Choosing Impactful Career Paths
00:00:06
Speaker
I'm Ariel Kahn with the Future of Life Institute. The world is full of problems, but each of us has only so much time available to make it better. If you wanted to improve the world as much as possible, what would you do with your career? Should you become a doctor, an engineer, or a politician? Should you try to end global poverty, climate change, or international conflict?
Guiding Graduates for Maximum Impact
00:00:26
Speaker
These are the questions that the research group 80,000 Hours tries to answer.
00:00:30
Speaker
They try to figure out how talented graduates in their 20s can set themselves up to help as many people as possible in as big a way as possible. To learn more about their research, I'm happy to have Rob Wiblin and Britton Mayer of 80,000 Hours joining me today. Before taking on the role of Director of Research for 80,000 Hours, Rob was previously the Research Director and Executive Director of the Center for Effective Altruism. He also recently launched a new podcast for 80,000 Hours, which I highly recommend checking out and which we'll link to on the site.
00:01:00
Speaker
Brenton's background is in clinical medicine and he has co-founded two nonprofits, including Effective Altruism Australia. At 80,000 hours, he's a career coach where he gives people personalized advice about how to have those high impact careers. Rob and Brenton, thank you so much for being here. Thanks so much, Ariel. It's a pleasure to be on the show. Thanks for having us.
Founding of 80,000 Hours and Effective Altruism
00:01:20
Speaker
Yeah. So first, can you give us just a bit more background about what 80,000 hours is and how it started?
00:01:27
Speaker
So 80,000 hours has been around for about five or six years now, and it started when two colleagues of mine, Benjamin Todd and Will McCaskill, who were the founders, were finishing their undergraduate degrees at Oxford and Cambridge respectively.
00:01:43
Speaker
And they wanted to figure out how they could do as much good as possible. Both of them had been studying philosophy and had a real interest in ethics and thinking about what is valuable in the world and how can we contribute to making the world a more moral place. But when they looked into it, they couldn't really find that much research to guide them on how, if you wanted to help a lot of people in a really big way, if you wanted to raise the welfare of humans, and I guess animals as well, what would you actually do?
00:02:12
Speaker
So they started doing some investigation of this, looking into things like, you know, what are the odds of becoming an MP in the UK if you tried to do that? Or if you became a doctor, how many lives would you save? And if you went into different professions, how much good would you do? And what are the most important problems that they could focus their career on? And pretty quickly, they were, you know, learning things just with a couple of months work that really no one else had written up because this whole research area just had barely been investigated at all.
00:02:38
Speaker
or where it had been investigated, it was only done very indirectly and there was no one who was pulling it together into an actual guide to how you can do good with your career.
Research and Resources for Impactful Careers
00:02:46
Speaker
So having realized that they could make a whole lot of progress on these questions really quite quickly, they decided to actually start an organization, 80,000 Hours, which would conduct this research in a more systematic way and then share it with people who wanted to do more good with their career. And 80,000 Hours has also ultimately become part of the effective altruism community. And effective altruism, as many of our listeners will know, is a social movement that's about using reason and evidence and analysis to figure out how you can do as much good as possible.
00:03:16
Speaker
And there's different groups who are taking different angles on this. There's people who are looking at how you can donate your money in ways that will do the greatest good. There's other research groups who are looking at things like what kind of policy changes could you push for in government that would be most valuable. We're kind of taking the angle of if you're a talented graduate in your 20s and you wanted to help as many people as possible in the biggest way as possible with your career, what kind of professions would you go into? What strategies would you adopt? And what kind of problems would you be working to solve?
Understanding the '80,000 Hours' Philosophy
00:03:44
Speaker
And so real quick, 80,000 hours is roughly how much you estimate the average person will spend in a lifetime on their careers, right?
00:03:52
Speaker
That's it. So 80,000 hours is roughly the average number of hours that you would work in a full-time professional career. So I think it's 40 years times by 40 hours a week times by 50 weeks a year. So on the one hand, that's an awful lot of time that you're potentially going to spend over the next 40 or 50 years of your career. So it pays off to spend quite a while thinking about what you're actually going to do with all of that time when you're in your late teens or early 20s.
00:04:16
Speaker
On the other hand, 80,000 hours is just not that long relative to the scale of the problems that the world faces. So it also suggests that you actually have to be quite careful about where you're going to spend your time because you can't tackle everything. You've only got, you know, one career to spend. So you should be quite judicious about what problems you try to solve and how you go about solving them. Okay. And so how do you actually try to help people have more of an impact with their careers?
Tools and Coaching for Career Impact
00:04:42
Speaker
So suppose we break this up into a couple of things. The main one is just the website and people can go to this at 80,000hours.org. And the second one is sometimes we do individualized coaching for people. So on the first, the kind of main product and the main thing that people read on the website is a career guide, which is just a set of 12 articles, which raise these considerations, which are pretty important for you to have in mind when you're thinking about how to have a high impact career.
00:05:08
Speaker
So the Career Guide will go over just a bunch of ideas that people probably need in order to have high-impact careers. So we'll talk about how to have careers which are satisfying, how to have a career which is working on one of the world's most important problems, how to set yourself up earlier in your career so that later on you can have a really large impact as well.
00:05:25
Speaker
And then we also look into a bunch of different specific careers. So we write career reviews on things like practicing as a doctor or trying to become a politician. And then we look at problem profiles as well. So say the problem of climate change or say the problem of artificial intelligence.
00:05:41
Speaker
And then the second part that we do is do career coaching and try to apply our advice to individuals. So people can apply for this through a link that hopefully we'll be able to put into the show notes of this podcast. And with this, we kind of help people go through applying these considerations to their career specifically and try to point them in the best direction that we can for how they can have a really impactful career.
00:06:04
Speaker
Okay. And so can you walk us through what the process is for someone who comes to you either through the website or directly to you for help choosing a career path? So if someone came through career coaching, for example, the first thing I'd try to do is figure out where people are coming from. So what causes they're interested in, what their background is or what they could become good at. And then I'll try to think about kind of what's standing between them and having a really high impact career.
00:06:32
Speaker
And sometimes this could be a choice of cause area. Sometimes this could be a specific job opportunity that I'm aware of. And sometimes it's a few small considerations which people probably don't think enough. And an example of something like this is our advice that you wanna do a bunch of exploration early on in your career and that it's better to do exploration of things before you do postgraduate study rather than after you do postgraduate study. And then hopefully I try to give them some resources and some connections to people working in the relevant fields and just try to leave with a few actionable steps.
00:07:04
Speaker
Okay. So yeah, you mentioned the exploration pre-graduate versus post-graduate. How does your advice change if someone comes to you, say, early in their undergraduate career versus at the end, or if they're considering going into graduate school or as a recent graduate?
Adapting Career Advice with Age
00:07:21
Speaker
Right. So it varies quite a lot, whether you're advising someone who's 20 years old versus 60 years old.
00:07:28
Speaker
And some of the big picture things that change is that when you're 20, you don't yet really know what you're good at. It turns out that people are just actually very bad at anticipating what they're going to enjoy and where their strengths are.
00:07:41
Speaker
Also, you have so much time ahead of you in your career that if you manage to gain any skills, then almost all of the benefit of gaining those skills comes far in the future. You have 30 or 40 years ahead of you during which you can benefit from learning to write better or become a better public speaker or do better analysis.
00:07:58
Speaker
So the things we tend to emphasize early on are exploration, so trying to collect information, firsthand information from having a variety of different experiences to figure out what you flourish at and what you enjoy doing, and also building up career capital.
00:08:14
Speaker
And career capital is this broader concept of anything that puts you in a better position to make a difference in the future. And that includes kind of skills. It includes your network and who you know and what kind of jobs you can get. It includes credibility, things like having a degree, the ability to be taken seriously. And it also actually includes just having some money in the bank to potentially finance you changing what you're doing in your career so that you don't just get stuck because you're living paycheck to paycheck.
00:08:41
Speaker
Now if someone comes to us and they're 60, by that stage there's not that much time for them to benefit from gaining new skills or from exploring completely different areas. So by that stage we're focused mostly just on how can they use what they already have to have a really large impact right away.
00:08:58
Speaker
So we'll be thinking, what jobs can they go into right now, given the skills that they have, given the network they have, where they can just immediately go and have a lot of impact. And of course, in the middle of your career, when you're 40, it's somewhere between these two things. You can still potentially go into other areas, you can specialize in different problems. And especially if you've developed transferable skills, like you're a good writer, or you're a good speaker, or you're just generally well informed, then potentially you can apply those transferable, kind of generic skills to different kinds of problems.
00:09:27
Speaker
But at the same time, by that stage, you do want to be starting to think, you know, how can I actually have an impact now? How can I actually help people directly so that you don't actually just run out of time by the time that you're retiring? Okay.
Earning to Give and Comparative Advantage
00:09:39
Speaker
And so I actually did go through and try some of the stuff you have on the website. And one of the things I did was filled out the questions on your career quiz, which I recommend because that I think was the shortest questionnaire on your site possibly. So that was kind of nice.
00:09:56
Speaker
And conveniently, the feedback I got was that I should pursue work at an effective nonprofit. Sounds like you're doing well. Yeah, thanks. I know there are other cases where it makes more sense for you to encourage people to earn to give. And I was wondering if you could talk about the difference between, say, working directly for a nonprofit or earning to give, or if there's other options as well for people.
00:10:20
Speaker
Right. So, earning to give for those who don't know is the career approach where you try to make a lot of money and then you give it to organizations that can use it to have a really large impact on the world, a really large positive impact. And it's one of the ideas that 80,000 Hours had relatively early on that was quite uncommon. And it was also somewhat controversial because it sounded like we were saying, you know, maybe the most moral thing that you could do is to go into finance and make a ton of money and then and then give it away.
00:10:45
Speaker
And some people found this idea really captivating and interesting and other people found it kind of bizarre and counterintuitive. But either way, it got a lot of media attention. And there's a significant number of people who are out there making money and then donating it. And that's the main way that they have impact. But of course, if we have people out there who are making money and donating it, there have to be people who are receiving that money as salaries and then doing really useful things. So we can't just have people making money and we can't just have people doing direct work. You need both people and money.
00:11:13
Speaker
in most cases in order to make a non-profit function. So we think about this in terms of kind of your comparative advantage and also how a funding constrained an area is. So I'll just go through those in turn.
00:11:27
Speaker
So there are some people who are much better placed to make money and give it away than they are to have a direct impact. And for example, I know some people who've chosen to go earning to give who are extremely good at maths. They are very good at solving mathematical puzzles and they have a massive personal passion about working in finance.
00:11:45
Speaker
And for them, they could potentially be making millions of dollars a year doing the thing that they love and donating most of that money to effective nonprofits and supporting 5, 10, 15, possibly even 20 people to do direct work in their place. And on the other hand, there's other people who are much better placed, perhaps you're an example of this, who are much better placed to do directly useful work, spreading really important ideas about how we can guide the future of humanity in a positive direction than they are to make a whole lot of money.
00:12:15
Speaker
I don't think that I could really make six figures with the skills that I've built. So I'm probably a much better place to be doing directly useful research and promoting important ideas than I am to make money and support other people to do that in my place. So if you're someone who can make seven figures and donate more than a million dollars a year, then probably you should be seriously thinking about earning to give as a way of making a difference.
00:12:36
Speaker
And the other element here is, depending on what problem you want to solve, there's some problems in the world that are flooded with money already, that there's lots of other donors who want to support people to work on those problems, but there's almost no one who they can find to hire. And then there's other problems where there's almost no money, but lots of people who want to work in the area.
00:12:54
Speaker
I think an example of the latter there might be animal welfare, where there's a lot of people who would really like to directly do work trying to improve the welfare of farmed animals and, I guess, pets as well. But there are relatively few wealthy funders who are backing them. So you end up with this problem of having lots of kind of non-professional volunteers working in that area and not really enough money, at least in the past, to support professionals to really take it to the next level.
Focusing on Neglected Areas
00:13:18
Speaker
On the other hand, there's other areas, and I think artificial intelligence is a little bit like this, where there's a lot of really wealthy people who have realized that there's significant risks out there from artificial intelligence, especially superhuman artificial intelligence, which might come in the future. But they're struggling to find people who have the necessary technical abilities to actually solve the problem.
00:13:39
Speaker
So if you're someone who has a background in machine learning and is potentially able to have really valuable insights that no one else could have about how to make artificial intelligence safe, then probably we need your expertise more than we need extra money because there's just not that much that money can buy right now.
00:13:55
Speaker
So I definitely want to come back to all of this, and I'm going to here in just a minute. But before we go too far into some of the different areas that people can go into, Brenton, I know you have a background in medicine. And I know one of the examples that 80,000 Hours gives of people misunderstanding how they can do the most good.
00:14:14
Speaker
is how many people choose to become doctors, though there are other paths that they could take that might actually help people more. And I was hoping you could talk a little bit about the paradox there, since obviously we do need doctors. We don't want to discourage everyone from becoming a doctor. Yeah.
00:14:30
Speaker
So I suppose what's going on here is that we need a bunch more of lots of different things. We do need more doctors in the developed world, but also we need lots of people working on lots of problems. And the question is, where do you kind of want the next additional talented and altruistic person to go?
00:14:52
Speaker
So we actually looked into how much could we expect a doctor in the developed world will do. And the answer is something like produce about five to 10 years of quality life per year that they work. And when I say quality life, I mean this measure that takes into account the amount that you extend people's lives as well as the amount that you improve their lives through improving their health in some way. Sorry, is that 10 years per person that they see or just 10 years of life in general?
00:15:21
Speaker
10 years per person that they see would be amazing. It is about 10 years over the course of a year of working in a hospital in the developed world, in the rich world.
00:15:32
Speaker
So this is pretty good, and I mean, most professions can't beat this. This is much better than most people will be able to do. But this said, you can probably do better than this. So as a doctor who's giving away 10% of their income, you can probably do several times better than this if you make sure that you give it to the best evidence organizations that are out there, like those evaluated by GiveWell.
00:15:52
Speaker
And then on top of this, you could work in other areas. So if I had the choice between a talented and altruistic person going into medicine or a talented and altruistic person working on some area to influence the long run future, such as reducing existential risk, then I would prefer to have them working on a ladder, you know, making it just so that we have more academic papers that are released on reducing the risk of human extinction than there are on dung beetles every year.
00:16:17
Speaker
I assume though that if someone came to you and said, but what I really want to do is help people directly, would you still try to encourage them to go into an existential risk research or would you say, okay, then you're probably a better fit to be a doctor?
00:16:32
Speaker
Yeah, I mean, probably not. So some people, yeah, just feel like they only wanna be a doctor, and this is the only thing that they're gonna do. And I went to uni with a bunch of these people. And for them, it very plausibly is the best thing to do, to become a doctor. I mean, I would encourage them to make the best use out of these qualifications, and they could do this through things like working in the developing world. So say in Rwanda, there are something like six doctors per 100,000 people, and it'd be better to work there than in Australia where I worked, where there are something like 270 doctors per 100,000 people.
00:17:01
Speaker
Or they could use the money that they earn to give, or they could work in public health where I expected the impact per doctor is significantly better. But yeah, I suppose the general point is that if someone's really passionate about something and this is the only thing that they can see themselves doing in order to be happy, then they should probably just do that thing. Okay.
Research on Global Issues and Long-term Impact
00:17:20
Speaker
You've also been doing research into what the most pressing problems facing the world are.
00:17:25
Speaker
presumably like existential risks. Can you talk a bit about what those issues are and why 80,000 hours has taken an interest in doing this research yourselves? Sure. So we've done some of this ourselves, but the reality is we've been drawing a lot on work that's been done by other groups in the effective altruism community, including, I guess, GiveWell and the Open Philanthropy Project and the Future of Humanity Institute at Oxford. I can talk a little bit about the journey that our ideas have taken over the last five or six years.
00:17:55
Speaker
Yeah, please. So one of the first things that we realized, as Brenton mentioned, is that it seems like if you're just trying to help people alive today, your money can go an awful lot further if you spend it in the developing world rather than the developed world. Because it's just so much less money being spent per person in those countries trying to solve problems that they have.
00:18:16
Speaker
And also the issues that you find in rural Kenya are by and large issues that have been partially almost completely solved in developed countries. So the issues are neglected and we also know that they're solvable. We basically just need to scale up solutions to basic health problems and economic issues that have been resolved elsewhere in the world. You already have a pretty good blueprint.
00:18:39
Speaker
And I think that that's basically, that's just correct, that if you focus on the developing world, rather than the developed world, you would increase your impact maybe 10 or 100 fold, unless you're doing something unusually valuable in a rich country.
00:18:50
Speaker
Then moving beyond that, we thought, well, what about looking at other groups in the world that are extremely neglected, other than just people living in rural areas in very poor countries? And if you look at that question, then factory farmed animals really stand out. There's basically billions of animals in factory farms that are killed every year, and they're treated just really overwhelmingly horrifically. There's a small handful of farmed animals in the world that are raised humanely.
00:19:17
Speaker
or at least somewhat humanely, but 99.something percent raised in horrific conditions where they're extremely confined, they can't engage in natural activities, they have body parts cut off just for the convenience of the farmers, and so they're treated in ways that if you treated your pet like that would send you to jail.
00:19:34
Speaker
And they're very numerous as well. So potentially you could have even larger impact again by working on trying to improve the welfare of animals and human attitudes towards animals. And that issue is also extremely neglected. There's actually very little funding and there's very few charities globally that are focused on improving farm animal welfare.
00:19:55
Speaker
The next big idea that we had was thinking, of all of the people that we could help, or of all of the groups that we could help, what fraction of them are actually alive today? And we think that it's only basically a small fraction, that there's every reason to think that humanity could live for another 10 generations or 100 generations on Earth, and possibly even have humans and our descendants alive on other planets.
00:20:19
Speaker
And so there's a long time period in the future in which humans and animals could have good lives, and there could just be so many of them at any point in time. And that gets you to thinking about what can you do that will have very persistent impacts, impacts that don't just help this generation, but also help our children, our grandchildren, and potentially generations in thousands of years' time.
00:20:42
Speaker
And that brings us to the kind of things that the Future of Life Institute works on. So we worry a lot about existential risks and ways that civilization can go off track and never be able to recover. And I think actually that broad class of things, thinking about the long-term future of humanity, is where just a lot of our attention goes these days.
00:21:03
Speaker
And where I think that people can have the largest impact with their career, if their main goal is to have a large impact with their career, then I think thinking about long term impacts is the way to have the largest impact basically. How does that get measured? If you're trying to think that far out into the future, how do you figure out if your career is actually that impactful versus say a doctor being able to say, well, I saved X number of lives this year.
00:21:26
Speaker
Yeah, so the outcome measure obviously depends on what problem you're working on. So if you're focused on developing world health, then you can talk about lives saved. If you're thinking about farm animals, then you can think about, you know, how many horrible lives did you prevent from existing? And when it comes to long term future, that's kind of the hardest thing to measure because you're not going to be around to see whether your actions actually prevented a disaster. And a disaster was kind of unlikely anyway. So if
00:21:52
Speaker
If you try to do things to reduce the risk of nuclear war and then next year there isn't a nuclear war, I mean it's only a very weak signal that you succeeded. It's basically no signal at all. So I don't think we have a really snappy answer to this. Basically we try to have people who are experts in these different problems, so people who
00:22:08
Speaker
understand or think that they have a good understanding of what kinds of things could precipitate a nuclear war. They've spoken to the actors that are involved so they know what the Chinese or Russians are thinking and under what circumstances they might start a nuclear war. And then try to change the conditions on the ground today to make everyone a bit less skittish or to make the number of nuclear weapons that are on hair-trigger alert fewer and hope that that is going to lower the risk of a global catastrophe.
00:22:38
Speaker
But this is one of the ways in which this problem area is less promising than others. It's quite hard to measure success and it's hard to know whether the things that you're doing are really helping and how much. So that makes it hard to do a really good job. So I want to stay on this topic for a little bit, not the measurements, but these issues, because obviously they're very important to the future of life.
Careers in AI Safety and Alignment
00:23:01
Speaker
You know, I think most of our listeners are aware that we're very concerned about artificial intelligence safety.
00:23:07
Speaker
We're very concerned about nuclear weapons. We also worry about biotechnology and climate change. And I was hoping you could take each of those areas, maybe individually, and consider different ways that people could pursue either careers or earn to give options for these various fields.
00:23:26
Speaker
Sure. Basically, there's just so much to say about each of these. And that's one reason why I've started our podcast where I have conversations between one and three hours with experts in these areas to really pick their brains about all the different ways you could try to make a difference and compare the pros and cons and offer really concrete advice to people. So if this is a topic that's interesting to you, if you're considering pursuing a career to reduce the risk of a global catastrophe and ensure that the future of humanity is good,
00:23:52
Speaker
then I definitely recommend subscribing to the 80,000 Hours podcast. We've got a lot of really long episodes coming out about these topics. So that's my pitch. But I'll try to just give you just give you a taste of the different options within these.
00:24:05
Speaker
So artificial intelligence safety, we've had a couple of interviews about this one and we've got quite a lot of articles on the site. Broadly speaking, there's two different classes of careers that you can take here. I suppose actually three. So one would be to specialize in machine learning or some other kind of technical artificial intelligence work, and then use those skills to figure out not so much how can we make artificial intelligence more powerful, but figure out how can we make artificial intelligence aligned with human interests.
00:24:34
Speaker
So I had a conversation with Dr. Dario Amadei at OpenAI, and he's a really top machine learning expert. And he spends his time working on machine learning, basically in a similar way to how other AI experts do. But his angle is how do we make the AI do what we want and not things that we don't intend. And I think that's one of the most valuable things that really anyone can do at the moment.
00:25:00
Speaker
So then there's the policy and strategy side. And I have a conversation with Miles Brundage, a researcher at the Future of Humanity Institute at Oxford about this class of careers. This is basically trying to answer questions like, how do we prevent an arms race between different governments or different organizations where they all want to be the first one to develop a really intelligent, artificial intelligence. So they kind of scrimp on safety.
00:25:23
Speaker
They scrimp on making it do exactly what we want so that they can be the first ones to get there. How would we prevent that from happening? Do we want artificial intelligence running military robots or is that a bad idea? I guess this is probably a bad idea. But there's some people who don't agree with that.
00:25:40
Speaker
You know, do we want the government to be more involved in regulating artificial intelligence or less involved? Do we want it to be, you know, regulated in some different way? All of these kinds of policy and strategy questions. And it's helpful to have a good understanding of the machine learning technical side if you're doing that. You can also approach this just if you have a good understanding of politics and of policy and political economy and economics and law and that kind of thing.
00:26:06
Speaker
So that's a second class of careers where you can potentially work in government or military or think tanks or non-profits, that kind of thing. And then as you said, the third class is working in owning to give. So trying to make as much money as you can and then support other people to do these other kinds of work.
00:26:24
Speaker
And I think at the moment, that wouldn't be my top suggestion for people who wanted to work in artificial intelligence, because there's already quite a lot of large donors who are interested in supporting this kind of work. But there are some niches in artificial intelligence safety work that don't have a whole lot of money in them yet. So you could potentially make a difference by earning to give there.
00:26:42
Speaker
Actually, I forgot that there is kind of a fourth class, which is doing supporting roles for everyone else. So doing things like communications and marketing and organization and project management and fundraising operations, all of those kinds of things can actually be quite hard to find skilled, reliable people for. So if you have one of those other kinds of skills, you know, possibly even web design, right, then you can just find an organization that's making a big difference and then, you know, throw your skills behind them.
00:27:10
Speaker
Yeah, that's, I mean, that's definitely something that I personally have found a need for are more people who can help communicate, especially visually. Yeah, definitely. Um, yeah, our, uh, our web designer left a couple of years ago and then we spent about six months trying to find someone who was, uh, who was suitable to do that well. And, uh, we didn't find them and fortunately the original web designer came back.
Creative Skills for Impactful Causes
00:27:34
Speaker
But it can be surprisingly hard to find people who have really good comms ability or can handle media or can do art and design. So if you have one of those skills, you should seriously consider just about applying to whatever organizations you admire because you can potentially be much better than the alternative candidate that they'll hire or potentially they won't hire anyone. Shall we move on to nuclear weapons and biotech and climate change?
00:27:58
Speaker
Yes, and especially nuclear weapons. I'm really interested in what you have to say there because we are an organization that is attempting to do stuff with nuclear weapons. And even for us, it's difficult to have an impact. So I'm curious how you suggest individuals can also help.
00:28:16
Speaker
Unfortunately, this is probably the one that we know the least about.
Nuclear and Biotech Risks and Opportunities
00:28:20
Speaker
It's kind of upcoming, something that we want to look at more over the next six months. I've got an interview with someone who works on nuclear anti-proliferation from the Nuclear Threat Initiative in a couple of weeks, and I was hoping to get a lot of information there.
00:28:32
Speaker
I guess, broadly speaking, there's going to be, you know, going into the military, if you have a shot at getting into the strategic side of nuclear weapons control, then there are groups like the Nuclear Threat Initiative, the Skoll Global Threats Initiative. It is very tricky. I guess I would say a lot of people who work on nuclear weapons stuff are, in my view, too focused on preventing terrorists from getting nuclear weapons and perhaps also, you know, smaller states like North Korea.
00:29:01
Speaker
The thing that I'm most worried about is an accidental all-out nuclear war between the US and Russia or the US and China.
00:29:08
Speaker
because that has a potential to just be much more destructive than a single nuclear weapon going off in a city. I guess that that's not a super strong view because you have to weigh out which of these things is more likely to happen. But I'm very interested in anything that can promote peace between the United States and Russia and the United States and China. Because that seems like the most likely like a war between those groups or an accidental kind of nuclear incident seems like the most likely thing to throw us back to the Stone Age or even pre-Stone Age.
00:29:35
Speaker
Well, in that case, I will give a couple quick plugs for the international campaign to abolish nuclear weapons, which played a huge role in the treaty that was just passed to ban nuclear weapons at the UN. And also we've done a lot of work with the Union of Concerned Scientists. And there are bigger organizations that they might have opportunities for people, but they've focused a lot on things like accidental nuclear war and hair trigger alert.
00:29:58
Speaker
Yeah, to be honest, I'm not that convinced by total disarmament as a way of dealing with the threat from nuclear weapons. The problem is, even if you could get the US and China and Russia to destroy all of their nuclear weapons a day, they would be always within a few months of being able to recreate them. And the threat that another country might rebuild their nuclear arsenal before you do might actually make the situation even more unstable.
00:30:22
Speaker
But the things that I would focus on are ensuring that they don't get false alarms, that other things trigger their warnings that they're suffering a nuclear attack, trying to increase the amount of trust between the countries in general and the communication lines so that if there ever are false alarms, they can communicate it as quickly as possible and diffuse the situation. And actually, the other one is making sure that these countries always can retaliate even at a delay.
00:30:46
Speaker
So Russia is in a tricky situation actually at the moment because its nuclear technology is quite antiquated, where they are at the risk of the US basically destroying all of their land-based nuclear weapons. They have a very short period of time between when they are notified about a potential nuclear attack and when the nuclear weapons would hit their own nuclear silos on land.
00:31:07
Speaker
And they don't have many nuclear submarines that they could use to then fight back at a delay. Because, of course, you can hit nuclear silos on the ground and just disable them so that they can't retaliate. But it's much harder to disable nuclear submarines. They can basically always retaliate, even weeks or months later.
00:31:23
Speaker
So one interesting thing that would be really helpful would just be to give Russian Federation nuclear submarines. I mean, they've never accepted nuclear submarines taken from the United States, but I'd be really happy if they would actually build some, because then they would always know that they could offer a full retaliation even weeks later. And so they don't have to always be on hair-trigger alert. They don't always have to retaliate within a few minutes of receiving a warning to make sure that they can retaliate at any point.
00:31:48
Speaker
The other thing would be to improve their monitoring ability. So, you know, give them more satellites and better radars so that they can see nuclear weapons incoming to Russia sooner. And so they have a longer lead time before they have to decide whether to retaliate. But it's interesting. I mean, this stuff is not so much about disarmament, but about, you know, having in a way better nuclear technology. And I think that's another direction that you can go. I don't think that the US or China or Russia are going to disarm. And I'm not sure even if they did that it would be that helpful. So I would focus on other approaches.
00:32:18
Speaker
Yeah, I would still advocate for decreasing the number of nuclear weapons. Yeah, I think, I mean, they should do that anyway because, or at least they should decrease the number of land-based nuclear weapons because it's basically just a waste of money. They have far more than they actually need. And as far as I can tell, at least in the United States, it's just a boondoggle to the nuclear industry that wants to build more and more of this nuclear machinery and just cost the taxpayer a lot of money without any increase in security. And I certainly agree. It would be fantastic if we could get nuclear weapons taken off of hair-trigger alert.
00:32:46
Speaker
I think China, at least historically, has had far fewer nuclear weapons that are able to respond really quickly. And their approach is that if they're attacked with nuclear weapons, they will potentially spend days considering what their response is going to be under a mountain and then retaliate with some delay once they've fully figured out who attacked them and why and what they should do. That's a much safer situation than having nuclear weapons that can be fired, you know, with a few minutes notice.
00:33:14
Speaker
Yeah, the Union of Concerned Scientists has come out with some reports, though, that indicate they think that the Chinese may be changing their policy. Yeah, no, I have heard that as well, that they're modernizing, which in this case means making it worse. But at least historically, there has been another way of dealing with this. Again, I'm not sure about the political practicalities of that. And to be honest, this isn't so much my area of expertise. Maybe you should get someone from the Nuclear Threat Initiative on to talk about these careers. But it's a very interesting topic.
00:33:45
Speaker
Yes, it's a really interesting challenge, depressing challenge, but an interesting one. Yeah. So then we got biotech and climate change? Yes. Okay, so biotech. The risks here are primarily that we would either deliberately or accidentally produce new diseases using synthetic biology or disease breeding.
00:34:10
Speaker
And I have a two and a half hour long conversation with Howie Lempel, who was a project officer working on these kinds of risks at the Open Philanthropy Project. And so if you're interested in this, I strongly recommend listening to that episode and then applying for coaching so we can give you more information.
00:34:26
Speaker
Broadly speaking, I think the best opportunities here are in early surveillance of new diseases. So at the moment, if there's a new disease that's coming out, a new flu, for example, it takes us quite a long time to figure out that that's what's happened because obviously people come into the hospital with flu symptoms that happens all the time. We don't typically, you know, take assays from them to figure out whether it's a new strain of flu or an old strain of flu.
00:34:52
Speaker
And it takes a long time for enough people to be dying or showing unusual symptoms for us to realize that there's something unusual going on here and then start actually testing. And just when it comes to controlling new diseases, time is really of the essence. If you can pick it up within a few days or weeks, then you have a reasonable shot at quarantining the people and following up with everyone that they've met.
00:35:12
Speaker
and containing it. And we have successfully done that in a couple of cases. Older people among your audience might remember SARS, the Sudden Acute Respiratory Syndrome, that spread through Hong Kong and Singapore, I think in around 2003, 2004. The authorities there were pretty on the ball, and they caught it early enough, and they managed to do follow-up with everyone who the people who had this disease had met and to contain it. And even though it was a very deadly disease, it didn't end up killing that many people.
00:35:43
Speaker
But if it had taken them an extra month to find out that there was this new disease that was spreading, it might have just reached too many people for it to be practical to do follow up with all of them and to prevent them and to bring them all into hospital because it's just there won't be enough beds for them. And at that point, really, it's like once a fire gets out of control, it just becomes massively harder to contain.
00:36:01
Speaker
You need to catch a fire when it's only just in one part of a room before it's spread to a whole building. And so any technologies that we can invent or any policies that we can make that will allow us to identify new diseases before they've spread to too many people is going to help with both natural pandemics where there's significant risk every year that we're just going to have a new strain of flu or other kinds of new diseases that could create big problems. And also any kind of synthetic biology risks or accidental releases of diseases from biological researchers.
00:36:31
Speaker
So those are the risks, but perhaps lesser note among people is that FLI is also looking at existential hope. And I think biotech offers good opportunities for that as well. Are there career paths you recommend for people who want to do the most good that way, health-wise, or anything else?
00:36:51
Speaker
Interesting. I mean, so this isn't as much my area. The suggestions that I've heard there, I guess this research into longevity, so trying to slow down aging so that people might hope to live significantly longer, potentially hundreds of years. And I guess, you know, in the very long term, maybe even living for thousands of years.
00:37:09
Speaker
That is maybe good in itself because people don't want to die. Most people would rather live longer than they're going to. It's also good in that if people expect to live hundreds of years, then it will make them more cautious about the future and more concerned about where humanity is going because they might actually benefit themselves from it. So there's some indirect effects there that could be positive in my view. There's also, I guess, other things I've heard are human enhancement.
00:37:33
Speaker
So you could try to use biotechnology to maybe make people more moral, to make them less selfish and less violent and less cruel. I don't know how practical that is, whether that's something that's going to come anytime soon. But if it were possible to make the next generation more moral than the present generation,
00:37:52
Speaker
That seems like it would be really helpful in terms of guarding humanity in a positive direction in the long term. But of course, there's pretty big problems that you could immediately see there, where, for example, if you're the Chinese government and you can just tweak the knobs on the next generation's personalities, then you could just make them very compliant and unwilling to ever rebel against the existing system. So there's both potentially big upsides and big downsides. Okay. And then climate change?
Addressing Extreme Climate Risks
00:38:17
Speaker
Yeah. I think Brenton wanted to chime in. Oh, yes, please do.
00:38:20
Speaker
Yeah, so climate change seems like quite a big problem. So in this framework where we try to assess which problems are the most important to work on, as Rob's kind of alluded to, we try to think about how solvable they are, try to think about how large they are in scale, like how big of a problem it is, and finally how neglected they are.
00:38:39
Speaker
of the ones that we've listed here climate change actually does the worst when you look at the standard case and this is probably because it just does bad on neglectedness there's something like 300 billion dollars or so spent on this problem per year because it is it is such a large problem however a much more neglected case and one that we are really worried about is kind of these extreme risks of climate change
00:39:01
Speaker
So if you look at not just the median outcome, but some of the worst forecasts, what you get is situations where kind of most of the damage is coming from. So there's a Wagner and Weizmann paper, which suggests that there's about a 10% chance of us being headed for warming, which is larger than 4.8 degrees Celsius, or a 3% chance of us headed for warming that is more than six degrees Celsius. And these are really disastrous outcomes. And this is at about 560 ppm, which it seems like there's a,
00:39:31
Speaker
pretty decent chance of us getting to. So I suppose our take on this is that if you're interested in working on climate change then we're pretty excited about you working on these very bad but non-median case scenarios. So how would you do this? The first answer is it's a bit hard and sensible things to do would be improving our ability to forecast, it would be thinking about kind of the positive feedback loops that might be inherent in the Earth's climate,
00:39:57
Speaker
It might be thinking about how to enhance international cooperation. And then another angle on this is doing research into geoengineering. So if it turned out that we were in a disastrous scenario and we were getting warming of something like five degrees, what could we do about that? And there are a few situations that are pretty scary to think about, like trying to throw up dust of calcium carbonate into the stratosphere and try to reduce the amount of sunlight that's getting through to the Earth, but might be the kind of things that we need to consider.
00:40:27
Speaker
And the kind of things where it'd be really good to have good research on it now, before we're in a situation where we've got this very badly warmed earth, we maybe have problems with political systems, we have countries not seriously taking into account how bad it could be to do things like engineering that could seriously mess with the world's climate even more. So getting that research done now seems sensible. And is there timeline estimates for when these potentially catastrophic temperatures could be hit?
00:40:57
Speaker
Yeah, I'm trying to think. I think we're talking kind of 50 to 150 years here, unless we get extremely unlucky and we hit some, you know, intense feedback loops really fast. This is more towards the later part of this century. Okay. I'll just add some other comments on climate change. It's one that we know a bit less about because as Brenton said, because there's already in the hundreds of billions of dollars being spent on tackling climate change every year, it doesn't seem as extremely neglected as some of these other issues.
00:41:25
Speaker
I worry a bit about geoengineering. I think it could end up being extremely helpful in cases where maybe climate change turns out to be much worse than we thought, and we can try to slow it down or contain it a bit. But it also creates serious problems itself.
00:41:40
Speaker
Geoengineering actually is disturbingly cheap, such that any medium-sized country could actually run an almost global scale climate engineering project itself. And that means that there's a real risk that it will be done too quickly and too recklessly. Because South Korea could do it, Australia could do it, and they don't really need anyone else to agree. They can just go ahead and start releasing these chemicals into the atmosphere themselves.
00:42:01
Speaker
And in as much as we develop this technology and it becomes acceptable, I'm not sure whether it reduces the risk from normal climate change more than it increases the risk by creating the possibility that a single country will somewhat foolishly do it, just because one leader thinks that it's a good idea. So if you wanted to go for stuff that's potentially high impact on climate change, that seems a bit less likely to backfire. Of course, you could do research into the risk from viewer engineering, thinking that, wow, we might use it at some point in the future anyway, so better to be prepared and to be aware of the downsides.
00:42:31
Speaker
But it also does just seem like solar power and storage of energy from solar power is the stuff that's going to have the biggest impact on emissions over the next 100 years or at least the next 50 years. It's already having a pretty large impact. It's getting close to cost parity in a lot of cases and every year the cost of batteries gets cheaper, the cost of solar panels gets cheaper and just in more and more places in the world it becomes more sensible to build solar panels than to build coal plants.
00:42:55
Speaker
So anything that can speed up that transition, uh, any new technologies that you can invent, that mean that it's just, it's just profitable to replace coal plants with, with solar panels, I think, uh, makes a pretty big contribution and the investments in solar R and D in the past look like really good investments today. Okay.
Discouraged and Encouraged Career Choices
00:43:13
Speaker
And so we've looked a lot at suggested career paths. I'm wondering, especially when you mentioned things like coal plants, are there career paths that you discourage?
00:43:24
Speaker
Yeah, there are career paths we described. They used to be in an old talk, we did this kind of lineup of several careers that we didn't encourage. And it was kind of this really depressing game of career bingo, is you'd go through them and various people would be upset.
00:43:37
Speaker
So, I mean, I suppose the answer is that we discourage quite a lot of careers. I mean, we think that people should be trying to help other people with their career, basically because it's a good thing to do, and basically because there are pretty good reasons to think that it increases how satisfied you are with your career. A lot of careers just don't help people, and that's almost a good enough reason in and of itself for us to not be excited about it.
00:43:59
Speaker
Then on top of that, most jobs that there are out there aren't working on any of our priority areas or they're working on things where people haven't tried to think about how large the problem is that they're trying to tackle and therefore encourage people to try to work on problems that are quite neglected. And then on top of this, it seems like there are a few careers that are just dominated by similar options. So an example of this is that you could be thinking about earning to give. And I think in this case, consulting beats investment banking in almost every case, like
00:44:28
Speaker
You might be particularly excited about investment banking but otherwise the earning is similar to consulting and in consulting you learn a bunch of other skills which you can then take to careers later on which might be higher impact than consulting is. Another example like this is going into corporate law for earning to give. Again, it takes quite a long time to get there. The skills that you get aren't very transferable as is the case with investment banking and instead consulting just seems to be better there.
00:44:55
Speaker
And then as well as on top of this, there are a bunch of careers that we just think do harm. And we've, we've written an article on this called, what are the 10 most harmful jobs? Because we're certainly not above clickbait. And all of these seem, seem pretty bad. So, so just scanning through it now, we've got a marketing R&D for compulsive behaviors, factory farming, homeopathy, patent trolls, lobbying for rent seeking, weapons research, fundraising for a charity that achieves nothing or does harm.
00:45:21
Speaker
forest clearing and tax minimization for the super rich all seem pretty robustly bad. So if any of your listeners were planning on going into tobacco marketing, then probably don't do that. But I would imagine they probably have more positive intentions than that. Probably. So I want to go back to something you were saying earlier, comparing consulting to these other careers.
00:45:44
Speaker
So if you think investment banking sounds interesting, do you then go into consulting for finance in general or what type of consulting work are you talking about? Because that can be sort of broad. Yeah, sure. So the case that I've got in mind is someone that's interested in earning to give, looks at a bunch of different careers and how much you can earn at various points, and then concludes that investment banking would be a sensible thing to go into.
00:46:09
Speaker
Yeah, in this particular case, I think even though consulting and investment banking would look similar, I think consulting just like strictly dominates on on these other things that we care about, like your ability to take the skills somewhere else. And the kind of things that I've got in mind are just for profit strategy consulting, or yeah, obviously for profit investment banking. Alright, so, you know, I have a science background, but I actually started out doing advertising and marketing, and I worked as a writer, and I did a lot of creative things. And it was it was honestly a lot of fun.
00:46:39
Speaker
And so I was wondering what advice do you give people who are interested in more creative pursuits or who I guess a second part of the question that I have is I've also found.
00:46:52
Speaker
entertainment can be sort of hit or miss. Sometimes it's just a mind sucking waste of time, but sometimes it actually does help you escape from reality for a little while, or it helps spread ideas that can later help society. And so I was just sort of curious how you give advice to people who are interested in more creative fields. Right. So I have a podcast episode, or at least it was a pre-podcast episode, just an interview I did with one of our freelance artists about a year ago.
00:47:22
Speaker
There's a couple of different ways you can think about this. So I'd say there's three broad ways that you can try to do good through creative arts. One is to just try to make money and give it away. So just take the earning to give path. And there it just depends on whether actually you do have opportunities to make any significant amount of money. Most artists, of course, are not getting rich, but a few of them, you know, the most talented ones do have reasonable prospects of making money.
00:47:45
Speaker
Um, the second one would be to try to just entertain people and try to do good directly by, you know, I watch Game of Thrones and I love Game of Thrones and I guess it's, it's good that I have a good time watching Game of Thrones. So that would be another angle would just be to try to make people happy.
00:47:59
Speaker
That one I think we're a bit more skeptical of because it doesn't seem like it has very long-term impacts on human well-being You make some good music and you know, I listen to music all the time But someone makes a great, you know house remix and I listen to it and I enjoy it I get immediate happiness in my brain, but then it doesn't seem like this is helping future generations It doesn't seem like it's helping, you know, the worst of people in the world in particular or anything like that It doesn't seem very highly leveraged, right?
00:48:26
Speaker
But then you've got kind of the third angle where you try to do good through art somewhat indirectly. So you could try to make, for example, documentaries that promote really important ideas and that change people's attitudes. You could try to tell stories that open people's eyes to injustice in the world that they weren't previously aware of. Or you could just produce marketing materials for a really important organization that's doing valuable work, perhaps a nonprofit like the Future of Life Institute, or you can run this podcast, which has a creative element to it.
00:48:53
Speaker
that helps to spread important ideas and draw attention to organizations that are doing good work. And I think there we're actually quite positive because at least when it comes to the effective altruism community, we're fairly short on creative skills. We don't tend to have people who are great at producing music or songs or visual design or making beautiful pieces of art or even just basically beautiful functional websites.
00:49:16
Speaker
So I'm always really enthusiastic when I find someone who can do visual design or has kind of a creative streak to them, because I think that's a very important complementary skill that almost any campaign or organization is going to need at some point to some extent. And so, you know, Maria, the freelance artists who I interview about this.
00:49:35
Speaker
She produces lots of designs for the 80,000 hours website and it means that more people are interested in reading our content, more people come to the site, it looks more professional, it increases our audience and that's all extremely valuable. Okay, excellent. So I'm going to completely switch gears here. Rob, I know you're especially interested in sort of these long-term multi-generational indirect effects. Can you talk about first just what that means?
Foresight and Decision-making for Future Challenges
00:50:04
Speaker
So as I was saying earlier, we think that one of the highest leverage opportunities to do good is to think about how we can help future generations. And if you're trying to help people and animals hundreds of years, thousands of years in the future, it's not really possible to help them directly because they don't exist yet. You have to help them through a causal chain that involves helping or changing the behavior of someone today, and then they'll help the next generation, and then they'll help the next generation, and so on.
00:50:30
Speaker
And I have a long talk on YouTube where I think about this framework, about long-term indirect effects, and wonder what stuff could we do today that will really change the trajectory of civilization in the long term. And it's quite a tricky issue. I can already feel myself slightly tying myself in knots with this answer, trying to make sense of it. But I'll try to give just a couple of lessons. I'll just try to run through some of the thinking that people have about this issue of long-term indirect effects and some of the lessons that have come out of it.
00:51:00
Speaker
So one way that people might try to improve the long-term future of humanity is just to do very broad things that improve human capabilities, like reducing poverty, improving people's health, making schools better, and so on. And I think that kind of thing is likely to be very effective if the main threats that humanity faces are from nature. So we've got diseases, asteroids, super volcanoes, that kind of thing.
00:51:25
Speaker
Because in that case, if we improve our science and technology, and if we improve our education system, then we're better able to tackle those problems as a species. But I actually think we live in a world where most of the threats that humanity faces are from humanity itself. So we face threats from nuclear weapons. We face threats from climate change, which is caused by us. We face threats from diseases that we might ourselves invent and spread either deliberately or unintentionally.
00:51:55
Speaker
And in a world where the more science and technology we develop, the more power we have to harm ourselves, to basically destroy our own civilization, it becomes less clear that just broadly improving human capabilities is such a great way to make the future go better.
00:52:11
Speaker
Because if you improve science and technology, you both improve our ability to solve problems, but it also means that we're creating new problems for ourselves more quickly. We're inventing more and more powerful ways of potentially just ending the human story.
00:52:26
Speaker
And so for that reason, I tend to focus on differential technological change. So I think about what technologies can we invent as soon as possible that disproportionately make the world safer rather than more risky. So for example, I think it's great to improve the technology to discover new diseases quickly and to produce vaccines for them quickly. But I'm perhaps less excited about just generically pushing forward all of the life sciences, because I think there's a lot of potential downsides there as well.
00:52:56
Speaker
And I also think a lot about human values, because it's harder to see how these backfire. So I think it's really useful if we can make people care about the welfare of people in other countries, or about animals in factory farms, or about the welfare of future generations, so that we're more likely to act in a prudent and responsible way that shows concern for the welfare, not just of ourselves and our family and our friends, but all beings as a whole.
00:53:22
Speaker
That seems fairly, fairly robustly a valuable thing to do that could improve the long term, especially if those values are then passed down to future generations as well. All right. So that's pretty abstract. Are there, how would you suggest that people actually get involved or, or can take action here?
00:53:39
Speaker
Yeah, that's a fair cop. It was a pretty abstract talk and a lot of the thinking in this area is pretty abstract. To try to make it more concrete. I mean, so earlier we were talking about specific careers that you might pursue to work on global catastrophic risks and risks from new technologies. And we've got the podcast, as I mentioned, which go into even more detail with specific courses that you might study, PhDs you might do, places you might work.
00:54:02
Speaker
But another option here, another way that we can, I think, robustly prepare humanity to deal with the long-term future is just to have better foresight about the problems that we're going to face in the future. And there are really good people in psychology and in the intelligence services and in government trying to figure out how we can get reliable intelligence about what threats we're going to face next year and in five years and in 10 years and what things we might do to tackle them and whether those are actually going to work.
00:54:28
Speaker
and not just have them be silly individuals' opinions, but actually rather robustly good answers. So some people in your audience might have heard of Professor Philip Tetlock, who spent decades studying people's predictions about the future and figuring out under which circumstances are they accurate. What is it for a prediction to be accurate? How do you even measure that? What kind of errors do people systematically make? And he's been funded lately by the IAPA, which is the Intelligence R&D Funding Service.
00:54:58
Speaker
And so they've had thousands of people online participate in these prediction contests. And then they study basically whose judgment is reliable and on what kinds of questions can we reliably predict the future. We have a profile coming out actually later this week where we describe how you can pursue careers of this kind. So like what kinds of things would you study at undergrad, post-grad level? What kind of labs would you go to to try to improve human decision making and human foresight?
00:55:24
Speaker
And I think that's kind of a very concrete thing that you can do that puts humanity in a better position to tackle our problems in the future. Just being able to anticipate those problems well ahead of time so that we can actually dedicate some resources to averting those problems. Okay, so I'll give another plug. I don't know if you've visited Anthony Aguirre's site Metaculous? I have, only briefly though.
00:55:46
Speaker
Yeah, I know that's one of the things that he's working on doing with his site. So that's definitely a fun one if you want to practice your predictive skills. I'm pretty sure it's just sort of an opportunity to try to either prove that you're good at predicting things or sort of test yourself, compare yourself to other people. Right, right. And just sort of try to improve the skill.
00:56:09
Speaker
Yeah, so Professor Tetlock's version of this is called the Good Judgment Project. And I've signed up to that and you can go into their interface and make predictions about all kinds of things, you know, about about elections, geopolitical events, economic events. And I tell you what, it's a chastening experience to actually have to go in there and like give concrete percentage probabilities on all kinds of different outcomes.
00:56:29
Speaker
makes you realize just how ignorant you are. And if you don't realize that upfront, then you'll realize it once you start getting the answers to the various things that you've predicted. Well, once you know it's on the record, it's in a database, and you're going to be told and you were wrong. I think some of the bravado that people naturally have, it goes out the window. Yeah, I haven't tried that one, but I've opened Metaculous quite a few times and not had the courage to actually make a prediction.
00:56:54
Speaker
Yeah, but I'll tell you what, I do that, but then I still go down to the pub and offer very strong opinions about everything. Exactly. All right, so is there anything else that either of you want to add? Yeah, so I suppose one plug to give is just that a whole bunch of the thinking that we do is on our website, and you can just see lots of it there. So as I said, there's the career guide, which just goes through a bunch of considerations that you need to have in your mind when you're thinking about how to have a impactful career.
00:57:22
Speaker
There are these problem profiles where we think about various problems and how important they are to have another person or another dollar working on. And we think about just specific careers and a particular career path that you might be interested in might be on there as well. So it's worth looking at from that angle. Yeah, we've got an article about kind of almost every question that we discussed here. So if you feel like we bungled our answer and tied ourselves in knots, then we've almost certainly written an article where we have our considered view expressed properly, which you can find.
00:57:51
Speaker
So check out, yeah, 80,000hours.org. We've got the main career guide, which goes through all of our key content. And then if you're still keen to get more personalized advice so you can figure out, you know, how am I going to reduce the risk from artificial intelligence by working in policy? That's the kind of question that is really good to deal with one on one. Once we understand, you know, your specific opportunities and your specific skills. So
00:58:11
Speaker
definitely go out there and apply for coaching. And of course, yeah, I've got lots of interviews on very similar topics to what the Future of Life Institute works on. So we've got upcoming episodes on risks from biotechnology, risks from artificial intelligence, nuclear security, climate change as well. So yeah, subscribe to the 80,000 Hours podcast and you can enjoy a couple of hours of my voice every week. Well, we'll definitely add a link to that as well. And Rob, Brenton, thank you so much for joining us today. It's been a great pleasure. Thanks for having us.
00:58:41
Speaker
To learn more, visit futureoflife.org.