00:00:00.000 Hello and welcome to Topcast episode 57, this is a bonus episode just for Easter 2021.
00:00:18.520 It was first released to my generous Patreons a week early but now it is available everywhere.
00:00:23.600 If you'd like to support the podcast and get earlier access to podcasts like this one,
00:00:31.040 If you simply do a Google search for my name, Brett Hall or for Topcast along with Patreon
00:00:37.040 that's Patreon, P-A-T-R-E-O-N, then you will find a way to support me and my ongoing
00:00:44.960 This podcast was recorded at the request of one of my Patreons, it's all about existential
00:00:50.280 risk, so it's called, and how to think about existential risk in the context of a rational
00:00:55.680 but critical worldview, and especially an optimistic worldview.
00:01:00.160 As you will hear, the quality is a little bit different to my other podcast, in large part
00:01:04.000 because I'm outside and so you can hear the sounds of the Sydney suburbs, and it is quite
00:01:10.320 I also hope it's reasonably good humor on such a serious topic, but without further ado,
00:01:20.720 One of my Patreons subscribers some months ago now asked me to speak about existential
00:01:27.760 risk and I've been unable to get around to doing that until now, so this is largely off
00:01:33.600 the cuff with my ideas about how to think about existential risk, existential risk is
00:01:41.400 something that a certain scientists are extremely interested in trying to explain to the
00:01:46.920 rest of us about how there won't merely be problems that are going to distract governments
00:01:55.600 and people, but there are going to be problems which could wipe out all of civilization,
00:02:03.800 And if we want to think seriously about this, we're going to have to calculate the likelihood
00:02:11.920 When we're talking about existential risk, what we mean is the end to existence to a
00:02:17.720 large extent, the end of the existence of humanity.
00:02:20.680 So we're talking about the biggest possible problems that we can conceive of.
00:02:25.760 We're trying to understand, analyse and then predict the probability that these things
00:02:32.120 are actually going to wipe us out at some point in the future.
00:02:35.400 So what things appear in this category or one thing that appears in this category is something
00:02:42.160 There is a Supervolcano, so we are told geologically, lurking beneath Yellowstone National
00:02:47.760 Park in the United States, and this thing is so vast that should it erupt in the way
00:02:52.320 in which some volcanologists think that it may one day.
00:02:55.440 It could lead to an outgassing event, in other words, huge amounts of gas, like carbon dioxide
00:03:01.800 and various other noxious volatiles, not to mention the dust and ash that this sort of
00:03:07.040 thing kicks up into the atmosphere, that that volcano will be of such a magnitude that
00:03:12.280 it will blot out the sun for some time, and this could lead to global catastrophe.
00:03:17.160 It could lead to crop failure in particular, so that's one thing that might happen.
00:03:21.680 We could have agriculture suffering across the entire planet, not only agriculture suffering
00:03:26.480 but the vegetation that wild animals eat could suffer as well, and indeed the aquatic
00:03:32.400 environment might suffer because there won't be enough, light getting down to the oceans
00:03:37.680 where the algae is producing food for the fish to eat.
00:03:43.040 There's certain other apocalyptic scenarios from such a volcanic event, like huge molten
00:03:47.840 balls of rock being thrown up into the air and coming back down and setting ablaze, large
00:03:53.280 areas of forest, this might happen as well, and so that is only going to exacerbate
00:03:57.280 this problem. This could be an existential threat to human civilization, and even if
00:04:02.680 it didn't kill everyone outright, it could cause such disruption to the economy that that
00:04:07.960 could lead to wars and it could precipitate something that eventually leads to a huge decline
00:04:14.560 in global progress, global well-being, and perhaps global population.
00:04:19.360 So how could you calculate the probability of something like this happening?
00:04:22.400 Well you look through the geologic record, you look at what is hitherto happened in the
00:04:26.600 past and that naturally occurring events, when we think about whenever volcanoes have gone
00:04:34.360 Rock strata are a wonderful record of being able to preserve precisely what was going on
00:04:41.000 We know when there have been huge volcanic eruptions because we can see the ash deposits,
00:04:46.000 the remnants of those volcanoes fall down to earth and then gradually get compacted between
00:04:52.720 layers of different rock over time over millions of years. So the probability calculation
00:04:58.320 on this account is a pretty straightforward exercise. You simply look for the evidence,
00:05:03.480 you look for how frequently volcanoes have occurred in the past, and it might very well
00:05:07.720 be the case and indeed it is the case that a planet like the earth being geologically
00:05:12.120 active becomes less geologically active over time, over the time scale of millions, hundreds
00:05:18.640 of millions, billions of years. There are good geophysical reasons for this, the earth is gradually
00:05:24.040 cooling over time, so in fact the earth is becoming geologically more stable over time,
00:05:28.640 the number of huge volcanic eruptions is decreasing. But that is sad, we do know that volcanoes
00:05:34.800 will continue to erupt and indeed some big volcanoes we would expect to erupt bigger than
00:05:40.040 they have in the history of any human that currently alive.
00:05:43.840 So apparently we can do some sort of probability, we can make some sort of calculation
00:05:49.400 about when the next volcano is going to happen and looking in the past we can see there
00:05:54.040 have been certain volcanoes that have been so huge that they have caused the extinction
00:05:58.240 of entire species and vast numbers of species that I think it is relatively well known
00:06:03.360 in the geology, paleontology, astrophysics community that soon after the KT event, the event
00:06:11.560 that wiped out the dinosaurs, this was the asteroid that crashed in the Yucatan Peninsula,
00:06:16.840 but soon after that there were large volcanic eruptions.
00:06:20.520 Sometime within a million years I think of that KT event, and it's caused debates in
00:06:27.080 the community of geologists and paleontologists about what really was either the proximate
00:06:32.680 or the ultimate cause of the demise of the dinosaurs was indeed that asteroid crashing down
00:06:38.520 to the earth and kicking up a whole bunch of dust into the atmosphere was that the thing
00:06:42.160 that wiped out the dinosaurs because the vegetation died, or was it the fact that that
00:06:46.400 asteroid actually precipitated a whole bunch of different volcanic eruptions around
00:06:50.960 the world, and it was that the volcanic eruptions that actually caused the demise of
00:06:55.800 I think the consensus now is it was the asteroid that was directly responsible for the extinction
00:07:01.000 of the dinosaurs, but it's interesting that serious scientists did indeed come up with
00:07:05.640 the idea that volcanoes could have wiped out the dinosaurs, so volcanoes are in that category
00:07:17.800 Well asteroids as we just mentioned, it has certainly been the case over the geological,
00:07:22.440 cosmological history of the universe and the history of the planet, obviously, that many,
00:07:27.600 many species have been wiped out simultaneously by asteroid collisions.
00:07:33.120 In fact, there's something like five or six so-called mass extinction events.
00:07:37.920 Many of these may have been caused by asteroids.
00:07:40.560 They may also have been caused by other cosmological events.
00:07:44.240 One of my favorite examples is of course the possibility that just perhaps it was a supernova
00:07:52.840 I've often thought that all the disaster movies are so far being made, disaster movies
00:07:56.960 about volcanoes going off or disaster movies about sudden rapid climate change or earthquakes
00:08:04.360 or name the disaster and you will have a disaster movie about it, but as far as I know,
00:08:09.120 we don't have a disaster movie yet about supernovae going off somewhere nearby, because
00:08:14.800 if a supernova went off somewhere nearby right now, well, when I say right now, if the explosion
00:08:20.240 reached us right now, what would happen depending upon the intensity of that supernova explosion,
00:08:26.600 you would have irradiation on one side of the planet and perhaps this would cause severe
00:08:31.160 disruption to the atmosphere, but essentially largely speaking, the people on the other
00:08:35.600 side of the planet not facing the supernova explosion wouldn't be affected immediately,
00:08:40.760 but they would gradually come to learn that as the earth rotates, the supernova radiation
00:08:45.800 would begin to exterminate things that came within sight of the supernova, so it could
00:08:49.960 very well be the case, for example, that if Australia was facing the direction of the supernova
00:08:55.040 blast that everyone in Australia and on that side of the planet would be wiped out by the
00:08:59.440 supernovae explosion, but the frightening thing would be for the people on the other side
00:09:03.080 of the planet, let's say in the United States trying to phone or zoom with their friends,
00:09:07.760 they would find that no one in the country of Australia would be answering because we
00:09:10.960 could have been almost immediately exterminated by the supernova, depending upon the conditions
00:09:15.080 who knows what this supernova is like, but for argument's sake, this is a possibility
00:09:19.120 that you can have people wiped out on one side of the planet and the other side of the
00:09:22.360 planet, these people would have time to dig deep beneath the earth and to try and escape
00:09:27.520 from the supernova radiation, and the supernova radiation would probably only last for hours
00:09:32.240 to days perhaps, so in theory they could go beneath the surface of the earth, escape from
00:09:37.080 the worst effects, but while meantime, the people on the other side of the planet would
00:09:41.520 have been largely exterminated unless they happen to already be underground or perhaps
00:09:46.320 inside, I'm not sure, but this is my idea for a disaster movie, but how do we think about
00:09:52.000 the probability, the existential threat of supernova as well, we can rely upon the astrophysical
00:09:58.520 knowledge of the community of astronomers out there, and we kind of have a good idea
00:10:03.920 of the stability of stars that we can see, and there are some unstable stars, there are
00:10:10.080 some stars that are large red giants, there are some stars that are variable stars,
00:10:14.760 and we can see most of these and we understand the astrophysical processes, we understand
00:10:18.720 well, what it takes for a star to explode in a supernova blast, such that the amount
00:10:23.600 of radiation reaching the earth would be a threat to life here on earth, and there is nothing,
00:10:28.840 there is no star within the minimum radius required that is likely to explode any time
00:10:35.080 within the next few hundred thousand years that we need to be worried about, all the stars
00:10:40.120 that are relatively close to the earth within the danger zone, so to speak, are not really
00:10:45.080 near the point where they are about to explode any time soon, within the lifetime of
00:10:48.600 anyone alive, or the lifetime of anyone's grandchildren who are alive, now what about asteroids?
00:10:54.000 Well, the interesting thing about asteroids is they used to strike the earth far more frequently
00:10:58.400 than what they do today, now why is that, well the reason is, the planets as they orbit
00:11:02.840 the sun, hover up or vacuum up, all the debris in there orbit, so along the orbit at that
00:11:08.360 earth takes, which is in between Venus and Mars obviously, once upon a time there were
00:11:12.720 lots and lots of rocks, lots of debris, lots of asteroids out there, but they've long since
00:11:17.680 been attracted to the earth and have crashed into the earth, there was this period billions
00:11:21.680 of years ago called the late heavy bombardment, and it's just like it sounds, the earth
00:11:26.040 was bombarded by a huge asteroid crashing into the earth, and this sterilized the surface
00:11:31.400 of the planet pretty much, but since then so many of the asteroids have either been shepherded
00:11:35.880 into that region between Mars and Jupiter into the asteroid belt, and they tend not to
00:11:40.440 leave that asteroid belt, they tend to be pushed towards the earth, it can happen, it
00:11:45.880 doesn't happen frequently, the other place to find asteroids apart from the asteroid belt
00:11:49.760 is called the Kuiper belt, the Kuiper belts out beyond the orbit of Neptune, now asteroids
00:11:56.640 could be kicked out of their orbits from there and head towards the earth as well, and
00:12:00.640 then of course going out further beyond the Kuiper belt we have the mysterious alt cloud,
00:12:04.640 the alt cloud is the place in the solar system, the very boundaries of the solar system,
00:12:09.080 it's certainly as part of the solar system because it's gravitationally bound to the sun
00:12:12.280 which makes it part of the solar system, but out there we have rocky icy bodies which
00:12:17.080 when they come towards the sun end up being comet, and so they could crash into the earth
00:12:21.400 as well, and we know that this has happened in the past and we know that these objects
00:12:25.560 are still out there, and so you can kind of do some kind of calculation about the frequency
00:12:29.800 with which objects like this come towards the earth and potentially intersect with the earth's
00:12:35.160 orbit and how often out of all those intersections with the earth's orbit, actually intersect
00:12:39.000 with the earth itself causing a mass extinction event, now of all the ways of considering
00:12:44.600 existential risk, I do like analysis of the asteroid question, because absent people,
00:12:51.680 and this is a theme in the work of David Deutsch and in my attempts to explain it, absent
00:12:57.440 people we can do these kind of calculations, the frequency with which asteroids intersect
00:13:03.320 with the earth, crash into the earth and wipe out some huge proportion of the life on earth,
00:13:09.000 for example, at the time of the dinosaurs, 62 million years ago when the asteroid crashed
00:13:13.880 in the southern part of Mexico and the Yacatan Peninsula, and that we know caused the extinction
00:13:19.720 of the dinosaurs and a whole bunch of other species as well in related species, it was an
00:13:23.880 extinction level event, that kind of asteroid would be an existential threat, but he is such
00:13:28.760 an asteroid and existential threat today, well it would be if we didn't find out about it before
00:13:34.280 it crashed into the earth, but we do have people who are interested in searching for such asteroids,
00:13:40.280 NASA has programmed searching, scanning the skies for asteroids, for dangerous asteroids which
00:13:46.280 might crash into the earth, so can we put a number, a probability, one in the thousand of this
00:13:53.320 happening within the next century of a huge asteroid coming and wiping out all life on earth,
00:13:58.040 where we can certainly put a number on the probability that such an asteroid could intersect
00:14:04.600 with the point at which the earth is on its orbit as it goes around the sun, but let's say such
00:14:09.880 a calculation is done, where we find, for example, that a particular asteroid is located,
00:14:16.840 and this asteroid is large, let's say it's 10 kilometers across, that's large by asteroids that
00:14:21.960 crash into earth turns, and let's say we picked a number, one in 100, and that one in 100 was the
00:14:28.600 chance that the asteroid was going to crash into the earth 100 years from now, that's quite a
00:14:34.200 high probability, and given the potential effects of such an asteroid crashing into the earth,
00:14:39.240 namely for wiping out of civilization, a 10 kilometer asteroid rushing at 10 times the speed of
00:14:45.160 a bullet straight into the earth is going to do untold damage, if it hits the oceans it's going
00:14:50.040 to create tsunamis around the entire world, if it hits a particular country on land it's going to
00:14:54.840 kick up dust, either way it's going to kick up dust, including large amounts of molten or fiery
00:15:00.600 rock, which will come crashing down onto the forest of the world, setting them ablaze, and so
00:15:05.400 simultaneously we'll have forest fires all over the earth caused by the fallout from such an
00:15:10.280 asteroid, and the cloud of dust that would be kicked up into the atmosphere will blot out the sun
00:15:15.080 for perhaps some weeks, destroying much of the plant life on earth, and certainly much of the food
00:15:21.320 for animals and for people, so if we had such a probability event or one in 100 chance,
00:15:27.320 given the fact that it could potentially wipe out all of civilization, we would want to do
00:15:32.520 something about that, and if we had a 100 year lead in time, we would start doing things
00:15:38.840 about that, and probably the first thing we would do is we'd want to refine the probability,
00:15:43.400 we would have more astrophysicists with better telescopes refining the probability, because you
00:15:48.920 need to be able to do extra precise calculations in order to refine that probability, and by
00:15:54.200 refine that probability, I mean find out if it really is one in 100, or something different,
00:15:59.080 now if after 20 years we'd built different telescopes, and we'd built better supercomputers,
00:16:04.040 and we engaged the help of ever brighter astrophysicists to come up with a new calculation,
00:16:09.240 and if we found to our relative horror that in fact the probability wasn't one in 100,
00:16:14.040 but it was one in three of this asteroid actually intersecting with the earth,
00:16:19.640 then I think we'd start to take steps to mitigate the effects, and by mitigate the effects I mean
00:16:25.640 to push the asteroid off course, our creativity would then begin to change that probability,
00:16:33.560 and there's all sorts of interesting ways that engineers and astrophysicists and clever people have
00:16:39.320 thought about in order to push an asteroid out of the way, this is an engineering problem,
00:16:44.920 there's nothing in the laws of physics that says asteroids can't be pushed out of the way by human
00:16:48.680 beings, it may be difficult, it may be a bit of an engineering challenge, but one can imagine
00:16:52.760 people like Elon Musk and Jeff Bezos actually putting rockets up there and just physically
00:16:57.960 pushing the asteroid out of the way, and you don't have to push the asteroid very far,
00:17:01.480 and you don't need much rocketry power in order to push an asteroid out of the way,
00:17:05.240 because an asteroid isn't being propelled anywhere, all it's doing is following a particular trajectory
00:17:10.600 which is affected almost solely by the gravitation of the sun, it's following an orbit around the sun,
00:17:16.840 and sadly for us that orbit intersects with the earth's own orbit at the time when the earth
00:17:21.880 is actually at the place where the asteroid is going to intersect with that orbit,
00:17:26.440 indeed scientists have done calculations where if you would have simply covered one half of the asteroid
00:17:31.480 and aluminium foil or white paint, the differential effect of sunlight on such an object would be
00:17:38.440 enough to push the asteroid, either or either in such a way that it would no longer intersect
00:17:43.800 with the orbit of the earth, so there are solutions to this and we know what these solutions are,
00:17:47.880 and whatever the level of technology today, it's going to be vastly greater in 10 years,
00:17:53.000 and if we know there's a one in three chance in 10 years from now that of the asteroid crashing into
00:17:58.040 the earth 90 years hence, then if we wait yet another 10 years we might find that oh our calculation
00:18:03.400 of one in three is different again, it might in fact be two in three, so it seems more and more
00:18:08.840 likely indeed this asteroid is going to crash into the earth in which case we will have to use our
00:18:13.720 technology to get up there, to get rockets, we'll have a huge global effort in order to push
00:18:18.920 this thing out of the way and that chance would then change, it would no longer be two in three,
00:18:24.920 it would be two in three, absent us doing anything but we would do something. This is the way
00:18:31.080 in which critical rationalism and optimism in the style of David Deutsch deals with existential
00:18:37.640 risks, we have to have a start of problem solving, we cannot calculate the probability of things
00:18:45.160 absent people because people exist and people actually have effects in the world. This is why
00:18:51.080 many of us have a little giggle at things like the doomsday clock, the doomsday clock was a
00:18:56.920 serious endeavor, back invented back in 1947 by atomic scientists who were very concerned, rightly
00:19:04.680 concerned with global nuclear war and so in order to push home the point to the rest of society
00:19:12.760 that these weapons were terribly catastrophic, terribly worrisome, the damage they could do could
00:19:19.480 indeed lead to the collapse of civilization if they were used just like conventional weapons were
00:19:25.880 and so the doomsday clock was supposed to shock people into realizing how scary nuclear weapons were,
00:19:32.760 how seriously we should take the threat of nuclear war and so saying that for example
00:19:37.080 where at five minutes to midnight was supposed to be a measure of how un-seriously government
00:19:42.680 to the world and other people in the world were when it came to considering the seriousness of
00:19:49.080 the threat of nuclear catastrophe that via accident or indeed intention, government to the world
00:19:56.920 could precipitate a nuclear winter, they could precipitate something that no one in their right
00:20:02.760 mind would want to happen and so this was the seriousness of the doomsday clock but since then of
00:20:07.960 course the doomsday clock has been used for just about any kind of threat and so all different
00:20:15.000 threats are now incorporated into the doomsday clock so so now scientists get up in front of the
00:20:20.600 cartoonish doomsday clock and even include things like climate change, climate change which
00:20:26.040 might wipe us out but in the same way that it was always people who were in charge of whether or
00:20:32.360 not the global nuclear winter was going to happen after global nuclear war, it is still people that
00:20:39.560 are in charge of whether or not we're going to have catastrophic global climate change and the
00:20:46.120 scientists and others who set the doomsday clock never consider that people's creativity
00:20:53.640 has a real effect and we should be optimistic about that so instead of the minute hand getting
00:20:57.880 ever closer to midnight it should be getting further and further away from midnight and this is why
00:21:02.680 some of us giggle at it now they're not taking seriously the notion that people are becoming more
00:21:08.920 moral, more risk averse, we are more willing to engage in serious solutions to the most pressing
00:21:14.600 problems so rather than us getting closer to global catastrophic climate change we are getting further
00:21:21.320 away despite all the political noise, despite the concerns about whether or not particular
00:21:27.480 governments are enacting this or that policy we are gaining the power, the knowledge and the power
00:21:33.880 to literally change the thermostat of the planet. People already laugh at certain wealthy individuals
00:21:40.840 like Bill Gates I think was one who suggested that we put a kind of aerosol into the atmosphere
00:21:47.320 in order to reflect some of the sunlight. Now this may not be a good idea but at least in principle
00:21:53.240 people who are wealthy enough are thinking about doing this kind of thing and it might potentially
00:21:58.120 be the case that one day we have something similar. I wouldn't suggest aerosols because I don't know
00:22:03.080 that people have done sufficient scientific work to figure out what negative effects the aerosol
00:22:07.960 might have we want of course to have a solution which fixes the problem of climate change without
00:22:12.680 causing as a side effect a whole bunch of problems that are even worse than what climate change was.
00:22:17.800 What are the kinds of global risk are they of existential risk? Well I've just gone to the Wikipedia
00:22:23.800 page it's just titled global catastrophic risk which is where you get taken if you do a Google search
00:22:30.360 for existential risk you end up at the Wikipedia page and there is a section there on likelihood
00:22:36.680 and so let me read through some these are the estimated probabilities for human extinction before
00:22:42.680 2100. Now there are a whole bunch of respectable scientists and philosophers who make similar
00:22:49.240 calculations. Now Martin Rees is one the great astrophysicist, British cosmologist Martin Rees
00:22:54.280 who read a book called On the Future Prospects for Humanity in 2018 which talks about
00:22:59.320 all the ways in which we might die. And before that in fact he had our final hour that was back
00:23:04.680 in 2003 both of these about the potential for human beings going extinct as so he makes some calculations,
00:23:14.120 some educated guesses about the potential for humans being wiped out through either their own actions
00:23:21.400 or indeed inactions. So we're cobbled either way and I think he's right to say that we would be
00:23:26.200 cobbled either way but can he make such a prediction such a probability assessment? Now someone
00:23:33.560 else who speaks in these terms is of course Nick Bostrom and Nick Bostrom has written books and papers
00:23:39.640 he's probably one of the world's foremost thinkers on this topic of existential risk and so he's
00:23:45.640 written a book called Global Catastrophic Risk that's one of his books. He has serious academic
00:23:51.960 papers for example existential risks analyzing human extinction scenarios which has been cited by
00:23:58.040 641 people in terms of citations that's pretty serious. We might have a look at that one shortly
00:24:03.480 but there are other books as well published just recently one by Toby Ord that book is called the
00:24:09.480 precipice existential risk and the future of humanity. Now I put all of these ways of speaking about
00:24:16.040 existential risk into the same category as disaster movies and I think that they sell really well
00:24:22.920 and people are very excited about them because it is thrilling. I honestly think it's
00:24:27.000 emotionally thrilling to go along to a disaster movie. I know I love it, I loved deep impact
00:24:31.800 the movie all about an asteroid or is it a comet? I think it was a comet coming to earth
00:24:36.520 and literally crashing into the earth and you get to see the wonderful special effects
00:24:40.360 and consider all the ways in which civilization could be upended and how impotent people are
00:24:45.240 and the face of some cosmological event like this. It's fun it's fun but of course in disaster movies
00:24:51.560 people fail to solve the problem and often they fail to solve the problem because they just don't
00:24:55.880 try hard enough. They just don't put the right effort into doing what needs to be done in time because
00:25:01.320 otherwise they wouldn't be a movie. It wouldn't be much of a movie if the terrible event
00:25:05.480 didn't actually happen. It would be an anti-climax but of course in our world we want the anti-climax,
00:25:11.240 we do not want the end to come so we are going to put in more effort than what the people in
00:25:17.080 any disaster movie ever do and end up failing and before I get to the example probabilities from
00:25:22.600 the Wikipedia article let me just skim through Nick Boschtran's professional peer reviewed
00:25:29.000 well-sited article on this topic and let's just go through specific examples of existential risks.
00:25:37.560 So he has different categories of existential risks that he calls bangs crunches shrieks and wimpers
00:25:45.400 that is just a way of him saying how quickly the extinction event is going to happen. The bang is
00:25:52.440 something that happens immediately and there's nothing anyone can do about it. It's a huge explosion
00:25:57.400 across the entirety of the earth and we don't have enough time to respond all the way through to
00:26:02.280 a whimper which is where some post-human civilization arises but involves in a direction that leads
00:26:08.360 gradually but irrevocably to either the complete disappearance of the things we value or to a state
00:26:14.120 where those things are realized to only a minuscule of degree of what could have been achieved. In
00:26:18.520 other words what he's hinting at there is the gradual takeover of some kind of artificial intelligence
00:26:23.800 of humanity and that could happen so slowly and imperceptively that by the time we think that
00:26:30.520 the AI as a danger will be too late for us and too anything about it. Nick Boschtran is very
00:26:35.160 animated by this kind of science fiction scenario but but here's particular forms of bangs
00:26:41.000 these include deliberate misuses of nanotechnology so nanotechnology which might itself cause
00:26:48.120 some sort of pandemic some way of getting into our bloodstream and destroying our bodies.
00:26:54.280 Nuclear holocaust that favourite of people over many decades now. The fact that we're living in a
00:27:01.160 simulation perhaps and it gets shut down that could be a bang event that causes the end of
00:27:06.760 existence. This is a serious philosophical paper. Badly programmed super intelligence could indeed
00:27:12.360 take over much more quickly than what people think. Genetically engineered biological agents.
00:27:19.160 One category is something unforeseen and what he says about this. This is a serious suggestion
00:27:27.000 for the way in which we might all go extinct and we should consider this very seriously.
00:27:31.800 He says we need a catch all category. It would be foolish to be confident that we have already
00:27:36.040 imagined and anticipated all significant risks. Future technological or scientific developments
00:27:41.080 may very well reveal novel ways of destroying the world. So that's interesting and I couldn't agree
00:27:45.640 with him more there. There could always be something unforeseen that could wipe us all out immediately
00:27:52.120 but of course with any of these claims about something unforeseen that could wipe us out
00:27:56.680 all immediately something unforeseen could happen. Namely the creative output of people that could
00:28:01.480 solve that thing in ways in which Nick Boschtran hasn't thought or which I haven't thought
00:28:06.600 but either you can go down the pessimistic route, the exciting thrilling way of thinking that
00:28:11.800 Nick Boschtran could be right and we're all going to die at some point. It's a pessimistic way to
00:28:16.040 live your life or you could go down an alternative route where you think yes there are dangers out
00:28:21.160 there but people, people are grand, not exactly gods but we share some features of what traditional
00:28:27.560 gods alike. We do have control over the laws of nature to a very large extent we're able to
00:28:33.880 move matter, create knowledge and solve problems. We are not being blown around like leaves in the
00:28:40.920 wind. We actually have control. What else does Nick Boschtran have among his ways in which we might
00:28:46.280 all die? Plain old physical disasters so for example a particle accelerator experiment might
00:28:53.080 produce something strange and unforeseen. This too is something that other physicists have said
00:28:58.200 with a large Hadron Collider was switched on people said well it might create little black holes
00:29:02.120 which could swallow the earth. Of course the particle physicists should have been talking to the
00:29:05.640 astrophysicist because astrophysicist knew that there were particles that have much, much higher
00:29:10.920 energy than anything the LHC, the large Hadron Collider was producing crashing into the other
00:29:15.720 upper atmosphere and have been crashing into the upper upper atmosphere ever since the earth began
00:29:21.400 and because we knew that this had been going on for billions of years and hit the two had never
00:29:25.960 created any black hole we know that this isn't going to happen and so Nick Boschtran simply has the
00:29:30.920 physics wrong there. There's nothing that a particle accelerator can do artificially, may and made,
00:29:37.160 that already is not occurring in nature even within even on the earth in the upper atmosphere.
00:29:43.160 What else do we have on the list? Naturally occurring diseases. What if age was as contagious as
00:29:47.400 the common cold this could be, this could wipe us out. Finally he gets to asteroid and comet
00:29:51.880 impacts, runaway global warming, resource depletion or ecological destruction, misguided world
00:29:58.680 governments or other static social equilibrium stops technological process as well. I agree with
00:30:03.560 him there and in fact I would say that a lot of Nick Boschtran's own solutions to these questions
00:30:10.600 of global catastrophic risk. In fact involve stasis, they involve us not making further progress
00:30:18.200 because progress is something that he records as being particularly hazardous. So
00:30:23.160 there's an irony lurking here and he continues, take over by a transcending upload. I think he's
00:30:31.240 starting to repeat himself now so this is more about artificial intelligence taking over the
00:30:35.400 world. Another one which is in a separate category, flawed super intelligence. I think that's
00:30:39.160 the same sort of thing. Repressive totalitarian global regime. Now he's really running out of
00:30:44.440 ideas now he really is repeating himself. Something unforeseen again. Okay he's really starting
00:30:49.720 to wind down. I'm down into the category which is labeled wimpers now. He mentions killed by an
00:30:57.320 extraterrestrial civilization so that makes the list as well so we have to watch out for that.
00:31:03.080 Okay I think that will do for now. I'll leave behind Nick Boschtran's long interesting list of
00:31:10.520 science fiction scenarios. Well basically interesting premises I think that Hollywood producers could
00:31:15.720 do well reading through such a list and finding good directors to make stories out of those
00:31:22.760 ways in which we can die. I'd certainly watch some of those movies but that's for taking them
00:31:27.080 seriously in a scientific sense. Well of course we should take problem seriously but it's those
00:31:32.200 unforeseen ones I'd rather be focused on and not focused on by being fixated upon them but focused
00:31:37.960 on to the extent that it means we need to have continued scientific research. We need to have
00:31:43.080 continued open flourishing societies. We need to give ourselves the best opportunity to be able to
00:31:49.160 create the knowledge in time when these unforeseen things happen. So back at Wikipedia we have
00:31:56.440 the estimated probabilities for human extinction before 2100 and the source for this, the source for
00:32:03.000 this table in Wikipedia is the future of humanity institute in something published in 2008.
00:32:09.080 And so what they say is the chance of nuclear terrorism causing the extinction of human beings
00:32:15.560 before 2100 the chance of that is 0.03 percent and a natural pandemic 0.05 percent and then a
00:32:24.200 technology accident 0.5 percent nuclear war 1 percent, engineered pandemic 2 percent all wars including
00:32:34.120 civil wars 4 percent, super intelligent AI 5 percent, molecular nanotechnology weapons 5 percent
00:32:43.080 and the overall probability of taking all these things together and somehow summing them
00:32:47.640 19 percent. So there's a 19 percent chance of humans going extinct before 2100.
00:32:56.120 It's a question that learned looms here isn't there. How do they know? How do they know any of this?
00:33:01.320 So we could probably have a reasonable way in which we could calculate the frequency at which
00:33:10.040 asteroids collide with the Earth and how that frequency has decreased for good astrophysical reasons
00:33:16.520 over time. We know the reasons why the frequency has decreased over time and then the less we
00:33:21.800 could indeed calculate given what we know about the asteroids that exist in the solar system,
00:33:26.600 what the chance is of an asteroid crashing into the Earth. I think the more scary asteroids of course
00:33:30.920 are the ones and there's been a recent example of asteroids from well outside the solar system.
00:33:37.720 So goodness knows where these asteroids are coming from other solar systems or
00:33:42.360 parts of exploded stars and so on and so forth, traveling far more quickly and much higher
00:33:46.760 velocity than anything within the solar system. So if these things start heading towards the Earth
00:33:51.640 war, that's a bit of a worry. I don't know how you mitigate against those. Some of these things
00:33:55.800 might have relativistic speeds, something that is coming from a supernova,
00:34:00.760 traveling half the speed of light, traveling across the entire galaxy. In fact,
00:34:05.800 in possibly from another galaxy, there's a scary option to consider. They could crash into the
00:34:09.960 Earth and we probably wouldn't see it coming. That would be one of those bang events, literally
00:34:14.760 a bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang-bang. But I don't think it's worth
00:34:18.040 worrying about. I mean, if you seriously tried to calculate the probability of something like that
00:34:22.360 intersecting with the Earth, you'd probably need to wait many times the lifetime of the universe
00:34:28.840 before an event like that actually happened, given the size of the universe and the small
00:34:34.760 size of such an asteroid. Okay, but asteroids, at least we have the potential of trying to come up
00:34:40.440 with the number of asteroids in the solar system, or indeed, you know, in the universe, we could
00:34:45.480 probably have a some sort of estimate of that. And the chance of any of those intersecting with
00:34:51.000 the orbit of the Earth. But when it comes to something like nuclear terrorism, putting the number
00:34:56.120 at 0.03%, how on Earth that is done? This is an inprincible impossible number to use. It's an
00:35:05.000 inprincible impossible number to calculate. Why? Because nuclear terrorism is predicated upon
00:35:12.680 bad ideologies, bad ideas, people who have some ridiculous idea about how, perhaps,
00:35:20.120 if they were to wipe a whole bunch of people out, you know, nuclear terrorism, they're going to
00:35:25.000 receive some metaphysical award, you know, they're going to be welcomed into heaven by the creator
00:35:29.240 of the universe who wants them to do something like that. Now, I don't know what the chances of
00:35:33.720 such ideologies taking hold on the planet. I'd like to think that as time goes on, the number of
00:35:40.920 people subscribing to such ridiculous ideas actually decreases over time. People become more
00:35:46.760 enlightened, more moral. That's been the lesson of the history of humanity. But I don't know
00:35:52.600 if in 10 years that everyone isn't converted to some terrible fundamentalist form of worshipping
00:36:00.200 the great spaghetti monster. And perhaps these pastiparian's, as they're called, these pastiparian's
00:36:05.640 become genocidal and they just want to wipe out all of humanity. The chance of that happening
00:36:11.400 is not noble. We don't know because people are creative and they can create terrible ideas.
00:36:17.320 And those terrible ideas can lead to things like terrorist acts. And in theory, those terrorist acts
00:36:24.680 could be so severe. They could involve nuclear weapons that they could actually wipe out large
00:36:29.720 portions of humanity. But that one reason to think that that is not as likely as what people think
00:36:35.960 is because the good guys have an advantage. The pastiparian's, the people who want to, for
00:36:41.400 example, the fundamentalist, the pastiparians, the people who worship the great spaghetti monster. They
00:36:47.160 are always at a disadvantage because number one, they're very, very focused on ensuring the
00:36:51.720 purity of their religion, of pastiparianism. And because they're so focused on the purity of being
00:36:58.600 a pastiparian, they're not focused on how to get past the security, the security of an open dynamic
00:37:07.720 society, and the open dynamic society governments are very, very focused and very, very
00:37:13.320 skillful and knowledgeable in trying to prevent terrorists from ever succeeding. And they do
00:37:18.200 thwart the terrorists. It's not perfect. But of course, now and again, the terrorists actually do win.
00:37:23.080 But by and large, people who are terrorists are not all that bright. They are at a disadvantage.
00:37:28.040 Now it's not to say that terrorists are uneducated. We know very well Sam Harris speaks
00:37:32.360 eloquently on the fact that there are a certain breed of terrorists who are highly educated,
00:37:36.440 highly educated indeed. But this is the exception rather than the rule. Usually people who become
00:37:44.280 better at their understanding of science also become better at their understanding of at least
00:37:50.200 a folk form of philosophy. They become better, therefore, in terms of reasoning in the moral
00:37:55.400 sphere too. They generally just become better people and less likely to be captured by terrible
00:38:02.040 ideologies. Now it's not to say that it's impossible. We know of famous examples. Where, indeed,
00:38:07.080 terrorists have been among the most educated people in society. But this will become
00:38:11.560 diminishingly unlikely to be the case. So how do you calculate something like
00:38:16.840 the chance of nuclear terrorism wiping out humanity before 2100? This is called
00:38:22.040 mathematicism. It's a form of scientism, really. It's just applying statistical methods to places
00:38:27.960 where there is no business in applying them because they cannot possibly account for all the ways
00:38:33.640 in which people would use knowledge creation, their personal creativity in order to mitigate
00:38:38.120 the chance of this terrible thing happening. Now the chance of super intelligent AI wiping us
00:38:43.000 out by 2100, they said was 5%. But what is the chance of super intelligent AI arising at all
00:38:47.960 within the next century? I don't know. How do they know? We don't know anything about what it
00:38:53.320 takes to produce the super intelligent AI. So much less the chance of super intelligent AI
00:38:59.400 actually being an existential threat to humanity. So even if we did whatever the chances of
00:39:04.520 super intelligent AI arising, how do we know that the super intelligent AI is not going to
00:39:11.000 make it less likely that the super intelligent AI is going to wipe out humanity? Maybe there
00:39:16.040 will be many such super intelligent AI and just like people, maybe some of them will be
00:39:20.840 genocidal and want to wipe out humanity, but maybe they will be in the severe minority of super
00:39:25.960 intelligent AI. So maybe there will be these other AI's who are good and who will keep in check
00:39:32.200 the bad super intelligent AI's. So I don't know how we calculate the probability of any of this
00:39:38.440 happening. It seems a little absurd. They say that wars have a 4% chance of wiping us out before 2100.
00:39:46.440 But the number of wars keeps on decreasing and certainly the number of large scale wars keeps
00:39:51.400 on decreasing. As trade increases, people realise that everyone's boat rises with the same
00:39:58.280 tide. And although at the moment, for example, there is much friction between let's say China and
00:40:04.440 the United States. It's not like China and the United States are on the brink of nuclear war.
00:40:10.200 And I think they're anywhere near it. I think they're as far from having a nuclear war as any two
00:40:15.800 countries have been over the existence of nuclear weapons. I don't think the war's about to become
00:40:21.720 hot. The worst thing that the Chinese can do to their so-called opponents over in the US is to
00:40:28.680 have a trade embargo is to affect trade. This is what the Chinese are doing in order to affect
00:40:34.760 the United States. And people in the United States are far more concerned about that. And into
00:40:38.680 the people in China are far more concerned about that too. Everyone wants to make money.
00:40:43.080 Everyone wants to be healthier, wealthier, and wiser. They want to learn more. They want to make
00:40:48.760 more widgets. They want things to be better for the people that are around them.
00:40:53.000 They're not so interested in taking over more territory. Modular, of course, yes, the Chinese are
00:40:58.680 having issues in Taiwan. They're having issues in the seas around Southeast Asia. Yes, these
00:41:05.800 political things happen. But it really is nothing like the Second World War. There's nothing
00:41:11.480 about China, for example, that resembles anything like Germany at the beginning of the Second
00:41:17.560 World War, just deciding to take over one territory after another. China knows that the free
00:41:24.120 world is a raid against them when it comes to taking over land. The free world remains far more
00:41:30.600 powerful than China. There is a huge alliance of people, Australia, India, the United Kingdom,
00:41:37.720 the United States, Canada, Europe, just about all of these countries. These free democracies
00:41:44.120 are a raid against totalitarian regimes who might decide to encroach upon the territory of small
00:41:50.920 nations. And insofar as it would get anywhere rise to anywhere near the level of true conflict,
00:41:57.560 that true conflict is played out diplomatically. It may be played out at worst in terms of trade,
00:42:04.280 as I say, sanctions might be put on one country or another. There might be mean things said in
00:42:11.080 the political sphere, but there does not seem to be a path to global war. So I don't think not
00:42:18.520 only is there not a path towards global war, there's not a path towards global war such that it
00:42:22.920 would ever wipe out humanity. There's no 4% chance of wars wiping out civilization before 2100.
00:42:31.400 Not only is this not known, but of all the explanations about the ways in which nations interact
00:42:37.400 in the modern era, one with another, the last thing any of them want to do is go to war. No matter
00:42:43.640 how much their bellicose happens, no matter how much they talk big, it is all talk, it is all talk.
00:42:49.800 And so it's kind of laughable that here we have an attempt at a calculation. But how?
00:42:55.320 Where does this number is plucked out of the air? Now I'm just clicking on the reference
00:42:59.960 to this table, and I'm taken to another Wikipedia page, the Future of Humanity Institute at Oxford
00:43:06.120 University. Now why should I be unsurprised? The future of the Future of Humanity Institute,
00:43:14.600 it turns out I didn't know who these people were, I should have guessed, Oxford University.
00:43:21.880 It's director is philosopher Nick Boschman, and its research staff and associates include
00:43:28.040 Toby Ward on the table. I have all the people that are in charge of this. Oh and here we go.
00:43:40.680 The Future of Humanity Institute, the logo is there, it's a lovely diamond logo, it looks like it
00:43:46.360 looks like it could be a logo for a bank, but it says the introduction is, sharing an office
00:43:53.080 and working closely with the Center for Effective Outroism, the Institute's state of objective
00:43:57.960 is to fuck us rich, or it can make the greatest positive difference. I'm sorry, this is very serious,
00:44:06.360 this is about existential risk, can't laugh about this, but why did I should have guessed that
00:44:12.040 they're associated with the Center for Effective Outroism. Now I've written and spoken about
00:44:17.640 Effective Outroism before, Effective Outroism is, I would say, effectively socialism, but
00:44:23.880 putting that aside, they want to make a positive difference for humanity in the long term. Well,
00:44:29.560 the positive difference they want to make of course comes out of the deepest form of pessimism.
00:44:35.400 So if you can extract, if you can extract pessimism, if you can extract positivity
00:44:40.680 out of pessimism, well excellent, wonderful, I more power to them, but I think there's a better way,
00:44:47.880 I don't think we need to start with a premise about all the ways in which we are possibly going to die,
00:44:55.560 and from there somehow figure out ways in which to, well, let's be clear, control society
00:45:02.920 and to come up with solutions, which no doubt are about trying to have these philosophers,
00:45:08.760 the philosopher kings, advise governments on what kind of policies will best redistribute wealth
00:45:15.960 and control people, because that is almost always what their prescriptions are. Now,
00:45:23.320 I shouldn't make light of it, I think that we can come together, I think, that optimists in
00:45:28.200 the David Deutsch sense, or to some extent the Stephen Pinker sense, to some extent the Matt Riley
00:45:35.000 sense, that we can indeed find common cause here, we can indeed find common cause with what I would
00:45:41.320 regard as the most pure pessimists that I know of in academia, the people who are very, very,
00:45:48.360 very fixated upon this notion of existential risk, upon the ways in which people are going to
00:45:55.160 destroy themselves either through creating technology, so technology is always a thing that these
00:45:59.960 people worried about, the AI is going to not only take our jobs, it could potentially wipe us out,
00:46:04.040 so that's how actions doing things, or nuclear weapons, our actions doing things, or our inaction,
00:46:09.480 so for example, not doing the thing that we need to do in time to prevent the global catastrophe,
00:46:15.240 so for example, not solving the global pandemic when it comes, not doing sufficient
00:46:21.640 things about climate change in order to prevent catastrophes in the economy, the prescriptions
00:46:27.880 that are on offer, and we won't go through the prescriptions, this podcast is already
00:46:31.160 far longer than what I ever intended it to be, but the prescriptions usually do come down to some
00:46:37.880 kind of social control, and although I have had serious disagreements with the ways in which
00:46:43.880 governments around the world have handled the pandemic, one thing you can take away from this,
00:46:49.080 is that people do want to try and solve a problem, once it is apparent that the problem
00:46:54.520 is affecting their lives, their personal lives, they are willing to do what it takes,
00:46:59.080 when the government says to lock down, they lock down, there wasn't some sort of civil,
00:47:03.960 huge civil unrest, there wasn't huge amounts of civil unrest, there were some protests here and there,
00:47:08.760 and I think that's a tiny healthy that those sort of things happened, and it wasn't like
00:47:13.160 there was a huge movement against developing a vaccine, quite the opposite, people worked on making
00:47:19.320 multiple vaccines faster than it ever been done before, the lesson of this cannot go unnoticed,
00:47:25.000 we do not need a stance of pessimism, we do not need to think that the only option here
00:47:30.520 is completely upending the global economy and free trade and freedom and democracy,
00:47:36.760 in order to join a totalitarian route, which is what these people typically suggest,
00:47:41.720 but is instead to have more of what it took to do what we did, which was freely enabling
00:47:48.440 researchers to come together to come up with solutions, in this case it was of course a very
00:47:53.320 fast production of a vaccine, where did that come from, by the way, where did these vaccines
00:47:57.800 come from, two places, the United States and Great Britain, two of the freest countries in the
00:48:03.720 world, even the technological powerhouses of Asia were now able to do it as quickly, and why
00:48:10.280 it's purely down to freedom and the way in which people think or are willing to think and
00:48:16.040 criticize ideas, it's not an accident that Oxford University was one of the places that
00:48:23.080 the vaccine was produced most quickly, despite the fact that that's when Nick Bosch admits,
00:48:29.320 I don't know, I don't know if Nick Bosch might have been arguing if he ever had a chance to cross
00:48:34.040 paths as one of these vaccine researchers, that they should be very careful about producing the
00:48:37.960 vaccine, after all who knows what terrible existential threat a vaccine could be if it wasn't
00:48:42.840 perfectly tuned to dealing with the coronavirus, maybe the vaccine could cause all sorts of,
00:48:47.960 and of course there are people like this, by the way, of course there are the anti-vaxa people who
00:48:51.640 think that the cure is worse than the disease itself, or think the cure is some sort of way of
00:48:58.200 the government trying to get into a bloodstream and silly stuff like that. So there will always be
00:49:02.920 pessimists and naysayers that all the rest of us can do I suppose is to remain positive in the
00:49:09.080 face of having existential risks listed. We don't need to analyze them mathematically for the
00:49:15.800 most part. What we need to do is to have a stance not merely of focusing on any specific problem but
00:49:22.280 of generally creating more knowledge in all areas and so that's why we need to continue to fund
00:49:28.520 basic science, basic research at the most fundamental level because the deeper our scientific
00:49:33.720 understanding, for more widely those solutions can be applied to practical applications.
00:49:40.040 Okay I think that we'll do for today, this was largely off the cuff as I say but if you want to
00:49:45.720 read more, simply type in global existential risk, got just existential risk and perhaps
00:49:51.000 chronic bostromine there as well. He is a serious philosopher and despite the fact I laugh I think
00:49:55.560 we need to laugh sometimes because I think the topic is dealt with in a far too serious way when
00:50:04.600 there are many things to be concerned about particular problems of let's say for example
00:50:09.880 local poverty, disease, people dying too early, this kind of thing is actually going on right
00:50:17.720 now all the time. We don't need to be fixated upon how the entire planet, all simultaneously
00:50:23.800 might get wiped out by the gray nanotechnology goo. Okay until next time bye bye.