00:00:09.160 because there are many things that we take for granted today
00:00:19.800 So I thought, well, if I can do some of those things,
00:00:26.480 then that's like magic and that would be really cool.
00:00:29.280 And I was at sort of a slight existential crisis
00:00:34.680 because I was trying to figure out what does it all mean?
00:00:38.680 And I can't really conclusion that if we can advance
00:00:43.240 the knowledge of the world, if we can do things
00:00:46.520 that expand the scope and scale of consciousness,
00:00:50.560 then we're better able to ask the right questions
00:00:59.680 Welcome to Topcast, part two of chapter nine, optimism.
00:01:02.800 That was Elon Musk who, of course, is just getting on
00:01:16.520 there's a real focus on optimism as applied to people,
00:01:22.080 People and knowledge creators, and we just heard from Elon
00:01:28.720 That's us, that's our nature of anything is our nature,
00:01:32.400 We don't all have to be creating knowledge about rockets
00:01:36.840 We might just be creating knowledge in our own lives
00:01:38.840 about how to best have tomorrow be better than today.
00:01:53.200 between our civilization and pre-enlightenment civilizations.
00:02:05.000 to a post-enlightenment one, skipping a little bit,
00:02:08.800 and David Rides, in the case of our civilization,
00:02:15.640 Since our civilization has not been following it,
00:02:25.640 And such a change has never been successful before.
00:02:30.160 So a blind pessimist would have to oppose it on principle.
00:02:37.080 just for a little bit of exposition on just that bit,
00:02:50.680 the proposed principle says avoid making big changes
00:02:54.720 because they're dangerous, avoid making progress
00:03:04.800 which is the very society that's undergoing big changes.
00:03:12.560 And David Rides, this may seem like logic chopping,
00:03:27.080 unknowable things about the future of knowledge.
00:03:31.080 our best knowledge contains both truth and misconception,
00:03:36.760 is always the same as prophetic optimism about the other.
00:03:39.720 For instance, Rhesus worst for use depend upon the
00:03:42.320 unprecedented rapid creation of unprecedented powerful
00:03:45.280 technology, such as civilization destroying bioweapons.
00:03:49.440 If Rhesus is right, that the 21st century is uniquely dangerous,
00:03:58.000 Our final century mentions only one other example
00:04:05.600 Yet by that standard civilization must already have had
00:04:07.960 a similarly narrow escape during the Second World War.
00:04:20.040 and had plans to use it against the United States.
00:04:22.640 Many feared that even a conventionally won victory
00:04:25.080 by the Axis powers could bring down civilization.
00:04:32.960 Though as an optimist, he worked to prevent that.
00:04:42.800 because they considered civilization to be already doomed.
00:04:46.080 So that would make it three narrow escapes in a row.
00:05:01.560 He had calculated that the exponentially growing population
00:05:06.320 of various technological and economic improvements,
00:05:08.360 was reaching the limit of the planet's capacity
00:05:14.000 He believed that he had discovered a law of nature
00:05:18.360 First, the net increase in population in each generation
00:05:38.000 It is not proportional to whatever the population
00:05:46.760 And argued that population, when unchecked increases
00:05:52.320 in arithmetic ratio, a slight acquaintance with numbers,
00:06:00.520 His conclusion was that the relative well-being
00:06:03.120 of humankind in this time was a temporary phenomenon,
00:06:05.840 and that he was living at a uniquely dangerous moment
00:06:14.840 and on the other starvation, disease, murder, and war.
00:06:43.080 who've made similar prophecies of doom over the decades.
00:06:48.520 There's people who appear in the media all the time,
00:06:59.640 If you want to read some more criticism of mouthless
00:07:17.840 And alternatives to it, following David, of course,
00:07:25.680 or as I like to say, pessimism broadly speaking,
00:07:39.440 or other forms of authoritarianism or anti-natalism
00:07:42.400 and certain kinds of affective altruism and so on.
00:08:12.840 and they think that we need to have some form of redistribution.
00:08:31.960 I always get suspicious when altruism doesn't begin
00:08:35.960 with either one's self or with one's local community,
00:08:39.560 but is then focused on these massive global things.
00:08:44.440 Because the more distant you get from the problem
00:08:47.680 you're trying to solve, the less you tend to know about it.
00:08:50.480 Will McCaskle, the philosopher, one of the people
00:08:52.960 who are credited with coming up with a lot of the philosophy,
00:09:06.600 seemed like a good idea at the time to help people
00:09:14.800 so they invented this thing called the play pump
00:09:16.520 that kids could jump on and play like a merry-go-round
00:09:19.840 and push around, and it would pump water as well.
00:09:22.720 As it turned out, the play pump is a terrible idea
00:09:39.280 The lesson here is that if you yourself are not
00:09:44.560 if you yourself are not experiencing the problem
00:09:55.280 is to simply give them money, to give them cash.
00:09:57.440 If you're going to force a solution onto anyone,
00:10:06.560 The actual solution, of course, to people in poverty
00:10:09.520 is to trade with them, is to find something of value
00:10:20.960 and given the opportunity, they will sell you something.
00:10:25.320 And so this is the way, this idea of having free trade
00:10:33.800 who themselves aren't wealthy and then to improve their lot.
00:10:45.160 Top down solutions tend to do quite the opposite.
00:10:48.880 Their pessimistic ideas about how people's creativity
00:10:53.880 doesn't have sufficient value that we would like to invest in
00:10:57.640 or that we, more wealthy people would like to invest in,
00:11:06.840 Malthus had accurately foretold the one phenomenon,
00:11:17.160 In 1798, the forthcoming increase in population
00:11:19.560 was more predictable than the even larger increase
00:11:21.560 in food supply, not because it was in any sense
00:11:23.840 more probable, but simply because it depended less
00:11:26.160 on the creation of knowledge by ignoring that structural
00:11:38.880 into believing that they had discovered an objective asymmetry
00:11:50.560 they all thought they were making sober predictions
00:11:58.080 of the human condition that we did not yet know
00:12:08.480 who thinks they can extrapolate based upon the best
00:12:14.240 ignores that that we can't know what we have not yet
00:12:18.360 And so it wouldn't matter if you've got a wonderfully
00:12:31.760 all the way back in 1798, can lead you into gross error.
00:12:53.320 It's not Bayesian reasoning, it's Bayesian unreasonable.
00:12:59.400 And I just want to go off on a little tangent about that as well.
00:13:04.480 This would be a very narrow genre of epistemology.
00:13:20.280 and I'm solving a particular problem in front of you
00:13:23.240 So if we don't get the answer with the next few minutes
00:13:25.240 or hours or, you know, most days, the patient might die.
00:13:29.720 It's important we distinguish between wild guessing,
00:13:35.280 if we're trying to figure out what's wrong with us medically.
00:13:39.400 And educated guesswork, which is what the experienced doctor
00:13:48.480 is really about when we have competing good explanations.
00:13:55.280 We want to know the cause so we can find the appropriate treatment
00:14:18.720 It's a time horizon within which knowledge can be created
00:14:24.520 And lots of knowledge can be created in years and decades,
00:14:27.560 the time in which the climate change is supposed to happen
00:14:39.120 who has just come into the emergency department.
00:14:43.920 has just turned up at the emergency department.
00:14:49.000 because nothing is going, no new knowledge is going to be created.
00:14:52.160 No new explanatory theory is going to be sent down
00:14:57.040 by the oncologists or by the medical scientists
00:15:01.720 that is going to change the decision of that doctor
00:15:04.400 within the next couple of minutes, extremely unlikely.
00:15:10.320 that are trying to figure out what's going to happen
00:15:17.560 years to decades, centuries, a time during which
00:15:21.880 the educated guesswork can be completely undone.
00:15:37.360 over the next few years that can completely change
00:15:43.360 another mouthless, no race intended to prophesy.
00:15:46.960 They were warning that unless we saw certain problems
00:15:51.560 But that has always been true, and always will be.
00:15:59.040 all of our sister species, such as the Ninianderthals,
00:16:13.000 which reduced its total numbers to only a few thousand,
00:16:15.440 being overwhelmed by these and other kinds of catastrophe
00:16:37.160 as the natural disasters of drought and famine.
00:16:40.160 But it was really because of what we would call
00:16:43.600 In other words, lack of knowledge, skipping a bit now.
00:16:50.400 If a one kilometer asteroid had approached the Earth
00:16:52.640 on a collision course at any time in human history
00:16:56.360 it would have killed at least a substantial proportion
00:17:09.680 from such impacts, which occurred once every 250,000 years or so.
00:17:24.760 of an asteroid impact than in an airplane crash.
00:17:34.600 Civilization is vulnerable to several other known types
00:17:39.160 For instance, ice ages occur more frequently than that,
00:17:46.520 that they can happen with only a few years' warning.
00:17:48.480 A super volcano, such as the one lurking under Yellowstone
00:17:50.960 National Park could blot out the sun for years at a time.
00:17:56.560 using artificial light and civilization could recover.
00:17:59.320 Many would die, and the suffering would be so tremendous
00:18:04.480 almost as much preventative effort as an extinction.
00:18:08.160 We do not know the probability of a spontaneously occurring
00:18:16.160 Since pandemic such as the Black Death in the 14th century
00:18:30.760 We have a chance because we are able to solve problems.
00:19:24.680 This is what is astonishing with the pericule way
00:19:26.720 of thinking that animates people that are engaged
00:19:31.880 Rather so often, our focus simply on this so-called
00:19:40.400 to curb wealth creation, the very thing that can help us
00:19:43.480 with the problems that we don't yet know about.
00:19:47.240 Wealth to many people is, of course, a dubious thing.
00:20:05.720 But the assumption is that the problems we have now
00:20:08.560 are the biggest problems we will have tomorrow.
00:20:12.200 And the heart of the problem, rather too often,
00:20:18.320 And all of that is, in fact, the biggest problem.
00:20:21.280 This idea that we are the cause of the biggest problems
00:20:37.200 we need so much more of it to help fund people's creativity.
00:20:42.040 So we can engage with these problems we haven't foreseen
00:20:52.880 because we have slowed down our wealth creation
00:20:59.360 Any amount of slowing wealth creation in progress
00:21:14.160 It's sort of this idea from Thomas Hobbs almost
00:21:32.400 as we've polluted the earth, polluted this planet.
00:21:38.360 because people just exist in a state of suffering
00:21:48.000 That the anti-natalists, I think, like environmentalists.
00:21:50.320 They all think they're on the side of morality.
00:21:57.680 But morality is really, as David says elsewhere
00:22:39.560 and then there is absolutely no problem at all.
00:22:49.760 Whereas I think that people are a source of progress
00:22:57.760 So those who want to curb human life in one way
00:23:11.880 error correction or perhaps stopping error correction
00:23:17.880 But given that people are the means of error correction,
00:23:27.320 Okay, skipping a little bit and then David writes,
00:23:33.960 if we cannot derive them from our best existing knowledge
00:23:43.160 Like scientific theories, policies cannot be derived
00:23:46.800 They are conjectures and we should choose between them
00:23:51.160 but according to how good they are as explanations,
00:23:58.200 End of the idea that knowledge is justified to believe.
00:24:01.280 Understanding that political policies are conjectures
00:24:03.680 entails the rejection of a previously unquestioned
00:24:07.440 Again, Papa was a key advocate of this rejection.
00:24:10.640 He wrote, and this is one of my favorite passages
00:24:22.160 In fact, it really does form the core of Papa's epistemology.
00:24:36.840 The question about the sources of our knowledge
00:24:54.840 I propose to assume instead that no such ideal sources exist.
00:25:09.360 Therefore, the question of the sources of our knowledge
00:25:24.560 The question, how can we hope to detect and eliminate error
00:25:35.040 for human decision making as it is for science.
00:25:41.640 For example, explanations of what has gone wrong,
00:25:46.040 what effect various policies have had in the past
00:26:06.920 The misconception that evidence can play no legitimate role
00:26:11.680 Objective progress is indeed possible in politics,
00:26:14.240 just as it is in morality generally and in science.
00:26:27.360 or priests or a dictator or a small group or the people
00:26:38.920 How does one ensure an informed and responsible electric?
00:26:41.880 Just pause there, of course, there's an article
00:26:46.080 that David and I like to tweet rather regularly
00:27:05.640 how to most easily remove rulers without violence
00:27:17.800 I know the same discussion goes on the United Kingdom
00:27:20.160 and you hear it from certain people in the United States as well
00:27:24.840 about how we need a more scientifically literate politicians.
00:27:30.040 We need politicians who understand science more.
00:27:38.440 The person who should rule should have some sort of scientific
00:27:40.840 understanding and I don't understand that at all.
00:27:48.240 if you listen to the RSA discussion between Martin Reese
00:27:52.280 and David Deutsch was heartening to here, actually,
00:27:55.200 then both agree and Martin Reese to talk about how
00:28:03.240 for politicians to have a background in science.
00:28:06.920 He would much rather than have a background in, let's say,
00:28:10.720 history and I suppose if I had a bias one way or the other,
00:28:25.920 is rooted in the same misconception as the question,
00:28:28.440 how are scientific theories derived from sensory data,
00:28:35.200 It is seeking a system that derives or justifies
00:28:40.120 from an existing idea, such as inherited entitlements,
00:28:44.680 The same misconception also underlies blind optimism
00:28:47.960 They both expect progress to be made by applying a simple rule
00:28:50.880 to existing knowledge to establish which future possibilities
00:28:53.880 to ignore and which to rely on induction, instrumentalism,
00:28:57.520 and even the mark is an all make the same mistake.
00:29:02.640 They expect knowledge to be created by fear, with few errors
00:29:07.920 that is making continual stream of errors and correcting them.
00:29:22.120 and I'll just point people to, of course, the book
00:29:24.920 and to Popper's important essay in the economist as well.
00:29:30.000 I'll have the link there at the bottom of this video
00:29:38.200 But the central point here that David is making
00:29:45.320 is that democracy is about trying to avoid violence.
00:29:49.520 It's about this political system of nonviolence.
00:29:57.720 then they should be kept there by some sort of force.
00:30:03.760 And so the who should rule question is Popper says
00:30:13.040 and has often received them as David has written there
00:30:21.680 where David writes, Popper therefore applies his basic,
00:30:28.520 how can we rid ourselves of bad governments without violence?
00:30:36.000 So a rational political system makes it as easy
00:30:42.480 and to remove them without violence if they are.
00:30:44.680 So there's this real philosophy of nonviolence.
00:30:56.680 And I've had some disagreements with people over the years
00:30:58.800 about to what extent Popper believed in coercion.
00:31:04.880 it depends in some senses, he seems to have been
00:31:16.240 what Popper actually thought I'd rather concentrate
00:31:23.040 what is the place of force and coercion in politics, if any?
00:31:28.200 Well, it's remarkable to me how many people seem
00:31:36.920 But I'm not just talking about violence in response to violence.
00:31:41.840 you need some kind of police force to stop that violence
00:31:48.640 or some, you yourself take personal responsibility
00:32:12.320 just to see how optimistic they are on this point.
00:32:19.440 Stephen Pinker, probably the so-called most prominent,
00:32:28.800 with David Deutsch has been labeled an optimist.
00:32:35.320 universally optimists, like the way David Deutsch is.
00:32:42.000 not to mention everyone else who's not an optimist,
00:32:47.560 they believe in the initiation of force by the state
00:32:55.360 You know, they think there's some kind of inherent evil
00:32:57.640 lurking there in people that needs to be tamed and controlled.
00:33:03.160 it's the norm I, but he, like many other who throw in
00:33:08.280 think that what evolution has written onto our blank slates
00:33:11.760 before birth, really is quite bad in many ways.
00:33:21.120 And so we've got these genes for rape and genes
00:33:51.880 without some sort of imposition of power from the top,
00:33:56.080 people will fall into violent anarchy and tyranny
00:33:58.160 because of their genes are compelling into violence.
00:34:06.560 It's been kept under control by a certain amount of violence.
00:34:10.160 The status built on this kind of philosophy of violence.
00:34:13.160 It's certainly built on a philosophy of pessimism,
00:34:20.480 people do rebel against the state and the controller state.
00:34:24.560 Especially when they feel that the only way to rebel
00:34:33.000 views used people as inherently immoral in some way.
00:34:44.000 it's not just Christianity here in certain Asian cultures.
00:34:54.040 And it's hard, it saw people as immoral as well.
00:34:57.920 And so the idea there is that you need the strong state
00:35:10.560 There's something wrong with people inherently.
00:35:25.000 And this is a pessimistic view of what humans are
00:35:54.280 But it also assumes that improving upon them is possible.
00:36:05.160 but that when it does, it will be an opportunity
00:36:13.040 and yet there's still much more to read and to comment upon.