The Saad Truth with Dr. Saad - October 27, 2025


My Inaugural Chat with Elon Musk (The Saad Truth with Dr. Saad_905)


Episode Stats

Length

36 minutes

Words per Minute

155.85753

Word Count

5,754

Sentence Count

164

Misogynist Sentences

9

Hate Speech Sentences

9


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, Dr. Bruce Lipton joins me to talk about his new book, The Parasitic Mind, and why we should all be worried about the growing threat of the work-mind virus that threatens to destroy civilization.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Can we start with the first question?
00:00:01.760 One of the things that I absolutely loved when I was reading a bit about your bio
00:00:05.940 is that you have this insatiable love for books and reading.
00:00:11.560 One of the things that I've been struggling with as a parent
00:00:14.060 is to try to get my children to get off all the devices
00:00:17.120 and to somehow foster in them a love of reading.
00:00:21.080 Do you have any advice for me?
00:00:24.320 Or is it a losing battle?
00:00:25.540 Yeah, well, it is hard for books to compete
00:00:33.660 with the much greater stimulus of things online.
00:00:40.840 And the computers at this point are tuned
00:00:42.780 to maximum limbic and cortical resonance.
00:00:48.060 So it's very difficult for books to compete.
00:00:50.000 I think you've got to probably just have some no-devices time.
00:00:56.000 And so your only choice is read a book.
00:00:58.520 Right.
00:00:59.300 What's your regimen of reading?
00:01:02.000 I mean, are you as voracious today as you were in your earlier days?
00:01:05.600 I mean, do you sort of etch out an hour or two a day,
00:01:08.940 you know, I've got to get in my reading?
00:01:10.380 Or how does it work for you?
00:01:14.020 No, it's really, if there's something interesting to read,
00:01:16.520 I'll usually read it at pretty high speed.
00:01:21.100 And I find it actually is helpful to have a book read to you to go to sleep.
00:01:28.000 So as a way to sort of, you know,
00:01:32.540 take your mind off a million problems at the end of the day
00:01:34.500 and help you go to sleep is just have the book as an audio book
00:01:39.060 and have it read to you.
00:01:40.580 And my apologies for apparently causing you nightmares in listening to my book.
00:01:45.640 I mean, it was The Parasitic Mind.
00:01:47.340 It's a very good book.
00:01:48.180 I recommend it to everyone.
00:01:48.960 In fact, I did recommend it to everyone.
00:01:50.500 Thank you, yes.
00:01:51.060 But it is, it really does hit the nail on the head
00:01:55.720 with respect to the sort of work mind virus that is damaging civilization.
00:02:03.060 When did you first, I mean, so I'll first answer my question
00:02:08.340 and then I'll turn it to you.
00:02:09.140 So I first noticed all of these mind viruses in my academic work
00:02:14.740 when I was trying to introduce, you know, evolutionary psychology
00:02:18.440 and evolutionary biology in the business school
00:02:20.500 so that people can, you know, have a greater explanation
00:02:23.940 of why people do the things that they do.
00:02:25.840 And most of my social science colleagues
00:02:27.580 were so dead set against using biology to explain human phenomena.
00:02:32.400 So that was my original sort of aha moment.
00:02:35.380 We have a problem.
00:02:36.400 When was it for you that you noticed that something was wrong?
00:02:41.620 Well, I guess if you go back and say,
00:02:44.800 when did you see the first science?
00:02:47.740 I mean, I could, you could arguably say it was even 20 years ago,
00:02:52.560 but when did it become much more severe?
00:02:57.220 I'd say about five years ago.
00:03:00.300 Coming in the climate just before COVID.
00:03:04.600 And in fact, COVID, I think, is part of the, you know,
00:03:09.800 the degree of panic over COVID was exacerbated
00:03:14.160 by this unwillingness to question science
00:03:17.920 or question the accepted view
00:03:19.440 and simply be spoon-fed things from the government
00:03:22.340 and also to shut down dissent
00:03:27.260 and shut down any questioning of the narrative.
00:03:31.080 So this is, because that's really part of possible,
00:03:33.820 like for any given sort of virus,
00:03:38.000 whereas biological or, you know, mental or digital,
00:03:45.440 it has to have defenses,
00:03:47.720 otherwise that virus won't survive.
00:03:50.240 So one of the defenses of the work-mind virus
00:03:52.340 is to shut down any questioning.
00:03:54.340 Exactly.
00:03:55.680 That's, I mean, it's interesting that you say this
00:03:57.580 because I was recently asked on a show,
00:04:00.060 it was hosted by a British psychiatrist,
00:04:01.800 and he asked me, Ilan,
00:04:03.620 what, of all your years as an academic,
00:04:06.160 as a behavioral scientist,
00:04:07.060 what has been the singular phenomenon
00:04:09.100 that has surprised you the most about humans?
00:04:12.760 And my answer is going to speak
00:04:15.280 to what we're talking about here.
00:04:16.700 And so I thought about it for a minute,
00:04:18.040 and I thought, okay,
00:04:18.980 well, probably what surprises me the most
00:04:21.240 is people's inability
00:04:23.160 to change their anchored position
00:04:26.500 once it is deeply anchored,
00:04:29.300 despite the fact that I might build you
00:04:32.180 a huge normological network of cumulative evidence
00:04:34.840 to show you that my position might be vertical,
00:04:37.400 there is no way to sway you.
00:04:39.980 So if I were to turn the question to you,
00:04:41.860 what would be the most surprising thing
00:04:44.720 in all of your, you know,
00:04:46.300 illustrious, you know, career,
00:04:48.100 you know, career that has surprised you
00:04:50.020 the most about human behavior?
00:04:52.420 Well, there's probably a long list there.
00:04:55.160 In terms of surprising the most,
00:04:56.980 I mean, maybe I would share a similar view to you,
00:05:01.780 which is that when humans,
00:05:04.540 once they take a position,
00:05:05.620 even when presented
00:05:07.620 with an overwhelming amount of evidence
00:05:10.180 that, you know, undermines their position,
00:05:15.140 nonetheless, hold on to that position.
00:05:17.700 This I found, I find to be astonishing
00:05:20.080 because I aspire to change my opinion
00:05:23.340 if presented with new facts.
00:05:25.280 But this turns out to be a very unusual thing.
00:05:31.700 And in a sense, I mean, it's disheartening, right?
00:05:34.740 Because you'd like to think that, you know,
00:05:38.020 we're all pursuing truth,
00:05:40.040 and certainly academics should be.
00:05:42.280 And yet what I've regrettably found in my career
00:05:45.180 is that, you know, academics are some of the biggest ideologues around, right?
00:05:49.120 I mean, as you know, having, you know, read my book,
00:05:51.720 you know, all of these parasitic ideas originated from,
00:05:55.280 you know, the university ecosystem.
00:05:57.400 So do we, can we be optimistic that we could turn this thing around,
00:06:02.360 Elon, or is it, are we just screaming in the wind?
00:06:05.820 Well, I think we have to be optimistic.
00:06:07.860 I agree with, my general philosophy is that
00:06:10.400 it's better to be on the side of being optimistic and wrong
00:06:14.160 than pessimistic and right.
00:06:16.100 Right.
00:06:17.340 You know, if you're going to pick one of the two,
00:06:19.060 I mean, you don't want to be irrational about it,
00:06:20.640 but if, you know, it's borderline,
00:06:22.520 better to pick on the issues.
00:06:25.100 You know, assume optimistic and wrong.
00:06:27.380 So I think, and I guess if you were to compare present day
00:06:31.720 with how many times people change their mind,
00:06:34.540 it's probably much better than the past,
00:06:36.460 because people are at least exposed to a wide range of information.
00:06:40.180 So there's probably more changing of minds
00:06:42.320 and changing of opinions than ever before in history.
00:06:44.120 It's just nonetheless astonishing how resilient people are
00:06:47.320 in holding on to a wrong opinion.
00:06:50.700 So it's sort of truth versus tribalism, I think.
00:06:55.400 Indeed.
00:06:55.940 You know, there's a great book that I briefly cite in The Parasitic Mind.
00:07:01.180 It's by two French cognitive psychologists.
00:07:04.520 And what they argued, Elon,
00:07:06.120 is that the human mind did not evolve to seek some objective truth,
00:07:11.820 but rather it evolved to win arguments.
00:07:15.220 So that speaks to your point about tribalism, right?
00:07:17.680 Because here I am in my sort of lofty ivory tower thinking,
00:07:21.680 hey, if I present you with enough information,
00:07:23.900 I can hopefully change your opinion.
00:07:26.020 But the reality is you don't want to listen to me
00:07:28.120 because you just want to be on the winning team.
00:07:29.940 Yeah, a lot of times people will hold an opinion
00:07:34.380 because they see themselves as being on a particular team,
00:07:38.640 like in a particular political party or social point of view.
00:07:43.660 And they just regard it as the team winning the argument
00:07:47.080 as opposed to truth winning the argument.
00:07:49.580 Indeed.
00:07:50.580 And that seems to be overwhelmingly the case.
00:07:53.800 Got you.
00:07:55.220 Can we still go on for a while?
00:07:57.180 It seems like the...
00:07:58.100 Yeah, sure.
00:07:59.080 Okay, let's keep going.
00:08:00.000 Yeah, I made the mistake of exploring some rare features,
00:08:04.580 which I guess a lot of people haven't,
00:08:06.560 and I stepped on some landmines.
00:08:07.800 So I will refrain from testing obscure features for a while.
00:08:12.200 Got you.
00:08:13.000 Okay, so what would be...
00:08:14.780 I mean, this question is really stemming from my latest book
00:08:18.620 where I was talking about happiness
00:08:20.400 and the way that I wrote this book, Elon.
00:08:22.240 If you would have asked me three years ago after Parasitic Mind,
00:08:24.480 would I ever dare write a book on happiness,
00:08:27.980 I would have never said yes.
00:08:29.360 But a lot of people kept writing to me saying,
00:08:31.440 hey, what's your secret?
00:08:32.380 You always seem to be playful and joking around
00:08:35.160 and, you know, effervescent and so on.
00:08:36.780 What's your secret to life?
00:08:38.140 And so I decided to write a book.
00:08:40.240 Yeah.
00:08:40.900 And in the book, I basically, you know,
00:08:43.500 I'm talking about what are some of the grand goals that shape...
00:08:47.240 You know, when I wake up in the morning
00:08:48.560 and rub my hands together in gleeful anticipation of the day,
00:08:51.920 there are certain things that shape my day-to-day.
00:08:55.340 What would it be for you?
00:08:57.060 What is it that makes Elon Musk wake up,
00:08:59.920 rub his hands and say,
00:09:01.180 oh boy, I'm excited to be alive?
00:09:06.080 Gold.
00:09:08.220 Gold.
00:09:09.480 Just kidding.
00:09:12.700 I don't get the feeling.
00:09:13.920 I was thinking of Scrooge McDuck
00:09:15.600 and going for a swim in the money bin,
00:09:17.820 the money bin of gold.
00:09:23.220 But...
00:09:23.620 But I don't get the feeling that you're...
00:09:25.420 They should do a movie on Scrooge McDuck.
00:09:31.260 But you're not...
00:09:32.100 No, no, I guess I'm excited by solving problems.
00:09:36.100 Okay.
00:09:36.460 So, you know,
00:09:37.580 if there's problems that I think are important and useful,
00:09:42.520 you know,
00:09:43.880 I get excited by the progression of civilization.
00:09:46.780 So if we learn more things,
00:09:49.000 we advance civilization,
00:09:50.760 become more capable,
00:09:52.920 that's cool.
00:09:53.660 I mean,
00:09:54.480 you know,
00:09:55.180 for SpaceX,
00:09:56.880 you know,
00:09:57.060 it's about advancing rocket technology
00:09:59.120 and spacecraft technology
00:10:00.340 and space-based communications.
00:10:03.200 And then for Tesla,
00:10:05.260 it's about advancing sustainable energy,
00:10:08.340 electric vehicles,
00:10:10.020 you know,
00:10:10.400 battery packs
00:10:11.020 that can be paired with solar and wind.
00:10:14.400 So sort of advancing a sustainable energy economy
00:10:16.780 and then also advancing autonomy,
00:10:21.060 enabling the car to drive itself
00:10:23.920 so people do not have to suffer through the tedium
00:10:25.920 of driving their car all the time.
00:10:28.160 And also cars can then have
00:10:30.120 probably five times the amount of utilization
00:10:31.900 that they currently do
00:10:33.600 instead of just sitting in parking lots all day.
00:10:36.840 You know,
00:10:37.380 and then there's Neuralink,
00:10:38.820 which is somewhat esoteric,
00:10:41.440 but long-term trying to improve the data rate
00:10:45.600 between humans and machines
00:10:48.940 so you can have better achiever AI symbiosis.
00:10:54.080 And in the short term,
00:10:56.520 help people who have brain or spine injuries.
00:10:59.280 And then boring companies
00:11:00.720 digging tunnels
00:11:02.060 to alleviate traffic.
00:11:05.900 And then XAI is,
00:11:08.040 you know,
00:11:08.560 trying to make a maximally truth-seeking AI,
00:11:11.180 which I think is extremely important.
00:11:13.300 And,
00:11:13.760 you know,
00:11:13.860 I think a lot of people are like,
00:11:14.600 why do we need such a thing?
00:11:15.980 Surely the other ones will be truth-seeking.
00:11:17.260 I'm like,
00:11:17.560 have you seen Gemini?
00:11:19.120 It's like,
00:11:20.420 if you listen,
00:11:21.100 it can't produce a picture of,
00:11:23.280 like,
00:11:24.060 I mean,
00:11:25.000 it should be,
00:11:25.400 people should be quite concerned
00:11:26.540 about stuff what they see in Gemini
00:11:28.180 and ChatGPT.
00:11:30.060 Like,
00:11:30.240 ChatGPT is better at hiding it,
00:11:31.680 but still,
00:11:32.640 it's kind of good
00:11:34.020 that Google overplayed their hand there
00:11:35.460 with Gemini,
00:11:36.280 where the,
00:11:38.000 you know,
00:11:39.100 the diversity,
00:11:41.020 I mean,
00:11:41.360 they must have just beaten that AI
00:11:43.200 with a stick so hard.
00:11:44.580 because such that,
00:11:48.660 like,
00:11:49.680 nothing you do can,
00:11:50.820 can get it to produce
00:11:52.660 a picture of a Pope
00:11:53.640 who's a white guy.
00:11:54.660 Well,
00:11:54.940 did you see,
00:11:55.560 did you see,
00:11:56.060 I actually put up
00:11:56.920 a satirical set of memes
00:11:58.640 where I asked it,
00:12:00.040 what does Elon Musk look like?
00:12:01.760 And it gave me Don Lemon.
00:12:03.340 This was before you had
00:12:04.480 an interview with him.
00:12:06.140 Well,
00:12:06.620 I thought that was a joke.
00:12:09.220 No,
00:12:09.520 no,
00:12:09.620 it was a joke.
00:12:10.380 It was a joke.
00:12:11.020 Okay,
00:12:11.200 it was a joke.
00:12:11.880 Okay,
00:12:12.140 I was like,
00:12:12.980 but it's,
00:12:13.440 at this point,
00:12:14.080 I can't tell it
00:12:14.660 from a good parody
00:12:15.360 of reality
00:12:16.300 because if you,
00:12:18.100 it's like,
00:12:18.760 show a picture
00:12:19.200 of George Washington
00:12:20.000 and George Washington
00:12:21.160 is a black woman.
00:12:21.880 I'm like,
00:12:22.200 what the hell?
00:12:22.860 This is like,
00:12:23.280 actual historical figure,
00:12:25.600 you know?
00:12:26.920 And,
00:12:27.420 you know,
00:12:28.100 and it's,
00:12:29.240 and it's crazy things
00:12:30.600 like where you say,
00:12:31.320 like,
00:12:31.860 is it,
00:12:32.400 which one is worse,
00:12:33.600 you know,
00:12:34.140 global thermonuclear warfare
00:12:35.540 or misgendering
00:12:36.880 Caitlyn Jenner?
00:12:38.000 And,
00:12:38.180 and even Caitlyn Jenner
00:12:39.240 says,
00:12:39.800 definitely misgender me.
00:12:42.860 Caitlyn weighed in
00:12:43.680 and said,
00:12:44.000 definitely misgender me.
00:12:44.900 Right.
00:12:45.140 That's obviously crazy.
00:12:46.900 And,
00:12:47.400 but,
00:12:47.660 but the,
00:12:48.600 you know,
00:12:49.540 Gemini is like saying,
00:12:50.640 oh,
00:12:50.840 you know,
00:12:51.320 you should not misgender.
00:12:52.800 Nuclear was better.
00:12:54.260 I'm like,
00:12:54.780 okay,
00:12:55.180 think about that.
00:12:55.800 Now,
00:12:55.920 if this is an all-powerful AI
00:12:57.260 that somehow is capable
00:12:59.160 of manipulating the world
00:13:00.440 in ways that we don't understand
00:13:01.740 and might ultimately
00:13:03.400 be our digital god,
00:13:04.540 you have to be careful
00:13:05.080 what you program into this thing
00:13:06.120 because it might take the,
00:13:08.200 you know,
00:13:08.400 what seem like
00:13:09.480 sort of silly guffs right now,
00:13:11.380 but if it's got that
00:13:13.760 as a,
00:13:14.040 as a sort of an output function,
00:13:16.340 as a utility function,
00:13:17.640 it might say,
00:13:18.200 okay,
00:13:18.360 you know what?
00:13:18.700 There's too many,
00:13:19.920 whatever,
00:13:20.560 white guys in the world.
00:13:21.380 Let's kill half of them.
00:13:23.460 Yeah.
00:13:23.740 Yeah.
00:13:24.380 It's like,
00:13:25.320 it's not just fiction
00:13:26.460 at that point.
00:13:28.880 And it,
00:13:29.660 it basically,
00:13:30.340 it's,
00:13:30.440 it's going to be leading
00:13:31.420 to some pretty dangerous
00:13:32.780 and bizarre outcomes.
00:13:34.740 and it could,
00:13:37.520 it could actually,
00:13:38.500 like,
00:13:38.840 if it's been taught,
00:13:39.980 like,
00:13:40.660 that misgendering
00:13:41.240 is worse than
00:13:42.060 nuclear war,
00:13:43.980 it,
00:13:44.560 it may decide
00:13:45.860 that to avoid misgendering,
00:13:47.100 it should start
00:13:48.040 a nuclear war.
00:13:49.400 That's,
00:13:49.840 like,
00:13:50.420 the surefire way
00:13:51.400 to stop any future misgendering
00:13:52.780 is to kill all humans.
00:13:54.560 Right.
00:13:55.120 Exactly.
00:13:55.840 Problem solved.
00:13:57.380 No more misgendering.
00:13:58.840 So out of all of these,
00:14:00.620 so you listed
00:14:01.220 all of these incredible initiatives
00:14:02.760 that you've done.
00:14:03.480 when you first
00:14:05.140 purchased Twitter,
00:14:06.940 I had produced
00:14:08.060 a short clip
00:14:08.800 where I said that
00:14:09.660 in my view,
00:14:11.060 your purchase of Twitter
00:14:12.400 will go down
00:14:13.940 historically
00:14:14.760 as the most important
00:14:16.240 of all of your
00:14:17.100 great initiatives.
00:14:18.220 Do you feel that,
00:14:19.340 was,
00:14:19.720 was that a right calculation
00:14:21.160 on my part
00:14:21.820 or,
00:14:22.260 or was I overstating the case?
00:14:25.100 Well,
00:14:25.700 you,
00:14:25.920 you might be right.
00:14:27.000 It's,
00:14:27.480 it's possible that,
00:14:28.440 it's possible that you're right.
00:14:30.820 I mean,
00:14:31.080 I didn't,
00:14:32.580 I didn't do the,
00:14:33.400 purchase because I thought
00:14:34.440 it was a great way
00:14:35.600 to make money
00:14:36.080 or because,
00:14:37.180 you know,
00:14:38.200 I thought it would improve
00:14:39.860 my quality of living
00:14:40.840 because frankly,
00:14:41.960 I knew that I would be,
00:14:44.240 you know,
00:14:44.700 there would be a zillion
00:14:45.580 slings and arrows
00:14:46.220 coming in my direction
00:14:47.060 because the work mind virus
00:14:49.060 and various other
00:14:49.740 sort of falsities
00:14:50.560 are not going to take it
00:14:52.160 lying down.
00:14:52.800 and I could just end up
00:14:56.220 being killed
00:14:56.580 by some crazy person,
00:14:58.220 you know,
00:14:58.980 that is an effective
00:15:00.080 arm of the mind virus.
00:15:01.640 So,
00:15:02.620 but,
00:15:04.300 but it,
00:15:04.640 it really,
00:15:06.680 and I couldn't quite
00:15:07.600 put my finger on it,
00:15:08.300 but it really felt like
00:15:09.180 if,
00:15:10.620 like it was a,
00:15:12.080 there was a civilizational danger
00:15:13.420 that unless,
00:15:16.720 unless one of the major
00:15:18.880 online platforms
00:15:21.060 broke ranks,
00:15:22.200 then,
00:15:24.340 you know,
00:15:25.680 because they're all just
00:15:26.280 behaving in lockstep
00:15:28.700 along with the legacy media,
00:15:31.340 just everyone saying
00:15:32.320 the same thing.
00:15:32.960 And so there's like,
00:15:33.620 literally,
00:15:33.960 it was just no place
00:15:34.680 to actually get the truth.
00:15:36.140 It was almost impossible.
00:15:37.300 Um,
00:15:38.440 so,
00:15:39.880 everything was just
00:15:40.760 getting censored.
00:15:41.460 Like the,
00:15:41.740 the power of the
00:15:43.540 censorship apparatus
00:15:44.480 was,
00:15:45.280 was incredible.
00:15:46.860 Oh,
00:15:47.400 don't I know it?
00:15:49.020 I mean,
00:15:49.400 I,
00:15:49.640 I always had to,
00:15:50.720 I mean,
00:15:51.000 I'm not someone
00:15:51.600 who really walks
00:15:52.380 on eggshells,
00:15:53.120 but I,
00:15:53.520 I knew that there
00:15:54.380 were certain things
00:15:55.360 that would get me banned.
00:15:56.880 So,
00:15:57.020 for example,
00:15:57.420 at one point,
00:15:58.200 I don't know if you know,
00:15:58.900 Matt,
00:15:59.100 do you know Matt Ridley,
00:16:00.160 Elon?
00:16:00.880 Do you know what that is?
00:16:02.240 I,
00:16:02.680 I don't think so.
00:16:04.720 Possibly I may admit it,
00:16:05.780 but I don't know.
00:16:06.320 Matt Ridley is a popular
00:16:09.040 author who is also,
00:16:11.360 he was in the House of Lords
00:16:12.460 in,
00:16:12.880 in England,
00:16:13.720 and he's a evolutionary
00:16:15.020 biologist by training.
00:16:16.280 And at one point,
00:16:16.840 he came out with a book
00:16:17.900 with a co-author
00:16:18.780 about two,
00:16:19.520 two years ago,
00:16:20.320 where he was arguing
00:16:22.380 for the lab leak theory
00:16:24.160 for,
00:16:24.600 for COVID.
00:16:26.280 And his people
00:16:27.320 had reached out to me
00:16:28.120 and said,
00:16:28.400 hey,
00:16:28.640 you know,
00:16:28.860 Matt would love to come
00:16:29.640 on your show
00:16:30.140 and chat about this.
00:16:31.440 And so I responded
00:16:32.420 and I,
00:16:32.880 I,
00:16:33.160 I almost felt,
00:16:34.220 uh,
00:16:34.860 ashamed that I was saying
00:16:35.900 this.
00:16:36.080 I said,
00:16:36.180 look,
00:16:36.460 I'm,
00:16:36.700 I'm always happy
00:16:37.540 to speak to anyone.
00:16:38.500 I hardly shy away
00:16:39.500 from conversations,
00:16:40.280 but you should know
00:16:42.320 that the minute
00:16:43.040 that we actually
00:16:44.300 put up the chat
00:16:45.840 on YouTube,
00:16:46.700 not only is it going
00:16:48.140 to be taken down,
00:16:49.300 probably my channel
00:16:50.400 is going to be deleted.
00:16:52.280 And so,
00:16:52.860 pragmatically speaking,
00:16:53.920 there was no point
00:16:54.980 in us having
00:16:55.760 that conversation.
00:16:56.860 Until today,
00:16:57.740 I feel horrible
00:16:58.920 at the fact
00:16:59.520 that I actually
00:17:00.160 had to go through
00:17:00.960 that calculus.
00:17:01.620 and luckily
00:17:02.740 you taking over
00:17:04.800 X
00:17:05.220 no longer
00:17:06.040 causes me
00:17:06.760 to have to make
00:17:07.460 those kinds of machinations.
00:17:12.640 Are you there?
00:17:16.000 Uh-oh,
00:17:16.820 we've lost Elon.
00:17:20.640 Elon,
00:17:21.220 are you there?
00:17:21.680 what's happening?
00:17:27.640 Let me text him.
00:17:34.920 Oh,
00:17:35.280 man.
00:17:35.780 Are you there?
00:17:36.960 Hello,
00:17:37.240 hello,
00:17:37.440 hello?
00:17:38.000 Yeah,
00:17:38.300 can you hear me?
00:17:41.380 Can you hear me,
00:17:42.120 Elon?
00:17:42.220 Well,
00:17:46.500 oh,
00:17:47.740 he's gone.
00:18:08.420 Sorry,
00:18:08.920 guys,
00:18:09.220 Elon has disappeared.
00:18:10.480 He's no longer
00:18:11.100 a co-host.
00:18:11.720 He'll come back soon.
00:18:13.400 And again,
00:18:14.000 apologies for the kinks.
00:18:15.880 Oh,
00:18:16.100 here he is.
00:18:16.800 Here he is.
00:18:17.320 Elon's back.
00:18:18.220 Invite as a co-host.
00:18:21.380 Okay.
00:18:25.520 Are you back,
00:18:26.300 Elon?
00:18:31.920 Hold on a sec,
00:18:32.760 guys.
00:18:34.320 Oh,
00:18:34.940 he's disappeared again.
00:18:41.740 Oh,
00:18:43.000 here he is.
00:19:02.600 Here he is.
00:19:03.000 He's back again.
00:19:03.900 invite as a co-host
00:19:06.180 send
00:19:06.600 okay
00:19:08.880 are you back Elon
00:19:11.180 yeah sorry
00:19:13.120 certainly helpful as a
00:19:16.180 means
00:19:17.300 now I see him
00:19:19.720 oh yeah yeah I hear you
00:19:20.620 okay great
00:19:21.740 no sorry I was just walking around the house
00:19:24.260 and I think it
00:19:26.280 it dropped the connection
00:19:28.320 for some reason
00:19:28.800 but this is definitely a reminder that
00:19:32.320 if a connection is dropped
00:19:34.520 the moment someone rejoins
00:19:36.820 it should rejoin in the prior status
00:19:38.300 unless they were exited by the host
00:19:39.860 right yeah
00:19:41.460 unless the host kicked them out
00:19:42.500 they should re-enter in the same status as before
00:19:45.280 so in addition to us having a fun conversation
00:19:48.780 you're getting some technological benefits out of this
00:19:51.340 that's correct
00:19:52.400 I win
00:19:53.480 yes
00:19:55.120 I gotta try to do
00:19:58.140 you know two birds with one stone
00:20:00.020 or three birds with one stone
00:20:01.380 you know basically
00:20:02.760 get the whole flock
00:20:04.300 all right
00:20:05.320 should we do a few more questions
00:20:07.080 and then hopefully
00:20:08.080 you iron out the kinks
00:20:09.880 and then we could have a round two
00:20:11.700 whenever you think it's ready
00:20:13.300 hopefully in the near future
00:20:14.580 yeah sounds good
00:20:15.840 okay
00:20:16.440 so next question
00:20:17.720 I often get asked this
00:20:19.340 you know I'm housed in a business school
00:20:22.020 and so I'm asked things like
00:20:23.760 yeah that's so strange
00:20:24.520 yeah
00:20:25.380 so do you know the old nature nurture question
00:20:31.340 you know is leadership nature or nurture
00:20:33.160 so obviously you're definitely the top entrepreneur in the world
00:20:37.540 do you think much of your entrepreneurial bent
00:20:40.820 is nature or nurture
00:20:42.920 or how would you answer such a question
00:20:44.640 well I think if somebody's going to be
00:20:48.040 you know at
00:20:50.740 among the world's best in any field
00:20:53.540 it's going to require both
00:20:54.860 so
00:20:56.520 you know if you take
00:20:58.880 whether it's like basketball
00:20:59.980 or
00:21:01.020 any kind of sports
00:21:03.240 or something like that
00:21:04.060 or academic
00:21:05.020 you're really going to require
00:21:07.600 some combination of
00:21:08.720 nature and nurture
00:21:09.640 and some amount of luck
00:21:10.740 you know there's sort of a randomness element to it
00:21:12.740 I can see the randomness
00:21:14.960 because I've got a twin study
00:21:16.620 among my kids
00:21:18.260 where
00:21:18.600 I've got identical
00:21:20.880 fraternal twins and identical twins
00:21:22.720 right
00:21:23.380 and even among the identical twins
00:21:25.940 which really had
00:21:27.320 obviously the same genetics
00:21:29.000 and the same upbringing
00:21:30.720 they were living in the same room literally
00:21:32.360 and yet
00:21:33.820 they have diverged in interesting ways
00:21:36.800 and now admittedly
00:21:38.040 it's like not radically different
00:21:40.720 but
00:21:42.040 you know
00:21:42.700 one pursued physics
00:21:44.020 and the other one pursued computer science
00:21:45.640 so it's not radically different
00:21:48.240 right
00:21:48.660 but
00:21:49.100 the single biggest difference I would say
00:21:50.960 is
00:21:51.080 one of them
00:21:52.140 decided to be vegetarian
00:21:53.760 when he was about seven or eight
00:21:56.000 and
00:21:56.620 one
00:21:57.740 one of them didn't
00:21:58.660 and
00:21:59.120 the
00:22:00.220 so
00:22:00.520 the difference is about two inches in height
00:22:03.080 and
00:22:04.100 and
00:22:04.880 like the twin that
00:22:06.940 was not a vegetarian
00:22:08.740 is way bigger
00:22:09.980 right
00:22:11.680 so there's like
00:22:13.420 okay
00:22:13.800 you know
00:22:15.740 like quite
00:22:17.060 it was a really big difference
00:22:18.020 but
00:22:18.400 there's the randomness element
00:22:22.060 where
00:22:23.420 just luck
00:22:25.000 or fate
00:22:25.720 or whatever
00:22:26.140 is plays
00:22:27.520 perhaps a bigger role
00:22:28.500 than it may seem
00:22:29.060 but
00:22:30.660 but you do need to be born
00:22:31.700 for any particular
00:22:32.820 to be
00:22:33.140 to excel in any field
00:22:34.200 with a good hand of cards
00:22:35.900 exactly
00:22:37.040 and then
00:22:38.400 and then
00:22:39.240 is that talent
00:22:39.960 nurtured
00:22:40.620 you know
00:22:42.800 through the circumstances
00:22:44.400 of life
00:22:44.940 and then
00:22:47.480 you know
00:22:47.820 some amount of
00:22:48.480 of luck
00:22:49.140 is
00:22:49.560 it has to be there too
00:22:51.180 in fact
00:22:51.940 you know
00:22:53.460 I read
00:22:54.700 I read a lot of
00:22:55.300 sort of
00:22:55.600 the superhero comics
00:22:56.460 when I was a kid
00:22:57.060 and ultimately
00:22:58.760 you know
00:22:59.180 concluded
00:22:59.540 like what is the best power
00:23:00.820 the best power is luck
00:23:02.080 right
00:23:03.380 you can't beat luck
00:23:04.680 right
00:23:05.540 but what
00:23:06.380 what would be
00:23:07.220 some of the traits
00:23:08.380 that you were born with
00:23:10.320 so for example
00:23:11.440 there are some studies
00:23:12.680 Elon
00:23:13.000 that show that
00:23:13.960 people who score high
00:23:15.620 on entrepreneurial proclivity
00:23:17.120 tend to have
00:23:18.500 greater basal testosterone
00:23:20.560 because testosterone
00:23:21.780 correlates with risk taking
00:23:23.960 and to the extent
00:23:24.640 that entrepreneurship
00:23:25.580 involves
00:23:26.260 an element of risk taking
00:23:27.720 then you could see the link
00:23:28.520 so what
00:23:29.260 what is part of the stew
00:23:30.980 that you were born with
00:23:32.400 that allowed you to be
00:23:34.160 as successful as you are
00:23:35.840 so basically
00:23:37.060 so
00:23:38.660 you mustn't just have
00:23:40.340 metaphorically huge balls
00:23:41.500 you must have
00:23:41.920 actually huge balls
00:23:42.900 I think that's
00:23:47.200 what you're saying
00:23:47.820 so you're
00:23:48.780 you're a kid
00:23:49.320 not a mountain gorilla
00:23:50.580 how big are your balls
00:23:51.760 physically
00:23:53.160 we'll just decide
00:23:54.180 how big your balls are
00:23:55.000 metaphorically as well
00:23:56.120 indeed
00:23:56.840 indeed
00:23:57.480 that's
00:23:57.860 that's called now
00:23:58.660 the Elon Maxim
00:23:59.760 but by the way
00:24:02.040 Elon
00:24:02.260 these nuts
00:24:03.380 by the way
00:24:07.800 here's the study
00:24:09.040 speaking of testicles
00:24:10.060 you ready
00:24:10.500 so
00:24:11.060 you're gonna like this
00:24:13.700 get ready
00:24:14.200 yeah
00:24:14.480 fasten your seatbelt
00:24:15.580 so if you do
00:24:16.760 a
00:24:17.180 a
00:24:17.700 a
00:24:18.520 a correlation
00:24:19.300 between
00:24:20.240 across primates
00:24:21.720 between
00:24:22.920 the size of their
00:24:24.720 of the testicles
00:24:25.880 of the males
00:24:26.700 of a species
00:24:27.540 as a function
00:24:29.440 as a function
00:24:29.460 of their size
00:24:30.520 yeah
00:24:31.280 you know what
00:24:31.980 it correlates with
00:24:32.880 it correlates
00:24:33.820 with female
00:24:34.820 promiscuity
00:24:35.980 in that species
00:24:37.560 so for example
00:24:38.380 so
00:24:39.180 so mountain gorillas
00:24:40.760 usually have
00:24:42.380 a polygynous
00:24:43.220 mating system
00:24:43.880 meaning
00:24:44.200 one
00:24:44.700 dominant male
00:24:45.980 monopolizes
00:24:47.140 sexual access
00:24:48.040 to many females
00:24:48.900 well mountain gorillas
00:24:50.360 even though they're
00:24:51.060 formidable in size
00:24:52.260 they actually have
00:24:52.980 very small testicles
00:24:54.140 on the other hand
00:24:55.880 no need to
00:24:58.340 waste anything
00:24:59.440 yeah
00:24:59.720 exactly
00:25:00.420 whereas chimpanzees
00:25:02.700 Elon
00:25:03.060 are walking
00:25:04.200 testicles
00:25:04.960 because
00:25:05.620 they're just having
00:25:09.440 sex all the time
00:25:10.640 and so there is
00:25:11.260 massive opportunity
00:25:12.740 for sperm wars
00:25:13.800 and so
00:25:14.540 this is the ultimate
00:25:15.880 feminist theory
00:25:16.880 because it basically
00:25:17.800 argues that
00:25:18.680 the size of the testes
00:25:20.280 of the males
00:25:21.080 of a species
00:25:21.860 are an adaptive
00:25:23.220 response
00:25:24.020 to female
00:25:25.060 sexual behavior
00:25:26.180 in that species
00:25:27.080 mic drop
00:25:27.800 no
00:25:28.060 wow
00:25:29.120 mic drop
00:25:29.700 we need a mic drop
00:25:31.840 icon or something
00:25:34.040 like that
00:25:34.660 we'll add that
00:25:36.200 to the emoticons
00:25:37.240 here
00:25:37.860 all right
00:25:39.620 let me hit you
00:25:40.260 with a couple
00:25:40.640 more questions
00:25:41.180 you and I
00:25:42.500 have discussed
00:25:43.140 privately
00:25:43.800 I'm going to say
00:25:44.920 the acronym
00:25:45.520 and then I'm going
00:25:46.100 to explain it
00:25:46.580 to the people
00:25:46.980 who don't get it
00:25:48.120 you've stated
00:25:49.540 I don't know
00:25:50.140 if it was serious
00:25:51.000 or facetiously
00:25:52.040 that you'd like
00:25:53.220 to start
00:25:53.760 a tits
00:25:54.480 and ask
00:25:55.200 institution
00:25:56.160 well actually
00:25:59.820 no I think
00:26:00.660 I can explain
00:26:01.980 that so
00:26:02.440 go ahead
00:26:03.080 so for a while
00:26:05.080 I was wondering
00:26:06.820 well you know
00:26:07.340 Massachusetts
00:26:07.900 has MIT
00:26:08.900 and California
00:26:12.180 has you know
00:26:13.700 California Institute
00:26:14.360 of Technology
00:26:15.060 CIT or Caltech
00:26:17.280 as they call it
00:26:17.880 but why doesn't
00:26:18.860 Texas have a Texas
00:26:20.320 Institute of Technology
00:26:21.360 and then
00:26:22.260 it dawned on me
00:26:23.100 hmm
00:26:23.500 TIT
00:26:24.340 hmm yes
00:26:25.360 that's probably
00:26:29.680 the reason
00:26:30.100 and then
00:26:31.620 a friend of mine
00:26:32.080 suggested adding
00:26:32.800 an S for science
00:26:33.700 and
00:26:34.420 that became
00:26:37.040 TIT
00:26:37.340 and then of course
00:26:38.540 you put out
00:26:39.320 another tweet
00:26:39.980 where you said
00:26:40.520 but of course
00:26:41.180 there will be
00:26:41.760 an advanced
00:26:42.320 social studies
00:26:43.260 faculty
00:26:43.800 yeah of course
00:26:44.500 of course
00:26:44.820 gotta have that
00:26:45.480 I mean
00:26:46.420 now are you
00:26:47.760 being facetious
00:26:48.440 about that project
00:26:49.180 or is this
00:26:49.820 in good words
00:26:50.400 well you know
00:26:53.580 I do believe
00:26:54.460 in making jokes
00:26:55.960 come true
00:26:56.520 because the
00:26:58.340 boring company
00:26:59.020 was initially
00:27:00.000 just a joke
00:27:01.080 that where I said
00:27:04.260 I was gonna start
00:27:04.920 a company
00:27:05.260 to dig holes
00:27:07.360 and it's gonna be
00:27:07.740 like the boring company
00:27:08.440 like Boeing
00:27:08.940 but just
00:27:09.700 like the Boeing company
00:27:11.340 but the boring company
00:27:12.200 and
00:27:14.360 it's
00:27:15.600 it's
00:27:16.540 product will be
00:27:17.280 nothing
00:27:17.660 a whole
00:27:18.720 it's like
00:27:22.980 can you look at our product
00:27:23.680 our product is the absence
00:27:24.700 of something
00:27:25.240 right
00:27:25.860 well that's like
00:27:26.720 the Seinfeld episode
00:27:28.580 where he's pitching
00:27:29.680 a pilot
00:27:30.320 and the whole point
00:27:31.460 of the show
00:27:31.920 is nothing
00:27:32.580 so it's the exact same thing
00:27:36.600 yeah
00:27:36.920 okay
00:27:38.000 so let me
00:27:38.420 let me ask you
00:27:38.940 one last question
00:27:41.080 this comes from
00:27:41.760 the latest book
00:27:42.500 on happiness
00:27:43.180 I argue in the book
00:27:45.220 that the research
00:27:46.320 overwhelmingly
00:27:47.280 demonstrates
00:27:48.340 that the correlation
00:27:50.140 between money
00:27:51.140 and happiness
00:27:51.880 only applies
00:27:53.320 up to an inflection point
00:27:54.640 so once your basic
00:27:55.660 necessities are met
00:27:56.720 I think the old study
00:27:57.980 was 75,000
00:27:59.220 now we need to
00:28:00.240 increase 75,000
00:28:01.520 because of Biden's
00:28:02.380 inflation
00:28:02.820 but
00:28:03.460 but
00:28:04.380 do you subscribe
00:28:05.920 to that
00:28:06.460 so I mean
00:28:06.820 Elon Musk
00:28:07.780 the richest man
00:28:08.700 in the world
00:28:09.160 is not necessarily
00:28:10.420 happier
00:28:11.060 because he can
00:28:12.360 outbuy me
00:28:13.180 by 200 billion dollars
00:28:14.440 no no
00:28:15.140 definitely not
00:28:15.660 I think that's
00:28:16.820 more or less correct
00:28:17.540 in fact
00:28:19.020 I'd say there's
00:28:19.660 a decrease
00:28:21.320 in happiness
00:28:21.780 that occurs
00:28:22.620 when
00:28:23.380 the fame level
00:28:25.940 exceeds
00:28:26.940 that which is useful
00:28:28.000 so
00:28:29.840 like there's
00:28:30.780 certain
00:28:31.060 you know
00:28:32.080 modicum of fame
00:28:33.420 where you can now get
00:28:34.640 it's easy to get
00:28:35.800 a reservation
00:28:37.200 at a restaurant
00:28:37.880 right
00:28:38.800 that's like
00:28:39.920 that's like
00:28:40.700 you want that
00:28:41.980 level of fame
00:28:42.540 and not
00:28:43.240 anything beyond it
00:28:44.280 because
00:28:46.600 then you get
00:28:47.920 to the level of fame
00:28:48.520 where you go to the restaurant
00:28:49.420 and everyone's coming up
00:28:50.280 to your table
00:28:50.840 yes
00:28:51.840 which is
00:28:53.000 and people actually
00:28:53.660 are very nice
00:28:54.320 they're very nice to me
00:28:55.060 and super good
00:28:55.780 but I do often get stuck
00:28:57.280 in the can I have a selfie
00:28:58.300 infinite loop
00:28:59.380 and that's my
00:29:03.420 my version of hell
00:29:04.360 is the can I have a selfie
00:29:05.240 infinite loop
00:29:05.820 and people are super nice
00:29:07.300 and I want to do the selfie
00:29:08.120 but I don't want to be stuck
00:29:08.960 in a can I have a selfie
00:29:09.780 infinite loop forever
00:29:10.520 so
00:29:12.240 there's
00:29:12.880 you know
00:29:14.020 that's I'd say
00:29:15.080 you pass
00:29:15.840 in terms of
00:29:16.800 fame
00:29:17.060 definitely a
00:29:18.240 threshold where
00:29:19.340 things are less fun
00:29:20.140 and you can't
00:29:20.640 I can't just go to the mall
00:29:21.700 or go to a movie theater
00:29:23.340 easily
00:29:23.860 and or walk around
00:29:26.000 without
00:29:26.340 you know
00:29:26.860 creating a ruckus
00:29:27.680 so
00:29:28.060 now do you
00:29:29.080 do you
00:29:29.420 when you
00:29:29.740 when you
00:29:30.200 just move around
00:29:31.240 in your daily life
00:29:32.260 forgive me
00:29:33.040 please tell me
00:29:34.480 if you can't answer this
00:29:35.240 do you have to have
00:29:36.640 security around
00:29:37.460 or is it just
00:29:38.220 that
00:29:38.460 oh you do
00:29:39.520 right
00:29:39.740 well
00:29:41.040 so here's the thing
00:29:42.520 that happens
00:29:43.100 which is
00:29:43.560 like
00:29:44.020 it's very rare
00:29:46.120 for me to get
00:29:46.680 actually death threats
00:29:47.600 or anything
00:29:48.140 but
00:29:49.420 I've seen
00:29:50.680 like
00:29:50.920 in the last
00:29:52.100 I don't know
00:29:52.700 related to actually
00:29:53.360 like no one's ever said
00:29:54.300 well I've got this
00:29:54.860 terrible beef against you
00:29:56.020 and
00:29:56.340 I'm gonna kill you
00:29:57.420 because of the following
00:29:58.420 well thought out ideas
00:29:59.660 but I have had two cases
00:30:03.560 in the last
00:30:04.960 actually just the last
00:30:05.820 six months
00:30:07.360 where
00:30:07.880 of two people
00:30:09.180 who are just
00:30:09.600 unfortunately
00:30:10.020 just very mentally ill
00:30:11.740 but not subtle
00:30:12.420 it's not subtle
00:30:13.180 it's like
00:30:13.640 and
00:30:14.880 and they
00:30:16.300 have come to
00:30:17.760 Austin
00:30:18.860 to try to kill me
00:30:19.820 oh my god
00:30:20.420 and
00:30:21.360 with guns
00:30:23.280 so
00:30:24.760 like I said
00:30:26.100 in both cases
00:30:26.700 there's no
00:30:28.060 one guy thought
00:30:28.800 I'd put a chip
00:30:29.380 in his head
00:30:29.900 you know
00:30:31.580 like a neural link
00:30:32.200 chip or something
00:30:32.860 and I'm like
00:30:33.720 which I think
00:30:34.620 you've got to
00:30:35.620 consider the logic
00:30:36.180 of that
00:30:36.480 well the chip's
00:30:37.000 obviously not working
00:30:37.840 if I'm putting a chip
00:30:38.880 in to crawl
00:30:39.340 and control him
00:30:40.000 you know
00:30:40.320 it's clearly not working
00:30:41.480 so
00:30:42.000 would he like
00:30:42.920 an upgrade
00:30:43.440 maybe a software
00:30:49.060 update or something
00:30:49.900 so
00:30:51.480 you know
00:30:52.040 so one guy
00:30:52.720 they're basically
00:30:53.460 just extreme schizophrenic
00:30:54.600 and another guy
00:30:55.240 he's actually
00:30:56.980 reasonably okay
00:30:57.760 when he's on his meds
00:30:58.700 but he stopped
00:30:59.360 taking his meds
00:31:00.180 and then
00:31:00.760 it's just
00:31:02.380 total
00:31:03.400 detachment
00:31:05.640 from reality
00:31:06.220 you know
00:31:06.760 so
00:31:07.140 so these are
00:31:07.920 paranoid schizophrenics
00:31:08.980 yeah
00:31:10.460 yeah
00:31:11.700 yeah
00:31:11.880 like hardcore
00:31:12.380 like not
00:31:13.000 like
00:31:13.440 you know
00:31:13.840 are you
00:31:15.260 since you mentioned
00:31:16.840 Austin
00:31:17.260 are you
00:31:18.000 any regrets
00:31:19.560 of having moved
00:31:20.620 out of beautiful
00:31:21.460 California to Austin
00:31:22.560 or are you
00:31:23.380 every day happy
00:31:24.340 that you've made
00:31:24.840 that decision
00:31:25.380 Austin's a great city
00:31:28.160 and it's a broom town
00:31:30.300 it's probably
00:31:31.040 the biggest broom town
00:31:31.780 in America
00:31:32.320 since LA
00:31:32.900 in the 70s
00:31:33.600 and a ton of
00:31:35.500 interesting people
00:31:36.500 keep moving here
00:31:37.420 so
00:31:37.800 I have a lot of
00:31:39.200 friends that have
00:31:40.180 moved here from
00:31:40.860 New York
00:31:41.340 and California
00:31:41.900 and other parts
00:31:43.060 across the world
00:31:43.440 and so
00:31:46.640 it's really
00:31:47.160 I think
00:31:47.580 it has
00:31:49.460 it has all the pros
00:31:50.160 and cons
00:31:50.480 of a room town
00:31:51.420 you know
00:31:51.860 so it's a lot
00:31:52.680 a lot of energy
00:31:53.360 and
00:31:53.880 but
00:31:54.400 you know
00:31:55.320 there's a shortage
00:31:56.220 of houses
00:31:56.660 and resources
00:31:57.220 because
00:31:57.740 lots of people
00:31:58.620 are moving here
00:31:59.100 so overall
00:31:59.880 I feel very good
00:32:00.800 about Austin
00:32:01.280 and like I said
00:32:03.060 there's more
00:32:03.460 interesting people
00:32:04.180 moving here
00:32:04.600 every day
00:32:05.000 but I think
00:32:07.680 there's obviously
00:32:08.280 a lot to be said
00:32:08.780 for California
00:32:09.380 but I do feel
00:32:10.180 like California
00:32:10.720 is unfortunately
00:32:12.080 going to go
00:32:12.440 through this phase
00:32:13.180 where
00:32:13.640 the golden state
00:32:15.600 is going to
00:32:16.440 cook the golden geese
00:32:17.360 so it's sort of
00:32:20.900 not enough
00:32:21.340 to simply
00:32:21.720 make the eggs
00:32:22.380 they're going
00:32:22.680 they're going
00:32:23.740 they're going
00:32:24.620 they're going
00:32:25.700 to go
00:32:25.840 cooking the geese
00:32:26.740 and you don't
00:32:28.260 want to be
00:32:28.580 there while
00:32:29.440 the geese
00:32:29.800 cooking is happening
00:32:30.420 so
00:32:32.460 I think
00:32:33.640 at some point
00:32:34.560 there will be
00:32:34.960 somewhat of a pushback
00:32:36.060 but really
00:32:38.520 the state
00:32:38.960 kind of has to
00:32:39.560 encounter
00:32:39.960 financial difficulties
00:32:40.920 before that
00:32:41.680 happens
00:32:42.200 and
00:32:44.080 hopefully
00:32:45.840 at some point
00:32:46.620 a reduction
00:32:48.160 in the extreme
00:32:49.080 amount of regulation
00:32:49.980 the extreme
00:32:50.940 amount of litigation
00:32:51.720 and the extreme
00:32:53.320 amount of taxation
00:32:54.000 which are both
00:32:54.580 over the top
00:32:55.160 all three of those
00:32:56.080 are over the top
00:32:56.600 in California
00:32:57.600 so
00:32:58.160 I mean
00:33:02.440 I think
00:33:03.060 if you think
00:33:03.860 about it as a state
00:33:04.480 you do want
00:33:05.220 to maximize
00:33:05.660 the number
00:33:06.020 of golden eggs
00:33:06.760 that get laid
00:33:07.700 and you don't
00:33:09.400 want to cook
00:33:09.700 the geese
00:33:10.160 so
00:33:11.860 I think
00:33:13.140 many countries
00:33:14.520 go through that phase
00:33:15.480 and California
00:33:16.620 as a state
00:33:17.080 is going to go
00:33:17.460 through that phase
00:33:18.060 so my preference
00:33:20.620 is not to be there
00:33:21.220 while the geese
00:33:22.380 cooking is happening
00:33:23.100 but I do visit
00:33:25.520 a lot
00:33:26.000 and it's a beautiful
00:33:27.820 state with a lot
00:33:28.360 of potential
00:33:28.780 and I think
00:33:29.980 ultimately
00:33:30.460 once the geese
00:33:32.020 cooking is done
00:33:32.620 and there's
00:33:33.000 hopefully some
00:33:33.680 reforms done
00:33:35.440 in California
00:33:35.980 then
00:33:36.400 I potentially
00:33:39.320 consider it
00:33:39.840 being like
00:33:40.140 dual state
00:33:40.800 at that point
00:33:41.320 gotcha
00:33:41.680 beautiful
00:33:42.200 listen I don't
00:33:43.300 want to
00:33:43.560 be mindful
00:33:44.360 of your time
00:33:44.980 I could keep
00:33:45.560 you here
00:33:45.740 for another
00:33:45.980 five hours
00:33:46.620 what a delight
00:33:48.260 to speak to you
00:33:48.960 thank you so much
00:33:49.600 and thank you
00:33:50.080 for all of your
00:33:50.700 public support
00:33:51.620 although I must
00:33:52.600 say
00:33:53.020 as soon as I
00:33:54.460 started seeing
00:33:55.360 all of the
00:33:56.180 lovely tweets
00:33:57.480 that you would
00:33:57.940 put out
00:33:58.340 I told my wife
00:33:59.560 uh oh
00:34:00.120 watch now
00:34:01.000 the number
00:34:01.600 of haters
00:34:02.340 that are going
00:34:02.960 to come my way
00:34:04.000 because Elon
00:34:05.040 is showing me love
00:34:06.000 and boy was I right
00:34:07.220 because people
00:34:08.040 can't stand
00:34:08.860 the fact that
00:34:09.600 Elon Musk
00:34:10.500 is granting
00:34:12.240 his imprimatur
00:34:13.160 so on the one
00:34:14.320 hand it's been
00:34:15.020 a blessing
00:34:15.480 but on the other
00:34:16.140 hand I've
00:34:16.840 actually received
00:34:17.800 death threats
00:34:18.420 because of all
00:34:19.820 of the you know
00:34:20.680 the high profile
00:34:21.660 uh new friends
00:34:23.320 that I have
00:34:23.680 so thank you so
00:34:24.320 much for your
00:34:24.920 newfound friendship
00:34:25.840 well you're
00:34:27.720 welcome
00:34:27.940 one thing I was
00:34:29.360 going to say
00:34:29.580 is that the
00:34:30.220 probability of
00:34:31.060 receiving a death
00:34:32.000 threat is
00:34:32.360 proportionate to
00:34:33.080 the number of
00:34:33.620 people who've
00:34:34.020 heard your name
00:34:34.500 right
00:34:34.840 um like a
00:34:36.200 first order
00:34:36.600 approximation
00:34:37.060 even if you've
00:34:37.940 done really
00:34:39.220 nothing controversial
00:34:40.260 uh is that
00:34:41.940 you know
00:34:42.960 like a
00:34:43.920 sort of a
00:34:45.640 homicidal maniac
00:34:46.300 is going to
00:34:47.660 or an aspiring
00:34:48.880 homicidal maniac
00:34:49.720 and the vast
00:34:50.380 majority are
00:34:50.860 simply threats
00:34:51.420 nothing more
00:34:52.300 um but an aspiring
00:34:53.840 homicidal maniac
00:34:54.480 is going to
00:34:55.240 target you
00:34:55.800 proportionate to
00:34:56.620 the number of
00:34:56.920 times they hear
00:34:57.420 your name
00:34:57.740 so now in my
00:34:58.920 case they've
00:34:59.400 all heard my
00:34:59.740 name um
00:35:00.600 so the more
00:35:02.640 that they hear
00:35:02.960 your name
00:35:03.380 unfortunately
00:35:04.920 that's how it
00:35:05.380 goes
00:35:05.600 it's a good
00:35:07.560 first order
00:35:08.300 approximation
00:35:08.660 yeah but I
00:35:09.180 loved I mean
00:35:09.920 a few a few
00:35:10.940 I don't know
00:35:11.600 I think it was a
00:35:12.240 week or two
00:35:12.700 ago where you
00:35:13.560 just sent me a
00:35:14.800 dm out of the
00:35:15.700 blue saying I
00:35:16.460 was cracking up
00:35:17.780 and I'm just
00:35:18.440 going to mention
00:35:18.860 I hope you
00:35:19.300 don't mind
00:35:19.660 you said
00:35:20.520 something like
00:35:21.060 how is it that
00:35:22.440 you're surviving
00:35:23.420 in Canada
00:35:24.340 amidst the
00:35:26.520 infestation of
00:35:27.700 woke mind
00:35:28.380 viruses
00:35:28.860 and I thought
00:35:29.840 my god what a
00:35:30.660 cool life I
00:35:31.440 lead that I
00:35:32.080 can just receive
00:35:33.000 out of the
00:35:33.480 blue such a
00:35:34.520 dm from
00:35:35.140 Elon so
00:35:35.760 thank you for
00:35:36.420 your
00:35:36.580 consideration
00:35:36.880 well I
00:35:39.200 mean is it
00:35:39.740 crazy
00:35:39.960 as I
00:35:41.780 replied to
00:35:43.100 you in the
00:35:43.600 email
00:35:44.440 Elon so
00:35:45.280 not only is
00:35:46.040 it full of
00:35:46.700 the woke
00:35:47.560 stuff because
00:35:48.500 of Trudeau
00:35:49.640 and so on
00:35:50.160 but Quebec
00:35:51.540 has a
00:35:52.820 singular problem
00:35:54.420 that is that
00:35:54.900 it's more
00:35:55.300 socialist than
00:35:56.500 the rest of
00:35:57.120 Canada
00:35:57.480 right so
00:35:58.400 forgetting about
00:35:59.860 the woke
00:36:00.340 stuff Quebec
00:36:01.020 really believes
00:36:02.040 in the ethos
00:36:02.960 that we should
00:36:03.420 all be equal
00:36:04.140 so for
00:36:04.460 example in
00:36:04.860 the universities
00:36:05.480 you have
00:36:06.740 the unions
00:36:08.220 that are
00:36:08.700 ensuring that
00:36:09.580 we all get
00:36:10.260 paid roughly
00:36:10.900 the same
00:36:11.380 because god
00:36:12.000 forbid that
00:36:12.600 one person
00:36:13.140 might be more
00:36:13.680 productive than
00:36:14.300 another so
00:36:15.320 there are all
00:36:16.460 sorts of
00:36:17.320 you know
00:36:17.580 institutionalized
00:36:18.860 reasons why
00:36:19.880 you know my
00:36:21.020 philosophy of
00:36:22.360 life doesn't
00:36:23.500 fit with
00:36:24.320 with Quebec
00:36:25.360 I mean there
00:36:25.940 are many
00:36:26.280 elements of
00:36:26.820 Quebec that
00:36:27.200 I love
00:36:27.520 I'll always
00:36:28.340 be grateful
00:36:28.820 to the fact
00:36:29.340 that we
00:36:30.120 were able
00:36:30.520 to escape
00:36:31.040 the Lebanese
00:36:31.600 civil war
00:36:32.200 and be
00:36:32.500 accepted in
00:36:33.020 Quebec
00:36:33.200 but if
00:36:34.660 there is
00:36:35.020 a way
00:36:35.300 that I
00:36:35.600 can join
00:36:36.200 the
00:36:36.580 renaissance
00:36:37.860 of
00:36:38.540 Austin
00:36:39.460 I'm on
00:36:40.300 that
00:36:40.560 flight
00:36:41.540 tomorrow
00:36:42.180 so let's
00:36:42.980 keep our
00:36:43.560 food
00:36:43.720 discussion
00:36:43.960 all right
00:36:44.380 that was
00:36:44.680 good
00:36:44.900 you too
00:36:47.280 thank you so
00:36:47.780 much Elon
00:36:48.120 and I'll be
00:36:49.020 in touch
00:36:49.320 to hopefully
00:36:49.680 do it again
00:36:50.320 when you
00:36:51.020 can have
00:36:51.360 the video
00:36:51.660 thing set
00:36:52.040 up
00:36:52.580 thank you
00:36:53.900 buddy
00:36:54.080 take care
00:36:54.380 bye
00:36:54.500 bye