Real Coffee with Scott Adams - November 21, 2022


Episode 1934 Scott Adams: The News About Twitter, Trump, Alex Jones, Musk, Ye And More


Episode Stats

Length

1 hour and 18 minutes

Words per Minute

142.22014

Word Count

11,142

Sentence Count

940

Misogynist Sentences

9

Hate Speech Sentences

27


Summary

A play about a Twitter quitter looking for a new job, the disappearance of the Dilbert website, and why Elon Musk is the funniest person in the world. Plus, a story about a man who thinks he's a comedian.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization.
00:00:05.160 It's called Coffee with Scott Adams, if you didn't know that already.
00:00:08.980 And you probably come here to have your dopamine faucet turned on.
00:00:15.360 Came to the right place.
00:00:17.200 Do you feel your dopamine starting to get a little active?
00:00:21.340 Feel a little bit of it?
00:00:23.040 Yeah.
00:00:23.820 Watch.
00:00:24.420 Watch what happens after the simultaneous sip.
00:00:26.980 You can almost feel the tingle.
00:00:28.400 And all you need to get that tingle is a cupper mugger, a glass of tanker, chalice, or stein, a canteen, sugar, flask, a vessel of any kind.
00:00:37.080 Fill it with your favorite liquid.
00:00:39.160 I like coffee.
00:00:40.940 Join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:00:46.880 It's going to give you a little tingle.
00:00:48.500 It's called the simultaneous sip.
00:00:50.800 It happens now.
00:00:51.560 Go.
00:00:57.680 Yep.
00:00:58.400 Dopamine faucet on.
00:01:03.800 We're going to leave that on for the entire live stream.
00:01:07.360 What you do afterwards?
00:01:08.680 Well, that's your own decision.
00:01:11.440 All right.
00:01:13.940 So I would like to begin with a one-act play.
00:01:18.920 This play I call a Twitter quitter who's looking for a new job.
00:01:27.180 Yeah.
00:01:27.480 Twitter quitter.
00:01:28.300 I just coined that.
00:01:30.060 The people who left because Musk told them that they had to work hard.
00:01:35.580 They're the Twitter quitters.
00:01:36.940 So now my impression of a Twitter quitter going for a new job.
00:01:43.040 I will need to do this visually, so I give you the return of Dale.
00:01:51.540 Well, I'm applying for a job at your company, and I have very good qualifications.
00:02:00.120 Well, I see from your resume that you had a job at Twitter.
00:02:06.540 Twitter.
00:02:06.840 So could you tell me what was it about that job?
00:02:10.660 Why did you quit that job?
00:02:11.800 Um, something about President Trump?
00:02:23.980 Hmm.
00:02:24.900 That doesn't seem terribly relevant.
00:02:27.880 Um, was there anything else?
00:02:30.740 Any other reason you left Twitter?
00:02:36.820 I'm not a huge fan of hard work.
00:02:39.420 And scene.
00:02:49.860 Isn't it going to be a little bit awkward?
00:02:53.040 It's going to be a little bit awkward, isn't it?
00:02:56.000 Because they all quit for the same reason.
00:02:59.460 Not big fans of working hard.
00:03:05.180 How's that going to work?
00:03:06.400 How's that play out?
00:03:07.740 I don't know.
00:03:08.140 I'm just curious.
00:03:09.420 All right.
00:03:12.140 Um, you may have already noticed that the Dilbert website has been down since Friday, I think.
00:03:19.320 And you might say to yourself, how is that even possible?
00:03:23.640 In 2022, how can anybody bring a website down for like three days?
00:03:30.540 Like a major professional commercial website.
00:03:34.560 Three days.
00:03:35.280 I don't even think they're close to fixing it.
00:03:39.740 I don't think they're close.
00:03:42.060 Now, I don't know the details.
00:03:45.360 Um, I don't know.
00:03:46.160 Actually, I don't even know if it's confirmed that it's a hack.
00:03:49.260 But it looks like it, you know, from the outside, it looks like a hack.
00:03:53.760 Um, apparently the DNS pointer just disappeared.
00:03:57.660 So basically, the website just doesn't exist on the internet anymore.
00:04:02.260 Somebody took down the DNS pointer or could have been maybe some weird, there's a possibility that it happened naturally.
00:04:12.680 I don't know what the odds of that are.
00:04:14.900 But there was a weird chain of custody of the website that went from, you know, one company to another.
00:04:22.260 And the old company disappeared, but it's still in their name.
00:04:25.880 And God knows there might be some problem like that.
00:04:28.660 I don't know.
00:04:29.980 But it also could have been targeted at me.
00:04:34.760 Could you rule that out?
00:04:37.020 Could you rule out that somebody targeted me?
00:04:39.620 Now, they did get all of the, you know, the entire comic infrastructure at United, I'm sorry, at Universal.
00:04:50.980 So they got all the comics.
00:04:52.920 It wasn't just me.
00:04:54.260 But I'm the only one that you would target.
00:04:57.280 Right?
00:04:57.560 The rest of them were not anybody making any trouble.
00:05:01.340 I don't know.
00:05:02.260 Could be anything.
00:05:03.680 You never know.
00:05:04.860 But they'll get it fixed.
00:05:05.820 So, as you know, I think this may have been reversed.
00:05:13.440 But didn't CBS say they were going to suspend their Twitter activity due to the, quote, uncertainty under Elon Musk's leadership?
00:05:22.620 Did they reverse that already?
00:05:27.240 Yeah.
00:05:28.020 It says I'm the registrant, but that's just on paper.
00:05:31.240 I don't manage the site.
00:05:32.200 Well, anyway, regardless of what CBS has done recently, I only want to talk about how Musk made fun of them.
00:05:43.000 I used to think that you couldn't top Trump for funny tweeting.
00:05:50.240 But I feel as though Musk is funnier.
00:05:54.440 And who saw that coming?
00:05:56.460 I mean, he claims he's on the spectrum, you know, Asperger's or something.
00:06:00.240 But I'm not so sure.
00:06:03.160 I know.
00:06:03.640 His facility with humor is so strong.
00:06:09.500 And for that, you would have to have an understanding of how other people's minds work.
00:06:14.480 So I'm just not sure about this Asperger's on the spectrum thing.
00:06:18.920 But here's what he said about CBS.
00:06:23.040 And I laughed until I cried.
00:06:25.800 This morning, I laughed.
00:06:27.420 I had tears streaming out of my eyes.
00:06:29.260 This is the funniest frickin' set of tweets I've ever seen, all right?
00:06:34.980 There are three tweets, and they have to be seen, you know, as a whole.
00:06:41.680 So when CBS, you know, said it was going to suspend its Twitter activity, Musk first tweeted this.
00:06:48.720 Who made this decision?
00:06:51.440 Perfectly good thing to ask.
00:06:52.620 Who made this decision?
00:06:53.500 But within a very short time, Musk tweeted a follow-up.
00:06:58.840 Oh, forget it.
00:06:59.620 Who cares?
00:07:04.100 Because Musk owns Twitter, and CBS just isn't really relevant.
00:07:09.440 But it gets better.
00:07:11.160 His third follow-up, or second follow-up, he goes,
00:07:14.140 They should bring Walter Cronkite back.
00:07:17.700 Oh, my God.
00:07:21.960 That is so fucking funny.
00:07:24.500 That is fucking hilarious.
00:07:29.600 Now, I don't know how you interpret that.
00:07:32.480 You know, you can interpret it different ways.
00:07:34.160 The way I interpret it is the funniest way, right?
00:07:39.120 The funniest way.
00:07:40.440 The last time you didn't suck, Walter Cronkite was still alive.
00:07:46.480 So that's one way to interpret it.
00:07:48.420 The last time you didn't suck, Walter Cronkite was alive.
00:07:52.000 Here's the second way to interpret it.
00:07:54.500 And I don't know which way he meant.
00:07:56.120 The second way to interpret it is you could dig up his corpse,
00:07:59.480 and he would do a better job than whatever they're doing now.
00:08:01.800 Well, also good.
00:08:04.180 Also perfectly good.
00:08:07.400 And then the third way is that CBS is so unimportant and irrelevant
00:08:12.020 that, you know, acting like you didn't even know that
00:08:14.920 you didn't even know that Walter Cronkite wasn't an option anymore.
00:08:20.700 Which would be funny.
00:08:22.500 So this is funny any way you want to interpret it.
00:08:25.820 This is fucking hilarious.
00:08:27.360 I've never seen any corporation dismissed so effectively.
00:08:34.360 They should bring back Walter Cronkite.
00:08:40.060 Oh, come on.
00:08:41.540 Is there anybody who doesn't think that's hilarious?
00:08:45.000 Now, it's partly hilarious because of who says it, right?
00:08:47.880 Like, if I had tweeted that, it wouldn't be funny
00:08:51.160 because the context would be different.
00:08:52.980 But just the fact that the guy who just bought Twitter,
00:08:56.540 richest man in the world,
00:08:58.480 he says they should bring Walter Cronkite back,
00:09:01.300 that's everything.
00:09:03.580 All right.
00:09:05.700 Adam Schiff was on TV saying he's not too pleased
00:09:09.540 that Trump is allowed back on Twitter.
00:09:11.340 Which brings us to a really, really interesting thing
00:09:18.720 that's happening.
00:09:20.540 And don't let me forget to talk about Ye,
00:09:24.720 because I'm going to skip him for a moment.
00:09:26.960 So Twitter has a, let's call it a fact-checking feature now
00:09:32.480 that used to be part of that bird watch,
00:09:34.820 is what they called it.
00:09:35.660 But now it's renamed to Community Notes.
00:09:40.860 And here's what Community Notes on Twitter does.
00:09:44.780 If someone makes a claim that needs a fact-check,
00:09:50.620 somebody can add a note with context.
00:09:54.840 But other people will vote
00:09:56.960 on whether they agree on that added note.
00:10:00.180 So the ones that most people agree with
00:10:02.520 would be surface to the top.
00:10:04.440 But I think I'm describing it a little bit wrong.
00:10:08.600 And here's the part I'm going to need a,
00:10:10.840 I'm going to need a little bit of a tweaking
00:10:12.800 on my understanding, okay?
00:10:14.260 So maybe you can help me with this.
00:10:16.080 I don't believe that the notes go to the top,
00:10:19.980 you know, the correction notes.
00:10:24.620 I didn't skip the sip.
00:10:26.620 What are you talking about?
00:10:31.280 Why do you think I skipped the simultaneous sip?
00:10:34.440 Did I?
00:10:39.520 Yeah.
00:10:40.640 There's some kind of weird thing happening over on locals.
00:10:43.260 There's a bunch of them who think they didn't see the sip.
00:10:46.640 You saw it, right?
00:10:49.300 Yeah.
00:10:51.600 Oh, weird.
00:10:52.340 There's like a,
00:10:53.480 there's a whole like hallucination thing happening in real time.
00:10:56.180 There's a whole bunch of people on locals
00:10:59.320 who believe that it didn't happen.
00:11:01.560 Oh, and over here,
00:11:02.440 there's a bunch of people who say it didn't happen as well.
00:11:04.620 So YouTube sees the same thing?
00:11:10.660 Really?
00:11:12.400 Now, it's not because of a time lag, right?
00:11:14.260 Now, did you,
00:11:16.420 did you see me do the wind-up,
00:11:18.820 but not the actual sip?
00:11:22.340 You,
00:11:22.840 you saw the introduction to the sip, right?
00:11:26.780 You saw the song,
00:11:28.220 you know,
00:11:28.480 the,
00:11:28.700 the poem or whatever.
00:11:29.920 This is the weirdest thing.
00:11:35.180 Wow.
00:11:36.540 So,
00:11:37.360 you just,
00:11:38.440 you're watching reality fork.
00:11:40.980 Reality is forking right now,
00:11:42.520 right in front of you.
00:11:43.480 I don't think I've ever seen this before.
00:11:45.840 This is in real time.
00:11:47.580 You're watching reality fork.
00:11:49.420 There's some people who just are leaving your reality.
00:11:52.560 Some people didn't see it,
00:11:53.660 and some people saw it,
00:11:54.560 and it was,
00:11:55.480 it either did or did not happen,
00:11:57.100 and it was right here.
00:11:58.880 Wow.
00:12:00.440 So interesting.
00:12:02.220 I wonder what triggered it.
00:12:03.960 Was,
00:12:04.220 was there a delay in the feed?
00:12:06.700 This is now the most interesting thing
00:12:08.500 that's happening today.
00:12:10.420 Is it because of a glitch?
00:12:13.400 Yeah,
00:12:13.680 it might have been,
00:12:14.120 it might have been a glitch in the stream.
00:12:15.840 So,
00:12:16.300 anyway,
00:12:17.540 back to my point.
00:12:19.900 So,
00:12:20.440 the,
00:12:20.560 the Twitter fact-checking,
00:12:22.000 which is really a context edition,
00:12:24.380 not a fact-check so much,
00:12:26.040 but I believe it's not based on just popularity.
00:12:29.920 I believe it's going to be based on a system called POLIS,
00:12:34.600 or POLIS,
00:12:35.960 that got rolled out in Taiwan.
00:12:38.600 So,
00:12:38.920 Taiwan will ask its citizens a question such as,
00:12:42.500 how would you regulate Uber?
00:12:45.080 By the way,
00:12:46.340 what I'm,
00:12:46.680 what I'm telling you now comes from an excellent article in Wired.
00:12:50.560 So,
00:12:51.000 let me recommend Wired for the,
00:12:52.640 the full detail of this.
00:12:54.080 Very good,
00:12:54.780 very good reporting in Wired on this topic.
00:12:57.940 So,
00:12:58.440 this POLIS system in Taiwan,
00:13:00.160 they can ask the public,
00:13:01.140 how would you regulate Uber,
00:13:03.820 would be one example.
00:13:05.640 But,
00:13:06.200 instead of just taking the most popular suggestion,
00:13:09.880 they find the suggestion
00:13:11.300 that has popularity across demographics.
00:13:16.560 So,
00:13:17.100 they're finding the thing that has the most widespread agreement,
00:13:21.520 as opposed to the thing that got the most trolls to vote for it.
00:13:24.980 Does that make sense?
00:13:26.740 So,
00:13:27.340 they can,
00:13:27.740 they can handle the troll problem,
00:13:30.060 by not looking at the most votes,
00:13:33.020 but rather looking at the votes that come from the most places,
00:13:36.580 the most types of people.
00:13:38.720 Which is,
00:13:40.460 frankly,
00:13:41.140 brilliant.
00:13:42.420 I'd never even thought of that.
00:13:44.280 Had you?
00:13:45.620 Yeah,
00:13:46.200 I mean,
00:13:46.380 that's just brilliant.
00:13:47.660 So,
00:13:48.040 if you start with things that people on all sides can agree with,
00:13:52.080 and then you start narrowing it down from that point,
00:13:54.540 apparently you get a better outcome.
00:13:55.820 outcome.
00:13:56.980 Now,
00:13:57.600 it looks like,
00:13:59.360 Twitter might follow that model.
00:14:01.920 How amazing would that be?
00:14:04.260 I mean,
00:14:04.620 seriously.
00:14:05.480 This is the first time I've heard an actual system plan,
00:14:11.200 that,
00:14:11.560 in my opinion,
00:14:12.380 could totally work.
00:14:14.500 Am I wrong?
00:14:16.120 Doesn't that look like that could totally work?
00:14:19.040 I think it could.
00:14:20.760 I think it could totally work.
00:14:22.020 I think it could totally work.
00:14:23.120 So,
00:14:23.580 imagine that.
00:14:26.400 It's not a coincidence that CBS is wondering whether they can participate.
00:14:31.600 because,
00:14:33.080 here's the thing.
00:14:38.620 How can,
00:14:39.600 how can the mainstream news survive this?
00:14:44.420 Literally.
00:14:45.900 Literally.
00:14:46.380 how can any news organization,
00:14:48.600 left or right,
00:14:49.580 how could they survive this?
00:14:51.840 I don't know.
00:14:52.960 I don't think they could survive it.
00:14:55.220 Left or right.
00:14:56.420 Unless they,
00:14:57.240 you know,
00:14:57.460 start doing real news and correcting and stuff.
00:15:00.540 But,
00:15:01.080 I don't know.
00:15:01.480 So,
00:15:03.500 I'll give you a,
00:15:05.000 a make up simultaneous sip as soon as I,
00:15:07.300 I'm done with this topic.
00:15:08.640 Because I know,
00:15:09.260 a number of people are,
00:15:10.340 are addicted to the sip.
00:15:13.800 So,
00:15:14.400 even if it happened,
00:15:15.500 if they missed it,
00:15:17.440 as a,
00:15:18.140 citizen of,
00:15:20.180 earth,
00:15:20.560 I feel I need to,
00:15:22.220 accommodate them.
00:15:23.080 So,
00:15:23.280 we'll do that in a minute.
00:15:24.380 I know I sipped,
00:15:25.540 but some people missed it,
00:15:26.600 so we'll,
00:15:26.900 we'll accommodate them.
00:15:28.060 But here's my point.
00:15:32.120 If Twitter actually pulls this off,
00:15:34.700 and does actual,
00:15:35.840 honest to God,
00:15:36.680 useful,
00:15:37.820 context,
00:15:38.460 and I like calling it context better than fact checking.
00:15:41.240 Who would agree with me on that?
00:15:43.700 I love context.
00:15:46.100 Fact checking gets a little too,
00:15:47.700 into opinion.
00:15:49.140 Doesn't it?
00:15:50.360 Like,
00:15:50.700 as soon as you say,
00:15:51.700 fact check,
00:15:52.560 hmm,
00:15:53.740 but context is good every time.
00:15:55.980 Then you make up your own mind.
00:15:58.060 All right.
00:16:00.620 I don't know if,
00:16:01.580 I don't know if I have a way to express how important this is.
00:16:05.140 This would actually fix our government.
00:16:07.740 It would fix the news.
00:16:10.060 It might even get rid of,
00:16:11.340 like,
00:16:12.060 divisiveness.
00:16:13.480 It might actually make things work.
00:16:15.960 Like,
00:16:16.340 I know that's,
00:16:17.140 that's a lot.
00:16:18.800 I don't want to,
00:16:19.600 I don't want to get ahead of it with my optimism,
00:16:22.060 but,
00:16:22.480 I don't think I've ever been more optimistic about the country,
00:16:26.100 than right now.
00:16:28.060 Honest to God,
00:16:29.460 I've never felt more optimistic about America,
00:16:32.760 at least,
00:16:33.980 than right now.
00:16:35.880 I think Musk did that.
00:16:38.460 I think Musk did that.
00:16:40.940 Because this is so important.
00:16:44.300 It's like,
00:16:44.900 huge.
00:16:46.460 Anyway.
00:16:46.660 So imagine that happens.
00:16:52.440 All right.
00:16:57.060 But there's more.
00:16:59.060 We might have a contest for the presidency,
00:17:03.060 that would involve Trump,
00:17:04.960 probably.
00:17:06.220 We don't know for sure,
00:17:07.360 but probably Trump running against a Democrat.
00:17:09.760 You know what's interesting?
00:17:12.840 If that happens,
00:17:14.220 we would have the most vetted candidate of all time.
00:17:19.580 Trump is the most examined,
00:17:23.020 researched,
00:17:23.860 vetted candidate of all time.
00:17:27.160 So let me,
00:17:27.860 let me pull the story together here.
00:17:29.760 So we could have a Twitter that works to keep everybody else honest.
00:17:35.820 We could.
00:17:36.780 Don't know that yet,
00:17:37.840 but it's possible.
00:17:39.220 We could have a president who doesn't have any secrets left.
00:17:44.020 How much would you like that?
00:17:45.900 Because my biggest problem with Biden is that I don't know what's up with him in Ukraine,
00:17:51.140 and what's up with him in China.
00:17:53.860 But if he had been investigated as thoroughly as Trump had been investigated for Russia,
00:17:59.440 I feel like I would trust him on that issue anyway.
00:18:03.000 Don't you?
00:18:04.140 If Biden had ever gone through the scrutiny that Trump has already gone through,
00:18:09.820 I wouldn't probably be asking any questions about China or Ukraine.
00:18:13.900 I'd say,
00:18:14.380 okay,
00:18:14.560 we looked into it.
00:18:15.600 Good enough.
00:18:16.180 So we might have radical transparency of one candidate who could actually become president again,
00:18:23.740 while we have radical,
00:18:25.880 you know,
00:18:26.760 context checking on Twitter,
00:18:28.600 which would make everybody else have to be honest,
00:18:30.820 because it'd be too embarrassing to,
00:18:32.180 to be out of step with Twitter fact checking.
00:18:35.800 And then we have the GOP taking over the House,
00:18:40.400 so they're going to start investigating,
00:18:42.220 you know,
00:18:43.160 all the bad behavior of the Democrats.
00:18:46.180 Do you know how much transparency we're about to get whacked with?
00:18:52.620 We've never had anything like this.
00:18:55.940 Usually you're just guessing.
00:18:58.040 You're just guessing.
00:18:59.000 If I vote for this candidate,
00:19:00.420 I have no idea what's in that closet,
00:19:03.460 right?
00:19:04.080 That closet might open up halfway through the presidency.
00:19:07.540 But we wouldn't have to worry about,
00:19:09.560 at least in my opinion,
00:19:10.560 others will disagree.
00:19:11.600 But in my opinion,
00:19:13.000 you wouldn't have to worry about the president having any hidden secrets.
00:19:16.180 If it were Trump,
00:19:17.900 the GOP is going to unravel any bad behavior.
00:19:21.620 The Democrats did maybe,
00:19:23.240 at least they'll try.
00:19:24.600 And then Twitter will give us some kind of actual context for the first time.
00:19:28.140 This is all amazing stuff.
00:19:32.460 But can we have all of this optimism without a second simultaneous sip?
00:19:40.660 No.
00:19:41.640 No.
00:19:42.460 I believe that we deserve another sip.
00:19:46.480 And so for anybody who didn't see it the first time,
00:19:50.200 for whatever reason,
00:19:51.940 reality forked.
00:19:53.940 You're going to get it this time.
00:19:55.820 Everybody,
00:19:56.480 get ready for the simultaneous sip.
00:19:59.340 Make up.
00:20:00.180 Go.
00:20:00.320 Good stuff.
00:20:06.040 I said fact-checking instead of context.
00:20:08.680 Yeah,
00:20:09.000 I'm going to still use them interchangeably when I'm talking in public.
00:20:13.120 So let me tell you,
00:20:14.560 I will use them interchangeably when I'm casually talking in public.
00:20:18.300 But I do prefer context.
00:20:20.980 They work similarly.
00:20:21.940 All right.
00:20:25.180 I'm also fascinated by the Alex Jones edge case.
00:20:31.160 Now,
00:20:31.980 I believe that Twitter was solidly in the right.
00:20:36.520 Like,
00:20:36.980 no doubt about it.
00:20:38.460 Bring you back,
00:20:39.280 you know,
00:20:39.620 the Babylon Bee and bring you back Trump.
00:20:42.880 Totally,
00:20:43.500 totally on board with those decisions.
00:20:46.520 Alex Jones,
00:20:47.320 however,
00:20:48.340 is not like the others.
00:20:50.700 He's not like the other situation.
00:20:51.940 And I believe this is a problem for Musk.
00:20:58.520 Because Musk answered why he wasn't bringing Alex Jones back.
00:21:02.220 And if he didn't see it,
00:21:03.660 it's one of the most powerful tweets you'll ever see in your life.
00:21:07.860 I'm going to read it to you.
00:21:09.600 And it might,
00:21:10.080 you know,
00:21:10.380 take you out of your happy moment for a bit.
00:21:12.900 But we'll get you back.
00:21:14.020 Okay.
00:21:15.980 So,
00:21:16.620 Kim.com,
00:21:18.660 if you know him,
00:21:20.340 famous internet user,
00:21:21.920 and famous for some other stuff in his past.
00:21:25.480 We won't get into that.
00:21:26.380 But anyway,
00:21:26.700 he's a well-known character.
00:21:28.440 And he was suggesting that free speech suggests that Alex Jones should come back.
00:21:34.460 Now,
00:21:34.860 nobody is defending Alex Jones.
00:21:38.020 As far as I know,
00:21:39.600 I've never heard anybody say,
00:21:40.860 yeah,
00:21:41.160 what Alex Jones did with that Sandy Hook thing was pretty good.
00:21:44.840 I'm nobody.
00:21:46.420 Nobody defends him.
00:21:47.940 So we are strictly talking about free speech here, right?
00:21:51.580 Can we agree?
00:21:52.680 That if you talk about Alex Jones on Twitter,
00:21:55.060 it's not about defending him.
00:21:57.200 It's very much not about defending him.
00:21:58.960 It's about the best edge case we have for free speech.
00:22:03.400 So here's what Elon said in a tweet.
00:22:08.240 Musk said,
00:22:09.060 my firstborn child died in my arms.
00:22:11.900 I felt his last heartbeat.
00:22:14.360 I have no mercy for anyone who would use the deaths of children for gain,
00:22:19.040 politics,
00:22:19.600 or fame.
00:22:20.080 Is that case closed?
00:22:25.580 He owns the company.
00:22:27.760 He says that's his line in the sand.
00:22:30.340 No way.
00:22:32.380 Here's the problem.
00:22:34.480 That's a personal opinion, isn't it?
00:22:36.680 It's a personal opinion.
00:22:38.920 Right?
00:22:39.800 If Musk can ban somebody because of how he personally feels,
00:22:44.780 then he's not the right person to own Twitter.
00:22:48.320 End of story.
00:22:50.080 If he can ban Alex Jones only because of his personal feeling,
00:22:54.620 which he may have other reasons, right?
00:22:56.860 So he led with that,
00:22:58.340 but he might have other valid reasons.
00:23:00.120 I think there are other valid reasons, by the way.
00:23:02.540 So we can talk about whether he should come back.
00:23:06.160 But if that's his reason,
00:23:08.740 that's not good enough.
00:23:10.860 That's not good enough.
00:23:12.360 Free speech is a little bit bigger than that.
00:23:15.340 Now, as strong as this is,
00:23:18.140 I mean, I can't even imagine this.
00:23:20.460 I mean, I almost lost it just reading the fucking tweet.
00:23:23.220 I can't even imagine it.
00:23:25.060 I mean, it's literally,
00:23:26.300 literally unimaginable.
00:23:28.980 Right?
00:23:29.960 Now, I got to see my dead stepson in person.
00:23:33.520 Got to actually see him dead.
00:23:34.760 And, you know, I'm not a biological parent.
00:23:40.280 You know, I raised him since he was little.
00:23:42.600 But I can't even imagine a newborn.
00:23:46.620 That's beyond my ability to imagine.
00:23:50.040 So do I blame Musk for taking this stance?
00:23:54.100 Absolutely not.
00:23:55.860 Absolutely not.
00:23:56.760 I have no criticism against him personally
00:24:00.320 for taking this stance.
00:24:02.120 I can imagine I might have done it myself.
00:24:05.220 But,
00:24:06.740 free speech is bigger.
00:24:10.060 It's bigger.
00:24:12.000 You know?
00:24:12.380 As big as,
00:24:14.660 and we all feel it, right?
00:24:16.840 You know,
00:24:17.120 this is the most visual,
00:24:20.020 it's almost tactile.
00:24:22.000 I mean, you can almost feel.
00:24:23.140 You almost feel this situation
00:24:26.240 when he describes it.
00:24:27.620 Like, it puts you right in the scene.
00:24:30.300 This is one of the most
00:24:31.600 emotionally powerful statements
00:24:34.420 I've ever heard.
00:24:37.780 The most emotionally powerful reason
00:24:40.420 I've ever heard.
00:24:41.980 And it's not good enough.
00:24:43.440 It's not good enough.
00:24:45.000 It really isn't good enough.
00:24:46.960 Free speech is bigger than this.
00:24:50.200 But, at the same time,
00:24:51.480 I completely support him as a human.
00:24:54.340 As a father,
00:24:56.480 I'm on his side.
00:24:58.880 As a citizen,
00:25:00.760 I'm not.
00:25:03.520 So it's a tough one.
00:25:05.040 This is the best,
00:25:06.320 I say best in maybe,
00:25:08.280 you know,
00:25:08.560 in an ironic way,
00:25:09.540 but it's the best edge case
00:25:10.920 I've ever seen.
00:25:12.900 I mean,
00:25:13.240 this is the best edge case
00:25:15.180 I've ever seen.
00:25:17.020 And I will also tell you
00:25:18.400 that I don't think Musk
00:25:19.940 will change his mind
00:25:20.740 for other reasons.
00:25:22.360 I think Twitter,
00:25:23.260 the value of Twitter
00:25:24.360 would go down
00:25:25.200 if Alex Jones
00:25:27.080 was allowed back on.
00:25:29.180 So there's a business decision
00:25:30.920 that can't be ignored.
00:25:32.760 Because as soon as you let
00:25:34.000 Alex Jones on,
00:25:35.320 everybody on the left
00:25:36.540 says,
00:25:37.000 that's it.
00:25:37.860 That's it.
00:25:38.800 You know,
00:25:39.120 I can kind of understand
00:25:40.480 why a national politician
00:25:42.520 has to be on Twitter.
00:25:43.400 I can kind of get that.
00:25:46.540 But you're not going to change
00:25:48.840 anybody's mind
00:25:49.700 about Sandy Hook.
00:25:51.580 Nobody's changing their mind
00:25:52.760 about that.
00:25:53.800 So if you take that side,
00:25:55.600 or it looks like you're taking
00:25:56.640 Alex Jones' side,
00:25:58.400 even if your point
00:25:59.620 is free speech,
00:26:00.840 I don't think it's good enough.
00:26:03.740 Because there are so many
00:26:06.080 other parents out there.
00:26:08.400 Right?
00:26:09.200 This,
00:26:10.720 unfortunately,
00:26:11.360 Alex Jones is not about
00:26:12.860 free speech.
00:26:14.140 It's not about commerce.
00:26:16.680 It's about parents,
00:26:18.460 isn't it?
00:26:19.440 This is a parent-to-parent problem.
00:26:21.420 And maybe,
00:26:21.760 maybe Musk got it right.
00:26:23.580 Maybe Musk got it right.
00:26:25.340 This isn't about politics.
00:26:26.880 It's not about Twitter.
00:26:27.760 It's just parent-to-parent.
00:26:29.220 And as a parent,
00:26:30.240 he just said,
00:26:30.800 fuck you forever.
00:26:34.140 All right.
00:26:34.620 You know,
00:26:34.940 I just talked myself
00:26:35.720 into Musk's opinion.
00:26:37.800 Yeah.
00:26:38.600 I just talked myself into it.
00:26:40.260 While I still say
00:26:42.560 free speech says
00:26:43.860 bring Alex Jones back,
00:26:45.580 as horrible as that sounds,
00:26:49.620 I'm going to say that
00:26:51.420 maybe parent-to-parent
00:26:53.380 is just too strong.
00:26:55.340 Maybe parent-to-parent
00:26:56.200 is all that matters.
00:26:58.160 I don't know.
00:26:58.920 I think I'm going to revise
00:27:00.760 my opinion as I talk about it.
00:27:02.800 And I'm going to say
00:27:03.400 that free speech
00:27:04.180 feels like the most important thing,
00:27:05.900 but, you know,
00:27:07.560 just one person
00:27:08.460 isn't really going to affect
00:27:09.660 free speech, right?
00:27:11.480 This might be just
00:27:12.520 the special case
00:27:13.400 of all special cases.
00:27:15.180 It could be the case
00:27:15.980 where you just say
00:27:16.860 dad is more important
00:27:19.120 than all of that other stuff.
00:27:21.900 And dad-to-dad,
00:27:23.060 we're just going to
00:27:24.060 squash the shit out of this.
00:27:26.580 Maybe.
00:27:27.580 I wouldn't complain about that.
00:27:29.060 All right.
00:27:32.440 I watched,
00:27:33.400 I think it was
00:27:34.480 Shannon Breen's show
00:27:35.900 last night on Fox,
00:27:37.620 and she interviewed
00:27:38.800 a top Republican
00:27:41.200 and a top Democrat.
00:27:43.480 Senator Tom Cotton
00:27:44.600 was the Republican,
00:27:45.980 and he was saying
00:27:46.860 about TikTok
00:27:47.440 that TikTok
00:27:48.560 should be banned.
00:27:50.880 So he was anti-TikTok,
00:27:52.880 thought it was dangerous
00:27:53.760 for TikTok
00:27:54.400 to be operating
00:27:55.160 in this country.
00:27:56.280 So then she had
00:27:57.220 a prominent Democrat on,
00:27:59.500 maybe you can tell me
00:28:01.580 which one,
00:28:02.640 whose name I can't remember.
00:28:04.160 Prominent,
00:28:04.840 the one I don't see
00:28:05.700 that often.
00:28:07.540 Anyway,
00:28:07.920 somebody,
00:28:08.700 a senator,
00:28:09.280 I believe,
00:28:10.080 a senator, right?
00:28:12.520 Somebody saw it.
00:28:13.420 You'll tell me the name.
00:28:14.600 Anyway,
00:28:14.900 it doesn't matter.
00:28:15.900 But he agreed.
00:28:17.400 Mark Warner,
00:28:18.040 thank you.
00:28:18.580 So Mark Warner,
00:28:20.180 prominent Democrat,
00:28:22.460 said exactly the same thing.
00:28:24.460 So now you've got
00:28:25.020 a prominent Republican
00:28:26.000 saying get rid of TikTok.
00:28:28.180 You've got a prominent
00:28:28.960 Democrat who says
00:28:29.940 get rid of TikTok.
00:28:31.200 Do you think you could find
00:28:32.380 anybody who would disagree?
00:28:35.080 I've never seen one.
00:28:38.480 I've never seen one.
00:28:40.360 Is there even one person
00:28:41.880 who disagrees?
00:28:45.180 Swalwell?
00:28:45.740 I don't think so.
00:28:46.760 I don't think anybody
00:28:47.540 disagrees.
00:28:48.040 So if you have
00:28:50.040 bipartisan support
00:28:51.460 to get rid of TikTok,
00:28:52.560 why is it still here?
00:28:54.620 Give me any hypothesis
00:28:56.640 why it's still here
00:28:57.800 when everybody
00:28:59.020 wants to get rid of it.
00:29:01.460 Bipartisan.
00:29:04.900 Now, yeah,
00:29:05.660 you might have some
00:29:06.320 like free marketer
00:29:07.280 people who disagree.
00:29:08.420 I can only think
00:29:11.660 of one reason.
00:29:13.780 I can only think
00:29:14.600 of one reason.
00:29:15.880 It has to do,
00:29:17.160 there's,
00:29:17.640 by default,
00:29:18.440 there's only one reason.
00:29:19.640 It has to do
00:29:20.400 with something
00:29:20.820 about Hunter
00:29:21.580 and China.
00:29:23.740 China must have
00:29:24.580 some blackmail
00:29:25.280 against Biden
00:29:26.020 because there isn't
00:29:27.360 any other reason.
00:29:29.260 Now,
00:29:29.800 if Biden were
00:29:30.740 to offer a reason,
00:29:31.880 I would take that
00:29:32.600 under consideration.
00:29:34.360 But if he doesn't
00:29:35.480 offer a fucking reason,
00:29:36.640 and I've never heard one,
00:29:38.420 have you?
00:29:39.760 There are a lot of people
00:29:40.980 who watch the news here.
00:29:42.400 Tell me,
00:29:42.760 have any of you
00:29:43.360 heard a reason
00:29:44.000 for keeping TikTok?
00:29:46.420 Anybody?
00:29:48.300 It doesn't exist, right?
00:29:50.140 It literally doesn't exist.
00:29:52.620 And,
00:29:54.120 right?
00:29:54.900 It doesn't exist.
00:30:02.860 Now,
00:30:03.620 if you think
00:30:04.240 it's because
00:30:04.700 of Generation Z voters,
00:30:06.040 I disagree
00:30:06.680 because it's bipartisan.
00:30:09.380 It's bipartisan.
00:30:11.800 Everybody agrees.
00:30:14.000 All right?
00:30:14.680 So,
00:30:16.100 I think we,
00:30:18.040 here's what I think.
00:30:20.080 If the news industry
00:30:21.540 doesn't ask
00:30:22.120 the right questions
00:30:22.820 and our government
00:30:24.620 doesn't give us reasons,
00:30:26.380 you should,
00:30:27.960 by default,
00:30:28.820 make the worst
00:30:29.680 assumption about them
00:30:30.960 because that's
00:30:32.260 where the evidence points.
00:30:33.960 It doesn't mean
00:30:34.620 you'd be right,
00:30:35.920 but everything
00:30:36.740 that we do
00:30:37.300 with politics
00:30:38.000 is operating assumptions,
00:30:40.240 right?
00:30:40.840 You have to take
00:30:41.620 an operating assumption
00:30:42.760 on every topic
00:30:43.760 until you find out
00:30:45.120 you're wrong,
00:30:46.520 you know,
00:30:46.700 that there's something
00:30:47.280 wrong with the data
00:30:48.000 or your analysis,
00:30:49.180 but you have to take
00:30:50.180 an operating,
00:30:51.960 you know,
00:30:52.920 assumption
00:30:53.360 to move forward.
00:30:54.200 my operating assumption,
00:30:56.360 which my government
00:30:57.120 has given me,
00:30:58.340 is that they're not
00:30:59.100 going to give me
00:30:59.520 a reason why
00:31:00.280 we're keeping TikTok.
00:31:01.840 Therefore,
00:31:02.460 it must be a reason
00:31:03.280 that the public
00:31:03.940 should not see.
00:31:05.860 Give me one other
00:31:06.760 example of what
00:31:07.680 the public should not
00:31:08.940 know about TikTok.
00:31:11.920 Well,
00:31:12.220 it's got to be,
00:31:12.960 yeah,
00:31:13.480 it's either something
00:31:14.260 nefarious
00:31:14.960 or something else
00:31:16.540 nefarious,
00:31:17.620 right?
00:31:18.540 But it's within
00:31:19.520 the nefarious category.
00:31:21.000 We don't know
00:31:22.500 exactly what
00:31:23.300 nefarious thing
00:31:24.180 is happening,
00:31:25.520 but it's definitely
00:31:26.180 nefarious.
00:31:27.800 Definitely nefarious.
00:31:31.760 So,
00:31:33.000 and I also noticed
00:31:34.200 that neither
00:31:34.820 Senator Cotton
00:31:35.780 nor Warner
00:31:36.620 mentioned the risk
00:31:38.240 of persuasion
00:31:39.180 from TikTok.
00:31:40.380 They both mentioned
00:31:41.340 the data privacy element,
00:31:44.040 which has its own,
00:31:45.060 you know,
00:31:45.480 risks.
00:31:46.960 But
00:31:47.340 why didn't
00:31:49.500 either of them
00:31:50.300 mention the
00:31:51.380 persuasion risk?
00:31:53.700 Why do you think?
00:31:55.760 Tell me why
00:31:56.600 you don't think
00:31:57.180 either of them
00:31:57.740 mentioned the
00:31:58.360 persuasion risk.
00:32:05.020 They don't,
00:32:05.900 they don't
00:32:06.560 understand it?
00:32:08.180 Well,
00:32:08.760 they certainly
00:32:09.200 understand that
00:32:09.940 social media
00:32:10.660 influences opinions.
00:32:13.060 Everybody
00:32:13.580 understands that.
00:32:15.680 You're okay with it,
00:32:16.800 somebody says.
00:32:17.520 All right,
00:32:20.920 well,
00:32:21.160 I don't know
00:32:21.520 the answer to that.
00:32:22.740 So,
00:32:23.240 you want to hear
00:32:24.320 something amazing?
00:32:26.720 All right,
00:32:27.120 I want you to just
00:32:28.040 feel this next story
00:32:30.360 for how amazing it is,
00:32:32.760 in a good way.
00:32:34.220 So this is going to be
00:32:35.060 the good amazing thing.
00:32:37.260 So,
00:32:37.820 because Twitter exists,
00:32:40.000 people like me
00:32:40.900 can contact
00:32:42.520 people I normally
00:32:43.600 would not be able
00:32:44.200 to contact.
00:32:44.560 And the reason
00:32:46.060 that I have
00:32:46.720 lots of reach
00:32:47.460 on Twitter
00:32:48.000 is,
00:32:49.140 why?
00:32:49.860 Why do I have
00:32:50.640 lots of followers
00:32:51.540 on Twitter?
00:32:52.680 Is it because
00:32:53.480 of Dilbert?
00:32:55.080 I think Dilbert
00:32:56.000 gives me
00:32:56.940 50 to 100,000
00:32:58.160 users.
00:32:59.480 And I'm
00:32:59.960 pushing
00:33:00.520 800,000 now.
00:33:02.020 I think most
00:33:02.900 of it,
00:33:03.700 most of it
00:33:04.900 is based on
00:33:05.920 I say things
00:33:06.840 that people
00:33:07.220 want to hear.
00:33:08.780 I think.
00:33:10.000 You know,
00:33:10.220 I say things
00:33:10.780 that people
00:33:11.100 want to hear.
00:33:11.500 Now,
00:33:13.280 this allowed
00:33:14.400 me
00:33:14.820 to have
00:33:16.580 a sort
00:33:17.480 of a voice
00:33:18.720 where normally
00:33:20.440 if you went
00:33:20.900 back,
00:33:21.320 you know,
00:33:21.540 50 years
00:33:22.040 or whatever,
00:33:22.640 the only people
00:33:23.340 who would
00:33:23.640 influence politics
00:33:24.680 would be
00:33:25.980 people they
00:33:26.580 knew personally.
00:33:28.140 Right?
00:33:28.900 So in order
00:33:29.600 to influence
00:33:30.180 politics,
00:33:30.740 you had to
00:33:31.000 be in politics
00:33:31.980 or an advisor
00:33:33.020 or physically
00:33:34.720 nearby.
00:33:35.900 You know,
00:33:36.200 maybe the news
00:33:36.800 a little bit.
00:33:37.940 But an ordinary
00:33:38.800 person
00:33:39.440 couldn't just
00:33:40.660 do good
00:33:41.340 work
00:33:41.760 and then
00:33:42.740 have an
00:33:43.120 influence.
00:33:44.320 You know,
00:33:44.500 maybe an
00:33:44.860 expert in
00:33:45.320 some cases.
00:33:47.340 So I'm
00:33:48.080 going to use
00:33:48.460 my universal
00:33:49.860 analogy
00:33:52.140 person.
00:33:53.460 You know,
00:33:54.100 who I always
00:33:54.560 use for my
00:33:55.100 universal
00:33:55.600 analogies.
00:33:56.920 I don't know
00:33:57.680 why this is
00:33:58.240 true,
00:33:59.500 but Mike
00:34:00.120 Cernovich works
00:34:00.920 as an example
00:34:02.220 of so many
00:34:02.880 things.
00:34:03.820 I use them
00:34:04.600 all the time.
00:34:05.900 It's amazing
00:34:06.500 how often it
00:34:07.000 works.
00:34:07.820 Mike Cernovich
00:34:08.360 has never
00:34:08.860 been in
00:34:09.220 politics
00:34:09.760 or never
00:34:11.200 been elected
00:34:11.620 to office
00:34:12.100 and he
00:34:15.140 wasn't famous
00:34:15.940 before Twitter.
00:34:18.460 But now
00:34:19.220 he's, you
00:34:19.840 know,
00:34:19.940 got a big
00:34:20.340 influence,
00:34:20.980 big audience.
00:34:22.040 Why?
00:34:23.300 Because he
00:34:24.460 does a really
00:34:24.960 good job
00:34:25.440 of tweeting
00:34:25.840 and, you
00:34:27.260 know,
00:34:27.500 related stuff,
00:34:28.780 you know,
00:34:29.120 Epstein stuff,
00:34:30.060 a lot of,
00:34:30.640 you know,
00:34:31.060 genuine
00:34:31.400 accomplishments.
00:34:32.940 So somebody
00:34:34.940 like Cernovich
00:34:35.540 has a voice
00:34:36.920 in politics
00:34:37.660 entirely based
00:34:39.380 on competence.
00:34:41.920 Right?
00:34:43.120 It was just
00:34:43.880 that he did
00:34:44.400 good work
00:34:45.100 that allowed
00:34:46.280 his voice
00:34:46.800 to have
00:34:47.820 meaning.
00:34:48.860 Now,
00:34:49.360 here's where
00:34:49.800 I'm going to
00:34:50.040 pull it all
00:34:50.480 together.
00:34:51.680 Because of
00:34:52.560 this Twitter
00:34:53.400 effect where
00:34:54.040 ordinary people's,
00:34:55.280 you know,
00:34:55.700 voices can be
00:34:56.620 elevated.
00:34:57.800 Now,
00:34:58.100 who elevated
00:34:58.680 my voice?
00:34:59.320 Did I do
00:34:59.700 that myself?
00:35:01.080 I did not.
00:35:02.040 I did not
00:35:02.840 do it myself.
00:35:03.460 I didn't have
00:35:03.780 that power.
00:35:05.020 You did.
00:35:06.140 Right?
00:35:06.500 The people
00:35:06.880 who follow me
00:35:07.700 decided that
00:35:09.020 by following
00:35:09.680 or retweeting
00:35:10.380 me,
00:35:10.960 they would
00:35:11.340 boost my
00:35:11.900 voice.
00:35:13.120 So,
00:35:14.200 there's this
00:35:15.000 weird kind
00:35:15.460 of group
00:35:16.060 phenomena
00:35:16.960 in which
00:35:18.340 the same
00:35:18.820 way that
00:35:19.280 Twitter,
00:35:19.900 you know,
00:35:21.680 Twitter
00:35:22.080 context will
00:35:23.140 be sort
00:35:23.640 of voted
00:35:24.000 up,
00:35:24.360 but across
00:35:24.820 demographics.
00:35:27.140 Would I
00:35:27.760 do anything
00:35:28.220 good on
00:35:29.300 Twitter,
00:35:29.860 or useful,
00:35:30.600 let's say
00:35:30.880 useful,
00:35:31.220 then it
00:35:32.620 gets boosted.
00:35:34.000 So,
00:35:34.780 you have
00:35:35.300 allowed me
00:35:36.040 to do
00:35:36.600 the thing
00:35:36.940 that I
00:35:37.260 did today.
00:35:38.780 So,
00:35:39.120 this is
00:35:39.560 what you
00:35:40.060 allowed me
00:35:40.660 to do.
00:35:41.980 Send a
00:35:42.660 direct message
00:35:43.360 to Tom
00:35:44.020 Cotton,
00:35:44.840 and ask
00:35:45.640 him why
00:35:45.960 he doesn't
00:35:46.480 mention the
00:35:47.380 influence part
00:35:48.480 of TikTok,
00:35:49.420 and he
00:35:49.880 concentrates
00:35:50.520 on the
00:35:51.200 privacy part.
00:35:53.020 Now,
00:35:53.300 I got to
00:35:53.680 ask him
00:35:54.060 the direct
00:35:54.460 question because
00:35:55.120 he follows
00:35:55.640 me on
00:35:56.040 Twitter,
00:35:57.440 and vice
00:35:57.940 versa.
00:35:58.180 right?
00:36:00.460 Now,
00:36:01.720 doesn't that
00:36:02.220 just blow
00:36:02.640 you away?
00:36:04.260 Just think
00:36:04.720 about that.
00:36:05.860 I asked
00:36:06.520 him directly.
00:36:08.900 He gets
00:36:10.620 off of
00:36:11.000 Fox News
00:36:11.640 talking about
00:36:12.240 one of the
00:36:12.580 biggest issues,
00:36:13.540 in my opinion,
00:36:14.140 in the
00:36:14.400 country,
00:36:14.960 and I can
00:36:15.880 just ask
00:36:16.320 him directly
00:36:16.820 the exact
00:36:18.500 question that
00:36:19.080 I want to
00:36:19.400 ask.
00:36:20.380 But why
00:36:21.340 did I have
00:36:21.840 to ask
00:36:22.160 him that
00:36:22.420 question?
00:36:23.800 There's
00:36:24.240 nobody in
00:36:24.720 the whole
00:36:25.000 news business
00:36:25.640 who would
00:36:25.980 ask him
00:36:26.360 that
00:36:26.500 question,
00:36:27.580 and the
00:36:27.780 answer is
00:36:28.140 there wasn't.
00:36:29.340 There was
00:36:29.600 nobody in
00:36:30.100 the news
00:36:30.400 question to
00:36:30.900 ask that
00:36:31.240 question.
00:36:32.500 Now,
00:36:33.420 I don't
00:36:34.740 think he'd
00:36:35.120 mind if I
00:36:36.040 tell you
00:36:36.340 what I
00:36:36.680 said.
00:36:37.160 I'm not
00:36:37.500 sure I'll
00:36:37.780 tell you.
00:36:38.260 If he
00:36:38.460 responds,
00:36:39.820 that would
00:36:40.200 be a
00:36:40.440 private
00:36:40.760 response,
00:36:41.780 and I
00:36:41.960 wouldn't
00:36:42.140 tell you
00:36:42.400 what he
00:36:42.640 said
00:36:42.820 necessarily.
00:36:44.300 So he
00:36:44.720 hasn't
00:36:44.920 responded,
00:36:45.380 but let
00:36:45.560 me tell
00:36:45.780 you what
00:36:45.980 I said.
00:36:46.640 All right?
00:36:47.380 So this
00:36:47.800 is what
00:36:48.300 I sent
00:36:48.620 to
00:36:48.860 Senator
00:36:49.280 Cotton.
00:36:50.380 I
00:36:50.560 said,
00:36:50.820 about
00:36:50.940 TikTok,
00:36:51.540 I see
00:36:51.840 you
00:36:51.940 explain
00:36:52.280 the
00:36:52.480 data
00:36:52.720 security
00:36:53.200 risk,
00:36:53.680 but not
00:36:53.940 the
00:36:54.100 greater
00:36:54.340 risk
00:36:54.760 of
00:36:55.380 Chinese
00:36:55.940 weaponized
00:36:57.040 persuasion
00:36:57.860 via content
00:36:59.220 delivery.
00:37:01.080 There's
00:37:01.560 a reason
00:37:01.860 Chinese users
00:37:02.700 don't have
00:37:03.220 access to
00:37:03.880 the same
00:37:04.240 TikTok we
00:37:05.020 do.
00:37:06.340 Now,
00:37:06.720 I assume
00:37:07.160 he knew
00:37:07.440 that,
00:37:08.520 because he's
00:37:09.000 well-informed,
00:37:10.180 but maybe
00:37:10.700 not.
00:37:11.340 I just
00:37:11.720 wanted to
00:37:12.060 cover that,
00:37:12.660 that he
00:37:12.880 knows that
00:37:13.380 China doesn't
00:37:14.860 get the
00:37:15.160 same version
00:37:15.960 we do.
00:37:16.400 And I
00:37:19.040 said,
00:37:19.420 do you
00:37:21.300 have a
00:37:21.640 different
00:37:21.940 view of
00:37:22.440 the
00:37:22.600 persuasion
00:37:23.140 risk?
00:37:24.040 And then
00:37:24.460 I said,
00:37:24.960 parenthetically,
00:37:25.680 I'm a
00:37:25.960 trained
00:37:26.480 hypnotist,
00:37:27.440 and I
00:37:27.900 see the
00:37:28.220 risk as
00:37:28.700 WMD
00:37:29.420 level.
00:37:30.840 So then
00:37:31.460 I went
00:37:31.720 on.
00:37:31.980 I said,
00:37:32.260 if you
00:37:32.520 haven't
00:37:32.740 used
00:37:33.000 TikTok,
00:37:33.520 because I
00:37:33.780 think this
00:37:34.140 might be
00:37:34.440 the problem.
00:37:35.980 This
00:37:36.220 might be
00:37:36.600 the
00:37:36.760 problem.
00:37:38.380 I said,
00:37:39.020 if you
00:37:39.260 haven't
00:37:39.520 used
00:37:39.800 TikTok,
00:37:40.420 or the
00:37:40.860 similar
00:37:41.160 feature on
00:37:41.660 Instagram,
00:37:42.400 you might
00:37:43.020 not realize
00:37:43.700 how powerful
00:37:44.560 they have
00:37:45.020 become.
00:37:45.440 It isn't
00:37:46.480 just content
00:37:47.220 now.
00:37:48.140 They can
00:37:48.540 turn on
00:37:49.000 your dopamine
00:37:49.580 faucet and
00:37:50.400 just keep
00:37:50.880 it there.
00:37:51.820 I'm 65,
00:37:53.400 and it
00:37:53.880 zombifies me
00:37:54.860 instantly,
00:37:55.860 now that I
00:37:56.580 learned my
00:37:57.000 content
00:37:57.460 preferences.
00:37:58.260 True story.
00:37:59.460 If I
00:38:00.020 were 12,
00:38:01.240 it would
00:38:01.580 own my
00:38:02.020 brain
00:38:02.300 completely.
00:38:05.680 And it
00:38:06.400 does,
00:38:06.920 for 12
00:38:07.500 year olds.
00:38:08.280 You might
00:38:08.720 have a,
00:38:09.360 and then I
00:38:10.240 said to
00:38:10.800 the senator,
00:38:12.320 you might
00:38:12.700 have a
00:38:13.040 serious
00:38:13.380 education
00:38:14.060 issue in
00:38:14.620 Congress,
00:38:15.440 on the
00:38:15.860 current
00:38:16.200 power of
00:38:17.000 content
00:38:17.540 persuasion.
00:38:18.620 I don't
00:38:19.260 yet see a
00:38:19.840 sign that
00:38:20.240 China
00:38:20.540 weaponized
00:38:21.160 TikTok
00:38:21.520 persuasion.
00:38:22.360 I haven't
00:38:22.600 seen it
00:38:22.900 yet.
00:38:23.740 But if
00:38:24.260 they do,
00:38:25.020 it will
00:38:25.480 not be
00:38:25.860 obvious,
00:38:26.860 and it
00:38:27.200 will be
00:38:27.580 profound
00:38:28.140 and nearly
00:38:28.760 instant.
00:38:31.160 Now,
00:38:31.680 how was
00:38:32.000 that?
00:38:33.540 How did
00:38:34.000 I do?
00:38:35.400 Did I
00:38:35.820 make my
00:38:36.160 case?
00:38:40.480 Okay,
00:38:41.260 I think I
00:38:42.040 made my
00:38:42.380 case.
00:38:42.680 things.
00:38:43.600 So,
00:38:44.240 it could
00:38:44.700 be that
00:38:45.180 there's
00:38:45.380 actually a
00:38:45.900 really good
00:38:46.360 reason for
00:38:47.720 why things
00:38:48.280 are where
00:38:49.680 they are.
00:38:50.860 And because
00:38:51.320 I have a
00:38:51.660 lot of
00:38:51.840 respect for
00:38:52.500 Senator
00:38:52.880 Cotton,
00:38:53.880 I suppose
00:38:54.460 he'll tell
00:38:54.820 me.
00:38:56.320 I know.
00:38:56.740 He's
00:38:56.940 responded
00:38:57.380 before to
00:38:58.020 a compliment
00:38:58.680 I sent
00:38:59.140 by DM.
00:39:00.500 So,
00:39:01.700 he might
00:39:02.080 see it,
00:39:02.380 he might
00:39:02.620 respond,
00:39:03.220 and if
00:39:03.560 he does,
00:39:05.080 I'll decide
00:39:06.040 whether I
00:39:06.420 can tell
00:39:06.720 you.
00:39:06.960 It'll be a
00:39:07.380 private message,
00:39:08.020 so don't
00:39:08.980 assume that
00:39:09.420 I'll tell
00:39:09.700 you.
00:39:15.300 So,
00:39:16.220 let's talk
00:39:17.720 about
00:39:17.920 fentanyl.
00:39:19.380 My opinion
00:39:20.080 on fentanyl
00:39:20.740 might be
00:39:21.420 drifting a
00:39:22.020 little bit,
00:39:22.800 which will
00:39:23.340 surprise you.
00:39:27.060 So,
00:39:27.840 here's the
00:39:28.860 thing.
00:39:30.740 There's a
00:39:31.400 question that
00:39:32.240 if we had
00:39:32.780 a free
00:39:33.160 press,
00:39:34.400 they would
00:39:34.760 be asking
00:39:35.320 every politician
00:39:36.320 this question.
00:39:37.800 You ready
00:39:38.120 for this?
00:39:38.520 And when I
00:39:39.280 tell you
00:39:39.540 what the
00:39:39.780 question is,
00:39:41.520 don't hate
00:39:42.860 the current
00:39:43.440 press too
00:39:44.000 much.
00:39:45.920 Because you're
00:39:46.620 going to
00:39:46.800 start hating,
00:39:47.860 you're going
00:39:48.200 to hate
00:39:48.660 your journalists
00:39:50.300 as soon as
00:39:50.860 you hear the
00:39:51.240 question.
00:39:51.820 Because it's
00:39:52.220 so obvious,
00:39:52.980 and nobody's
00:39:53.420 asked it.
00:39:54.300 Here's the
00:39:54.720 question.
00:39:56.000 Hey,
00:39:56.260 politician,
00:39:57.640 why do you
00:39:58.420 rank the
00:39:58.820 safety of
00:39:59.380 the Mexican
00:39:59.920 cartels above
00:40:00.980 the safety
00:40:01.440 of Americans?
00:40:04.260 There it
00:40:04.860 is.
00:40:05.960 That question
00:40:06.900 needs to be
00:40:08.240 asked of
00:40:09.520 every politician
00:40:10.380 every time
00:40:11.060 they go in
00:40:11.440 public.
00:40:12.400 Because they
00:40:13.000 are.
00:40:13.920 And I don't
00:40:14.460 know the
00:40:14.760 answer to
00:40:15.160 it.
00:40:16.100 There might
00:40:16.500 be a good
00:40:16.840 reason.
00:40:17.880 What if they
00:40:18.300 have a good
00:40:18.620 reason?
00:40:19.880 Maybe they
00:40:20.500 say, oh,
00:40:21.200 you don't
00:40:21.500 know,
00:40:21.840 Scott,
00:40:22.560 if we went
00:40:24.360 against the
00:40:24.820 cartels,
00:40:25.740 they might
00:40:26.460 kill even
00:40:27.000 more people
00:40:27.560 than they're
00:40:27.940 killing now.
00:40:29.260 I'm not
00:40:29.700 saying that
00:40:30.040 would be a
00:40:30.420 good argument,
00:40:31.120 but maybe
00:40:31.480 it exists.
00:40:32.420 Maybe there's
00:40:32.820 some kind of
00:40:33.280 argument like
00:40:33.880 that.
00:40:34.260 Or they
00:40:35.800 might say,
00:40:36.840 oh, we
00:40:37.740 can't set a
00:40:38.340 precedent.
00:40:39.660 Maybe.
00:40:40.660 Maybe that's
00:40:41.260 the argument.
00:40:41.800 It wouldn't
00:40:42.260 convince me,
00:40:43.480 but I'd like
00:40:44.100 to hear the
00:40:44.460 argument.
00:40:46.340 Now, how
00:40:47.580 much do you
00:40:48.480 hate the
00:40:49.420 fact that
00:40:50.680 that's never
00:40:51.180 been asked?
00:40:52.860 You've never
00:40:53.580 seen that
00:40:54.000 asked, have
00:40:54.540 you?
00:40:55.440 Right?
00:40:56.680 No one has
00:40:57.200 ever asked
00:40:57.720 that question.
00:40:58.820 And that's
00:40:59.280 exactly what's
00:41:00.020 happening.
00:41:00.320 They are
00:41:02.040 putting the
00:41:02.700 well-being of
00:41:03.440 the cartels,
00:41:05.060 and presumably
00:41:05.820 there would
00:41:06.240 be, you
00:41:06.900 know, non-cartel
00:41:08.440 collateral damage
00:41:09.520 that nobody
00:41:10.000 wants.
00:41:10.840 But let me
00:41:11.440 tell you how
00:41:11.900 I would do
00:41:12.380 it.
00:41:13.320 If I were
00:41:13.980 going to
00:41:14.340 attack the
00:41:14.840 cartels, I
00:41:15.440 would do it
00:41:15.780 this way.
00:41:16.460 I would give
00:41:17.160 them warning
00:41:17.760 so that the
00:41:19.240 innocents can
00:41:20.040 get away.
00:41:21.340 You know, give
00:41:21.640 them plenty of
00:41:22.200 time, say,
00:41:23.200 this facility
00:41:24.380 will disappear
00:41:25.820 in 24 hours.
00:41:27.160 Make sure
00:41:27.640 you're not
00:41:27.940 standing there.
00:41:29.700 Now, what
00:41:30.120 would they
00:41:30.380 do?
00:41:30.760 Well, they'd
00:41:31.120 have a day
00:41:31.640 to collect
00:41:32.220 all their
00:41:32.680 valuable, you
00:41:33.660 know, fentanyl
00:41:34.320 making stuff
00:41:35.040 and quickly
00:41:36.020 move it to
00:41:36.580 another site.
00:41:38.380 Excellent.
00:41:39.820 As soon as
00:41:40.420 they set up
00:41:40.920 that new
00:41:41.300 site, you
00:41:42.540 say, you
00:41:43.440 got 24
00:41:44.100 hours, this
00:41:45.400 new site is
00:41:46.320 going to
00:41:46.620 disappear.
00:41:47.920 Then they
00:41:48.200 got to
00:41:48.420 collect it
00:41:48.840 all up and
00:41:49.280 move again.
00:41:50.360 Fine.
00:41:51.400 I'm okay
00:41:51.860 with that.
00:41:53.040 Let's just
00:41:53.460 keep doing
00:41:53.860 that.
00:41:55.040 Because, you
00:41:55.580 know, you
00:41:55.960 don't mow
00:41:56.400 the lawn
00:41:56.780 once.
00:41:58.240 You've got
00:41:58.760 to mow
00:41:59.000 it every
00:41:59.300 week.
00:42:00.120 That's
00:42:00.500 what a
00:42:00.820 lawn does.
00:42:01.660 Fennsville is
00:42:02.140 the same
00:42:02.440 thing.
00:42:02.980 Cartels.
00:42:03.800 You don't
00:42:04.240 mow them
00:42:04.660 once.
00:42:05.960 You've just
00:42:06.320 got to go
00:42:06.680 back every
00:42:07.160 day and
00:42:07.580 keep mowing
00:42:08.060 them, mowing
00:42:08.540 them, mowing
00:42:08.940 them.
00:42:09.380 And that's
00:42:09.780 just your
00:42:10.120 full-time
00:42:10.500 job.
00:42:11.460 Mowing the
00:42:12.020 lawn.
00:42:13.400 Now, anyway,
00:42:16.620 so if the
00:42:18.060 problem is
00:42:18.820 collateral
00:42:19.340 damage, I
00:42:21.380 feel like that
00:42:21.960 could be
00:42:22.260 handled.
00:42:22.560 like, you
00:42:23.940 could minimize
00:42:24.560 that.
00:42:29.440 But I'm
00:42:30.080 also feeling
00:42:30.740 that the
00:42:31.740 fact that
00:42:32.440 fentanyl is
00:42:33.220 not being
00:42:33.600 dealt with,
00:42:34.600 there could
00:42:35.060 be like an
00:42:36.600 unspoken
00:42:37.300 strategy here.
00:42:39.380 And I think
00:42:40.080 that's a
00:42:40.460 thing.
00:42:41.400 And I think
00:42:41.840 that there's an
00:42:42.780 unspoken strategy
00:42:43.840 in Israel on
00:42:44.940 birth rate.
00:42:45.600 We'll talk
00:42:45.900 about that
00:42:46.300 pretty soon.
00:42:46.700 But sometimes
00:42:47.880 there are
00:42:48.240 unspoken
00:42:48.860 strategies.
00:42:50.440 And here's
00:42:50.780 the unspoken
00:42:51.520 strategy of
00:42:52.960 American
00:42:53.400 politicians.
00:42:55.460 I think
00:42:56.420 they want a
00:42:57.520 better ratio
00:42:58.340 of non-addicts
00:43:00.560 to addicts.
00:43:02.300 And the
00:43:02.660 more addicts
00:43:03.600 or people
00:43:04.440 who could
00:43:04.860 become addicts,
00:43:06.280 the more
00:43:06.680 of them that
00:43:07.120 die, the
00:43:08.860 healthier the
00:43:09.440 country is.
00:43:11.920 And that
00:43:12.800 nobody's
00:43:13.320 thinking maybe
00:43:13.980 in that exact
00:43:15.540 way,
00:43:16.700 but that
00:43:17.600 at a
00:43:17.900 subconscious
00:43:18.360 level,
00:43:19.940 at a
00:43:20.300 subconscious
00:43:20.780 level,
00:43:22.040 we all
00:43:22.480 know the
00:43:22.800 country's
00:43:23.200 better off
00:43:23.760 if these
00:43:24.280 addicts
00:43:24.680 die.
00:43:26.740 Now let
00:43:27.320 me say the
00:43:27.720 harshest thing
00:43:28.360 you'll ever
00:43:28.720 hear.
00:43:29.840 My stepson
00:43:30.640 died of
00:43:31.640 an overdose
00:43:32.620 included
00:43:33.300 fentanyl.
00:43:37.420 And in
00:43:38.420 my opinion,
00:43:39.120 the world
00:43:39.380 is better
00:43:39.780 off without
00:43:40.260 him, like
00:43:41.140 a lot
00:43:41.400 better.
00:43:43.160 And I've
00:43:43.700 said this
00:43:44.020 before,
00:43:44.680 it's
00:43:45.340 horrible.
00:43:46.700 Because I
00:43:47.000 loved that
00:43:47.440 kid like
00:43:48.400 crazy.
00:43:49.580 I loved
00:43:50.080 him, but
00:43:51.200 there wasn't
00:43:51.740 any doubt
00:43:52.320 that he was
00:43:52.840 bad for
00:43:53.280 the world.
00:43:54.300 Like he
00:43:54.560 was really
00:43:54.880 bad for
00:43:55.260 the world.
00:43:55.980 I think
00:43:56.480 he could
00:43:56.740 have killed
00:43:57.000 somebody,
00:43:57.740 like
00:43:57.920 accidentally,
00:43:58.680 maybe with
00:43:59.200 like sharing
00:44:00.420 a drug or
00:44:01.080 driving
00:44:01.920 unsafely or
00:44:02.700 something.
00:44:02.980 He was
00:44:03.280 terribly,
00:44:04.140 terribly
00:44:04.340 dangerous.
00:44:05.520 And the
00:44:06.100 world is
00:44:06.460 better off
00:44:06.900 without him.
00:44:08.100 Now his
00:44:08.800 family, maybe
00:44:09.540 not, who
00:44:10.480 knows.
00:44:10.760 But I
00:44:12.500 believe that
00:44:13.000 that attitude
00:44:13.580 is behind
00:44:14.300 the lack
00:44:15.300 of fentanyl
00:44:16.940 action.
00:44:18.160 I believe
00:44:18.920 that everybody
00:44:19.400 who thinks,
00:44:20.120 yeah, on a
00:44:21.560 public and
00:44:22.280 logical level,
00:44:23.640 we can't let
00:44:24.700 China and the
00:44:25.600 cartels be
00:44:26.300 killing our
00:44:26.820 young people.
00:44:28.340 On the other
00:44:29.080 hand,
00:44:29.600 subconsciously,
00:44:30.720 do I care if
00:44:32.020 those people
00:44:32.740 die?
00:44:33.100 I actually
00:44:34.560 believe that
00:44:38.920 we have a
00:44:39.980 subconscious
00:44:40.520 strategy of
00:44:41.480 reducing the
00:44:42.340 addicts to
00:44:43.040 non-addicts
00:44:43.760 ratio, because
00:44:45.120 it makes the
00:44:45.560 country healthier.
00:44:46.920 Yeah, I don't
00:44:47.420 want to say
00:44:47.840 cull the herd,
00:44:48.940 because that's
00:44:49.660 just too
00:44:50.020 fucked up.
00:44:50.980 But we do
00:44:52.060 need fewer
00:44:52.680 addicts to
00:44:54.180 non-addicts.
00:44:55.080 Would you
00:44:55.460 agree?
00:44:56.300 The ratio of
00:44:57.380 addicts to
00:44:57.960 non-addicts could
00:44:58.880 reach a tipping
00:44:59.500 point, and I
00:45:01.020 don't know, we
00:45:01.420 could be close.
00:45:02.940 We might be
00:45:03.500 close.
00:45:05.060 Am I wrong?
00:45:06.660 Because one
00:45:07.920 addict destroys
00:45:08.820 maybe an
00:45:09.800 entire family.
00:45:11.500 So the
00:45:11.960 ratio of
00:45:12.740 addict to
00:45:13.580 destruction is
00:45:14.360 like one to
00:45:15.040 five, something
00:45:16.480 like that.
00:45:17.180 Like every
00:45:17.660 addict will
00:45:18.300 destroy five
00:45:19.140 other lives
00:45:19.860 indirectly.
00:45:22.080 So at
00:45:22.960 what point do
00:45:24.100 you have
00:45:24.300 enough addicts
00:45:25.060 where basically
00:45:25.600 everybody's life
00:45:26.340 is destroyed?
00:45:28.020 Like nobody
00:45:28.680 is free from
00:45:30.320 their destruction.
00:45:31.900 You know, if
00:45:32.500 you look at
00:45:32.860 the streets, you
00:45:33.480 see this zombie
00:45:34.360 apocalypse.
00:45:35.660 People walk
00:45:36.180 around like
00:45:37.820 that.
00:45:38.820 How many
00:45:39.300 of them is
00:45:39.800 too many?
00:45:42.740 Now, as
00:45:43.940 harsh as this
00:45:44.520 sounds, I'm
00:45:46.220 starting to
00:45:46.980 realize that the
00:45:48.300 country doesn't
00:45:49.000 want it to
00:45:49.500 stop, or else
00:45:51.880 it would have
00:45:52.200 already happened.
00:45:54.280 I believe that
00:45:55.140 subconsciously we
00:45:56.040 want these people
00:45:56.680 to die.
00:45:58.160 Not consciously.
00:46:00.200 If you ask
00:46:00.960 somebody, they
00:46:01.380 would say,
00:46:01.840 honestly, no, I
00:46:02.700 don't want this
00:46:03.080 to happen.
00:46:03.840 We try to get
00:46:04.380 them in
00:46:04.600 treatment.
00:46:04.840 But I
00:46:05.960 don't believe
00:46:06.300 that subconsciously
00:46:07.100 they believe
00:46:07.480 that.
00:46:09.260 What do you
00:46:09.900 think?
00:46:11.380 Do you think
00:46:11.960 I'm on to
00:46:12.460 anything or
00:46:12.860 no?
00:46:13.540 Because people
00:46:14.080 say it
00:46:14.460 directly, but
00:46:15.160 it's usually
00:46:15.580 trolls and
00:46:16.620 assholes on
00:46:17.400 social media.
00:46:18.860 But in
00:46:20.700 theory, I
00:46:21.340 would be the
00:46:21.820 last person to
00:46:22.840 adopt that
00:46:23.520 point of view
00:46:24.140 because I
00:46:25.120 lost a family
00:46:25.940 member.
00:46:27.320 But even I
00:46:28.180 have some
00:46:29.160 sympathy for it.
00:46:30.820 We actually
00:46:31.360 don't have any
00:46:32.440 way to stop
00:46:32.960 addiction.
00:46:34.120 Nothing.
00:46:34.600 I don't know
00:46:35.020 of any way.
00:46:36.520 And at the
00:46:36.960 same time, we
00:46:37.660 can't have this
00:46:38.220 many addicts.
00:46:40.000 It might be
00:46:40.820 solving itself.
00:46:43.060 As awful as
00:46:44.160 that sounds, it
00:46:45.420 might be solving
00:46:46.140 itself.
00:46:46.880 This might be a
00:46:47.520 self-correcting
00:46:48.180 problem.
00:46:50.200 I hate to say
00:46:51.060 it.
00:46:51.540 Now, which
00:46:51.860 doesn't say that
00:46:52.520 there will be
00:46:52.840 plenty of people
00:46:54.300 who weren't
00:46:55.300 going to become
00:46:55.820 addicts, got a
00:46:57.340 bad pill, died
00:46:59.200 when they would
00:46:59.740 have actually
00:47:00.220 turned it
00:47:01.200 around and
00:47:01.720 become productive
00:47:02.700 citizens.
00:47:03.540 And we all
00:47:04.080 understand that.
00:47:05.420 But I think
00:47:05.780 overall, I think
00:47:08.560 the country has
00:47:09.160 taken a
00:47:09.640 Darwinian opinion
00:47:10.500 to this because
00:47:11.340 there's no other
00:47:11.900 explanation.
00:47:13.700 There's no other
00:47:14.480 explanation.
00:47:15.400 I think people
00:47:16.080 at a base
00:47:17.040 level want the
00:47:18.540 addicts to be
00:47:19.200 dead because
00:47:20.540 they don't want
00:47:21.000 them in their
00:47:21.320 neighborhood.
00:47:21.780 They don't want
00:47:22.120 to deal with
00:47:22.560 them.
00:47:23.020 They don't want
00:47:23.460 to pay for
00:47:23.920 them.
00:47:24.240 They don't
00:47:24.480 want to go
00:47:25.680 through the
00:47:26.120 frustration of
00:47:26.900 trying to
00:47:27.260 clean them up
00:47:27.860 because it
00:47:28.240 doesn't work
00:47:29.080 very often.
00:47:29.740 So, but I
00:47:35.140 think we just
00:47:35.640 have to
00:47:36.200 understand who
00:47:37.180 we are.
00:47:38.320 If that's who
00:47:38.920 we are, maybe
00:47:40.300 that's who we
00:47:40.800 are.
00:47:44.040 All right.
00:47:45.340 Let's talk about
00:47:46.200 the declining
00:47:47.180 birth rate, which
00:47:48.260 seems to be like
00:47:49.260 the biggest problem
00:47:49.940 in the world, some
00:47:50.660 would say, and I
00:47:51.620 would agree.
00:47:52.860 It's up there with
00:47:53.620 the biggest problems
00:47:54.260 in the world.
00:47:54.680 Why is it that
00:47:56.560 birth rates are
00:47:57.360 declining everywhere
00:47:58.280 it seems except
00:47:59.200 Israel?
00:48:00.780 Anybody have a
00:48:01.460 theory for that?
00:48:03.800 I had a
00:48:04.920 fascinating
00:48:05.860 exchange today
00:48:07.860 with David
00:48:09.180 Boxenhorn,
00:48:10.560 who lives in
00:48:11.920 Israel, and
00:48:12.600 we're speculating
00:48:14.160 why Israel
00:48:15.260 has good
00:48:17.920 population
00:48:18.540 growth.
00:48:20.380 Not enough,
00:48:21.380 by the way.
00:48:22.020 Even in Israel,
00:48:22.980 I don't think it's
00:48:23.720 enough, but
00:48:24.920 it's better than
00:48:25.660 even, but think
00:48:27.220 about this, it's
00:48:27.820 even better than
00:48:28.540 the Arab birth
00:48:29.880 rate within
00:48:30.520 Israel.
00:48:33.540 So, here's my
00:48:37.580 hypothesis.
00:48:39.220 You ready?
00:48:40.680 Follow the
00:48:41.380 money.
00:48:43.100 That's it.
00:48:45.400 It works
00:48:46.200 every time.
00:48:48.000 All right.
00:48:48.440 In the beginning
00:48:49.040 of Israel, the
00:48:50.580 birth rate was
00:48:51.280 lower.
00:48:52.460 In modern
00:48:53.340 Israel, the
00:48:53.960 birth rate is
00:48:54.540 higher.
00:48:55.700 Follow the
00:48:56.140 money.
00:48:57.620 Follow the
00:48:58.160 money.
00:48:59.100 When you're a
00:49:00.600 new, scrappy,
00:49:01.900 barely-can-survive
00:49:03.040 family, having an
00:49:04.860 extra kid might be a
00:49:05.920 burden, because you
00:49:07.220 don't know if any of
00:49:07.920 this is going to
00:49:08.400 work.
00:49:09.160 Once you're
00:49:09.640 prosperous, as
00:49:10.820 Israel is, and
00:49:12.860 you have this
00:49:13.340 special case in
00:49:14.740 Israel that doesn't
00:49:16.100 exist anywhere else,
00:49:17.780 you have the
00:49:18.700 narrative of the
00:49:19.920 Holocaust saying
00:49:21.500 saying that there
00:49:21.900 are people in
00:49:22.480 this world who
00:49:23.580 would like to
00:49:24.060 reduce the
00:49:24.700 number of Jews
00:49:25.460 to zero.
00:49:27.220 Who else has
00:49:28.060 that?
00:49:28.820 The Uyghurs?
00:49:30.460 Right?
00:49:30.980 Like, that's
00:49:31.720 totally unique.
00:49:33.420 Nobody else has,
00:49:34.420 like, an operating
00:49:35.320 system for their
00:49:36.280 whole life that
00:49:37.500 says there are
00:49:38.020 plenty of people
00:49:38.600 out there who
00:49:39.180 want to reduce
00:49:39.740 the number of
00:49:40.320 you guys to
00:49:41.460 zero.
00:49:42.780 Right?
00:49:43.600 So, on top of
00:49:44.480 that, they live in
00:49:45.660 this little sand
00:49:46.520 island in the middle
00:49:47.520 of enemies, and
00:49:48.820 some of those
00:49:49.280 enemies are
00:49:49.820 lobbying missiles
00:49:50.540 into Israel
00:49:51.260 every day.
00:49:52.980 If you were
00:49:53.840 under attack, and
00:49:56.560 the main thing
00:49:57.660 you could do to
00:49:58.380 defend yourself
00:49:59.160 was more
00:50:01.960 people, what
00:50:04.660 would make
00:50:05.200 Israel more
00:50:06.400 safe?
00:50:07.840 Well, economics,
00:50:09.820 right?
00:50:10.480 How do you get
00:50:11.060 good economics?
00:50:12.600 More people.
00:50:14.560 Military defense?
00:50:16.540 How do you get a
00:50:17.140 good military
00:50:17.740 defense?
00:50:18.880 More people.
00:50:19.820 Right?
00:50:21.580 And military
00:50:22.360 defense and
00:50:23.040 economics are
00:50:23.780 basically inseparable.
00:50:27.320 You know, a good
00:50:27.960 economy gets you a
00:50:29.040 good defense.
00:50:30.860 They're just
00:50:31.360 inseparable.
00:50:32.580 So, I would say
00:50:33.800 that the birth rate
00:50:36.020 in Israel makes
00:50:37.740 sense on at least
00:50:38.680 three levels.
00:50:40.020 And David Boxen
00:50:41.280 Warren suggested
00:50:42.440 the first one.
00:50:43.520 The first one is
00:50:44.420 that the, what's
00:50:46.020 the right word,
00:50:46.520 the most observant,
00:50:50.640 what's the word,
00:50:51.500 traditional orthodox.
00:50:52.840 The most orthodox
00:50:55.020 Jews have had a
00:50:58.100 high birth rate
00:50:59.080 for a while.
00:51:01.060 So, there's one
00:51:02.380 group in Israel that
00:51:03.500 has a high birth rate
00:51:04.500 and that maybe some
00:51:05.660 of that, you know,
00:51:06.740 cultural influence
00:51:09.600 slopped over.
00:51:11.100 So, some of it
00:51:12.620 could be that
00:51:13.340 there's a group
00:51:14.040 of Jews having
00:51:14.860 a lot of babies
00:51:15.460 and maybe that
00:51:16.260 just looked popular
00:51:17.780 to other people.
00:51:19.740 But, I would go
00:51:21.760 further and say
00:51:22.460 I think it's because
00:51:23.500 they're a unique
00:51:24.220 situation where
00:51:25.700 they need more
00:51:26.440 humans to make
00:51:27.760 sure that there
00:51:28.320 are Jews in
00:51:29.140 a hundred years.
00:51:31.640 How did I do?
00:51:33.440 How's my theory?
00:51:34.400 Because nobody
00:51:34.880 else has that.
00:51:36.120 Nobody else has
00:51:36.780 an operating system
00:51:37.780 that says people
00:51:38.520 want to drive
00:51:39.060 your numbers to
00:51:39.740 zero.
00:51:41.600 I think that would
00:51:42.600 trigger you
00:51:43.180 subconsciously to
00:51:44.900 feel safer if
00:51:46.780 there were just
00:51:47.180 more Jews in
00:51:48.000 the world.
00:51:49.260 And I don't
00:51:49.660 believe anybody
00:51:50.200 else feels like
00:51:50.880 that.
00:51:52.000 Like, I don't
00:51:52.460 think Americans
00:51:53.120 wake up and
00:51:53.620 say, ah, I
00:51:54.640 better have some
00:51:55.180 babies because
00:51:56.060 that'll keep
00:51:56.840 America safe.
00:51:58.040 Now, as it
00:51:58.640 turns out, you
00:51:59.840 should have some
00:52:00.380 babies to keep
00:52:01.020 America safe, but
00:52:02.220 you're not thinking
00:52:02.820 about it.
00:52:03.580 Because in America
00:52:04.400 we don't have this
00:52:05.180 operating system that
00:52:06.380 says Holocaust,
00:52:07.360 Holocaust, Holocaust,
00:52:08.160 Holocaust, it's
00:52:08.760 going to happen to
00:52:09.260 you any minute now.
00:52:10.600 Right?
00:52:10.840 But if I did have
00:52:11.580 that, I would
00:52:13.060 think, have some
00:52:14.180 damn babies.
00:52:15.180 I'm going to need
00:52:15.580 some soldiers.
00:52:17.880 Male and female.
00:52:19.420 Because in Israel,
00:52:20.560 the women joined
00:52:21.680 the military, so
00:52:22.760 every baby you
00:52:23.420 have is a soldier.
00:52:25.280 Like, literally,
00:52:26.140 they all have to
00:52:26.740 become soldiers.
00:52:33.800 Yeah, some of you
00:52:35.040 need to start
00:52:35.640 having some babies.
00:52:37.360 All right.
00:52:39.380 That's what I think
00:52:40.360 is happening.
00:52:40.900 Now, the question
00:52:41.500 is, if I'm right,
00:52:43.760 can any of that be
00:52:44.840 generalized to other
00:52:45.940 populations that need
00:52:47.060 more babies?
00:52:48.120 And the answer is,
00:52:48.900 if I'm right, nope.
00:52:51.780 Because there's
00:52:52.360 nothing about Israel
00:52:53.220 that can be
00:52:53.640 generalized to
00:52:54.820 anybody else.
00:52:55.960 How many times
00:52:56.800 have I told you
00:52:57.480 that if you compare
00:52:59.020 the Holocaust to
00:53:00.060 anything, you're
00:53:02.220 wrong?
00:53:03.860 Like, just
00:53:04.380 automatically.
00:53:04.860 You know, that's
00:53:06.220 like the Holocaust.
00:53:07.280 No, it isn't.
00:53:08.500 No, it isn't.
00:53:09.540 Stop it.
00:53:11.040 There's nothing
00:53:11.620 like it.
00:53:13.400 So, all right.
00:53:14.960 I said I'd talk
00:53:15.840 about Ye.
00:53:16.320 Thanks for reminding
00:53:17.100 me.
00:53:17.340 Good job.
00:53:19.260 So, Ye is
00:53:22.820 developing his
00:53:23.860 city or community
00:53:25.760 idea where it would
00:53:26.780 be, you know, a
00:53:28.000 Ye-designed city.
00:53:30.560 Now, if the
00:53:31.760 reporting is right,
00:53:32.960 and it's too early
00:53:33.700 to know this is
00:53:34.420 true, the design
00:53:35.620 that he seems to
00:53:36.380 be settling on are
00:53:37.540 these igloo-looking,
00:53:39.880 concrete-looking
00:53:40.660 homes that are
00:53:41.980 sort of like the
00:53:42.720 Star Wars, you
00:53:44.280 know, the planet
00:53:46.080 of Tatooine or
00:53:47.120 whatever it is.
00:53:48.060 And I've seen a
00:53:49.900 number of them.
00:53:50.520 It looks like he
00:53:51.560 is sort of settling
00:53:53.000 on this dome-like
00:53:54.160 design.
00:53:55.260 But here's the
00:53:55.940 problem.
00:53:56.300 They're without
00:53:59.200 windows.
00:54:01.480 They're basically
00:54:02.300 windowless.
00:54:04.160 I don't know how
00:54:05.160 that ever could be
00:54:05.880 a good idea.
00:54:07.360 I feel like he
00:54:08.300 should be going
00:54:08.740 exactly in the
00:54:09.560 opposite direction.
00:54:11.260 In my opinion,
00:54:13.080 you should look for
00:54:13.820 the best insulated
00:54:15.680 glass, make as
00:54:17.720 many glass, you
00:54:18.760 know, walls as
00:54:19.480 you can, and then
00:54:20.960 have a curtain
00:54:21.680 system.
00:54:22.720 By the way, do
00:54:23.240 any of you have,
00:54:24.020 like, three-day
00:54:24.720 blinds?
00:54:25.300 You know, the
00:54:25.720 automatic curtains?
00:54:28.000 They're awesome.
00:54:29.340 They're great.
00:54:30.800 If you don't have,
00:54:31.700 like, automatic
00:54:33.060 curtains, or blinds,
00:54:36.220 blinds, not
00:54:36.740 curtains, if you
00:54:37.580 don't have
00:54:37.880 automatic blinds,
00:54:39.300 go get some.
00:54:40.720 They're freaking
00:54:41.280 amazing because you
00:54:42.460 can, you know,
00:54:42.900 instantly change the
00:54:44.060 environment, the
00:54:45.020 temperature, you
00:54:46.140 know, block the
00:54:46.720 sun, let it in.
00:54:48.280 Like, it gives you
00:54:48.840 all kinds of control
00:54:49.700 over your room.
00:54:50.380 In fact, I have mine
00:54:51.520 on right now where I
00:54:53.080 couldn't light this
00:54:53.840 room properly.
00:54:54.440 So, yeah, I
00:54:59.280 mean, it does cost
00:54:59.880 money, right?
00:55:00.940 But I think, here's
00:55:02.580 what I think.
00:55:03.040 I think EA may be
00:55:05.480 designing to design.
00:55:08.700 Let me say that
00:55:09.540 clearly.
00:55:10.000 I think he's trying
00:55:11.080 to create something
00:55:11.860 that looks good when
00:55:12.820 you look at it,
00:55:13.880 design-wise, and
00:55:15.180 different, but I
00:55:16.760 don't know that he's
00:55:18.020 got the user
00:55:19.080 requirement right.
00:55:20.180 it's sort of
00:55:21.720 looking like maybe
00:55:22.660 he should back
00:55:23.560 up to really
00:55:26.120 check on what
00:55:27.360 makes a human
00:55:27.980 healthy and happy.
00:55:29.920 And I don't know
00:55:30.540 that the igloo is
00:55:31.580 the answer to that.
00:55:33.620 Now, keep in mind,
00:55:35.020 I don't know
00:55:35.480 exactly what he's
00:55:36.320 up to.
00:55:37.020 He did talk about
00:55:37.960 having some kind
00:55:38.800 of light access.
00:55:42.020 so I think the
00:55:43.780 current ones maybe
00:55:44.800 are not built the
00:55:45.980 way the final one
00:55:47.200 would be.
00:55:47.700 So he probably will
00:55:48.520 add more light.
00:55:49.420 So I don't want to
00:55:50.060 assume that he
00:55:51.800 hasn't got that
00:55:52.400 figured out.
00:55:53.120 I'm just saying that
00:55:53.700 so far, I don't see
00:55:55.780 light as being a
00:55:58.540 focus.
00:55:58.960 It looks like it's
00:55:59.480 the opposite.
00:55:59.800 He might be
00:56:03.920 talking about
00:56:04.380 earth ships.
00:56:05.100 I think he's
00:56:05.540 probably, I
00:56:06.480 believe he's
00:56:07.060 probably influenced
00:56:08.340 by earth ships.
00:56:09.940 That's a green
00:56:10.840 building concept.
00:56:14.500 All right, let's
00:56:14.940 talk about Sam
00:56:16.440 Harris talking to
00:56:17.300 Bill Maher.
00:56:19.100 I guess Bill
00:56:20.360 Maher had him on
00:56:21.460 his little side
00:56:22.720 show that he does
00:56:23.460 where he talks to
00:56:24.000 one person.
00:56:25.460 And this is the
00:56:27.300 cleanest example of
00:56:28.600 cognitive dissonance.
00:56:30.060 In my opinion,
00:56:32.000 can't know for sure,
00:56:33.720 that you might ever
00:56:34.640 see.
00:56:35.500 So even if this,
00:56:36.760 even if I'm
00:56:37.360 misinterpreting this
00:56:38.300 example, it's the
00:56:39.680 best lesson you'll
00:56:41.500 ever see of a
00:56:43.180 clean looking,
00:56:45.580 again, I can't read
00:56:46.980 minds, but it looks
00:56:48.540 like a clean case of
00:56:49.640 cognitive dissonance.
00:56:50.880 And here's the
00:56:51.340 setup.
00:56:52.120 The thing that
00:56:52.740 caused cognitive
00:56:53.480 dissonance is, for
00:56:55.660 example, you think
00:56:57.220 you're one kind of
00:56:58.040 person.
00:56:58.380 let's say a smart
00:57:00.540 person.
00:57:01.700 And then life
00:57:03.120 proves that you
00:57:03.900 were wrong.
00:57:05.360 So your very belief
00:57:07.140 about who you are
00:57:08.120 and what you're
00:57:08.640 capable of is
00:57:09.880 challenged by the
00:57:10.660 facts.
00:57:12.640 There are two ways
00:57:13.600 to go.
00:57:14.160 You can either say,
00:57:15.160 oh, I guess I'm not
00:57:16.960 very good.
00:57:17.600 I guess the facts
00:57:18.360 prove that that's who
00:57:19.220 I am.
00:57:19.960 Or, more likely,
00:57:22.460 cognitive dissonance
00:57:23.480 clicks in and you
00:57:24.900 hallucinate, literally
00:57:27.000 hallucinate, a reason
00:57:29.100 why you were really
00:57:30.020 right all along.
00:57:32.060 You've seen it, right?
00:57:32.980 You've seen it in your
00:57:33.680 own life.
00:57:34.600 If you trap somebody
00:57:36.040 and you prove that
00:57:37.280 they were wrong for
00:57:38.260 years, their brain
00:57:40.760 can't say, oh, wow,
00:57:41.960 that's interesting.
00:57:43.580 Turns out I was
00:57:44.460 totally wrong for
00:57:45.580 years.
00:57:46.660 Almost nobody can do
00:57:47.720 that.
00:57:48.840 A normal human will
00:57:50.340 hallucinate a reason
00:57:52.020 that they were really
00:57:52.880 right all along.
00:57:54.980 Now, let's see if it
00:57:56.220 looks like Sam Harris
00:57:57.800 did that.
00:57:59.040 If you don't know
00:57:59.920 who Sam Harris is,
00:58:01.540 you should know that
00:58:02.360 he's one of the most
00:58:03.300 rational people in the
00:58:04.420 world.
00:58:05.080 That's what he writes
00:58:05.680 about, how to be
00:58:06.920 rational.
00:58:08.280 And you should know
00:58:09.360 that he's very well
00:58:10.600 informed, right?
00:58:12.120 So he's not somebody
00:58:13.000 who's paying no
00:58:14.400 attention to the
00:58:15.180 political news.
00:58:16.000 He's a well-informed,
00:58:17.580 very, very smart
00:58:18.660 person, way above
00:58:19.700 average.
00:58:20.700 He's talking to Bill
00:58:21.740 Maher, and Bill
00:58:26.660 Maher notes that the
00:58:27.760 New York Times and
00:58:29.060 the Washington Post
00:58:29.940 did not talk about
00:58:32.760 the Hunter laptop
00:58:33.760 properly.
00:58:35.800 In other words, they
00:58:36.500 reported that it
00:58:37.400 might be disinformation
00:58:38.360 all the way through
00:58:39.140 the election.
00:58:40.500 Now, how does Sam
00:58:42.320 Harris, how would he
00:58:44.020 explain in his own
00:58:45.260 understanding of
00:58:46.100 himself and the
00:58:46.920 world, don't you
00:58:49.100 think that Sam Harris
00:58:49.920 gets his beliefs
00:58:51.260 from the New York
00:58:52.580 Times and the
00:58:53.220 Washington Post, and
00:58:54.940 publications of that
00:58:56.360 sort?
00:58:57.620 I'll bet he does.
00:58:59.060 I'll bet, again,
00:59:00.860 here we're speculating,
00:59:01.980 nobody can read
00:59:03.000 anybody's mind, right?
00:59:04.440 So I can't actually
00:59:05.800 know what Sam Harris
00:59:06.940 is thinking or his
00:59:07.980 motivations or anything.
00:59:09.180 But I'm giving you,
00:59:10.240 like from the outside,
00:59:12.140 this is a perfect
00:59:13.060 example of what it
00:59:14.060 would look like if
00:59:15.680 cognitive dissonance
00:59:16.760 were involved.
00:59:17.340 So the setup is, Sam
00:59:19.800 Harris, very smart,
00:59:21.380 very knowledgeable, and
00:59:24.740 believes that his
00:59:25.640 sources are generally,
00:59:28.000 let's say, could be
00:59:28.900 wrong, because the news
00:59:30.140 sometimes is wrong, but
00:59:31.440 not intentionally wrong.
00:59:34.580 I guess that's the key.
00:59:37.040 Does Sam Harris believe
00:59:39.140 that the sources he uses
00:59:41.660 for his own understanding
00:59:42.940 of the world are
00:59:44.380 intentionally wrong?
00:59:46.000 And I don't think he
00:59:47.620 would believe that.
00:59:48.700 I think he believes they
00:59:49.520 can make mistakes like
00:59:50.460 everybody does.
00:59:52.180 Again, I can't read his
00:59:53.520 mind, so if tomorrow he
00:59:55.240 says, Scott, you got it
00:59:56.300 all wrong, listen to him,
00:59:58.120 not me.
00:59:59.140 All right?
00:59:59.680 Because it's very unfair
01:00:00.840 to characterize somebody's
01:00:02.440 inner thoughts.
01:00:04.260 So I'm trying to parse
01:00:06.620 that a little bit so I can
01:00:08.380 make my point about
01:00:09.220 cognitive dissonance and
01:00:10.880 give you a little lesson
01:00:11.680 here without saying too
01:00:14.400 much about somebody's
01:00:15.180 inner thoughts.
01:00:17.540 So this is a perfect
01:00:18.660 setup.
01:00:20.360 He's got a situation
01:00:21.520 where this laptop example,
01:00:23.600 and as Bill Maher pointed
01:00:24.700 out, and so Sam's
01:00:28.180 explanation was that there
01:00:29.840 wasn't enough time for the
01:00:31.860 New York Times and the
01:00:33.040 other major media to vet
01:00:35.840 the laptop before the
01:00:37.360 election.
01:00:38.060 Well, there just wasn't
01:00:38.740 enough time.
01:00:40.080 Then Bill Maher says,
01:00:41.500 well, why did the New York
01:00:42.380 Post have time?
01:00:45.140 Right?
01:00:46.440 So there were entities that
01:00:47.900 had plenty of time, and
01:00:49.360 all they did was say, here,
01:00:51.780 take a look at this, and
01:00:52.900 they brought an expert in.
01:00:54.540 And the expert looked at it
01:00:55.700 and said, yeah, that looks
01:00:56.380 legit.
01:00:57.860 And then they reported it.
01:00:59.860 And then after the election,
01:01:01.800 the other places reported it
01:01:03.720 too.
01:01:03.880 Now, Sam was under the
01:01:08.960 impression that there was
01:01:10.280 actually a reason that the
01:01:12.420 New York Times didn't report
01:01:13.720 it before the election.
01:01:15.680 I think Bill Maher proved to
01:01:17.400 him that that reason was not
01:01:19.840 valid, which should have, you
01:01:23.320 know, in a rational world, led
01:01:25.380 Sam to say, oh, I was not
01:01:28.380 aware that the sources that I
01:01:30.500 use for my understanding of the
01:01:32.080 world could, in a situation so
01:01:35.220 blatantly, literally prefer
01:01:40.100 disinformation over news.
01:01:44.340 Right?
01:01:45.220 Now, do you believe, oh,
01:01:46.600 somebody says he lied, and I'm
01:01:47.860 going to disagree with you.
01:01:49.200 There is no indication that Sam
01:01:51.320 Harris is a liar.
01:01:53.040 There's no indication of that.
01:01:54.820 And if you say you see it here,
01:01:56.720 I call bullshit hard on that.
01:01:59.300 I believe he saw the world that
01:02:01.020 way.
01:02:02.380 This is cognitive dissonance.
01:02:04.740 Cognitive dissonance hits very
01:02:07.240 smart people exactly the same as
01:02:10.920 everybody else.
01:02:12.440 And that's the part that you need
01:02:13.700 to learn.
01:02:14.240 If there's one thing I could
01:02:15.180 teach you, that Sam Harris being
01:02:17.360 one of the smartest, most rational
01:02:19.140 people in the entire public,
01:02:21.200 doesn't help you at all to avoid
01:02:24.700 cognitive dissonance.
01:02:26.800 Do you get that?
01:02:27.800 As soon as you tell yourself, well,
01:02:30.920 a smart person isn't going to fall
01:02:32.660 for that, nope, you don't
01:02:35.100 understand what it is.
01:02:37.220 Cognitive dissonance does not care
01:02:39.060 how smart or well-informed you are.
01:02:41.160 It's completely independent of that.
01:02:44.080 It's an illusion.
01:02:45.660 It's just like an optical illusion.
01:02:47.900 You know, no matter how smart you are,
01:02:50.300 an optical illusion still works,
01:02:52.080 right?
01:02:52.640 Like, you still see it that way,
01:02:53.980 even if you know it's an illusion.
01:02:56.180 So that's the important thing that
01:02:59.000 is really hard to communicate.
01:03:01.100 When you see dumb people who seem
01:03:03.680 to be in, you know, having a bad
01:03:05.660 opinion, then you say to yourself,
01:03:07.640 well, the explanation is that
01:03:09.480 they're dumb, right?
01:03:11.240 Don't you do that?
01:03:12.620 Don't you say, the reason we disagree
01:03:14.360 is that I'm smart and well-informed,
01:03:16.980 and you're either not smart or not
01:03:19.180 well-informed?
01:03:20.420 Always.
01:03:20.900 We always think that.
01:03:22.780 But this doesn't work with Sam Harris.
01:03:25.840 That's why I use him for my example.
01:03:28.120 You can't doubt that he's smart.
01:03:29.760 You can't doubt that he's rational,
01:03:31.160 and more rational than you are.
01:03:33.120 I would say he's more rational than I am,
01:03:36.420 probably better informed than I am
01:03:38.000 on most things, right?
01:03:40.380 He's a level of intellectual above me,
01:03:42.940 and I recognize that on most things.
01:03:45.900 So I think this is a clean, clean case
01:03:51.780 of cognitive dissonance, but I could be
01:03:54.520 wrong, because remember, I can't read
01:03:56.760 his mind, don't know what's going on in
01:03:59.220 there, but it's the cleanest case,
01:04:01.780 allegedly, I've ever seen.
01:04:06.720 I didn't see the, I only saw the
01:04:08.580 transcript, so I didn't see if he had
01:04:10.100 the reset face.
01:04:11.420 So somebody asked me if Sam had the
01:04:13.040 reset face.
01:04:13.880 Now, I've described that before as a
01:04:16.100 tell for cognitive dissonance.
01:04:18.160 The reset face is where you actually,
01:04:21.280 you're struck dumb for a moment,
01:04:24.240 and you can't say anything, and your
01:04:26.260 face looks like your brain just took a
01:04:28.160 vacation, and when you trigger people
01:04:30.860 into cognitive dissonance, and I do it
01:04:32.640 intentionally fairly often, in person,
01:04:35.760 you see the reset face, and it goes
01:04:37.600 like this.
01:04:39.100 So did you know that the other
01:04:40.780 publications had plenty of time to vet it?
01:04:43.560 Why couldn't the New York Times?
01:04:48.400 Yeah, if you're on Spotify, you don't
01:04:50.440 know what I just did.
01:04:51.480 Yeah, it's like Vaporlock.
01:04:53.140 You look like you're going to talk, but
01:04:54.700 nothing can come out.
01:04:57.420 That's how you tell.
01:04:58.960 So I will watch it and see if I see
01:05:00.640 that.
01:05:01.260 That'd be interesting.
01:05:01.900 All right, ladies and gentlemen.
01:05:07.740 Years ago, I heard, I just noticed I have
01:05:10.500 like chocolate on my face.
01:05:12.280 I always eat these protein bars before I
01:05:14.820 go live.
01:05:16.020 I'm like, I always got chocolate like
01:05:17.320 halfway over my mug.
01:05:19.540 It's a good thing I don't feel
01:05:21.040 embarrassment anymore.
01:05:21.960 No problem at all.
01:05:27.800 So years ago, Mark Cuban said something
01:05:30.860 that sounded crazy to me at the time.
01:05:33.380 He said that he doesn't look at regular
01:05:35.680 news sites because Twitter is faster and
01:05:39.100 it's complete.
01:05:40.540 So you could always check the news on
01:05:42.140 Twitter and you don't have to wait for the
01:05:43.880 news to come on when it comes on.
01:05:46.620 And I thought to myself, that doesn't work.
01:05:48.700 You know, there's always going to be
01:05:50.540 something on the news that's not on
01:05:52.100 Twitter.
01:05:52.960 But today, I did a news-related live
01:05:55.920 stream in which I spent hours collecting
01:05:59.020 news-related stuff and I never looked at
01:06:01.620 a news site.
01:06:02.860 It was just all on Twitter.
01:06:05.440 I think Twitter is just going to eat the
01:06:07.320 news.
01:06:11.520 Musk might put the entire news business
01:06:13.600 out of business.
01:06:15.060 Because here's something that could
01:06:16.540 happen on Twitter.
01:06:17.360 In just the same way that Mike Cernovich
01:06:20.840 went from unknown to, you know, a highly
01:06:24.360 useful account.
01:06:29.740 More as well as I'm going to vibrate
01:06:31.300 through the news.
01:06:32.160 I don't care.
01:06:32.920 Look at a paid comment here.
01:06:40.200 Oh, some people say that what Trump did
01:06:43.020 with Trump University was worse than
01:06:45.960 anything that Biden can do.
01:06:49.160 All right.
01:06:49.980 Well, we'll talk about that in a minute.
01:06:52.600 If you remind me to talk about that.
01:06:55.140 So my point was that just the way people
01:06:57.140 become prominent voices on Twitter,
01:06:59.840 just by being useful and people like what
01:07:02.680 they say.
01:07:04.120 Musk is talking about citizen journalism.
01:07:06.480 where anybody who's close to a story can just
01:07:11.580 report it on Twitter and then other people
01:07:14.260 say, hey, that was a good job.
01:07:15.720 And then that becomes the news.
01:07:18.500 Couldn't that happen?
01:07:20.800 Right?
01:07:21.820 Then how about this?
01:07:23.040 I told you there's a question that all
01:07:24.960 politicians should be asked.
01:07:27.000 You know, why do you favor the cartels' lives
01:07:29.560 over American lives?
01:07:30.700 And imagine if Twitter could surface the
01:07:35.680 questions that Twitter users most want asked
01:07:38.460 because other people vote on it across
01:07:40.680 categories, et cetera.
01:07:42.200 And then when there's a press conference,
01:07:45.320 one of the steps is they say, all right,
01:07:49.300 thank you for the media has asked the following
01:07:52.160 questions.
01:07:53.040 Or maybe Twitter needs somebody in.
01:07:55.360 Oh, this is even better.
01:07:56.140 Maybe Twitter needs a human being at the press
01:07:59.040 conference, you know, the presidential press
01:08:01.660 conferences, and then Twitter just has press
01:08:05.500 credentials, except Twitter's press credentials
01:08:08.860 under this scenario would be on behalf of its
01:08:12.380 public so that the Twitter representative would
01:08:17.420 ask the question that the Twitter users had
01:08:19.540 surfaced as the ones they cared about.
01:08:21.820 So it wouldn't be a reporter journalist so much as
01:08:25.360 somebody representing the collective curiosity of
01:08:29.500 Twitter.
01:08:30.400 How awesome would that be?
01:08:32.900 Imagine the questions that come from, you know,
01:08:35.500 NBC News, how, like, boring and stupid those are.
01:08:40.380 And then imagine the questions that would come from
01:08:42.580 Twitter users, like mine.
01:08:44.820 So my question, why do you favor Mexican cartel
01:08:48.980 lives over American lives?
01:08:50.360 That would probably bubble up to the top, if I do say so
01:08:55.900 myself.
01:08:57.840 I'll just say that directly, since I'm putting my ego out
01:09:01.400 there.
01:09:01.960 I believe that question is solid enough that it would
01:09:05.360 bubble to the top, and it would actually get asked in
01:09:07.700 public.
01:09:09.800 Now, if not, somebody else's better question would be
01:09:12.440 asked, and that's fine, too.
01:09:13.380 I don't think, I don't think that America has quite
01:09:19.500 processed how big the Musk takeover of Twitter can
01:09:23.900 be, because it's not going to be just Twitter anymore.
01:09:27.500 Twitter 2.0 is not going to be Twitter 1.0.
01:09:31.180 Like, it's, in fact, on everything is incalculably large, and
01:09:36.140 the only thing they have to do to make all that happen, what's the
01:09:39.500 only thing Twitter has to do to make all of that happen, everything
01:09:44.540 they're doing right now, exactly what they're doing right
01:09:47.940 now, and that that all will happen.
01:09:50.600 Like, you couldn't stop it, as long as they're doing the
01:09:53.060 things they're doing now.
01:09:54.800 Now, this assumes that, you know, Musk can afford to keep it
01:09:59.520 in business, and I like his odds.
01:10:02.080 Even if he says the odds are low, I like his odds.
01:10:05.140 All right.
01:10:12.600 Demons talk about leaving?
01:10:14.100 Yeah.
01:10:14.680 I don't know if they can.
01:10:19.500 Yeah, the Dilbert website has some problems for days, and I
01:10:24.840 don't even know if we're close.
01:10:26.460 When I say we, I don't directly manage that.
01:10:29.100 But I don't even know if we're close to fixing it.
01:10:32.320 Whatever the problem was, it's catastrophic.
01:10:36.660 I mean, it's comics on a website, so it's not catastrophic for the
01:10:41.940 country.
01:10:42.880 It's just catastrophic in terms of one website.
01:10:48.400 All right.
01:10:51.320 Who are my top three follows?
01:10:53.780 You know, I don't want to say three, because there are too many, and then I
01:10:56.940 leave somebody out, but I think I do owe you my best follow list, don't I?
01:11:04.560 I feel like that would be really useful, because if you think that I say
01:11:10.040 anything useful, you should know that I'm deeply influenced by 20 people
01:11:17.280 every day.
01:11:19.640 You know, just 20 power users.
01:11:21.720 And when I say power users, I just mean they're good at it.
01:11:24.200 That doesn't mean they have a lot of followers.
01:11:26.440 There are a number of people who influence me basically every day who
01:11:30.500 don't have a lot of followers.
01:11:35.420 Yeah, I'm checking my feed.
01:11:37.340 I haven't, I've got no response from Senator Cotton.
01:11:41.360 And again, I don't know if I'll tell you.
01:11:43.820 I might tell you if he responds.
01:11:45.140 I don't know if I'm going to tell you what he says, because I would treat that
01:11:48.460 as a private conversation.
01:11:53.140 All right.
01:11:54.560 Oh, Trump University, thank you.
01:11:57.200 Here's my take on Trump University.
01:12:02.720 That was bad management, but probably that's all it was.
01:12:07.360 Meaning that it's very unlikely that Trump knew the details of what the content was
01:12:16.120 of the course, or even exactly why that didn't match people's expectation of why it wasn't
01:12:23.860 working.
01:12:24.600 Remember, he was a licensee, I believe.
01:12:27.360 He was either a licensee, meaning not directly involved with the business, or he was sort of
01:12:31.780 a silent partner.
01:12:32.720 It was one of those.
01:12:33.420 I don't know what it was.
01:12:34.500 But whether he was sort of the non-operating partner or the licensee, whichever it was,
01:12:42.060 there's no evidence that he knew what was going on.
01:12:45.500 Am I wrong?
01:12:47.020 Fact check me on this.
01:12:48.500 I don't believe there's any evidence that Trump personally had a good understanding
01:12:53.660 of how bad things were.
01:12:57.180 I don't think that's ever been demonstrated, has it?
01:13:00.660 Now, it has been demonstrated that the students did not get what they thought they were paying
01:13:07.120 for.
01:13:07.380 And then, you know, financial restitution was made.
01:13:12.560 And that's the way the system works.
01:13:14.960 Somebody did a bad job.
01:13:16.700 It was legally actionable.
01:13:19.020 They acted.
01:13:19.760 They got a refund.
01:13:20.940 I don't know.
01:13:22.940 I mean, there's certainly super-sketchy behavior of the people running Trump Organization, but
01:13:28.920 if you extend that to the person who had 400 companies and probably spent one day thinking
01:13:34.400 about it, that's kind of a stretch.
01:13:36.780 Now, if it ever turned out that Trump himself knew exactly what they were doing and how sketchy
01:13:44.000 it was, well, then I'd change my opinion.
01:13:46.840 I don't need any cognitive dissonance for that.
01:13:48.960 I would just say, oh, well, I guess I got that totally wrong.
01:13:53.360 It wouldn't offend my sense of who I am.
01:13:56.580 It would just be new information.
01:14:01.320 All right.
01:14:07.760 Why is Sam Harris fixated on that?
01:14:11.300 I never say anybody's fixated on things.
01:14:14.080 That's sort of a dickish thing to say about anybody.
01:14:16.340 I hate it when people say it about me.
01:14:17.620 It's one thing he uses as an example.
01:14:21.280 I do think, and by the way, I agreed with him.
01:14:24.640 I think I did.
01:14:25.840 When Trump University was mentioned as a dark mark against Trump, I agree with that.
01:14:34.540 I agree with that.
01:14:36.060 That's part of what I factor into my overall decision.
01:14:39.180 And at the time, I said it was the thing that bothered me the most.
01:14:42.740 But I'd also have to know how much he knew about it.
01:14:45.480 And that's never been presented to me.
01:14:47.920 I've never seen the news report how much Trump knew about what was happening.
01:14:55.100 Sam Harris says it's worse than the laptop.
01:14:57.540 Maybe.
01:14:58.700 Maybe.
01:14:59.000 But we've had time to look into it.
01:15:02.120 And nobody's ever told me what he knew and what he didn't.
01:15:05.660 So, I mean.
01:15:08.140 But if he says it's worse, okay.
01:15:10.580 Maybe.
01:15:13.280 But you also have to look at the entire package.
01:15:17.220 All right.
01:15:17.740 Which I do.
01:15:19.540 All right.
01:15:20.100 I'm pretty sure this was the best live stream of all time.
01:15:25.600 I feel like I hit the mark here.
01:15:28.060 Do you all agree?
01:15:29.540 You learned more today.
01:15:31.860 You had more optimism and more happiness than anything else you could have been doing at the same time.
01:15:37.740 It's true.
01:15:43.340 All right.
01:15:43.980 Two sniffs and an exhale.
01:15:48.240 I'm still doing that, by the way.
01:15:50.260 Have any of you gotten hooked on that Andrew Huberman breathing technique that I've mentioned a few times?
01:15:56.280 Where you sniff two inhales and then you do one deep exhale?
01:16:00.760 I'm totally addicted to it.
01:16:03.460 Because the feeling you get is immediate.
01:16:06.720 Like the payoff is immediate.
01:16:09.360 Like, it's not like, oh, I've been doing this for a week and I do think I feel a little better.
01:16:16.140 It's nothing like that.
01:16:17.400 You actually feel immediately better.
01:16:20.520 So, like, I do it all day.
01:16:22.740 All day long.
01:16:24.000 Every time I'm bored.
01:16:25.420 I walk the dog.
01:16:26.840 I breathe.
01:16:27.960 I'm waiting in line.
01:16:29.080 I breathe.
01:16:30.420 I just do it all day.
01:16:32.480 Feels good every time.
01:16:34.780 How do you do it?
01:16:35.440 Belinda says.
01:16:36.320 I'll give you one more lesson for those who have missed it.
01:16:38.800 This is the Andrew Huberman method.
01:16:41.740 Nothing I invented at all.
01:16:43.780 And apparently there's some science behind it.
01:16:46.400 And it's based on the, I guess, the best ratio of, you know, O2 to CO2 or whatever the hell.
01:16:53.480 I don't know.
01:16:54.120 But there's some science to it.
01:16:55.580 But you do two deep nose inhales.
01:17:03.280 Followed by one long deep exhale.
01:17:06.400 There's something about the sniffing in and getting almost a full exhale and then another sniff so that you go beyond a full inhale to, like, extra inhale.
01:17:19.820 There's something about that that gives you the right balance of body chemistry.
01:17:24.240 I don't know.
01:17:28.700 Yeah.
01:17:30.620 All right.
01:17:31.680 I'm going to lock off the locals' feed from the public.
01:17:34.880 Thanks for reminding me.
01:17:36.220 I'm going to go talk to the locals' people about all the good stuff.
01:17:40.120 And then I will talk to you later.
01:17:45.060 If you were on the locals' subscription platform, you could hear this, too.
01:17:50.080 But you're all special.
01:17:51.640 And I'll talk to you tomorrow.
01:17:53.120 For you tomorrow.
01:17:53.720 I'll talk to you tomorrow.
01:17:54.380 I'll talk to you tomorrow.
01:18:02.820 I'll talk to you tomorrow.
01:18:10.580 I'll talk to you tomorrow.
01:18:12.440 Ready?
01:18:13.840 Take care eiffel.
01:18:14.720 And I'll talk to you tomorrow?
01:18:15.560 And then I'll talk to you tomorrow.
01:18:16.040 And then I'll talk to you tomorrow.
01:18:17.580 I'll talk to you tomorrow.
01:18:17.680 I'm sorry.
01:18:18.300 I'll talk to you tomorrow.
01:18:19.160 I'll talk to you tomorrow.
01:18:19.880 If you're any tomorrow.
01:18:20.240 Take care eiffel.