Real Coffee with Scott Adams - December 05, 2022


Episode 1948 Scott Adams: Fake News Out Of China, Fake News in NYT, I'm Starting To See A Pattern


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

141.2906

Word Count

8,899

Sentence Count

646

Misogynist Sentences

10

Hate Speech Sentences

26


Summary

In this episode of Coffee with Scott Adams, we talk about artificial intelligence (AI) and the future of education, and how it's going to change the world. Plus, I talk about why I don't want to own a car.


Transcript

00:00:00.000 Da, da, da, da, da, da, da, da, da, da, da, da, da.
00:00:04.780 Good morning, everybody, and welcome to the highlight of civilization.
00:00:09.600 It's called Coffee with Scott Adams.
00:00:11.140 There's never been a finer moment in your life.
00:00:14.500 I know, you'd probably like the birth of your children, maybe your wedding day,
00:00:18.320 but those are nothing compared to this experience.
00:00:21.500 And if you want to take it up to yet another level, and I know you do,
00:00:24.340 all you need is a copper mug or a glass, a tanker, chalice, or stein, a canteen jug or flask,
00:00:30.280 a vessel of any kind, fill it with your favorite liquid.
00:00:33.760 I like coffee.
00:00:35.500 Join me now for the unparalleled pleasure of the dopamine of the day,
00:00:39.120 the thing that makes everything better.
00:00:42.360 It's called the simultaneous sip, and it's going to happen right now.
00:00:44.600 Go.
00:00:49.760 Ah, yeah, that's good.
00:00:51.880 That's good.
00:00:55.720 Well, let's talk about some things.
00:01:00.560 A Twitter user named Peter Wang.
00:01:05.780 Stop it.
00:01:08.180 That's his actual name.
00:01:09.580 Don't make fun of him.
00:01:11.100 Just because his first name and his last name are a penis,
00:01:15.220 Peter Wang, that is no reason to mock him.
00:01:18.700 That's not kind.
00:01:20.340 Don't do that.
00:01:21.880 Because that's not what this is about.
00:01:24.000 But Peter Wang had a very good tweet,
00:01:26.920 which was neither about his Peter or his Wang.
00:01:31.120 But he said, quote,
00:01:32.800 I just had a 20-minute conversation with ChatGPT,
00:01:36.360 that's an artificial intelligence that's available to the public,
00:01:41.360 about the history of modern physics.
00:01:43.420 If I had this shit as a tutor during high school and college, OMG.
00:01:48.740 I think we can basically reinvent the concept of education at scale.
00:01:54.080 College as we know it will cease to exist.
00:01:57.580 AI will be your only user interface to everything, including education.
00:02:01.300 Yes.
00:02:03.540 Yes, Peter Wang, you have nailed this.
00:02:08.420 The only thing I want for education is AI.
00:02:11.120 Now, you say to yourself, but Scott, AI will not be as good as a human teacher who's reacting and all that.
00:02:21.000 To which I say, you are right on day one.
00:02:25.620 What happens on day two of AI versus all the human teachers?
00:02:34.380 On day two, AI, or well, not technically day two, but let's say year two.
00:02:39.960 On year two, AI has already tried 3,000 different ways to explain something.
00:02:49.200 And it tracked who got the best test results based on which way it chose to explain the same topic.
00:02:56.500 Once it found out that people got much better test results when it explained it this way,
00:03:01.740 it starts doing it that way from now on.
00:03:04.380 Maybe testing a little bit just in case it can do even better.
00:03:07.720 What did the human teachers do?
00:03:09.960 During that year, when the AI went from, I'll just guess what will work, to, oh, I have great data,
00:03:16.320 and I know exactly what worked, and this gets better test results.
00:03:19.340 What did the human teachers do differently?
00:03:23.980 They introduced some gender reassignment classes or something.
00:03:30.500 I don't know.
00:03:32.040 But I'll tell you what they didn't do, is know exactly what worked and what didn't,
00:03:36.800 and then change to what worked.
00:03:39.960 But AI did.
00:03:42.220 By year three, no human classroom would be able to compete with AI.
00:03:48.580 Because AI would have tested, you know, very completely, all the way through test scores,
00:03:54.280 to know what worked and what didn't, and then it would be over.
00:03:56.480 The only interface I want for anything is AI.
00:04:00.480 I don't want to do a Google search where I open a web page, and there's a little box.
00:04:07.040 I've got to find the box and type in my little thing and type it right.
00:04:10.580 I don't want any of that.
00:04:12.260 I want to be walking to my car, which is a self-driving car, and I want to say,
00:04:17.880 huh, I wonder when the next Warrior game is.
00:04:20.520 Hey, when's the next Warrior game?
00:04:22.460 Our next Warrior game will be Tuesday at 7 o'clock.
00:04:25.940 That's what I want.
00:04:27.140 I just want to say it and have the answer appear in my earpiece.
00:04:32.320 And yeah, it won't be my car, right?
00:04:33.820 I won't own the car and I'll be happier.
00:04:38.700 I don't think any of you quite understand how right Klaus Schwab is about not owning things and being happier.
00:04:48.360 If he's the one who said it, I think he did.
00:04:52.380 Let me give you a little insight from being rich.
00:04:56.580 You ready for this?
00:04:57.340 So I've experienced not having money, you know, the first part of my life, and I've experienced having lots of money.
00:05:07.180 And one of the things I like most about going on vacation is what?
00:05:13.020 What do you think is one of the things I like most about being on vacation?
00:05:18.560 Getting away from my money.
00:05:21.940 Because my money brought all these complications.
00:05:24.880 I've got a car that's breaking down.
00:05:27.380 I've got 600, you know, pieces of appliances that are not working at my house.
00:05:32.820 I've got leaks.
00:05:34.100 I've got shit.
00:05:34.900 I've got mail coming in.
00:05:36.440 I've got to clean the house.
00:05:37.980 I've got a dog.
00:05:40.420 It's all because of money, right?
00:05:42.640 When I go on vacation, I'm not actually escaping my own money.
00:05:46.280 It's a vacation from my money.
00:05:48.900 So if you're telling me that, you know, I want to own a lot of things, like being in my name,
00:05:54.320 because that's what will make me happy,
00:05:55.640 you've never experienced going from poor to rich.
00:05:59.380 If you've ever gone from poor to rich, you know your shit doesn't make you happy.
00:06:03.720 All right?
00:06:04.180 You're still going to buy it because you can, right?
00:06:07.000 I was watching a video of Andrew Tate by, I don't know, one or three new Bugattis or something.
00:06:17.600 He's got some big fleet of expensive cars.
00:06:21.380 And he looked very happy.
00:06:23.140 He seemed very happy when he was doing it.
00:06:24.740 But I think only the buying it and the talking about it makes him happy.
00:06:29.260 And maybe driving it two or three times.
00:06:31.140 But then after he gets through his happiness and then is parked with his other Bugattis,
00:06:37.600 does it still make him happy?
00:06:39.980 Or is it a little bit of work to have maintenance and, you know, storage and everything for yet another car?
00:06:48.080 I don't know.
00:06:49.160 Yeah, maybe it does work for him.
00:06:50.520 He seems to have a good system.
00:06:52.560 But if you believe that having a lot of stuff makes you happier, I guarantee you it doesn't.
00:06:59.060 You'll still get them.
00:07:00.820 Like, I'm not saying I'm going to stop buying things I want.
00:07:04.100 I mean, I just bought an electric guitar that clearly I did not need.
00:07:09.500 And I liked it.
00:07:10.420 I really enjoyed it.
00:07:11.000 Every time I look at it, it makes me happy.
00:07:13.220 But it's also a pain in the ass, right?
00:07:15.380 It adds a whole bunch of complexity to my life.
00:07:19.540 All right.
00:07:21.040 Yeah, AI will be everything.
00:07:23.580 I've got a question about the Snowden and Assange issues.
00:07:28.440 About Assange mostly.
00:07:30.640 But the question is whether they should be pardoned.
00:07:35.800 Why wouldn't we have a trial?
00:07:38.260 Is the belief that our justice system is so unfair that we can't have a trial that the whole country watches?
00:07:48.600 Because that's what I...
00:07:50.480 Yeah, I've seen a lot of yes.
00:07:52.320 So, now, I agree with you that our justice system can be biased and not get us the best answers.
00:08:00.020 But we use it for everything else.
00:08:03.860 Like, why would you make an exception for this one thing?
00:08:08.100 Don't you think the government itself would be on trial?
00:08:12.640 You know, here's my take.
00:08:18.380 I don't know the whole details of what Assange and Snowden did.
00:08:23.880 I haven't really heard the argument from, let's say, whoever wants to prosecute them, the CIA, etc.
00:08:30.780 So, I mean, I've heard the basic argument that it put people at risk and it was theft of materials and stuff.
00:08:37.460 And then the counter-argument is it was just journalism.
00:08:39.720 So, I'd like to see this case played out in public.
00:08:46.500 Now, I get that you don't trust that this specific case would be handled right.
00:08:51.800 However, I feel like if the whole country is watching,
00:08:56.080 it's a lot harder for anybody to, you know, play fast and loose with the rules.
00:09:00.940 So, as much as I don't want to see somebody put at risk if they have not committed any crimes,
00:09:10.600 I don't know what they have done.
00:09:13.180 Like, I'm a little hesitant to go full pardon without knowing exactly the whole details of the arguments.
00:09:20.760 But I'd like to know the arguments.
00:09:24.800 So, you're saying that government on trial would be a closed-door trial.
00:09:29.040 I'm seeing somebody shouting at me that it would be a closed-door trial.
00:09:32.680 Do you think it could be?
00:09:34.500 Do you think that you could put Assange on trial and say,
00:09:38.140 we're going to show some things to, let's say, the jury,
00:09:41.040 but the public can never know these things because they're secret?
00:09:46.000 Well, if that's the case, then there's no way to have a fair trial.
00:09:50.760 Because I think transparency would be pretty important.
00:09:54.500 Now, that would be interesting.
00:09:55.640 I would think that that would be a Supreme Court situation
00:09:59.340 because that would not be equal justice, would it?
00:10:03.140 I would argue that equal justice requires some amount of transparency for the entire public,
00:10:09.280 not just the people involved in the trial.
00:10:12.140 But, of course, there are plenty of cases where you do have to have that privacy.
00:10:17.520 So, I don't know.
00:10:19.240 Well, I'm biased toward having the trial
00:10:22.220 because I think it's terribly unfair to Assange
00:10:26.420 if it turns out no crimes were committed.
00:10:29.220 It's terribly unfair.
00:10:31.560 But, but,
00:10:34.400 he put himself in a position where he was trying to create a public good
00:10:38.160 and put himself at risk.
00:10:41.020 Would you agree that he knew he was doing that?
00:10:43.660 That Assange knew he was taking a risk,
00:10:45.460 he was doing it for the public good,
00:10:48.340 and maybe his own reasons too.
00:10:50.040 But, and I think this is more of that.
00:10:54.100 I would like, you know, I hate to put other people at risk.
00:10:57.940 You know, he should make his own decisions.
00:10:59.720 But he did make his decision to be in this domain.
00:11:02.120 And if a little bit more risk on Assange
00:11:05.680 could produce yet more benefit of the kind
00:11:09.300 that even Assange would like us to have,
00:11:11.820 I don't know.
00:11:13.920 I don't know.
00:11:14.940 It would be,
00:11:15.240 it would be real interesting
00:11:17.200 to have the trial.
00:11:20.820 I understand the argument for just pardoning him.
00:11:23.780 I totally get it.
00:11:24.720 But I feel like there's more benefit the country could get
00:11:28.940 by just knowing more about this situation.
00:11:32.540 All right.
00:11:35.020 The Iran protests,
00:11:37.100 apparently the New York Times is reporting fake news on this.
00:11:42.220 So Iran has this thing called the Morality Police Unit.
00:11:47.200 I guess they're the ones that make sure the women are wearing their
00:11:49.740 proper headgear and stuff.
00:11:51.960 And they were the sparking point that caused the protests
00:11:58.020 because they arrested some woman and she died in custody,
00:12:01.900 I think is the story.
00:12:03.420 So the protests are raging.
00:12:05.940 And they started with, you know,
00:12:10.720 being against the morality of police,
00:12:12.620 but I think it generalized now to the regime.
00:12:15.540 So now the protests are as much about the regime
00:12:18.860 as they are about this specific problem.
00:12:20.700 So what the regime is doing, wisely,
00:12:24.780 is they're saying,
00:12:25.700 oh, oh, now that it's about the entire regime,
00:12:28.760 maybe we could go back and solve that little problem
00:12:30.960 you were complaining about.
00:12:32.480 You know that morality police problem?
00:12:34.200 Let's get back to that.
00:12:35.900 Let's not overthrow the whole regime.
00:12:39.580 Let's just go back and talk to that thing.
00:12:41.360 We weren't going to talk about it before,
00:12:42.960 but maybe we can talk about it now
00:12:44.940 because you seem to be protesting so much.
00:12:46.860 So Iran apparently said they would sort of reconsider
00:12:50.660 or they've suggested that they're going to put this morality police unit on pause
00:12:57.300 or suspend them.
00:12:59.280 The New York Times, I guess, reported that like it's true.
00:13:02.220 And I think people who are a little more wise about Iran,
00:13:08.880 including the Washington Post,
00:13:10.540 are reporting it as something they said they'd do,
00:13:13.660 which is a big difference.
00:13:16.760 Something they say they'll do versus something they'll do.
00:13:20.240 Big difference.
00:13:21.760 Big difference.
00:13:23.060 So we'll see if that works out.
00:13:24.460 But it looks like the regime is pretty worried about their survival
00:13:30.080 or they wouldn't be giving up anything, would they?
00:13:34.540 It seems like the regime is pretty hard-nosed about this morality stuff.
00:13:41.360 I mean, it's pretty central to the whole theme.
00:13:44.080 I can't see that they would back down on the morality stuff
00:13:47.080 unless they genuinely thought the entire regime could be overthrown.
00:13:53.220 So we might be closer to a total overthrow.
00:13:56.780 Now, here's the most interesting story.
00:13:58.920 It's a few days old, but maybe you didn't see it.
00:14:02.160 When Iran played the U.S. in the World Cup a few days ago
00:14:07.380 and the U.S. won,
00:14:11.420 the Iranian protesters cheered the defeat of their own team
00:14:16.260 because they viewed their team as really being a tool of the regime,
00:14:21.000 not really a team for the people of Iran.
00:14:24.860 And so they were actually chanting and cheering
00:14:27.760 America's victory over an Iranian soccer team.
00:14:32.300 Did you ever think you'd see that?
00:14:34.120 And you know how the Iranians like to do the death to chant?
00:14:45.320 You know, death to America, death to Israel.
00:14:48.400 You know, it's always death to somebody.
00:14:50.580 Everybody's chanting, death to somebody.
00:14:52.900 And I just had this image in my head
00:14:54.880 of the Iranian protesters chanting,
00:14:58.060 death to ourselves, death to ourselves, yay America.
00:15:02.960 And I thought, who predicted that?
00:15:07.960 Like, show me somebody who predicted
00:15:09.580 that the Iranian population would be chanting
00:15:12.420 death to ourselves this time.
00:15:14.660 They didn't literally say that,
00:15:16.480 but, you know, there was a sense of it.
00:15:20.000 Could it be that the combined actions of, you know,
00:15:29.880 the several administrations that have had this weird
00:15:33.540 hands-on, hands-off approach to Iran,
00:15:36.700 could it be that it worked?
00:15:39.160 Could it be that the young population of Iran
00:15:43.140 is so pro-American that, you know, giving the,
00:15:48.560 even though the sanctions are a hard touch,
00:15:51.340 I think the Iranian people came to see it
00:15:53.200 as a war against their regime and not against them.
00:15:57.840 I don't know how that could have been better.
00:16:00.760 Name me any American outcome,
00:16:02.600 or even Israeli outcome.
00:16:04.960 Name me a better outcome
00:16:06.240 than the citizens protesting their own government
00:16:09.460 and being in favor of the United States.
00:16:12.380 I mean, at least, you know,
00:16:14.240 specifically in terms of the World Cup.
00:16:17.460 But you have to think that generalizes, right?
00:16:20.140 They're not going to do that
00:16:21.100 unless they've got some general good feeling
00:16:22.780 about the United States.
00:16:23.900 And the Iranian protesters are directly asking
00:16:26.940 for U.S. support for the protests.
00:16:30.360 So maybe, maybe just staying the course,
00:16:36.240 putting that consistent pressure on Iran,
00:16:38.700 but not going too far,
00:16:41.960 could it be that it was exactly the right play?
00:16:46.040 I mean, when was the last time you ever heard me say that?
00:16:48.800 And could it be that it was exactly the right play
00:16:51.420 across multiple administrations,
00:16:56.560 you know, both Democrat and Republican?
00:16:58.540 Because I don't think it hurt
00:16:59.940 to have a little Trump tough love
00:17:02.860 in between some Democrat administrations.
00:17:08.280 I don't know.
00:17:09.020 It looks like it might have been exactly the only thing
00:17:11.960 that could have got us to this place
00:17:14.320 that maybe is a better place.
00:17:16.420 Maybe.
00:17:18.000 You never know.
00:17:20.440 All right.
00:17:22.240 There's news out of China
00:17:23.760 that manufacturing orders
00:17:26.560 for Chinese manufacturers are down 40%.
00:17:29.860 Now, this is widely reported.
00:17:34.340 Manufacturing orders down 40%.
00:17:36.600 I'm going to call bullshit on that.
00:17:40.100 Yeah, that sounds like bullshit.
00:17:42.180 And here's why.
00:17:44.800 And see if anybody has the same background for this.
00:17:49.140 Number one, remember, I've got a degree in economics
00:17:52.240 and, you know, an MBA.
00:17:54.220 So on economic stuff, I'm a little bit better than guessing.
00:17:58.420 You know, there's a lot of domains
00:17:59.900 in which I'm pretty much guessing.
00:18:01.860 But in economics, I'm not totally guessing.
00:18:04.620 I've got a little bit of insight there.
00:18:06.260 And I also did a lot of financial reporting for years
00:18:10.420 in my corporate life.
00:18:11.340 So you get this, you get a feel about numbers
00:18:14.060 that usually you can call bullshit on stuff
00:18:17.740 before you know why.
00:18:20.100 Like, you can just look at it and say,
00:18:21.500 no, I don't know why this number is wrong,
00:18:25.080 but that number isn't wrong.
00:18:26.720 I mean, that number is wrong.
00:18:27.640 So I saw two different numbers.
00:18:30.260 One is that something like there's a 21% decrease
00:18:35.360 in U.S. orders or something.
00:18:38.520 And I would believe a 20-something decrease
00:18:41.220 in business in China.
00:18:43.580 That feels about right.
00:18:45.180 But a 40% collapse?
00:18:47.760 If this were true, and the manufacturing orders
00:18:52.180 for all of China's manufacturing,
00:18:54.960 if that were down 40% suddenly,
00:18:58.780 there wouldn't be any other news.
00:19:01.700 There wouldn't be any other news.
00:19:03.860 We would be in a global depression
00:19:05.600 at a scale we've never seen ever.
00:19:10.260 Am I right?
00:19:11.340 Is there anybody who knows enough about economics
00:19:13.640 and, you know, they have enough, let's say,
00:19:17.160 facility with numbers
00:19:18.880 that that 40% number seems obviously wrong?
00:19:23.360 Wouldn't you agree?
00:19:24.000 To me, that seems obviously wrong.
00:19:26.500 Like, way wrong.
00:19:28.260 Now, my...
00:19:29.940 So I'll make another...
00:19:31.620 Here's another very outside the mainstream...
00:19:36.240 Outside the mainstream prediction.
00:19:40.080 My prediction is just the data's wrong.
00:19:43.020 That's all.
00:19:44.180 My prediction is the data's wrong.
00:19:46.180 Now, apparently, there are multiple entities
00:19:48.720 who are confirming it.
00:19:51.180 You know, it's not just one company with data.
00:19:52.940 But that's not good enough for me.
00:19:55.780 That number is so obviously wrong.
00:19:59.040 It's just really obviously wrong
00:20:00.340 that there's something that might be affecting
00:20:03.800 all of the reporting.
00:20:05.800 I don't know what it is.
00:20:08.000 But there's no way 40% is true.
00:20:11.320 Would you buy the...
00:20:12.880 Do you buy the premise
00:20:14.200 that if that number were true,
00:20:16.960 the whole world economy
00:20:19.140 would have collapsed by now?
00:20:20.640 It just can't happen that fast.
00:20:25.060 So, I mean, we're going by feel here, right?
00:20:29.920 We're just going by experience and by feel.
00:20:33.900 My experience and my feel
00:20:36.440 is that number can't be real.
00:20:38.600 Can't be.
00:20:39.520 I wouldn't want it to be real.
00:20:41.220 I want us to pull out of China
00:20:43.220 as quickly as possible,
00:20:44.460 but that's faster than possible
00:20:46.600 in my opinion.
00:20:48.920 So, I don't want to go that fast.
00:20:50.680 I just want to make sure it's happening.
00:20:53.380 All right.
00:20:58.180 Yesterday, midday,
00:21:00.300 I was already commenting to friends
00:21:02.780 that Ye was being too quiet
00:21:05.340 and I didn't think we could go a day
00:21:07.920 without some new controversy.
00:21:11.380 Well, he did not disappoint.
00:21:14.480 So, by the end of yesterday,
00:21:16.600 Ye had used Instagram
00:21:17.600 to post a lengthy message
00:21:19.580 in which he said,
00:21:20.960 I'm sort of paraphrasing it here,
00:21:22.660 but he said,
00:21:23.820 am I the only one...
00:21:25.120 This is Ye talking about Elon Musk.
00:21:27.260 He goes,
00:21:27.720 am I the only one who thinks
00:21:28.940 Elon could be half Chinese?
00:21:30.840 That's the first sentence.
00:21:37.860 His first sentence.
00:21:39.520 Am I the only one who thinks
00:21:40.580 Elon could be half Chinese?
00:21:42.740 Now,
00:21:45.020 is he insulting China
00:21:47.420 or Chinese people?
00:21:50.180 Or is he complimenting them?
00:21:51.700 Let's finish.
00:21:53.660 All right.
00:21:54.260 He goes,
00:21:54.600 am I the only one who thinks
00:21:55.600 Elon could be half Chinese?
00:21:57.660 Have you ever seen his pictures
00:22:00.220 as a child?
00:22:01.960 Now,
00:22:02.380 I have not seen his pictures
00:22:03.340 as a child,
00:22:04.120 but I don't know.
00:22:05.300 Does he look Chinese?
00:22:08.060 Which is hilarious
00:22:09.020 because it's going to make me
00:22:10.400 look for his pictures
00:22:11.380 as a child.
00:22:12.640 You know I'm going to look
00:22:13.460 for those pictures now, right?
00:22:14.980 I have to see it
00:22:15.900 because I, you know,
00:22:17.400 I just have to see it now.
00:22:19.760 And then Ye goes on,
00:22:23.080 take a Chinese genius
00:22:24.600 and meet them
00:22:25.820 with a South African supermodel
00:22:30.000 and we have an Elon.
00:22:31.640 And then,
00:22:32.020 if this wasn't bad enough,
00:22:34.400 he goes,
00:22:34.960 they probably made
00:22:35.760 10 to 30 Elons
00:22:36.960 and he's the first
00:22:38.180 genetic hybrid that stuck.
00:22:41.360 And then he goes,
00:22:42.160 well,
00:22:42.380 let's not forget about Obama.
00:22:43.920 And then he makes
00:22:44.320 an Obama joke,
00:22:45.740 like Obama was a hybrid as well.
00:22:50.660 Now,
00:22:51.600 do you see the pattern?
00:22:55.140 Do you notice the pattern yet?
00:22:56.380 So this was the audio
00:22:58.760 that's from my earlier presentation
00:23:01.400 where I talked about
00:23:02.640 the pattern of Ye is picking,
00:23:04.860 you know,
00:23:05.200 whatever is the most outrageous
00:23:07.180 thing you could say
00:23:08.200 and then saying it,
00:23:10.020 but he's also consistently
00:23:11.380 doing the following thing.
00:23:13.600 He's consistently treating groups
00:23:16.100 like you can talk about them
00:23:18.340 as if they're one group
00:23:19.460 without the obvious knowledge
00:23:21.620 that all the individuals
00:23:23.260 are individual.
00:23:23.940 Now,
00:23:25.400 he just did it again
00:23:26.320 and he just slipped it through
00:23:28.300 because you're thinking
00:23:29.520 about Elon
00:23:30.060 and you're thinking,
00:23:32.080 basically,
00:23:33.000 you're thinking
00:23:33.600 about the accusation
00:23:36.660 that Elon
00:23:37.180 might be half Chinese.
00:23:39.580 But you also say to yourself,
00:23:41.240 wait a minute,
00:23:42.380 did he just say that
00:23:43.600 Chinese people
00:23:45.480 are smarter than average?
00:23:48.520 He didn't really say that,
00:23:50.120 but, you know,
00:23:52.700 the sort of Chinese genius
00:23:54.540 reference
00:23:56.280 kind of said it,
00:23:57.680 kind of said it.
00:23:59.040 Now,
00:23:59.620 what do we do
00:24:00.440 with the fact
00:24:00.960 that once again
00:24:02.060 he has created
00:24:03.960 a group of people
00:24:04.960 like they're one entity,
00:24:07.180 you know,
00:24:07.380 like the Chinese people
00:24:09.180 are extra smart?
00:24:10.520 What do you do with that?
00:24:14.500 Yeah.
00:24:15.760 He's basically
00:24:16.600 already pushed you
00:24:17.420 past the Overton window.
00:24:18.700 He just treated
00:24:21.140 an entire group
00:24:22.200 of people
00:24:22.600 like they were one thing
00:24:23.700 and you didn't even notice.
00:24:27.240 Because he's done it
00:24:28.240 to such an extent
00:24:29.380 that now he can just
00:24:31.340 make you think past it
00:24:32.920 and you don't even notice.
00:24:34.880 You don't even notice.
00:24:36.040 If you don't think
00:24:36.820 he's making his point,
00:24:38.660 you're missing the show.
00:24:40.560 He's totally making his point
00:24:42.040 that he can say
00:24:43.700 whatever the fuck he wants
00:24:44.940 as long as
00:24:46.300 he's willing to pay for it.
00:24:48.700 And he's willing
00:24:49.920 to pay for it
00:24:50.560 with his whole career.
00:24:53.200 So as long as
00:24:53.920 he's willing to pay for it,
00:24:54.900 he gets to be
00:24:55.500 the only person
00:24:56.120 with free speech.
00:24:57.220 And he's going
00:24:58.020 right at it.
00:24:59.240 He's going right
00:24:59.960 at the third rail
00:25:00.940 and he's just standing
00:25:01.700 on it
00:25:02.160 and just like daring you
00:25:03.800 to question him
00:25:05.160 because he's ready
00:25:06.040 for the fight.
00:25:07.420 And he actually
00:25:08.320 has the
00:25:08.920 high ground argument
00:25:11.060 in a weird way.
00:25:12.360 It seems
00:25:13.080 completely opposite.
00:25:14.060 But he has somehow
00:25:15.940 completely unexpectedly
00:25:18.540 taken the high ground,
00:25:20.080 which is,
00:25:21.280 if it's freedom of speech,
00:25:22.520 he can say unpleasant things
00:25:24.100 if he believes them.
00:25:28.420 Nobody's ever accused him
00:25:29.560 of lying.
00:25:30.600 Has anybody ever accused
00:25:31.660 Ye of lying?
00:25:33.060 We just think
00:25:33.840 we don't like his opinion.
00:25:35.460 It's a little dangerous.
00:25:37.640 But so is a lot of free speech.
00:25:39.540 Pretty dangerous.
00:25:40.140 Now what did Elon say
00:25:42.040 in response to
00:25:43.820 you know,
00:25:46.600 being accused,
00:25:47.700 maybe accused
00:25:48.640 is even the wrong word
00:25:49.600 because that's like
00:25:50.140 a biased word.
00:25:51.360 Let's say the suggestion
00:25:52.600 that he's half Chinese.
00:25:55.680 Well, Elon
00:25:56.440 is,
00:25:58.820 I don't know if you've noticed,
00:25:59.860 he's smart.
00:26:01.660 So what did Elon say
00:26:02.920 when he was accused
00:26:03.820 of being,
00:26:05.180 maybe,
00:26:05.560 accused is the wrong word
00:26:06.760 because then it sounds like
00:26:08.080 I'm saying that
00:26:08.780 that's somehow a bad thing.
00:26:10.140 And that's the opposite
00:26:10.960 of the point.
00:26:12.860 The point being
00:26:13.780 it would be potentially
00:26:14.700 a good thing.
00:26:16.280 So the suggestion,
00:26:18.740 he took it as a compliment.
00:26:20.560 He said,
00:26:21.320 I take it as a compliment.
00:26:23.240 Is that the right answer?
00:26:25.520 Oh, yes.
00:26:27.520 Oh, yes.
00:26:29.340 You know,
00:26:29.680 the thing about,
00:26:30.720 the thing about Musk
00:26:32.560 that you don't notice,
00:26:33.740 here's more dog not barking stuff.
00:26:37.200 You know,
00:26:37.600 you always notice
00:26:38.560 when anybody makes a mistake.
00:26:39.940 Right?
00:26:41.080 It just stands out.
00:26:42.540 The thing you don't notice
00:26:43.800 about Musk
00:26:44.620 is how often
00:26:46.000 he doesn't make a mistake.
00:26:48.400 Like where other people
00:26:49.580 could have easily
00:26:50.160 stepped in it
00:26:50.940 and he'll just,
00:26:52.800 boop,
00:26:53.340 just misses it.
00:26:54.640 That when you watch
00:26:55.460 how often he doesn't
00:26:56.560 make a mistake
00:26:57.260 that a smart person
00:26:58.760 could have made,
00:27:00.220 that's where
00:27:00.840 his real genius is.
00:27:02.260 A little of that
00:27:02.660 in engineering,
00:27:03.380 I guess.
00:27:04.320 But,
00:27:05.380 like,
00:27:05.900 when you notice
00:27:06.740 how deftly
00:27:08.200 he handles these things,
00:27:11.140 like the way
00:27:11.580 he handled
00:27:11.980 his ex-wife's comment
00:27:13.440 with just the little,
00:27:14.880 the little,
00:27:16.360 the little weird icon,
00:27:18.720 it was just perfect.
00:27:20.120 All right.
00:27:20.240 Again,
00:27:21.140 there were a hundred ways
00:27:22.520 he could have done
00:27:23.080 that wrong.
00:27:24.340 But he picked
00:27:25.060 the one way
00:27:25.620 that wasn't wrong.
00:27:27.020 And he does that
00:27:27.600 fairly consistently.
00:27:29.800 All right.
00:27:35.640 Were you aware,
00:27:37.640 all right,
00:27:37.820 how many of you
00:27:38.340 know the following facts?
00:27:39.680 Tell me,
00:27:40.380 who owns
00:27:40.880 the New York Times?
00:27:41.880 Go.
00:27:42.100 Who owns
00:27:43.820 the New York Times?
00:27:44.620 This is a test
00:27:45.280 of general knowledge.
00:27:48.880 Big family,
00:27:49.980 some people saying
00:27:50.660 a big family.
00:27:51.820 Somebody says Murdoch.
00:27:53.320 Somebody says Mexico.
00:27:55.880 Somebody says
00:27:56.520 Carlos Sims.
00:27:58.460 The answer is
00:28:00.100 not Bezos,
00:28:02.420 not Salzberger.
00:28:03.780 The answer is
00:28:04.560 the richest guy
00:28:05.620 in Mexico,
00:28:07.920 Carlos Sims.
00:28:09.740 Now,
00:28:10.000 does he own
00:28:10.440 all of it?
00:28:11.720 Or,
00:28:12.360 like,
00:28:12.640 a controlling
00:28:13.540 interest?
00:28:14.500 Yeah.
00:28:17.700 Carlos Sims.
00:28:18.680 If you haven't
00:28:19.240 heard of him,
00:28:19.720 he's like
00:28:20.140 the big billionaire,
00:28:21.640 one of the richest
00:28:22.220 guys in the world.
00:28:23.600 And he owns it.
00:28:25.680 Now,
00:28:26.180 that's the first question.
00:28:27.280 Carlos Sims
00:28:27.920 owns the New York Times.
00:28:29.340 Okay.
00:28:30.660 Now,
00:28:31.200 I want to see
00:28:31.760 if you can answer
00:28:33.000 the second question.
00:28:34.580 Who was
00:28:35.360 a famous,
00:28:36.220 notable investor
00:28:37.220 who invested
00:28:39.160 in Hunter Biden's
00:28:40.780 entities?
00:28:44.500 Carlos Sims.
00:28:45.680 Yeah,
00:28:45.840 same guy.
00:28:47.300 So,
00:28:48.020 the same billionaire
00:28:48.800 who owns
00:28:49.520 the New York Times,
00:28:51.180 which is very
00:28:51.860 friendly to Biden,
00:28:53.920 is also investing
00:28:55.080 in
00:28:56.580 Hunter Biden's
00:28:58.060 entities.
00:29:00.880 Now,
00:29:01.460 can you give me
00:29:02.200 a fact check on that?
00:29:03.100 I just saw it
00:29:03.640 on social media
00:29:04.380 yesterday,
00:29:05.080 and other people
00:29:05.700 were mocking me
00:29:06.560 for not already
00:29:07.160 knowing that.
00:29:08.220 But somehow,
00:29:08.740 I missed that.
00:29:09.680 That somehow
00:29:10.220 had slipped by.
00:29:10.880 Yeah.
00:29:12.600 And now,
00:29:13.780 tell me,
00:29:14.140 how many of you
00:29:14.640 did not know
00:29:15.560 of that connection?
00:29:18.520 Tell me if you
00:29:19.260 didn't know
00:29:19.920 of both of the
00:29:21.160 connections,
00:29:22.300 that the same
00:29:23.400 person owned
00:29:23.900 the New York Times
00:29:24.840 as invested
00:29:25.520 with Hunter.
00:29:27.920 Some of you knew,
00:29:29.380 but only a few.
00:29:31.480 Right.
00:29:32.340 And somehow,
00:29:33.100 that slipped by me.
00:29:34.080 Now,
00:29:37.460 here's what,
00:29:38.360 here's what you
00:29:39.640 can learn
00:29:40.140 when you spend
00:29:40.800 a little time
00:29:41.260 behind the curtain,
00:29:42.260 as I like to call it.
00:29:43.740 When you see the news,
00:29:45.300 the news reports
00:29:46.300 what happened,
00:29:47.860 generally.
00:29:48.500 They try a little bit
00:29:49.660 to tell you why,
00:29:50.500 but they don't get
00:29:51.100 that right.
00:29:52.340 But sometimes,
00:29:53.440 they can at least
00:29:53.940 get right what happened.
00:29:56.540 Right.
00:29:56.840 The part that they
00:29:57.740 never report
00:29:58.420 is the personal
00:30:00.740 connections
00:30:01.360 and the business
00:30:02.220 connections below
00:30:03.140 the hood.
00:30:04.080 So if you open up
00:30:05.120 the hood,
00:30:06.320 everything is a
00:30:07.020 different story
00:30:07.660 because you can see,
00:30:09.040 oh, this billionaire
00:30:09.940 runs this politician
00:30:12.240 but also owns
00:30:14.420 this entity
00:30:15.060 and then this politician
00:30:17.000 said something good
00:30:18.220 about this entity.
00:30:19.600 Oh, now it makes sense.
00:30:20.940 It's just the politician
00:30:21.880 working for their boss.
00:30:24.140 Right.
00:30:24.380 So it's a whole
00:30:24.980 different picture
00:30:25.700 as soon as you know
00:30:26.460 who knows who
00:30:27.200 and who's investing
00:30:28.660 with who
00:30:29.220 and all that stuff.
00:30:30.140 Completely different.
00:30:31.540 Everything you think
00:30:32.480 you know about the news,
00:30:33.380 it's all wrong
00:30:34.500 because of that.
00:30:36.840 Because you don't know
00:30:37.940 who knows who
00:30:38.740 and who just had coffee
00:30:40.540 with who
00:30:41.020 and who's talking about
00:30:42.440 starting a business
00:30:43.240 with whom
00:30:43.940 and all that.
00:30:45.240 Who's married to him.
00:30:46.180 Yeah, marriages.
00:30:47.300 Marriages are a big part of it.
00:30:50.760 Wall Street Journal,
00:30:51.920 to its credit,
00:30:52.820 did report on the
00:30:54.260 Hunter laptop story
00:30:55.500 both before the election.
00:30:57.340 credit.
00:30:58.860 All right.
00:31:00.580 Wall Street Journal.
00:31:02.160 Give them a little
00:31:03.100 sitting ovation.
00:31:05.420 Credit where credit is due.
00:31:07.480 We spend a lot of time
00:31:09.700 on this live stream
00:31:10.700 mocking the news organizations.
00:31:14.180 But in this case,
00:31:15.620 the Wall Street Journal
00:31:16.380 got the correct story
00:31:17.840 at the correct time.
00:31:19.520 They were not bamboozled
00:31:20.720 and they did not try
00:31:21.640 to fool the country.
00:31:23.240 So, good on them.
00:31:25.480 My highest praise
00:31:27.340 for just doing the job
00:31:29.000 they're supposed to do.
00:31:30.420 And now,
00:31:32.300 open up the hood.
00:31:35.260 Oh, wait.
00:31:37.040 What's under the hood?
00:31:39.440 Murdoch.
00:31:41.060 Yeah.
00:31:41.640 So, it's really a story
00:31:42.440 about one guy.
00:31:44.340 Right?
00:31:44.740 It's not really a story
00:31:45.840 about the Wall Street Journal.
00:31:47.400 It's a story
00:31:48.180 about one guy
00:31:48.900 who allowed his publication
00:31:50.260 to write that story.
00:31:51.160 That's about
00:31:53.120 basically what happened.
00:31:54.740 But,
00:31:55.160 to his credit,
00:31:56.360 he did allow
00:31:56.920 real news to come out.
00:31:58.500 So, that was a service
00:31:59.660 to the country.
00:32:03.140 And here's something
00:32:04.220 that the Wall Street Journal
00:32:05.460 reported.
00:32:06.160 So, this is the editorial board.
00:32:08.080 And they said,
00:32:09.040 we now know,
00:32:10.520 and know is the key word.
00:32:12.960 So, this is the Wall Street Journal
00:32:14.280 saying,
00:32:14.820 we know this.
00:32:16.140 We're not guessing.
00:32:17.920 We now know
00:32:18.900 that the Clapper-Brennan claims
00:32:21.240 were themselves disinformation
00:32:23.100 and that the laptop
00:32:24.640 was genuine
00:32:25.360 and not part of
00:32:26.640 Russian operation.
00:32:28.400 CBS News recently
00:32:29.960 waddled in
00:32:31.960 two years later
00:32:32.920 with a forensic analysis
00:32:34.620 of its own
00:32:35.380 and concluded
00:32:36.120 it is real.
00:32:37.520 Now,
00:32:37.960 I would like to give
00:32:39.060 a second sitting ovation
00:32:40.500 to the board
00:32:43.740 at Wall Street Journal
00:32:45.200 for using the word
00:32:46.500 waddled
00:32:47.180 in the sentence.
00:32:49.100 CBS News
00:32:50.160 recently waddled
00:32:51.820 in two years.
00:32:53.380 Okay,
00:32:53.960 that was the right word.
00:32:55.580 Now,
00:32:56.800 let me give you
00:32:57.500 a little writing tip.
00:32:59.140 This is in my mind
00:33:00.480 because I was
00:33:01.360 just writing this tip
00:33:02.760 somewhere else
00:33:03.360 in my upcoming book.
00:33:06.020 One of the filters
00:33:07.500 I use for writing
00:33:08.560 is that
00:33:09.980 if I'm writing
00:33:10.580 a humor piece,
00:33:11.760 first I'll write it
00:33:12.840 just to say
00:33:13.760 what I want to say
00:33:14.560 and then I'll go back
00:33:16.120 and I'll change
00:33:16.700 the words
00:33:17.260 into the funny versions.
00:33:19.440 So,
00:33:20.040 here you can see
00:33:21.040 instead of
00:33:21.760 CBS News
00:33:23.680 recently
00:33:25.320 entered the conversation,
00:33:28.800 waddled
00:33:29.400 would be the word
00:33:30.500 that you put in
00:33:31.040 if you're trying
00:33:31.480 to make that funny.
00:33:33.220 Now,
00:33:33.680 here's the second part.
00:33:35.000 If you're not writing
00:33:35.980 a humor piece,
00:33:37.140 and this was not
00:33:37.780 a humor piece,
00:33:39.580 use just one
00:33:41.320 in like maybe
00:33:43.860 the page,
00:33:44.920 just one funny word
00:33:46.660 that sticks out
00:33:48.140 like waddled
00:33:48.780 will make your brain
00:33:51.100 spend a whole bunch
00:33:51.900 more time
00:33:52.640 on their point.
00:33:54.580 So,
00:33:54.960 that one word
00:33:55.880 bought them
00:33:57.500 40% more attention.
00:34:01.400 One word.
00:34:02.520 And so,
00:34:02.840 that's my writing tip
00:34:03.640 for you.
00:34:04.820 If you're writing humor,
00:34:06.760 go back and look
00:34:07.460 for all the funny words
00:34:08.540 you could replace
00:34:09.340 with your ordinary words.
00:34:10.860 If you're writing
00:34:11.780 for a serious point,
00:34:12.920 one good waddle,
00:34:15.520 you know,
00:34:15.740 one good interesting word
00:34:17.080 can really
00:34:18.640 put a little
00:34:19.940 flavor
00:34:20.760 on a good point
00:34:21.900 without detracting
00:34:23.300 it into humor.
00:34:24.620 So,
00:34:25.320 very good tip there.
00:34:26.720 And also,
00:34:28.040 by the way,
00:34:28.740 if you'd like to become
00:34:29.640 a better writer,
00:34:30.960 you should read
00:34:31.820 good writers.
00:34:33.740 The Wall Street Journal
00:34:34.500 is very famous.
00:34:36.080 Very famous.
00:34:37.540 Damn it.
00:34:39.300 They're famous.
00:34:40.460 Get rid of the very.
00:34:41.280 The Wall Street Journal
00:34:42.860 is famous
00:34:44.120 for good writing.
00:34:46.140 If you want to learn
00:34:46.800 how to write well,
00:34:48.320 read the Wall Street Journal.
00:34:50.500 All right.
00:34:53.160 But,
00:34:53.880 here's my question.
00:34:56.000 How do Clapper
00:34:56.800 and Brennan
00:34:57.500 ever show their face
00:34:58.720 in public again
00:34:59.660 now that the
00:35:01.300 Twitter files
00:35:02.120 revelations
00:35:03.080 have come out?
00:35:04.660 How do they ever
00:35:05.860 appear in public again?
00:35:07.680 Now,
00:35:09.720 everything they did,
00:35:10.640 as far as I know,
00:35:11.600 is legal, right?
00:35:13.460 Because lying to the public
00:35:14.700 is legal.
00:35:17.040 You know,
00:35:17.400 it would be too much
00:35:18.480 of a problem
00:35:19.080 to arrest everybody
00:35:20.540 who lied.
00:35:21.160 So,
00:35:21.400 you can't really have
00:35:21.940 a law like that.
00:35:23.160 So,
00:35:23.640 it was probably
00:35:24.160 completely legal.
00:35:26.040 But,
00:35:26.600 can you think of
00:35:27.280 any examples
00:35:27.920 where the things
00:35:28.740 that would be legal
00:35:29.680 for every citizen
00:35:31.360 except somebody
00:35:32.900 who maybe
00:35:33.320 had some job?
00:35:34.260 That's a thing,
00:35:36.740 right?
00:35:37.600 Aren't there examples
00:35:38.720 where,
00:35:39.440 you know,
00:35:41.300 something is legal
00:35:42.540 for me,
00:35:44.040 but maybe it
00:35:44.620 wouldn't be legal
00:35:45.420 for an ex-person
00:35:48.260 of Congress?
00:35:49.680 You know,
00:35:49.900 there's talk about
00:35:51.300 limiting Congress
00:35:52.360 from,
00:35:53.580 let's say,
00:35:54.420 doing some kind
00:35:55.000 of lobbying
00:35:55.540 for a few years.
00:35:57.220 But,
00:35:57.600 that wouldn't apply
00:35:58.320 to the rest of us,
00:35:59.300 right?
00:35:59.620 It would be a law
00:36:00.280 just for a person
00:36:01.020 of a certain job.
00:36:02.680 Here's a law
00:36:03.440 I would like.
00:36:05.260 I think that
00:36:06.380 if leadership,
00:36:08.020 not rank and file,
00:36:09.520 but if leadership
00:36:10.460 of our intel agencies
00:36:13.260 run an intentional hoax,
00:36:15.460 even if they're
00:36:16.040 ex-leaders,
00:36:17.140 if they're ex-leaders too,
00:36:18.860 if they run
00:36:19.480 an intentional hoax
00:36:20.700 for the purpose
00:36:22.280 of changing
00:36:22.860 an election outcome,
00:36:24.460 that should be
00:36:25.260 the death sentence.
00:36:28.040 Now,
00:36:28.700 I'm not recommending
00:36:30.080 anything happen
00:36:31.860 to Clapper and Brennan.
00:36:33.060 As far as I know,
00:36:35.060 what they did
00:36:35.780 was legal.
00:36:37.080 And I'm a stickler
00:36:38.760 for the Constitution.
00:36:40.620 No matter how
00:36:41.380 disreputable it was,
00:36:43.200 no matter how awful
00:36:45.520 it was,
00:36:46.060 and it was as bad
00:36:46.940 as anything
00:36:47.340 I've ever seen,
00:36:48.480 it was legal.
00:36:50.360 And I'm never going
00:36:51.800 to leave that standard.
00:36:53.320 Right?
00:36:53.440 If it's legal,
00:36:54.480 you don't touch them.
00:36:56.060 But we really need
00:36:57.200 to reassess the law.
00:36:59.260 Let's make a law
00:37:00.200 that says if something
00:37:01.040 like this happens again,
00:37:02.080 it's the death sentence.
00:37:04.120 Because it's not
00:37:05.660 like regular people.
00:37:07.520 When your own
00:37:08.240 intel agencies
00:37:09.100 turn on your country,
00:37:11.040 that's got to be
00:37:11.720 the death penalty.
00:37:13.560 Right?
00:37:13.800 That's not
00:37:14.380 the Department
00:37:16.440 of Agriculture
00:37:17.380 lost some money.
00:37:19.340 That's not
00:37:20.180 the head of,
00:37:22.560 you know,
00:37:23.560 HUD
00:37:24.140 expense some things
00:37:27.560 he shouldn't have
00:37:28.140 expensed.
00:37:29.760 This is the worst
00:37:31.360 thing that anybody
00:37:32.140 could do.
00:37:33.160 Because our trust
00:37:35.100 in our system
00:37:35.840 is very importantly
00:37:38.900 connected to trust
00:37:40.020 in our intelligence
00:37:40.740 agencies.
00:37:42.000 You can't separate
00:37:42.820 that.
00:37:43.720 So having the heads,
00:37:45.500 not just people
00:37:46.320 who work for
00:37:46.960 intelligence,
00:37:47.960 but having the very
00:37:48.760 heads of them
00:37:49.500 run a hoax
00:37:51.460 against the American
00:37:52.300 people that changed,
00:37:53.500 probably,
00:37:54.140 had an effect
00:37:55.180 on the election
00:37:56.060 outcome,
00:37:57.260 that's as bad
00:37:58.240 as any kind
00:37:59.760 of treason
00:38:00.340 I can imagine.
00:38:01.700 It was short
00:38:02.180 of actual
00:38:02.640 physical violence.
00:38:05.380 Yeah,
00:38:05.680 it's a violation
00:38:06.580 of oath,
00:38:07.860 even if they're
00:38:08.540 out of office,
00:38:09.220 I think.
00:38:09.980 So,
00:38:10.520 let me ask you,
00:38:11.640 how many of you
00:38:12.620 would approve,
00:38:13.900 not retroactively,
00:38:15.760 that's no fair,
00:38:17.240 you can't punish
00:38:18.220 those two guys,
00:38:19.340 because they
00:38:19.660 followed the law,
00:38:20.900 unfortunately,
00:38:22.200 but how many
00:38:23.420 would agree
00:38:23.920 with making
00:38:24.420 that the death
00:38:25.100 penalty
00:38:25.480 if it happened
00:38:26.440 again?
00:38:27.660 Leadership only,
00:38:28.580 not rank and file.
00:38:30.980 Yeah.
00:38:33.540 I mean,
00:38:34.180 if treason,
00:38:35.820 treason is still
00:38:36.680 the death penalty,
00:38:37.340 right?
00:38:39.880 That hasn't
00:38:40.540 changed,
00:38:40.920 has it?
00:38:41.800 I mean,
00:38:42.640 if treason is
00:38:43.500 the death penalty,
00:38:44.380 how is this
00:38:44.920 different?
00:38:45.980 And to me,
00:38:46.420 it looks like
00:38:46.840 treason.
00:38:47.660 Now,
00:38:48.000 the difference,
00:38:48.380 of course,
00:38:48.800 is that it
00:38:49.380 wouldn't be
00:38:49.780 treason in
00:38:50.660 favor of a
00:38:51.360 foreign country,
00:38:52.980 but if it's
00:38:56.440 overthrowing
00:38:57.020 an election,
00:38:57.540 it's not that
00:38:58.160 different.
00:39:00.620 All right.
00:39:02.720 Did you know
00:39:03.600 that there
00:39:04.280 were two
00:39:05.320 Antifa groups
00:39:06.300 that had,
00:39:07.620 until recently,
00:39:08.340 been allowed
00:39:08.860 to operate
00:39:09.740 with abandon
00:39:11.300 on Twitter?
00:39:11.980 One of them
00:39:14.140 is the
00:39:14.740 Antifa
00:39:15.620 Feces Group
00:39:16.560 and the other
00:39:17.080 is the
00:39:17.420 Antifa
00:39:18.000 Urine Group.
00:39:19.820 I'm using
00:39:20.300 the technical
00:39:21.720 names.
00:39:22.820 And each of
00:39:23.580 them were
00:39:24.100 dedicated
00:39:24.660 toward getting
00:39:26.060 their people
00:39:26.760 to collect
00:39:27.780 their own
00:39:28.180 feces and
00:39:28.920 urine to
00:39:30.040 use in
00:39:31.160 protest and
00:39:31.820 throw at
00:39:32.240 people and
00:39:33.380 leave in
00:39:33.740 places.
00:39:35.660 So Elon
00:39:36.700 Musk,
00:39:37.160 I guess,
00:39:37.580 banned them.
00:39:40.780 So,
00:39:41.980 if you
00:39:43.360 wondered
00:39:43.680 what is
00:39:44.980 the purpose
00:39:45.620 of Antifa,
00:39:49.480 it's
00:39:50.100 apparently
00:39:50.780 to produce
00:39:51.640 so they're
00:39:53.700 not useless.
00:39:55.020 You know,
00:39:55.140 you think of
00:39:55.700 Antifa as
00:39:56.360 like not
00:39:56.780 part of
00:39:57.180 capitalism
00:39:57.740 because what
00:39:58.760 are they
00:39:59.020 producing?
00:40:00.340 Well,
00:40:00.640 they're not
00:40:00.900 producing
00:40:01.260 any products.
00:40:02.120 They don't
00:40:02.300 make any
00:40:02.700 iPhones.
00:40:04.160 But apparently
00:40:04.860 the Antifa
00:40:05.740 people,
00:40:06.180 they do eat
00:40:06.720 food that
00:40:07.420 was purchased
00:40:09.060 with the work
00:40:09.800 of other
00:40:10.180 people.
00:40:11.320 And then
00:40:11.560 when that
00:40:11.920 food that
00:40:12.360 was purchased
00:40:13.100 with the
00:40:13.700 work of
00:40:14.120 other people
00:40:14.640 are put
00:40:15.440 into their
00:40:15.840 bodies,
00:40:16.260 then they
00:40:16.500 can manufacture
00:40:17.260 it into
00:40:17.740 a valuable
00:40:18.500 component,
00:40:20.820 either Antifa
00:40:21.980 feces or
00:40:23.500 Antifa urine.
00:40:24.700 And those
00:40:25.160 two things are
00:40:25.840 valuable commodities
00:40:26.820 and protests
00:40:27.600 as part of the
00:40:28.520 democratic process.
00:40:29.840 So if you
00:40:30.300 thought that
00:40:30.740 Antifa was
00:40:31.540 worthless,
00:40:32.740 you're so
00:40:33.180 wrong.
00:40:33.960 They're like
00:40:34.300 little factories
00:40:35.020 for making
00:40:35.680 protest materials,
00:40:39.240 Antifa feces.
00:40:39.960 Now, if
00:40:40.900 you'd like
00:40:41.200 to get
00:40:41.440 a good
00:40:42.760 jar of
00:40:44.440 Antifa
00:40:44.900 feces for
00:40:45.640 your protest,
00:40:47.100 they don't
00:40:47.920 have a
00:40:48.180 website now
00:40:49.340 because they're
00:40:50.800 off Twitter.
00:40:51.500 Well, they
00:40:51.840 probably have a
00:40:52.240 website, so I
00:40:53.780 think you
00:40:54.060 could find
00:40:54.420 them.
00:40:55.200 So it
00:40:55.640 makes a
00:40:56.020 good gift.
00:40:57.220 If you're
00:40:57.560 looking for
00:40:57.960 a gift for
00:40:58.760 somebody in
00:40:59.300 your family
00:40:59.780 who leans
00:41:00.680 left, and
00:41:02.120 you're thinking,
00:41:02.560 oh, they
00:41:02.940 have everything,
00:41:03.820 what they
00:41:04.200 don't have
00:41:04.840 would be a
00:41:06.640 big jar of
00:41:07.320 Antifa
00:41:07.720 feces to
00:41:09.020 use at
00:41:09.360 their next
00:41:09.700 protest, and
00:41:10.900 maybe they
00:41:11.560 need that.
00:41:12.740 So that's
00:41:13.120 something you
00:41:13.400 could do.
00:41:14.640 Has anybody
00:41:15.300 noticed a
00:41:16.140 more racist
00:41:17.580 getting back
00:41:18.300 on Twitter?
00:41:21.080 I've noticed
00:41:21.840 it a little.
00:41:22.920 Have you?
00:41:24.780 Like, just a
00:41:25.460 little.
00:41:26.160 But it was
00:41:26.560 like, I felt
00:41:27.460 like it was
00:41:27.880 fleeting, and
00:41:29.300 hasn't really
00:41:30.000 affected my
00:41:31.120 consciousness
00:41:31.560 since then.
00:41:33.000 So what I
00:41:34.120 noticed when
00:41:34.880 Musk first
00:41:35.860 took over is
00:41:37.640 that there
00:41:37.960 were some
00:41:38.260 people who
00:41:38.720 wanted to
00:41:39.500 immediately
00:41:39.920 test the
00:41:40.800 limits.
00:41:42.220 So some
00:41:42.800 racist said
00:41:43.580 some racist
00:41:44.040 stuff.
00:41:46.300 Maybe they
00:41:47.000 got kicked
00:41:47.600 off, maybe
00:41:48.140 they didn't,
00:41:48.720 but maybe
00:41:49.060 they got
00:41:49.380 bored.
00:41:50.620 I don't
00:41:50.840 know.
00:41:51.680 It doesn't,
00:41:52.620 I would say
00:41:53.160 my overall
00:41:54.000 Twitter experience
00:41:55.000 is kinder and
00:41:56.380 gentler.
00:41:57.800 What would
00:41:58.200 you say?
00:41:59.480 But the
00:42:00.220 left might
00:42:00.700 say it's
00:42:01.120 because it's
00:42:01.940 people who
00:42:02.360 agree with
00:42:02.840 you that got
00:42:03.360 back on
00:42:03.780 there.
00:42:03.940 so far
00:42:06.400 Twitter does
00:42:08.280 not set my
00:42:09.120 stomach on
00:42:10.160 fire like
00:42:10.660 it used to
00:42:11.060 every single
00:42:11.660 day.
00:42:13.060 Like, my
00:42:13.720 brain would
00:42:14.320 just be on
00:42:14.760 fire from
00:42:15.520 something I
00:42:15.980 saw on
00:42:16.340 Twitter like
00:42:16.940 every day.
00:42:18.240 Usually it was
00:42:18.840 some Trump
00:42:19.500 related thing.
00:42:20.700 But that
00:42:21.120 hasn't happened
00:42:21.540 in a long
00:42:21.880 time.
00:42:27.720 Who has
00:42:28.440 blocked more
00:42:29.060 people, Sam
00:42:29.840 Harris or
00:42:30.340 Scott?
00:42:30.660 Do you
00:42:31.800 think that
00:42:32.380 you could
00:42:33.440 embarrass me
00:42:34.080 into blocking
00:42:34.780 fewer assholes?
00:42:36.980 Do you
00:42:37.420 think that
00:42:37.780 there's
00:42:38.000 something you
00:42:38.460 say here
00:42:38.920 about free
00:42:39.440 speech that
00:42:40.060 would make
00:42:40.860 me want to
00:42:41.480 listen to a
00:42:43.000 greater percentage
00:42:43.720 of assholes
00:42:44.360 compared to
00:42:45.720 nice people
00:42:46.340 who have
00:42:46.580 something useful
00:42:47.140 to say?
00:42:48.520 You know,
00:42:48.740 there's somebody
00:42:49.780 over on
00:42:50.220 YouTube who's
00:42:50.700 trying to
00:42:51.040 shame me into
00:42:51.800 unblocking
00:42:52.640 people that
00:42:53.760 I wanted to
00:42:54.740 block.
00:42:56.460 No, I
00:42:57.220 wanted to
00:42:57.560 block them.
00:42:58.640 And you
00:42:59.740 know that I
00:43:00.220 also block
00:43:00.720 my supporters,
00:43:01.560 right?
00:43:03.120 I block
00:43:03.820 my supporters
00:43:04.380 on Twitter
00:43:04.840 who say I'm
00:43:05.420 finally waking
00:43:06.080 up.
00:43:07.440 Because you're
00:43:07.980 not my
00:43:08.340 supporter.
00:43:11.060 That's the
00:43:11.620 last thing I
00:43:12.200 want to hear
00:43:12.540 from you.
00:43:13.640 So I do
00:43:14.120 block people
00:43:14.640 for saying
00:43:15.000 that.
00:43:15.200 I don't
00:43:15.460 block them
00:43:15.900 on locals
00:43:16.940 because they're
00:43:17.360 usually joking.
00:43:18.480 On locals,
00:43:19.120 I know you're
00:43:19.500 joking.
00:43:20.600 But on
00:43:21.180 Twitter, you're
00:43:22.000 not joking.
00:43:22.680 So I block
00:43:23.160 those.
00:43:23.920 And by the
00:43:24.280 way, if you
00:43:24.880 do joke about
00:43:25.600 it on Twitter,
00:43:26.120 I'm still going
00:43:26.500 to block you
00:43:27.060 because I can't
00:43:28.060 tell the
00:43:28.340 difference.
00:43:30.220 I was
00:43:39.700 looking at
00:43:40.040 the Twitter
00:43:40.600 terms of
00:43:41.200 service, and
00:43:43.380 I was
00:43:43.700 wondering
00:43:43.940 specifically
00:43:44.680 about Elon
00:43:47.120 Musk's
00:43:47.940 suspending
00:43:48.980 of yay.
00:43:50.660 Now, yay
00:43:51.380 got suspended
00:43:52.120 because of
00:43:52.800 that logo,
00:43:53.740 right?
00:43:54.160 It looked
00:43:55.700 like the
00:43:56.040 Star of
00:43:56.460 David with
00:43:57.100 a Nazi
00:43:57.860 logo on the
00:43:58.520 inside.
00:44:00.440 What do
00:44:00.940 you think
00:44:01.280 the logo
00:44:01.820 was supposed
00:44:02.500 to represent?
00:44:05.360 Interpret the
00:44:06.640 logo in your
00:44:07.480 opinion.
00:44:08.780 It was a
00:44:09.720 swastika.
00:44:10.680 Okay?
00:44:11.100 Anything else?
00:44:12.820 So you
00:44:13.180 saw some
00:44:14.180 violence,
00:44:14.700 maybe?
00:44:16.180 Maybe some
00:44:16.720 suggested
00:44:17.180 violence?
00:44:18.480 But what
00:44:18.800 about the
00:44:19.220 Star of
00:44:19.620 David part?
00:44:20.280 Was that
00:44:20.600 just to
00:44:21.120 further
00:44:21.700 insult the
00:44:22.500 Jewish
00:44:22.740 community by
00:44:24.100 doing the
00:44:24.840 most offensive
00:44:25.500 mashup you
00:44:26.200 could possibly
00:44:26.740 do?
00:44:26.960 Was it
00:44:27.960 art?
00:44:28.380 Some
00:44:28.540 people say
00:44:28.920 it's
00:44:29.040 art.
00:44:33.760 Here's
00:44:34.200 how I
00:44:34.600 interpreted
00:44:35.080 it.
00:44:36.060 Now, I'm
00:44:36.440 not saying
00:44:36.800 my
00:44:37.020 interpretation
00:44:37.480 is correct,
00:44:39.300 but that's
00:44:41.320 how art
00:44:41.780 works,
00:44:42.300 right?
00:44:42.860 If you
00:44:43.160 look at
00:44:43.440 art,
00:44:43.780 people
00:44:43.980 interpret
00:44:44.360 it
00:44:44.540 differently.
00:44:45.120 So I'll
00:44:45.380 tell you
00:44:45.620 my
00:44:45.800 interpretation.
00:44:47.100 When I
00:44:47.460 saw it,
00:44:47.760 if you
00:44:47.960 look at
00:44:48.180 the larger
00:44:48.580 context
00:44:49.140 of what
00:44:49.660 yay is
00:44:49.940 doing,
00:44:51.080 a Nazi
00:44:53.080 symbol
00:44:53.580 surrounded by
00:44:54.460 star of
00:44:55.080 David,
00:44:56.200 says to
00:44:57.140 me that
00:45:00.120 yay would
00:45:00.640 like to
00:45:01.020 bring
00:45:01.220 together the
00:45:02.620 people who
00:45:03.060 are least
00:45:03.480 likely to
00:45:04.080 ever come
00:45:04.500 together.
00:45:06.080 That's what
00:45:06.640 it looked
00:45:06.860 like to
00:45:07.140 me.
00:45:08.080 Now, you
00:45:08.380 might say,
00:45:08.800 Scott, that's
00:45:09.420 the stupidest.
00:45:10.820 But that's
00:45:11.180 what I, but
00:45:12.240 that was how I
00:45:12.860 interpreted it.
00:45:13.660 That was
00:45:14.080 actually my
00:45:14.700 first impression.
00:45:16.080 My first
00:45:16.940 impression was
00:45:17.660 it was the
00:45:18.140 opposite of
00:45:18.940 hate.
00:45:19.260 it was,
00:45:21.040 hey, even
00:45:22.920 the worst
00:45:23.600 people should
00:45:24.680 at least be
00:45:25.780 able to
00:45:26.040 come together.
00:45:27.260 And I'm
00:45:27.500 going to
00:45:27.640 take the
00:45:28.020 two most
00:45:28.840 objectionable
00:45:30.300 things, at
00:45:31.200 least together
00:45:31.740 that are
00:45:32.000 objectionable,
00:45:32.820 not individually
00:45:33.820 in one
00:45:34.840 case, and
00:45:38.420 I'm going to
00:45:38.800 make you
00:45:39.280 think about
00:45:40.060 could the
00:45:41.120 worst, least
00:45:42.600 likely people
00:45:43.480 ever find a
00:45:44.820 way to come
00:45:45.200 together.
00:45:46.180 Now, if you
00:45:46.840 say to
00:45:47.140 yourself,
00:45:47.500 well, Scott,
00:45:49.540 that couldn't
00:45:50.020 possibly be the
00:45:50.880 message, he
00:45:53.440 was sitting
00:45:53.880 next to
00:45:54.580 Nick Fuentes.
00:45:57.140 He's a
00:45:57.960 black rapper
00:45:58.720 who's partnered
00:46:00.340 with Nick
00:46:01.160 Fuentes.
00:46:02.560 He literally
00:46:03.460 is his
00:46:04.060 logo.
00:46:05.580 He's the
00:46:06.220 star of
00:46:06.640 David with
00:46:07.260 a swastika.
00:46:10.420 Now, you
00:46:12.140 know, in a
00:46:12.980 general sense,
00:46:14.460 the comparison
00:46:16.040 there is
00:46:16.580 unfair, I
00:46:17.240 know.
00:46:17.760 But you
00:46:18.260 see what I'm
00:46:18.640 doing.
00:46:20.260 So, if my
00:46:21.840 interpretation, my
00:46:23.360 interpretation was
00:46:24.440 that it was a
00:46:25.380 very challenging
00:46:28.280 way to say we
00:46:29.440 need to figure
00:46:29.960 out how not to
00:46:30.900 be enemies about
00:46:31.800 everything.
00:46:33.300 That's what it
00:46:34.000 looked like.
00:46:35.820 Now, Elon
00:46:36.420 Musk interpreted
00:46:37.260 it as a sign
00:46:38.320 of hate.
00:46:39.420 So he took
00:46:39.900 him off.
00:46:40.680 And so I
00:46:41.000 wondered, what
00:46:41.420 does the Twitter
00:46:43.200 terms of service
00:46:44.080 say?
00:46:44.380 Well, did you
00:46:45.580 know that the
00:46:46.920 terms of service
00:46:47.780 of Twitter
00:46:48.480 specifically
00:46:49.340 prohibit hate
00:46:51.520 symbols being in
00:46:53.000 your profile?
00:46:54.180 Did you know
00:46:54.660 that?
00:46:55.640 You can't put a
00:46:56.580 hate symbol in
00:46:57.140 your profile?
00:46:58.720 Now, it's not
00:46:59.780 prohibited in
00:47:01.700 your feed, which
00:47:03.280 makes sense.
00:47:04.140 Because you could
00:47:04.940 put a hate image
00:47:05.780 in your feed because
00:47:06.940 you're talking about
00:47:07.840 it.
00:47:08.140 See the point?
00:47:11.000 That doesn't mean
00:47:11.540 you're necessarily
00:47:12.020 supporting it because
00:47:13.080 other people tweeted
00:47:14.120 Ye's symbol and
00:47:15.900 they were not kicked
00:47:16.700 off because they
00:47:17.180 were talking about
00:47:17.880 it.
00:47:18.600 But if you put it in
00:47:19.380 your profile, you're
00:47:20.180 saying, this is me.
00:47:21.440 I associate with
00:47:22.600 this symbol.
00:47:23.380 So that's banned.
00:47:25.460 Now, did Ye put his
00:47:26.800 symbol in his profile?
00:47:29.420 I don't think he did.
00:47:31.320 Did he?
00:47:32.060 He may have tried.
00:47:34.320 No?
00:47:35.000 It was only in his
00:47:35.700 feed.
00:47:36.020 So which Twitter
00:47:38.400 terms of service
00:47:39.500 would apply to
00:47:40.560 that?
00:47:42.060 None.
00:47:43.400 That I'm aware
00:47:44.320 of.
00:47:45.320 Well, did it
00:47:46.540 incite violence?
00:47:48.480 Because inciting
00:47:49.560 violence is your
00:47:50.340 interpretation.
00:47:51.940 My interpretation
00:47:52.640 was it was a call
00:47:53.900 to unity, even
00:47:56.420 with the people who
00:47:57.220 were the least
00:47:57.840 likely ever to
00:47:58.780 find unity.
00:48:00.080 The number one
00:48:01.080 least likely group
00:48:02.080 to find unity.
00:48:03.720 That's what I
00:48:04.540 saw.
00:48:04.740 So why does
00:48:06.460 one person,
00:48:08.240 Elon Musk,
00:48:09.580 why does his
00:48:10.300 opinion of a
00:48:11.480 symbol that had
00:48:12.220 never existed in
00:48:13.480 that full mashup
00:48:14.460 form, well, I
00:48:15.780 guess it had
00:48:16.120 existed with the
00:48:16.860 railians, but
00:48:18.180 that's a different
00:48:18.640 story.
00:48:19.860 Why does one
00:48:20.840 person get to
00:48:21.480 decide that's a
00:48:22.160 hay symbol, where
00:48:23.500 if I had been the
00:48:24.180 head of Twitter, I
00:48:25.120 would have said,
00:48:25.640 no, no, no,
00:48:26.000 that's ambiguous,
00:48:27.460 and it's in the
00:48:28.040 feed, it's not in
00:48:28.880 the profile.
00:48:29.380 Well, so if it's
00:48:30.620 ambiguous, and it's
00:48:32.200 in the feed, terms of
00:48:34.340 service don't touch
00:48:35.060 it.
00:48:38.200 You say it's not
00:48:39.180 ambiguous?
00:48:44.280 I told you I
00:48:45.420 interpreted it a
00:48:46.840 different way.
00:48:48.120 How can you argue
00:48:49.140 it's not ambiguous?
00:48:50.820 Do you think I'm
00:48:51.420 lying to you?
00:48:52.720 Do you think that I
00:48:53.480 really didn't
00:48:53.980 interpret it that
00:48:54.680 way?
00:48:54.800 No, I actually
00:48:56.860 literally did.
00:48:59.560 So, you can't
00:49:00.460 argue that it's not
00:49:01.280 ambiguous, I just
00:49:02.260 told you I have a
00:49:02.900 different view of it
00:49:03.660 than you do, thus
00:49:04.980 proven ambiguous.
00:49:07.980 Now, you can say
00:49:08.760 that most people
00:49:09.360 don't see it as
00:49:10.000 ambiguous, and
00:49:10.880 that's why Musk
00:49:11.940 could ban it, and
00:49:12.780 everybody agreed
00:49:13.380 with him.
00:49:14.020 I agree with you
00:49:14.840 that most people
00:49:15.560 would not see that
00:49:16.300 as ambiguous.
00:49:17.180 I agree with
00:49:17.640 that.
00:49:18.820 But it is
00:49:19.500 ambiguous, because
00:49:20.820 I disagree, so
00:49:21.740 that's a fact.
00:49:24.800 Yeah, if you buy
00:49:25.360 Twitter, you get
00:49:25.960 to decide the
00:49:26.720 content.
00:49:27.520 All right, so
00:49:27.920 here's my bottom
00:49:28.800 line on this.
00:49:31.600 If you own
00:49:32.500 Twitter, you have
00:49:34.660 more than one
00:49:35.380 master you have to
00:49:36.340 serve.
00:49:37.920 One master is
00:49:38.920 Twitter.
00:49:39.880 It's a company, it
00:49:40.840 needs to run as
00:49:41.800 an entity that can
00:49:44.200 survive.
00:49:45.440 But you're also a
00:49:46.660 human being.
00:49:47.560 You're a citizen of
00:49:48.640 Earth and a citizen
00:49:49.560 of this country.
00:49:50.700 So you have a
00:49:52.200 different responsibility
00:49:53.160 that's even bigger
00:49:54.780 than your own
00:49:55.400 company, which
00:49:56.720 is to the
00:49:59.220 citizens.
00:50:01.360 I think Musk
00:50:02.620 made the right
00:50:03.500 citizen decision.
00:50:08.020 In other words,
00:50:08.940 he saw something
00:50:10.340 that could be
00:50:11.040 hate, that most
00:50:12.760 people interpreted
00:50:13.920 it as hate, and
00:50:15.380 he took an
00:50:16.060 anti-hate stand.
00:50:17.940 And as a
00:50:18.440 human, very
00:50:20.000 good.
00:50:21.220 Very good.
00:50:21.740 I support Elon
00:50:23.940 Musk as a human
00:50:24.960 being who took a
00:50:26.700 strong and fairly
00:50:28.580 rapid stand against
00:50:30.820 what looked like
00:50:31.820 hate.
00:50:33.760 As a human being,
00:50:35.760 A+.
00:50:36.220 As a protector of
00:50:39.420 free speech, a
00:50:41.260 little less clear.
00:50:44.120 So when I said the
00:50:45.300 other day that he
00:50:46.320 fucked up on this
00:50:47.960 decision, Elon, I
00:50:50.300 do think it's
00:50:50.960 wrong on a
00:50:52.860 freedom of speech
00:50:53.760 level.
00:50:56.060 How many
00:50:56.640 agree with my
00:50:57.360 take that he
00:50:58.000 was wrong on a
00:50:59.120 freedom of speech
00:50:59.960 level?
00:51:03.580 A number of
00:51:04.320 people agree.
00:51:06.120 All right.
00:51:08.120 Now, do you
00:51:08.780 think freedom of
00:51:09.460 speech in this
00:51:10.160 case should have
00:51:11.860 overridden what I
00:51:13.160 think is a
00:51:13.820 genuine good
00:51:15.900 intention to
00:51:16.920 keep people
00:51:17.460 safe?
00:51:18.880 Do you think
00:51:19.440 freedom of
00:51:19.880 speech should
00:51:20.260 have been the
00:51:21.020 higher standard?
00:51:24.760 Okay?
00:51:25.060 I think reasonable
00:51:25.820 people can say,
00:51:26.800 yes, it should be
00:51:27.380 the higher standard.
00:51:28.400 So I respect that
00:51:29.560 opinion, and I'm
00:51:31.800 not going to
00:51:32.340 disagree with it,
00:51:33.740 but I'm going to
00:51:34.640 say that I also
00:51:35.460 respect somebody
00:51:36.600 who would put
00:51:38.260 humanity above a
00:51:41.020 concept.
00:51:43.180 I respect anybody
00:51:44.420 who would put a
00:51:45.100 I just might
00:51:46.480 have, I might
00:51:47.280 have chosen
00:51:47.960 differently, I
00:51:49.260 might have, but I
00:51:50.980 do respect that
00:51:51.760 opinion on a
00:51:52.580 human level, but I
00:51:54.060 don't think it's
00:51:54.780 technically the
00:51:55.400 right freedom of
00:51:56.160 speech decision.
00:51:59.960 Yeah.
00:52:00.800 Now, and then I
00:52:02.020 would be especially
00:52:02.800 happy if someday
00:52:04.220 Ye clarifies what
00:52:06.420 he was up to, and
00:52:07.760 it turned out to
00:52:08.560 be, you know, an
00:52:09.460 acceptable
00:52:09.980 clarification, and
00:52:11.440 then he was let
00:52:12.040 back on, then I
00:52:13.940 think the loop
00:52:14.780 would be complete,
00:52:15.720 and I would be
00:52:16.200 happy with
00:52:16.640 everybody's
00:52:17.180 performance.
00:52:18.600 I'd be happy with
00:52:19.520 everybody.
00:52:20.500 I'd say, okay, you
00:52:21.620 did what you
00:52:22.400 thought you needed
00:52:22.980 to do, you had
00:52:24.180 good intentions, we
00:52:25.860 fought it out, and
00:52:27.280 here we are, and
00:52:28.300 maybe we learned
00:52:28.920 something in the
00:52:29.480 process.
00:52:30.540 So that's the most
00:52:32.240 optimistic thing I
00:52:33.320 can say.
00:52:33.660 And there's not
00:52:39.580 much else going
00:52:40.160 on.
00:52:41.480 Can we get some
00:52:42.500 new news, please?
00:52:45.520 Could we have
00:52:46.400 something else going
00:52:47.260 on?
00:52:49.620 Well, Ye did
00:52:50.780 mention loving
00:52:51.800 everybody, yes, but
00:52:54.620 he didn't complete
00:52:56.060 the circuit and
00:52:58.160 say how loving
00:52:59.020 everybody can be
00:53:01.040 made compatible with
00:53:02.160 the things he's
00:53:02.680 been saying and
00:53:03.260 tweeting, yeah, he
00:53:04.120 hasn't quite done
00:53:05.360 that.
00:53:06.560 Intentionally.
00:53:07.520 I think
00:53:07.920 intentionally, because
00:53:08.660 he would know how
00:53:09.200 to do that, of
00:53:09.940 course.
00:53:16.600 Someone should
00:53:17.360 announce for the
00:53:18.040 presidency.
00:53:20.880 How many of you
00:53:21.760 think that Trump
00:53:23.140 can still win?
00:53:26.500 How many of you
00:53:27.340 think Trump can
00:53:28.440 win?
00:53:28.760 I would say that
00:53:33.960 he's not
00:53:34.460 demonstrating a
00:53:35.400 will to win.
00:53:38.400 If he were
00:53:39.120 demonstrating a
00:53:40.020 will to win, I
00:53:40.960 think that I
00:53:41.780 would say, well,
00:53:43.260 you can never
00:53:43.840 count him out, and
00:53:45.520 no obstacle is too
00:53:46.560 big, and if he
00:53:48.440 survived, grab him
00:53:49.820 by the pussy, and
00:53:50.760 all the legal
00:53:51.360 challenges, and two
00:53:53.520 impeachments.
00:53:54.140 If he survived all
00:53:55.200 of that, and he
00:53:57.660 still had the
00:53:58.220 fight in him, then
00:53:59.900 I wouldn't count him
00:54:00.980 out.
00:54:02.240 But the evidence
00:54:04.420 suggests he doesn't
00:54:05.280 have the will, for
00:54:08.280 whatever reason.
00:54:09.000 And I don't think he
00:54:09.800 has the family support
00:54:10.780 either, and that
00:54:11.300 probably makes a
00:54:12.000 difference.
00:54:12.920 So I don't think he
00:54:13.900 has the will to push
00:54:14.760 through it.
00:54:15.740 I think he might be
00:54:17.040 able to get the
00:54:18.560 nomination, but he
00:54:20.420 doesn't have any chance
00:54:21.200 of winning in a
00:54:21.760 general.
00:54:23.160 I don't think.
00:54:24.620 That's my prediction.
00:54:27.860 Now, if he runs in
00:54:30.540 the primary, I don't
00:54:31.460 think DeSantis will
00:54:32.380 run against him, even
00:54:34.820 though DeSantis could
00:54:35.780 beat him in the
00:54:36.260 primary, because DeSantis
00:54:38.040 would be so wounded by
00:54:40.400 the fight that, I
00:54:42.200 don't know, it would
00:54:42.700 be like a, what do
00:54:45.000 you call it, a
00:54:45.620 Pyrrhic victory, the
00:54:47.220 kind where you win, but
00:54:48.240 in the long run, you
00:54:49.020 wish you hadn't, because
00:54:49.820 you got so crippled in
00:54:50.880 the winning, that you
00:54:52.520 end up dying in the
00:54:53.400 long run.
00:54:56.520 So I guess the real
00:54:57.640 question is whether
00:54:58.720 Trump goes ahead and
00:55:00.700 gets into the primary.
00:55:03.740 What do you think?
00:55:04.940 Do you think Trump
00:55:05.600 will actually run in
00:55:07.500 the primary all the
00:55:08.400 way through to
00:55:09.040 nomination?
00:55:13.740 It would be his
00:55:14.760 character to not
00:55:15.840 quit, wouldn't it?
00:55:17.120 It would be within his
00:55:18.100 character to not quit.
00:55:19.500 And it would be good
00:55:20.220 for true social
00:55:21.120 probably, so probably
00:55:23.340 just for his business
00:55:24.180 interests, he probably
00:55:25.020 has to play it.
00:55:26.480 So I think he's going
00:55:27.340 to get into the
00:55:30.740 primaries.
00:55:33.620 Now, here's another
00:55:34.740 wild card.
00:55:36.800 We imagine Trump at
00:55:38.380 full speed, but when
00:55:40.040 we think of DeSantis, we
00:55:41.380 think of him running
00:55:42.160 his state.
00:55:44.080 When you think of
00:55:44.960 those two things, you
00:55:45.780 think of Trump winning
00:55:46.820 easily, right?
00:55:47.680 Trump at full strength
00:55:49.540 versus mostly we've
00:55:52.200 seen a governor at
00:55:53.180 full strength as just
00:55:54.140 a governor.
00:55:56.180 But what we haven't
00:55:57.340 seen is DeSantis going
00:55:58.800 full strength at
00:55:59.760 Trump.
00:56:01.280 What would that look
00:56:02.300 like?
00:56:04.060 Because I don't think
00:56:05.540 anybody's ever done a
00:56:06.480 good job of going
00:56:08.700 after Trump.
00:56:09.900 Because the left goes
00:56:11.080 after him with hoaxes.
00:56:12.820 And the right says,
00:56:13.740 okay, that's just a
00:56:14.700 hoax.
00:56:15.640 Okay, that's just
00:56:16.300 fake news.
00:56:17.580 Okay, that's just
00:56:18.360 leaving out context.
00:56:20.660 So when the left goes
00:56:21.840 after him, you just
00:56:22.620 dismiss it because it's
00:56:23.680 just silly.
00:56:24.780 They're usually missing
00:56:25.740 the point.
00:56:26.280 They're lying.
00:56:27.380 But if DeSantis goes
00:56:28.660 after Trump, it's going
00:56:30.760 to be on solid
00:56:31.900 Republican argument.
00:56:35.660 And that we haven't
00:56:37.860 seen.
00:56:38.820 We haven't seen any
00:56:39.960 conservative make the
00:56:40.940 good argument against
00:56:42.080 Trump.
00:56:43.300 But DeSantis could, I
00:56:45.220 think he could bring it
00:56:46.000 home.
00:56:46.680 So you don't know what
00:56:47.720 kind of impact that
00:56:48.520 would have.
00:56:50.260 Here's the other thing
00:56:51.180 you don't know.
00:56:52.240 What kind of advisors
00:56:53.500 DeSantis would have.
00:56:56.600 Because if you're
00:56:57.380 looking at DeSantis,
00:56:58.500 his native ability is
00:57:00.000 very high.
00:57:00.520 But you also don't
00:57:02.280 know who's advising
00:57:03.440 him.
00:57:05.460 If his advisors also
00:57:07.180 go up to the
00:57:08.000 presidential level,
00:57:09.240 he gets maybe even
00:57:11.180 better advisors.
00:57:12.360 You don't know what
00:57:13.340 that turns him into.
00:57:14.840 You could turn into
00:57:15.660 anything.
00:57:17.780 So, I don't know.
00:57:18.840 It would be, I think
00:57:21.120 it would be a
00:57:21.680 disservice to the
00:57:22.760 people of Florida if
00:57:24.660 DeSantis ran.
00:57:25.500 because they're
00:57:26.520 pretty happy with
00:57:27.220 their governor and
00:57:27.900 it would, you know,
00:57:29.320 take him out of the
00:57:29.940 job for a long time.
00:57:32.240 So, Trump's advisors
00:57:35.000 have been either bad
00:57:37.720 recently or he doesn't
00:57:39.500 have any or he's
00:57:41.500 ignoring them.
00:57:44.800 Yeah.
00:57:46.040 So, we'll see.
00:57:48.880 All right, ladies and
00:57:49.740 gentlemen, is there
00:57:50.360 anything I've missed?
00:57:51.340 Any big story that's
00:57:52.380 happening that I forgot
00:57:53.500 to talk about?
00:57:56.460 I don't think so.
00:57:58.440 The World Cup?
00:57:59.920 Well, what about it?
00:58:05.740 U.S. lost, right?
00:58:08.040 To, who did we lose
00:58:10.300 to?
00:58:12.580 We lost to
00:58:14.120 Netherlands, the Dutch.
00:58:17.460 Did you watch the U.S.
00:58:19.440 play the Netherlands?
00:58:21.360 The U.S. looked like a
00:58:22.700 high school team playing
00:58:23.820 a professional team.
00:58:25.500 Every time the U.S.
00:58:28.600 team, like, would have
00:58:29.540 the ball and you say,
00:58:30.760 oh, here's the part in
00:58:32.480 soccer where the U.S.
00:58:34.680 team makes these clever
00:58:36.260 passes and beats their
00:58:38.240 defender and does
00:58:39.840 something you didn't even
00:58:40.700 think was possible and
00:58:41.740 then they make an attack
00:58:42.660 on the goal.
00:58:44.000 And then you'd watch the
00:58:44.940 Dutch just, like, take
00:58:45.820 the ball away from them.
00:58:47.460 Like they weren't even
00:58:48.320 playing somebody at the
00:58:49.200 same level.
00:58:49.680 You saw that too, right?
00:58:54.000 Yeah.
00:58:54.280 As soon as you turned it
00:58:55.240 on, you said to yourself,
00:58:56.360 uh, these don't look like
00:58:58.460 the same level at all.
00:59:01.180 Not at all.
00:59:01.860 North Carolina substations
00:59:11.580 attack.
00:59:13.760 So there's a story about
00:59:15.380 a bunch of power
00:59:17.160 substations in North
00:59:19.520 Carolina that were
00:59:21.040 attacked as if it were
00:59:22.180 some organized attack.
00:59:23.520 But, I don't know, that
00:59:28.380 sounds more like
00:59:29.540 individuals, yeah, that
00:59:31.280 sounds like drunk
00:59:31.940 rednecks or something.
00:59:33.320 That doesn't sound like
00:59:34.180 the beginning of a
00:59:34.900 revolution to me.
00:59:39.780 Uh, oh, Kyle
00:59:43.140 Rittenhouse is going
00:59:44.140 viral after asking if
00:59:45.480 Twitter files will reveal
00:59:46.940 any hidden censoring
00:59:48.820 against him.
00:59:50.380 You think Kyle
00:59:51.320 Rittenhouse was
00:59:51.980 censored?
00:59:53.520 Or accounts
00:59:56.360 talking about him?
01:00:00.160 I don't know.
01:00:03.800 I didn't notice it.
01:00:06.740 Uh, I saw
01:00:07.580 massive
01:00:08.740 tweeting, uh,
01:00:10.940 on his,
01:00:12.760 you know, in his favor.
01:00:14.360 You didn't see massive
01:00:15.820 pro-Rittenhouse tweeting?
01:00:17.740 Like, it was everywhere.
01:00:20.100 Like, I didn't see any.
01:00:21.320 Um, so I'm not saying
01:00:23.640 it didn't happen
01:00:24.300 because what we know
01:00:25.620 now suggests that it
01:00:26.880 did.
01:00:27.680 It suggests that every
01:00:28.920 major topic probably
01:00:30.080 had a little bit of
01:00:30.820 that problem.
01:00:32.920 But, I didn't notice it.
01:00:36.100 Oh, you were, you were
01:00:37.600 suspended for tweeting
01:00:38.680 free Kyle?
01:00:39.700 No, you weren't.
01:00:41.460 Nobody was, no one was
01:00:42.960 suspended for hashtag
01:00:44.520 free Kyle.
01:00:45.320 Well, that didn't
01:00:46.680 happen.
01:00:47.480 So, don't tell me that
01:00:48.380 happened because that
01:00:49.060 didn't happen.
01:00:49.500 Um, so, I'm going to
01:01:03.260 tell the, uh, locals
01:01:04.240 people, uh, something,
01:01:05.860 uh, dog and tree
01:01:08.180 related after I get
01:01:11.380 off of YouTube
01:01:11.900 because I can't tell
01:01:12.900 the YouTube people.
01:01:14.540 I, I give the locals
01:01:15.840 people their subscribers.
01:01:16.880 So, I give them the, the
01:01:18.560 secret stuff that so far
01:01:20.620 they have not shared
01:01:21.380 outside of locals.
01:01:22.460 Which, by the way, do
01:01:24.260 you know how amazed I
01:01:25.160 am?
01:01:26.280 So, this will amaze
01:01:27.380 you, YouTube people.
01:01:29.140 So, I've got over 6,000
01:01:31.000 subscribers on the
01:01:32.920 locals platform.
01:01:34.620 And I often ask them
01:01:36.280 not to tell anybody the
01:01:38.160 things that I'm telling
01:01:38.960 them.
01:01:39.600 And so far, I don't
01:01:40.440 think it's ever happened.
01:01:41.900 I, I can't think of one
01:01:43.320 example where I told
01:01:44.880 somebody on locals, you
01:01:46.080 know, don't tell
01:01:46.600 anybody, and that it
01:01:47.800 actually got on Twitter.
01:01:48.880 I haven't seen it once.
01:01:50.480 And I'm, I am so
01:01:51.600 impressed.
01:01:52.640 I am so impressed
01:01:54.020 about that.
01:01:54.960 The odds of that are
01:01:55.920 just shocking.
01:01:58.360 Yeah.
01:01:58.980 And, and, um, I think
01:02:02.260 I have sort of the
01:02:03.060 perfect size of a
01:02:05.220 subscriber base because
01:02:06.420 it's still very personal.
01:02:08.200 You know, I see the
01:02:08.780 same characters and we
01:02:09.920 interact.
01:02:10.620 If it were 100,000, I
01:02:11.980 don't think it would be
01:02:12.620 nearly as fun.
01:02:14.280 You know, 6,000 is a
01:02:16.420 really good number.
01:02:17.580 Under 10, I think I'd
01:02:18.600 like to keep it for
01:02:19.320 sure.
01:02:21.360 All right.
01:02:22.200 Um, YouTube, I'm
01:02:23.260 going to say goodbye
01:02:23.860 and, uh, Spotify and
01:02:26.620 all you folks and I'll
01:02:27.820 talk to locals a little
01:02:28.840 bit.
01:02:29.040 All right.