Real Coffee with Scott Adams - May 20, 2023


Episode 2114 Scott Adams: Trump vs DeSantis Strategy, Tim Scott Joins The Race, Kari Lake Might Win


Episode Stats


Length

52 minutes

Words per minute

140.5977

Word count

7,405

Sentence count

565

Harmful content

Misogyny

8

sentences flagged

Hate speech

15

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Join me for the dopamine hit of the day, the thing that makes everything better, and it happens right now! Enjoy the and don t forget the A at the end of the word ah .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:06.860 It's called Coffee with Scott Adams.
00:00:08.860 I'm pretty sure that science has never discovered a more awesome time or situation.
00:00:14.700 And you lucked into it.
00:00:16.420 Wow.
00:00:17.660 I guess your day's starting well.
00:00:19.740 But it could be even better, and all you need for that is a cup or a mug or a glass,
00:00:23.100 a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind.
00:00:27.780 And fill it with your favorite liquid, no-bud light.
00:00:33.360 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:37.020 the thing that makes everything better.
00:00:38.820 It's called the simultaneous sip, and it happens now.
00:00:41.820 Don't forget the ah at the end. 0.76
00:00:43.480 Go.
00:00:47.420 Ah.
00:00:49.080 Oh.
00:00:50.960 All right, let's start out with,
00:00:52.620 Did you all see the meme which put Joe Biden's face on Dylan Mulvaney?
00:01:00.280 Well, you're going to see it now.
00:01:01.620 It's with an ice-cold, but like so good.
00:01:08.520 Happy Friday.
00:01:09.680 All right, everyone.
00:01:10.860 So it's Friday night.
00:01:12.000 I'm getting ready to go out.
00:01:13.540 I'm feeling kind.
00:01:15.040 But before I go out, I've got a free game.
00:01:17.140 And the only way I can regain is with an ice-cold, but like so good.
00:01:26.100 Happy Friday.
00:01:28.000 So there are some things that AI will be good for.
00:01:32.800 So in case you were thinking AI, what good is AI?
00:01:36.980 There you go.
00:01:38.560 That's what it's good for.
00:01:40.520 2024 election is going to be so much fun.
00:01:44.420 Oh, my God.
00:01:45.080 It's going to be meme versus meme.
00:01:48.100 All right.
00:01:49.140 So in other news, there's a report of three boats have been sunk in Europe by orcas.
00:01:57.760 Is there a difference between an orca and a whale?
00:02:02.420 Why does this story say orcas?
00:02:04.400 There is a difference?
00:02:07.320 Yeah.
00:02:07.760 Okay.
00:02:09.400 An orca is not a whale.
00:02:11.800 It's a killer whale.
00:02:12.940 Well, why can't I just call it a whale?
00:02:16.280 Because it's an orca, I guess.
00:02:18.800 Well, all right.
00:02:19.520 So these orcas have sunk three boats, but that's not the amazing part.
00:02:23.460 The amazing part is that they've been spotted teaching other orcas how to do it.
00:02:29.840 That's right.
00:02:30.440 The war is on.
00:02:32.300 It turns out that the animals have decided to turn on the humans, and they're just going
00:02:36.660 to kill us after all.
00:02:37.620 And it turns out that the animals have a navy.
00:02:41.800 It's all orcas.
00:02:43.680 And the orcas are trading.
00:02:47.020 I don't know if NATO is trading them or what, but somebody's trading these orcas to attack
00:02:53.060 naval vessels.
00:02:54.580 Keep an eye on that.
00:02:56.100 Keep an eye on that.
00:02:57.000 Did you see a video not too long ago where Joe Rogan was showing a video of a rat that
00:03:08.260 learned to use tools?
00:03:10.300 It went and got a stick to click the rat trap so it could get the cheese without getting hurt. 0.99
00:03:15.680 Because it knew what the rat trap was, and it knew that if it used a tool, it could spring
00:03:21.760 it.
00:03:22.420 It actually used a tool, a stick.
00:03:24.660 Now we see these whales learning to attack boats.
00:03:29.420 Put it all together, people.
00:03:31.620 Put it all together.
00:03:33.680 It's not just AI that's getting smarter.
00:03:36.580 Somehow the creatures, the creatures are starting to gang up.
00:03:41.080 See, it's a total head fake.
00:03:43.560 We're all looking at the robots and the AI.
00:03:46.120 It's like, oh, look over there.
00:03:47.420 It's the robots.
00:03:48.660 We're afraid of the robots and the AI.
00:03:50.740 Meanwhile, the orcas and the rats apparently have conspired to take over everything with
00:03:58.460 their tool making and such.
00:04:00.760 All right.
00:04:00.980 We'll keep an eye on that.
00:04:02.660 Here's a story I told in the man cave last night, but you're going to have to hear it
00:04:06.060 because it's important.
00:04:08.420 It's about the simulation.
00:04:09.640 Have you ever had this situation where you worried yourself into the exact problem that
00:04:16.040 you were trying to prevent?
00:04:17.940 Does anybody have that?
00:04:19.580 That's like a real thing, right?
00:04:22.000 There's something that's never happened to you before, and then because you're thinking
00:04:26.500 of it, you just think yourself into the problem.
00:04:28.880 Well, yesterday, I went to take a ride in my e-bike, and uncharacteristically, I thought
00:04:37.180 to myself, you know what?
00:04:39.180 I'd hate to fall off my e-bike because I'm at that age now where if you get a serious sporting
00:04:45.340 injury, you know, maybe you don't come back.
00:04:49.180 It's kind of the end of your sporting life if you have a bad one.
00:04:52.380 And so I thought to myself about falling off my e-bike, and, you know, it's not like I've
00:04:57.000 never thought of it, it's just I've never dwelled on it before.
00:05:01.060 So the first time I dwell on it, take my e-bike up, and I'm going on this path that's too small
00:05:09.340 for automobiles, and it looks like the path is going to come to an end, and I need to turn
00:05:16.200 around.
00:05:17.160 But the path had narrowed to the point where turning around without getting off your bike
00:05:22.940 was going to be kind of a challenge, and there was gravel on each side of the narrowing
00:05:27.000 narrow path.
00:05:28.360 And I said to myself, I calculated, all right, to do a very low-speed turn, you know, you
00:05:36.580 can't do it if it's too narrow, you'll fall off.
00:05:40.060 And I thought, I better go off my bike and just, you know, walk it around.
00:05:44.360 And then I thought to myself, no, no, damn it.
00:05:47.660 I'm no bicycle pussy. 0.91
00:05:50.380 I'm not afraid of my bike.
00:05:52.140 I'm not going to talk myself into being afraid of a bicycle.
00:05:54.840 Well, damn it, I'm going to just make this turn.
00:05:58.940 So as I was flying through the air on my way to the hard pavement below, because as you
00:06:07.580 see, the turn didn't work out.
00:06:10.020 It was one of those situations where I really should have listened to that little voice in
00:06:13.740 my head that said, well, that looks like really dangerous.
00:06:16.620 You're not going to make it.
00:06:19.680 So as I was departing from the bicycle itself, and more airborne than actually bike-borne,
00:06:28.480 time stood still.
00:06:31.580 Have you ever had that experience where you know something bad is going to happen, and
00:06:35.640 it's only going to be a quarter of a second?
00:06:37.800 But that quarter of a second seems to just stop in time in your memory.
00:06:43.980 So as I'm flying through the air, I have to do triage on which part of my body I want
00:06:51.240 to sacrifice.
00:06:52.700 I'm thinking, arm?
00:06:54.480 No.
00:06:55.060 Don't want a broken arm.
00:06:56.920 Leg?
00:06:58.000 Knee?
00:06:58.560 No.
00:06:59.160 No, bad news.
00:06:59.940 Head?
00:07:00.340 Got a helmet.
00:07:01.500 But still, you don't want to land on your head.
00:07:03.540 Hands?
00:07:04.040 Got to protect the hands, right?
00:07:05.640 Because I work with my hands.
00:07:06.740 So I'm running through all my body parts.
00:07:10.520 I'm like, no, no, no, no.
00:07:12.960 And I came up with one body part that I thought I could get by with.
00:07:17.720 And it was this fat part of my back toward the top, you know, just past the shoulder blade
00:07:24.700 where there's like a little bit of muscle here, a little bit of padding.
00:07:28.160 And I managed to spin in the air just fast enough to tuck and roll, hitting that back part
00:07:35.100 of my back first.
00:07:36.000 And I got to say, it was quite a crash.
00:07:42.180 But I managed to pop up uninjured.
00:07:44.940 Now, that's not even the weird part.
00:07:46.640 Here's the part I didn't tell you last night in the man cave.
00:07:49.640 So have you had this situation where there's something that never happens and then it's
00:07:54.920 everywhere?
00:07:56.280 And you don't know, did you cause that?
00:07:58.640 Or was it always just there and you didn't notice it?
00:08:00.980 So today I'm looking through the news to get my little news stories for the show.
00:08:06.480 And there's a story about Simon Cowell, who happens to be just about my age, like pretty
00:08:12.900 close, and how he got seriously injured falling off his e-bike.
00:08:17.540 Now, I've only fallen off my e-bike once.
00:08:23.420 Just once.
00:08:25.220 And then I turn it and there's like a major story about a guy my age falling off an e-bike.
00:08:31.040 What are the odds of that?
00:08:33.060 Did I make that happen?
00:08:34.360 See, this is why I believe I live in the simulation.
00:08:39.280 One explanation is, oh, it's just a coincidence.
00:08:42.580 The other explanation is, well, stories about bike accidents were always out there.
00:08:47.240 You just didn't notice until you were primed for it.
00:08:50.420 Or, number three, I caused it to happen.
00:08:54.220 Not only did I cause the accident, but I caused the news story about the topic.
00:08:59.960 So that's how I experience life, like I just caused it.
00:09:04.560 I don't know if it's true, but I also don't know anything else is true.
00:09:08.620 Well, enough about me.
00:09:10.780 The Wall Street Journal is reporting that the bar exam for lawyers is going to be made easier
00:09:16.740 so that more black people can become lawyers.
00:09:23.180 So, that's actually in the news.
00:09:27.920 That's a real story.
00:09:32.200 Now, of course, in the story, they don't say black people.
00:09:36.400 You know, they have to wink at you.
00:09:38.360 Well, it's to make it fair for everybody.
00:09:42.420 You know, but you know what they're talking about.
00:09:44.940 So, what do you think about that?
00:09:48.100 Well, I'll tell you.
00:09:49.300 If I were currently a black lawyer and I had passed the hard bar exam,
00:09:54.000 I would not feel good about being presumably thrown in with the people who did not pass
00:10:01.980 the hard bar exam.
00:10:03.460 I've got a feeling that people are going to be like,
00:10:05.920 I don't know.
00:10:07.220 Did you pass the real bar exam or the new fake bar exam?
00:10:11.120 The easy one.
00:10:12.540 I'm not sure.
00:10:13.240 So, I feel like we are not considering the side effects of some of our decisions.
00:10:22.540 You know, as I've said before, historically, I've always supported, you know,
00:10:28.800 efforts to increase inclusion and equity, not equity, but equality of opportunity and all that,
00:10:36.620 including even affirmative action, historically.
00:10:44.880 But at some point, am I wrong that at some point it gives you more downside than upside?
00:10:54.440 Am I wrong about that?
00:10:56.120 That at some point, you could argue whether we're there or not,
00:10:59.000 but at some point, you need to stop doing it before you've reached total equality.
00:11:06.620 If you keep pushing it all the way to everybody's equal all the time, 0.86
00:11:10.980 there's no way that's good for everybody.
00:11:13.520 It's just going to be such a fight.
00:11:16.980 But, you know, in the early days, when the disparities are gigantic,
00:11:20.500 yeah, maybe you have to do something a little more aggressive.
00:11:23.280 But at the moment, I really need to, I think we need to rethink this stuff.
00:11:28.140 All right, I saw another tweet from somebody who's joining my opinion
00:11:31.700 that our worry about AI is bullshit, at least the current version,
00:11:37.880 and it's not nearly as useful for a lot of things as you think it will be.
00:11:44.700 So here's what AI will be good for.
00:11:48.380 Really, really good at complex searches, so it's a better search engine.
00:11:54.200 Really good at summarizing articles, which I would just call the search engine.
00:11:58.140 So I saw somebody say, oh, it's really good at summarizing the pros and cons
00:12:04.880 of all these articles about some topic, to which I say, you mean a search engine?
00:12:10.480 It's just a really good search engine.
00:12:12.560 Oh, yeah.
00:12:13.740 I mean, it's really, really good, but it's also unreliable, so you've got that.
00:12:18.880 But when it works, it works pretty awesome.
00:12:21.020 And then, you know, it's going to be good for helping people
00:12:23.840 who are already programmers code, and, you know, lawyers will be able to research
00:12:29.160 case law and stuff like that.
00:12:31.180 So there'll be a number of professions in which, if you're already an expert
00:12:36.860 in that profession, your writer or researcher or lawyer or whatever
00:12:40.600 is really going to help.
00:12:42.580 But it doesn't look like it's coming anywhere near taking jobs.
00:12:46.400 It looks like it's just going to make people work differently.
00:12:48.960 And so when I said I'm not worried about AI becoming an existential threat
00:12:56.120 in its current form, the pushback I got was, Scott, oh, wow.
00:13:03.580 If I may give you some condescending opinion about this.
00:13:07.140 Whoa.
00:13:07.880 Wow.
00:13:08.600 Boomer.
00:13:09.280 Boomer, let me explain to you.
00:13:11.660 No, we're not worried about the current version of AI, Boomer.
00:13:16.260 No, Boomer.
00:13:16.920 We're worried about where it's obviously going.
00:13:21.280 You know, yes, maybe version 1.0 is not going to destroy the world.
00:13:25.620 But obviously, as it gets smarter and smarter, the risk increases.
00:13:29.600 And Boomer, Boomer, please. 0.98
00:13:32.560 Please, Boomer. 0.98
00:13:33.460 Just maybe you shouldn't even talk about things you don't know about.
00:13:36.820 Please.
00:13:37.220 To which I say, your genius idea that a rabbit can evolve into a truck is not persuading me.
00:13:47.420 Oh, yeah.
00:13:49.440 Bad microchips do evolve into better microchips.
00:13:53.120 That's a thing.
00:13:54.520 Bad smartphones do evolve into better smartphones.
00:13:58.460 Totally.
00:13:59.480 Bad cars evolve into better cars.
00:14:03.300 Yep.
00:14:03.940 But not once in the history of the whole fucking universe has a rabbit turned into a truck.
00:14:10.520 And that's what's happening with AI.
00:14:12.420 Because the large language model is never going to be smart.
00:14:17.160 It's just going to be really good at organizing information.
00:14:19.800 And what it takes for AI to become smart has not been invented.
00:14:28.020 Right?
00:14:29.300 It hasn't been invented.
00:14:30.820 Nobody knows how that could happen.
00:14:32.940 Because we don't even have, like, a concept of what it would take to make it smart.
00:14:37.900 All we have is this old model, which just makes your search engine and your organization of data really, really cool.
00:14:46.260 And it can talk to you better.
00:14:47.900 So it's like Siri+.
00:14:49.200 It's like, you know, a better Siri, basically.
00:14:51.780 So I would like to submit that the AI that you're worried about is the one that hasn't been invented and nobody knows how to invent it.
00:15:05.880 Should you be worried about things that haven't been invented and nobody knows even the first part of how to make it?
00:15:13.780 That would be everything.
00:15:14.940 You know, I'm worried about the pen.
00:15:21.560 Because what if it evolves into a laser that shoots out of every hole?
00:15:28.040 Because, you know, right now it's innocent.
00:15:31.820 Right now there's not much danger, you know, unless it punks you.
00:15:34.900 But what if it evolved into a laser-shooting pen?
00:15:40.100 That's what the conversation about AI is like.
00:15:42.440 No, it does not evolve into a laser-shooting pen.
00:15:47.200 There might be lasers that could evolve, but a pen doesn't become a laser.
00:15:54.520 All right, enough of that.
00:15:58.580 All right.
00:16:01.900 As you know, the reason that ESG and DEI and CEI and all those racist things that are being pushed upon organizations
00:16:12.000 the main reason that they have to fold to those things is not because they think it's right, necessarily.
00:16:17.820 Some probably do.
00:16:19.180 But because there are all these ratings agencies that will destroy the reputation of your company
00:16:24.380 if you're not doing all the noble and good things.
00:16:27.280 And I thought to myself, well, why are there not alternative ratings agencies?
00:16:38.080 And why are there not ratings agencies to rate the other rating agencies?
00:16:44.460 Because wouldn't you like to know, oh, this agency says, you know, Tesla is not woke enough.
00:16:50.280 But wouldn't you like to know that there's a rating agency that rates all the rating agencies
00:16:55.720 and says, this rating agency is bullshit?
00:16:59.700 What's wrong with that?
00:17:01.020 Because my understanding is that it doesn't take much to be a rating agency.
00:17:05.340 You just have to say you are one and then see if you can convince people you are one. 0.97
00:17:10.680 Right?
00:17:10.920 It's not like it's a government, you know, licensed role.
00:17:15.640 So why not have, like, hundreds of them until they're all useless?
00:17:20.720 You know, the problem with the old search engine before AI is that if I Googled anything,
00:17:26.920 the problem wasn't that I would get no results.
00:17:29.560 The problem was I'd get too many, and they'd all be all over the place.
00:17:33.660 It would be like going to the library and looking through books.
00:17:36.620 It would just take forever.
00:17:38.540 Right?
00:17:39.360 So I feel like one way that this ratings agency thing might go
00:17:44.340 is there might be so many different agencies that they all become useless.
00:17:50.100 If you wanted to know what was the rating,
00:17:52.760 you'd look up the agencies to see what they rated Tesla,
00:17:55.780 and one would be an A, and one would be like a D+.
00:17:59.760 And then what do you do?
00:18:03.100 What do you do?
00:18:03.840 So I feel like one direction that all this rating agency stuff could go
00:18:08.880 is more of it until it dies under its own weight.
00:18:12.340 That could be one direction.
00:18:14.880 Because what I don't think is going to happen is you'll just have a few ratings agencies,
00:18:19.340 and everybody will just agree that they're the ones.
00:18:22.140 Maybe, but I feel like there's no barrier to entry, so there should be lots of them.
00:18:29.100 Then we see that one of the ratings agencies, there's a gay advocacy group.
00:18:36.260 They just stripped Anheuser-Busch of their good rating
00:18:40.600 because they didn't like how Anheuser-Busch basically caved to the pressure about Dill and Mulvaney.
00:18:47.780 So now you can't make the right happy because they didn't like how that was treated.
00:18:53.660 But now the left is unhappy because they changed their minds
00:18:57.260 because of the pressure from the right.
00:18:59.420 Do you think that big companies are learning that social advocacy is a bad idea?
00:19:04.620 How many times does a big company need to just get kneecapped
00:19:10.780 before they say, how about we just stay out of all of this?
00:19:14.520 How about you do you, we'll just make money,
00:19:18.920 and we'll just try to be a company for a change?
00:19:21.080 How about that?
00:19:21.720 So, anyway, have you noticed that when Democrats want to get re-elected,
00:19:34.780 there's something that they often do?
00:19:38.280 Tell me what it is.
00:19:39.800 What do Democrats do, and I'm not talking about hoaxes,
00:19:43.660 when they want to get re-elected?
00:19:45.880 Lie more?
00:19:46.540 Well, okay.
00:19:47.740 It wasn't act Republican.
00:19:49.040 They act Republican.
00:19:53.140 And here's the thing.
00:19:54.960 It's as if Democrats know that their own policies don't work
00:20:03.320 because when they need to get re-elected,
00:20:05.440 they just suddenly drift into Republican policies
00:20:08.380 to try to make you not realize that they never were there.
00:20:12.060 So here's a good example of that.
00:20:13.340 Governor Newsom is trying to change the approval process
00:20:24.080 for getting big projects approved in the state.
00:20:27.980 Now, a big project would be a big energy project or a big dam.
00:20:33.260 So anything with water or energy.
00:20:35.840 You can't get anything approved in California
00:20:38.360 because there's some process.
00:20:40.500 Let's see, the process is, what's it called?
00:20:47.220 There's a name for it.
00:20:48.660 The California Environmental Equality Act.
00:20:52.560 It's a law that is known as CEQA.
00:20:55.480 So apparently the Democrat governor realizes that,
00:20:59.100 I assume this came from Democrats,
00:21:01.960 that Democrats passed a law that made it impossible
00:21:04.340 to manage the state
00:21:05.960 because you couldn't get anything done
00:21:08.020 because you couldn't get it through this approval organization.
00:21:11.940 So Newsom is going to try to become Donald Trump
00:21:15.800 by reducing red tape so he can get things done.
00:21:23.420 Now, how do you justify that?
00:21:27.000 I mean, he literally has to become Trump to save the state.
00:21:32.200 Does anybody not notice that?
00:21:33.980 Are we supposed to not notice that he has to adopt
00:21:37.660 a baseline Republican policy to save the state?
00:21:42.960 Because he knows he's in trouble.
00:21:47.480 That's so embarrassing.
00:21:49.540 Now, that doesn't work the other way, does it?
00:21:52.580 It probably does, but I just don't know examples.
00:21:55.080 I'm probably being biased about this.
00:21:56.840 Does it work the other way that Republicans
00:21:58.580 try to turn into Democrats when they're running for election?
00:22:01.680 Maybe they do.
00:22:04.800 Maybe they do a little bit.
00:22:06.220 Maybe they soften.
00:22:07.600 They probably soften on abortion a little bit in some cases.
00:22:12.440 Yeah, some do.
00:22:14.320 So I guess I'm going to modify my statement
00:22:17.060 and say it works both ways.
00:22:19.420 All right, well, here's a little wild card.
00:22:22.440 Remember, the best way to predict the future,
00:22:26.000 as Elon Musk reminded us even yesterday,
00:22:28.420 is that the most entertaining outcome is the most likely,
00:22:33.400 as determined by an external observer.
00:22:38.440 And I told you, if that rule is true,
00:22:41.540 then one thing we should see before 2024
00:22:44.180 is evidence that the 2020 election was, in fact, rigged.
00:22:50.020 Now, I'm not predicting it based on evidence, right?
00:22:56.600 And I'm not saying it was rigged.
00:22:58.160 I'm not saying it was rigged.
00:23:00.040 And there's no evidence that I'm aware of.
00:23:03.920 But wouldn't it be the most entertaining?
00:23:07.180 What would be more entertaining than that
00:23:09.680 to find out it actually was rigged?
00:23:12.180 Because it would be the cherry on the cake
00:23:15.720 of all the other rigging and impropriety that we've seen.
00:23:20.980 Because every time I make this joke, people get quiet.
00:23:24.520 Do you know the joke I make about the election?
00:23:27.200 Well, we know that the following entities
00:23:29.320 are thoroughly corrupt.
00:23:31.400 The FBI, the CIA, Congress,
00:23:35.200 you go down the list, the FDA, the CDC,
00:23:38.280 all big pharma.
00:23:39.060 We know they're all corrupt.
00:23:42.180 But aren't we lucky that all 50 states
00:23:45.740 with separate voting systems work perfectly?
00:23:50.340 Everything else in the country was rigged.
00:23:53.300 Everything.
00:23:54.300 Everything that could be rigged was rigged.
00:23:56.940 But boy, are we lucky that the elections were not.
00:24:01.220 And when I say that, Democrats just fucking run away.
00:24:07.000 Because it's the one argument
00:24:08.980 that you just can't say anything about that.
00:24:12.180 Because the rejoinder is, but there's no evidence.
00:24:16.060 To which I say, absolutely.
00:24:18.920 You're so right about that.
00:24:20.460 There is no evidence.
00:24:22.500 You are right.
00:24:23.860 No argument there.
00:24:25.760 And again, aren't we lucky
00:24:27.860 that everything in the country is corrupted
00:24:30.700 and we found out for sure
00:24:32.340 but that.
00:24:34.440 So good on you for calling it right
00:24:38.140 and calling it the only thing that's not rigged.
00:24:41.120 It's the only thing.
00:24:44.000 All right.
00:24:44.820 So that's why
00:24:46.160 the most amusing outcome
00:24:48.720 would be to find out that it was.
00:24:51.320 Now, Carrie Lake has apparently arrested her case. 0.99
00:24:54.060 We don't know what the conclusion is yet.
00:24:55.680 But the question was
00:24:58.520 whether or not they had done
00:24:59.680 real signature verification
00:25:01.360 or fake signature verification.
00:25:05.180 Which one do you think they proved?
00:25:07.520 That real signature verification was done?
00:25:10.620 Or that fake verification was done?
00:25:13.740 Well, it turns out
00:25:15.500 according to Carrie Lake's side 0.99
00:25:18.260 that they did, I don't know,
00:25:21.900 tens of thousands,
00:25:22.980 70,000 signatures
00:25:24.460 that were verified in two seconds.
00:25:27.520 Two seconds.
00:25:28.900 So one of the things that Carrie Lake proved 1.00
00:25:30.740 is that the people
00:25:32.360 doing the signature verification
00:25:34.300 were way faster than you think.
00:25:38.540 Because just sort of commonsensically,
00:25:40.180 I think, 70,000.
00:25:43.760 How long would that take?
00:25:45.240 Human beings to look at each one.
00:25:47.880 And then I think to myself,
00:25:49.220 about two seconds.
00:25:51.820 That sounds about right.
00:25:53.040 Yeah, about two seconds.
00:25:55.840 So in order for Carrie Lake to lose, 1.00
00:25:59.680 and I don't mean legally,
00:26:01.060 but let's say lose logically,
00:26:03.300 lose the logical argument.
00:26:04.800 The legal argument might have
00:26:06.040 some legal loophole
00:26:09.180 that I don't know about.
00:26:10.180 But in terms of making her case
00:26:12.160 that there was, in fact,
00:26:13.940 not really any signature verification,
00:26:16.820 I feel like she's going to land that.
00:26:20.660 Because the logs themselves
00:26:22.320 apparently are undisputed,
00:26:25.140 and they say they were checked
00:26:27.140 in two seconds.
00:26:29.520 Now, obviously,
00:26:31.040 there's some pushback to that,
00:26:33.320 and I haven't heard it.
00:26:34.340 So don't assume,
00:26:35.300 just because you've heard
00:26:36.080 one side of it,
00:26:37.740 that there's not another side.
00:26:39.060 Because so far,
00:26:40.660 every time we've heard
00:26:41.400 there's an election story,
00:26:43.020 there is another side,
00:26:44.500 and it takes away all the fun.
00:26:47.200 Right?
00:26:47.680 So don't assume that this is
00:26:49.280 a slam dunk.
00:26:52.400 But, boy,
00:26:53.680 the simulation is just
00:26:54.920 screaming for it,
00:26:55.940 isn't it?
00:26:57.100 The simulation
00:26:58.020 seriously wants this to happen.
00:27:00.720 So we'll see.
00:27:03.680 We'll see.
00:27:04.780 If I were to predict
00:27:05.940 it's going to happen,
00:27:06.720 it would be based entirely
00:27:08.280 on the most entertaining outcome.
00:27:10.380 Not on any evidence.
00:27:11.940 Not on any evidence.
00:27:13.660 Because I don't believe
00:27:14.500 anything on the internet.
00:27:16.500 All right,
00:27:16.680 here's another simulation alert.
00:27:18.900 As you know,
00:27:19.960 Senator Tim Scott
00:27:21.200 has announced
00:27:21.940 he's going to run
00:27:22.720 as a Republican.
00:27:23.520 And
00:27:25.580 I'm going to make
00:27:28.120 a prediction
00:27:28.600 based on the same
00:27:30.320 phenomenon.
00:27:32.740 That reality
00:27:33.780 will be the most
00:27:34.520 entertaining reality.
00:27:36.020 For me.
00:27:37.960 So this will be
00:27:38.580 just personal.
00:27:40.120 I believe that
00:27:41.040 Tim Scott
00:27:42.200 will end up being
00:27:43.060 Trump's VP choice.
00:27:45.420 Here's why.
00:27:47.560 Because it will
00:27:48.520 drive me crazy
00:27:49.440 to hear that
00:27:50.420 the ticket is
00:27:51.240 Trump-Scott.
00:27:51.920 And I will be
00:27:54.000 plagued forever
00:27:55.080 until the election
00:27:56.700 because people
00:27:57.860 will think
00:27:58.520 it's very funny
00:27:59.340 to point out
00:28:00.540 that Scott
00:28:01.300 is my name as well.
00:28:03.780 And people will say,
00:28:05.300 well,
00:28:05.480 it's about time
00:28:06.180 you were running
00:28:06.840 on the ticket.
00:28:08.360 And then they will laugh
00:28:09.660 and it will be funny.
00:28:11.840 So,
00:28:12.480 for that reason alone,
00:28:14.260 I'm going to predict
00:28:15.560 that Trump will pick
00:28:16.540 Tim Scott
00:28:17.100 as his running mate.
00:28:19.960 Because it's just
00:28:20.880 the weirdest
00:28:21.460 most simulation-like
00:28:24.140 outcome.
00:28:25.160 It's just the weirdest.
00:28:27.000 Now,
00:28:27.400 on top of that,
00:28:29.020 Tim Scott's kind of perfect.
00:28:31.160 Because you want
00:28:31.960 a vice president
00:28:32.640 who's serious
00:28:33.760 and substantial,
00:28:35.620 doesn't have
00:28:36.580 scandals,
00:28:38.120 you know,
00:28:38.280 there's no negatives
00:28:39.240 that are coming with it,
00:28:40.680 but is not as
00:28:41.720 fascinating
00:28:43.100 as the top
00:28:44.160 of the ticket.
00:28:45.620 Perfect.
00:28:46.040 Tim Scott
00:28:50.180 is a solid
00:28:52.340 person
00:28:53.260 who you wouldn't
00:28:54.060 mind having
00:28:54.580 as your president.
00:28:56.060 You know,
00:28:56.420 I suppose
00:28:57.120 your mileage
00:28:58.920 might differ
00:28:59.460 if you're a Democrat.
00:29:00.660 But you can imagine
00:29:01.700 him being president
00:29:02.520 and you can imagine
00:29:03.340 yourself being
00:29:03.940 comfortable with that.
00:29:05.420 So that's a perfect fit.
00:29:07.120 Not as much wattage
00:29:08.400 as the president.
00:29:09.740 Takes care of
00:29:10.580 some of the
00:29:11.560 biggest weakness
00:29:12.560 that Trump has.
00:29:14.280 what's Trump's
00:29:15.400 biggest weakness?
00:29:17.360 Go.
00:29:18.000 What's his biggest weakness?
00:29:19.960 Yeah,
00:29:20.220 racism.
00:29:21.300 So if he runs
00:29:22.140 with a black
00:29:23.320 vice president,
00:29:24.800 they'll still
00:29:25.300 call him a racist.
00:29:26.540 They'll call him
00:29:27.220 an Uncle Tom.
00:29:28.460 But the
00:29:29.720 attacks won't
00:29:31.360 have the same
00:29:31.920 feeling.
00:29:33.440 It will also
00:29:34.400 set up Tim Scott
00:29:35.420 as an obvious
00:29:36.400 person who runs
00:29:38.400 for president
00:29:39.060 after being
00:29:39.880 a vice president.
00:29:41.640 And if you're
00:29:42.240 black and you
00:29:43.240 just want a
00:29:43.720 black president,
00:29:44.280 which we saw
00:29:45.700 with Obama,
00:29:47.080 what do you get?
00:29:47.720 96% of black
00:29:48.920 vote or something?
00:29:51.180 There might be
00:29:52.000 some people
00:29:52.540 who say,
00:29:52.940 you know what?
00:29:54.020 That's the
00:29:54.940 shortest path
00:29:55.780 to the next
00:29:56.860 black president.
00:29:58.580 So maybe they
00:29:59.280 like that.
00:30:00.780 Maybe.
00:30:02.320 So I think
00:30:02.940 it would be a
00:30:03.760 to me he seems
00:30:07.040 like the most
00:30:07.580 obvious choice.
00:30:10.560 What do you
00:30:11.180 think of that?
00:30:11.700 He got really
00:30:12.160 quiet when I said
00:30:12.940 that.
00:30:13.160 Is there any
00:30:14.420 negative to that?
00:30:16.700 Tim Scott
00:30:17.300 as vice
00:30:17.800 president for
00:30:18.340 Trump?
00:30:18.720 Is there any
00:30:19.260 negative to
00:30:19.880 that?
00:30:21.060 It almost
00:30:21.820 feels too
00:30:22.280 obvious,
00:30:22.720 doesn't it?
00:30:23.680 It feels like
00:30:24.300 it's just
00:30:24.600 really obvious.
00:30:26.440 Yeah.
00:30:28.900 Maybe.
00:30:29.660 I could see
00:30:30.200 Kristi Noem
00:30:30.960 being a choice
00:30:32.080 as well,
00:30:32.660 but Tim Scott
00:30:33.560 seems more
00:30:34.040 obvious.
00:30:34.360 All right.
00:30:35.740 Do you know
00:30:36.300 the Trigger
00:30:36.940 Nometry
00:30:37.380 podcast?
00:30:39.240 It's a pretty
00:30:39.880 big deal.
00:30:41.420 It's made a lot
00:30:42.300 of news with
00:30:42.880 some big
00:30:43.440 interviews.
00:30:44.260 Apparently
00:30:44.720 they're,
00:30:45.620 so they're
00:30:46.760 British based
00:30:47.500 and their
00:30:48.620 bank just
00:30:49.420 shut them
00:30:50.300 down with
00:30:51.620 no explanation.
00:30:53.900 No
00:30:54.500 explanation.
00:30:55.500 They just
00:30:55.860 decided to
00:30:56.480 unbank them.
00:30:57.980 And they
00:30:58.280 just said,
00:30:58.800 yeah,
00:30:59.000 I'm sorry,
00:30:59.960 we can't
00:31:00.400 be your
00:31:01.200 bank anymore.
00:31:01.740 It's called
00:31:04.760 the Tide
00:31:05.840 Business is
00:31:06.560 the name
00:31:06.880 of the
00:31:07.140 bank.
00:31:07.580 Tide Bank
00:31:08.340 or something?
00:31:09.320 Tide Business
00:31:10.060 Bank.
00:31:10.560 I never
00:31:10.940 heard of
00:31:11.160 that bank,
00:31:12.800 but do
00:31:14.880 you think
00:31:15.160 that's good
00:31:15.640 for Tide's
00:31:16.220 business?
00:31:17.480 I hope
00:31:18.240 not.
00:31:19.020 Now,
00:31:19.360 maybe there's
00:31:19.820 some reason
00:31:20.320 we don't
00:31:20.700 know about,
00:31:22.260 but as
00:31:22.900 Eric Weinstein
00:31:24.860 was pointing
00:31:25.400 out,
00:31:26.600 we do
00:31:28.560 have a
00:31:28.900 problem
00:31:29.820 with choke
00:31:30.360 points.
00:31:31.740 Banks are
00:31:32.600 a choke
00:31:32.960 point.
00:31:34.460 So the
00:31:34.760 bad people
00:31:35.700 don't have
00:31:36.140 to go
00:31:36.400 after
00:31:36.680 everybody,
00:31:37.880 they just
00:31:38.240 go after
00:31:38.560 your bank.
00:31:40.180 And you
00:31:40.480 saw what
00:31:40.780 happened to
00:31:41.100 me with
00:31:41.400 the choke
00:31:41.760 point,
00:31:42.120 right?
00:31:42.960 Just go
00:31:43.400 after the
00:31:43.760 publisher?
00:31:45.200 You could
00:31:45.560 take me
00:31:45.880 out of
00:31:46.080 business
00:31:46.340 with just
00:31:46.820 one choke
00:31:47.500 point.
00:31:48.300 That's what
00:31:48.660 happened.
00:31:49.740 So choke
00:31:50.660 points are
00:31:51.100 a big
00:31:51.560 problem in
00:31:52.180 politics right
00:31:52.880 now,
00:31:53.420 if that's
00:31:54.220 what happened,
00:31:54.860 but we'll
00:31:55.120 keep an eye
00:31:55.540 on.
00:31:56.460 Trigonometry,
00:31:57.140 wishing them
00:31:57.560 well.
00:31:59.140 All right,
00:31:59.460 what do you
00:31:59.700 think of the
00:32:00.920 DeSantis
00:32:01.380 versus
00:32:01.900 Trump
00:32:02.540 strategy of
00:32:04.520 saying that
00:32:05.020 the other
00:32:05.420 one was
00:32:05.940 wrong on
00:32:06.780 the pandemic?
00:32:08.880 Do you
00:32:09.560 think that's
00:32:09.980 going to
00:32:10.200 work?
00:32:12.280 So
00:32:12.860 apparently
00:32:13.220 both Trump
00:32:14.340 and DeSantis
00:32:14.940 are going
00:32:17.160 to exaggerate
00:32:18.180 what they
00:32:18.560 did and
00:32:19.340 maybe exaggerate
00:32:20.860 the other
00:32:21.160 direction what
00:32:21.700 the other
00:32:21.960 person did.
00:32:22.720 So they
00:32:23.260 both seem
00:32:23.880 to have a
00:32:24.260 claim that
00:32:25.780 the other
00:32:26.140 one got
00:32:26.540 it wrong.
00:32:26.980 And I
00:32:30.560 try to
00:32:30.920 follow the
00:32:31.400 argument.
00:32:32.620 So I'm
00:32:32.820 like,
00:32:33.040 okay,
00:32:33.340 on this
00:32:33.800 date,
00:32:35.000 DeSantis
00:32:35.440 was a
00:32:36.320 little bit
00:32:36.920 pro-lockdown,
00:32:38.760 but on
00:32:39.040 this date,
00:32:39.520 he was
00:32:39.820 anti-lockdown.
00:32:40.660 But Trump,
00:32:41.960 he was
00:32:42.400 pro-vaccination,
00:32:43.820 but then
00:32:44.160 not mandatory,
00:32:46.100 but then he
00:32:46.580 wasn't against
00:32:47.620 lockdown.
00:32:48.840 And then
00:32:49.180 what about
00:32:49.800 masks?
00:32:50.360 What did
00:32:51.580 they say
00:32:51.880 about masks?
00:32:52.980 You know,
00:32:53.400 so I
00:32:54.700 think it's
00:32:55.020 too complicated.
00:32:56.980 Now,
00:32:57.520 if it's
00:32:57.760 too complicated,
00:32:59.640 voters will
00:33:00.460 just retreat
00:33:01.920 to their
00:33:02.300 bias and
00:33:02.940 back their
00:33:03.480 candidate.
00:33:04.640 So I
00:33:05.020 don't think
00:33:05.460 it has
00:33:06.540 much punch.
00:33:07.720 What do
00:33:08.000 you think?
00:33:09.320 I think
00:33:09.860 that both
00:33:10.340 of them
00:33:10.780 are going
00:33:11.220 to be
00:33:11.440 looking at
00:33:12.580 suboptimal
00:33:14.460 performances
00:33:15.860 because
00:33:17.140 everybody
00:33:17.780 was suboptimal
00:33:18.740 in the
00:33:19.180 pandemic.
00:33:21.340 You could
00:33:22.120 argue that
00:33:22.880 DeSantis
00:33:23.480 was one
00:33:24.020 of the
00:33:24.180 better ones,
00:33:25.660 but
00:33:26.080 apparently
00:33:26.620 even
00:33:28.120 New York
00:33:28.560 State
00:33:28.820 had a
00:33:29.160 lower
00:33:29.340 death
00:33:29.620 rate.
00:33:30.620 Did you
00:33:30.880 know
00:33:31.040 that?
00:33:31.720 That's
00:33:32.200 Trump's
00:33:32.560 claim.
00:33:32.900 I don't
00:33:33.080 know if
00:33:33.300 it's
00:33:33.440 true.
00:33:34.020 So
00:33:34.160 Trump
00:33:34.480 is
00:33:34.660 claiming
00:33:34.960 that
00:33:35.380 even
00:33:36.580 New York
00:33:37.060 State
00:33:37.580 did better
00:33:38.740 on death
00:33:39.200 rates than
00:33:39.740 Florida did.
00:33:40.980 But
00:33:41.400 Florida
00:33:42.540 made a
00:33:42.960 conscious
00:33:43.260 choice
00:33:43.720 to
00:33:44.080 protect
00:33:44.460 the
00:33:44.880 elderly
00:33:45.320 and
00:33:46.000 let
00:33:46.760 everybody
00:33:47.060 else
00:33:47.300 live
00:33:47.500 their
00:33:47.660 life.
00:33:48.960 So
00:33:49.360 was that
00:33:50.180 a mistake?
00:33:51.440 Was it
00:33:51.800 a mistake
00:33:52.200 to let
00:33:52.540 people live
00:33:53.040 free,
00:33:54.320 knowing,
00:33:54.980 of course,
00:33:55.300 that the
00:33:55.680 death
00:33:55.860 rate
00:33:56.040 would be
00:33:56.280 higher?
00:33:59.140 I
00:33:59.620 wouldn't
00:33:59.820 call it
00:34:00.140 a mistake.
00:34:01.240 I'd
00:34:01.620 call it
00:34:01.960 a choice.
00:34:04.320 So
00:34:04.660 I don't
00:34:05.200 know.
00:34:05.400 I think
00:34:06.320 neither of
00:34:06.820 them are
00:34:07.080 going to
00:34:07.260 get
00:34:07.400 traction.
00:34:07.820 I think
00:34:09.880 that we're
00:34:10.220 over the
00:34:10.800 pandemic.
00:34:12.240 We
00:34:12.720 understand
00:34:13.340 that nobody
00:34:13.940 was perfect.
00:34:15.520 And
00:34:15.940 they both,
00:34:16.800 I thought
00:34:17.260 they both
00:34:17.700 did the
00:34:18.560 best they
00:34:18.940 could
00:34:19.180 under
00:34:20.420 great
00:34:20.860 uncertainty.
00:34:21.300 So
00:34:23.560 I told
00:34:24.360 you
00:34:24.500 before
00:34:24.820 at the
00:34:25.700 beginning
00:34:25.920 of the
00:34:26.160 pandemic,
00:34:26.860 I told
00:34:27.480 you that
00:34:27.780 I was
00:34:28.000 going to
00:34:28.160 be an
00:34:28.460 easy
00:34:28.860 grader
00:34:29.460 for all
00:34:30.060 the
00:34:30.260 leaders
00:34:30.600 who got
00:34:30.980 it
00:34:31.100 wrong.
00:34:31.880 Because
00:34:32.140 there'd
00:34:32.280 be a lot
00:34:32.580 of people
00:34:32.880 guessing
00:34:33.300 and they
00:34:34.300 would be
00:34:34.540 getting
00:34:34.760 it wrong.
00:34:35.780 So
00:34:35.980 from the
00:34:36.520 very start
00:34:37.140 I said
00:34:37.720 let's
00:34:38.800 be a
00:34:39.180 little
00:34:39.340 bit
00:34:39.560 generous
00:34:40.080 on
00:34:40.720 this
00:34:40.980 one.
00:34:41.980 On
00:34:42.220 this
00:34:42.460 one
00:34:42.680 people
00:34:42.940 are
00:34:43.160 genuinely
00:34:43.640 guessing.
00:34:44.900 So
00:34:45.240 you don't
00:34:45.980 want to
00:34:46.240 throw your
00:34:46.700 good leaders
00:34:47.460 under the
00:34:48.220 bus
00:34:48.560 because they
00:34:49.540 guessed
00:34:49.760 wrong.
00:34:50.140 Now
00:34:52.340 you could
00:34:52.940 argue
00:34:53.200 there are
00:34:53.480 things that
00:34:53.900 were more
00:34:54.220 objectively
00:34:54.800 true when
00:34:55.540 they got
00:34:56.040 wrong.
00:34:56.740 But
00:34:57.080 nobody was
00:34:58.380 going to
00:34:58.560 get everything
00:34:58.980 right.
00:34:59.500 That wasn't
00:35:00.000 a thing.
00:35:01.060 So I
00:35:01.520 tend to
00:35:02.100 not care
00:35:02.720 so much
00:35:03.120 about what
00:35:03.540 they got
00:35:03.820 wrong.
00:35:04.920 You know
00:35:05.140 they were
00:35:05.400 in the
00:35:05.620 ballpark.
00:35:06.740 I think
00:35:07.080 both
00:35:07.400 Trump
00:35:07.860 and
00:35:08.460 DeSantis
00:35:09.060 were in
00:35:10.260 the
00:35:10.440 ballpark
00:35:11.040 of my
00:35:11.540 preference.
00:35:13.060 Meaning
00:35:13.620 they were
00:35:14.180 struggling
00:35:14.760 to understand
00:35:15.680 as best
00:35:16.680 they could
00:35:17.140 make the
00:35:17.640 best
00:35:17.860 decisions
00:35:18.320 they could.
00:35:19.360 They were
00:35:19.880 well-meaning.
00:35:22.080 I don't know.
00:35:22.940 I have
00:35:23.140 nothing to
00:35:23.580 complain about
00:35:24.140 honestly.
00:35:26.240 Even though
00:35:26.980 it was
00:35:27.200 suboptimal.
00:35:30.460 All right.
00:35:31.400 Larry Elder
00:35:32.040 has a book
00:35:32.700 As Goes
00:35:34.900 California.
00:35:35.860 My Mission
00:35:36.400 to Rescue
00:35:36.900 the Golden
00:35:37.300 State and
00:35:37.880 Save the
00:35:38.260 Nation.
00:35:39.260 Now
00:35:39.540 available.
00:35:41.800 That feels
00:35:42.660 like an
00:35:43.260 important read
00:35:44.040 because I
00:35:45.300 think he's
00:35:45.720 right on
00:35:46.120 this As
00:35:47.380 Goes
00:35:47.700 California
00:35:48.340 because
00:35:49.300 California
00:35:49.800 does
00:35:50.260 it is
00:35:50.860 sort of
00:35:51.120 the
00:35:51.340 canary
00:35:51.800 in the
00:35:52.100 coal mine
00:35:52.560 for the
00:35:52.900 rest of
00:35:53.160 the country.
00:35:54.060 It's like
00:35:54.300 well we're
00:35:54.760 doing this
00:35:55.160 this year
00:35:55.640 just wait
00:35:56.840 two years
00:35:57.460 you're going
00:35:58.300 to be doing
00:35:58.620 it too
00:35:59.000 whatever it
00:35:59.780 is.
00:36:02.020 All right.
00:36:02.940 I saw
00:36:03.720 Lawrence Jones
00:36:05.820 refer to
00:36:06.540 DeSantis
00:36:07.000 as Trump
00:36:07.740 light.
00:36:09.160 What do
00:36:09.740 you think
00:36:09.920 of that
00:36:10.140 framing?
00:36:11.480 DeSantis
00:36:11.920 as Trump
00:36:12.680 light.
00:36:13.100 Now
00:36:14.740 it's similar
00:36:15.300 to Bill
00:36:15.920 Maher
00:36:16.140 saying that
00:36:16.780 DeSantis
00:36:17.380 is the
00:36:17.880 tribute
00:36:18.200 band.
00:36:19.500 But
00:36:19.800 here's
00:36:20.420 what I
00:36:20.620 like about
00:36:21.040 calling
00:36:21.400 DeSantis
00:36:22.380 Trump
00:36:23.920 light
00:36:24.300 because it
00:36:25.080 reminds you
00:36:25.520 of Bud
00:36:25.860 light.
00:36:28.100 And Bud
00:36:28.740 light
00:36:29.060 is now
00:36:30.820 carrying
00:36:31.340 some bad
00:36:32.560 vibes with
00:36:33.200 it.
00:36:34.840 So I'm
00:36:35.100 not sure
00:36:35.460 the first
00:36:36.260 time I
00:36:36.560 said it
00:36:36.900 when the
00:36:37.800 first time
00:36:38.240 I just
00:36:38.540 said that
00:36:39.520 DeSantis
00:36:39.980 is Trump
00:36:40.600 light
00:36:41.040 quoting
00:36:41.780 Lawrence
00:36:42.240 Jones
00:36:42.560 did you
00:36:45.000 immediately
00:36:45.420 think of
00:36:45.860 Bud
00:36:46.020 light
00:36:46.200 or no?
00:36:47.460 Did Bud
00:36:47.960 light come
00:36:48.360 into your
00:36:48.640 mind
00:36:48.880 or no?
00:36:50.980 Because it
00:36:51.300 took,
00:36:51.840 it did.
00:36:52.720 Some of
00:36:53.200 you yes,
00:36:53.620 some of
00:36:53.900 you no.
00:36:55.080 It's kind
00:36:55.620 of an
00:36:55.820 interesting
00:36:56.160 framing,
00:36:56.760 isn't it?
00:36:58.580 Because
00:36:59.060 it's as
00:37:01.300 good as
00:37:02.420 tribute
00:37:02.740 band,
00:37:04.200 although
00:37:04.480 tribute
00:37:04.840 band is
00:37:05.400 really visual
00:37:06.180 and you
00:37:07.600 can almost
00:37:07.980 hear it.
00:37:09.180 And nobody's
00:37:09.780 ever liked
00:37:10.200 the tribute
00:37:10.580 band better,
00:37:11.280 so that's
00:37:11.580 good.
00:37:11.760 But people
00:37:13.100 do like
00:37:13.560 light beer
00:37:13.980 better.
00:37:15.280 There are
00:37:15.500 people who
00:37:15.900 genuinely
00:37:16.220 like light
00:37:16.840 beer.
00:37:18.160 But I
00:37:19.680 do like
00:37:19.960 the fact
00:37:20.300 that it
00:37:20.620 associates
00:37:21.200 it with
00:37:21.580 something
00:37:21.880 that the
00:37:22.560 base is
00:37:23.600 already
00:37:23.880 biased
00:37:24.500 against.
00:37:26.620 So CNN
00:37:27.560 and maybe
00:37:28.280 some others
00:37:29.720 from the
00:37:30.100 left are
00:37:30.500 going after
00:37:31.000 Musk.
00:37:31.960 So there's
00:37:32.360 a big
00:37:32.580 opinion
00:37:32.920 piece.
00:37:33.320 I'm not
00:37:33.580 even going
00:37:33.880 to tell
00:37:34.100 you who
00:37:34.400 wrote it.
00:37:35.340 The opinion
00:37:36.040 pieces on
00:37:37.420 CNN are
00:37:38.800 just such
00:37:39.300 hack jobs.
00:37:40.020 I mean
00:37:41.060 they're
00:37:41.200 just so
00:37:41.700 unprofessional
00:37:42.460 and poorly
00:37:43.600 written.
00:37:44.380 I'm not even
00:37:44.980 going to tell
00:37:45.280 you who it
00:37:45.620 was because
00:37:46.500 it doesn't
00:37:46.840 matter.
00:37:48.080 But it's
00:37:51.620 sort of
00:37:51.920 trying to
00:37:52.500 put together
00:37:52.960 a laundry
00:37:53.500 list of
00:37:54.620 reasons why
00:37:55.360 you shouldn't
00:37:55.900 trust or
00:37:56.460 like Musk.
00:37:59.180 So here's
00:38:00.080 their laundry
00:38:00.600 list of
00:38:02.500 reasons.
00:38:03.220 Now remember
00:38:03.580 the laundry
00:38:04.180 list persuasion
00:38:05.260 means that you
00:38:07.040 don't have
00:38:07.460 anything.
00:38:07.820 The reason
00:38:08.960 you put
00:38:09.320 them in
00:38:09.540 a list
00:38:10.040 is that
00:38:10.940 individually
00:38:11.480 none of
00:38:11.920 them would
00:38:12.220 bother you.
00:38:13.400 But if you
00:38:13.820 see them
00:38:14.080 in a list
00:38:14.640 you're like
00:38:14.940 whoa that's
00:38:15.560 a lot of
00:38:15.980 smoke there.
00:38:17.020 Must be
00:38:17.480 some fire
00:38:17.980 there.
00:38:19.200 So here's
00:38:19.760 their beginning
00:38:20.160 of their
00:38:20.560 little list
00:38:21.500 bullshit
00:38:22.140 propaganda
00:38:22.840 against
00:38:23.460 Musk.
00:38:25.240 That he
00:38:26.100 apparently
00:38:28.060 agreed to
00:38:29.040 Turkey's 0.81
00:38:29.980 request to
00:38:30.940 censor a
00:38:31.740 bunch of
00:38:32.120 critics of
00:38:32.680 the government.
00:38:34.760 And I
00:38:35.200 guess the
00:38:35.500 choices were
00:38:36.160 that to
00:38:37.160 not have
00:38:38.140 Twitter in
00:38:38.900 Turkey during
00:38:39.620 the election
00:38:40.120 which would
00:38:41.280 have been
00:38:41.460 bad or
00:38:42.800 to do
00:38:43.420 what a
00:38:43.720 dictator
00:38:44.060 wanted them
00:38:44.560 to do
00:38:44.960 and censor
00:38:46.260 on their
00:38:46.580 behalf which
00:38:48.160 is bad.
00:38:49.640 So Musk
00:38:50.040 had two
00:38:50.500 choices.
00:38:52.080 One was to
00:38:52.820 stay in
00:38:53.200 business in
00:38:53.820 Turkey and
00:38:55.320 live to
00:38:56.160 fight another
00:38:56.700 day because
00:38:57.540 at least that's
00:38:58.620 some free
00:38:59.040 speech.
00:39:00.180 Or to
00:39:01.720 go hard and
00:39:03.520 say no
00:39:03.860 Turkey you 0.96
00:39:04.380 don't get
00:39:04.780 Twitter.
00:39:05.680 There will
00:39:05.960 be no
00:39:06.320 Twitter for
00:39:06.760 Turkey.
00:39:07.520 Sorry.
00:39:08.740 You don't
00:39:09.080 get any.
00:39:10.620 Now how
00:39:11.640 do you know
00:39:12.080 what the
00:39:12.400 right decision
00:39:12.980 was?
00:39:15.200 You don't
00:39:15.820 really know
00:39:16.200 the right
00:39:16.540 decision do
00:39:17.120 you?
00:39:17.820 You know
00:39:18.420 what happened
00:39:19.020 but you
00:39:19.780 don't know
00:39:20.140 if he'd
00:39:20.480 made the
00:39:20.840 other
00:39:21.040 decision a
00:39:22.580 better thing
00:39:23.060 would have
00:39:23.340 happened.
00:39:24.360 It's
00:39:24.600 completely
00:39:25.000 unknowable.
00:39:26.680 So the
00:39:26.960 first complaint
00:39:27.680 they have
00:39:28.020 about Musk
00:39:28.700 is that he
00:39:29.780 tried to
00:39:30.380 weigh the
00:39:31.200 benefit of
00:39:31.820 free speech
00:39:32.640 versus all
00:39:34.320 the other
00:39:34.660 variables and
00:39:35.980 realized that
00:39:36.700 there was a
00:39:37.080 no-win
00:39:37.480 situation.
00:39:39.120 There was
00:39:39.400 no good
00:39:39.820 way.
00:39:40.920 So he
00:39:41.860 was either
00:39:42.140 going to
00:39:42.320 be no
00:39:42.760 Twitter in
00:39:43.300 Turkey which
00:39:44.020 is bad or
00:39:45.860 to agree to
00:39:47.440 censor which
00:39:48.920 is bad.
00:39:50.060 He just had
00:39:50.520 two bad
00:39:50.920 choices.
00:39:52.300 So if he
00:39:53.180 picked one of
00:39:53.720 the two bad
00:39:54.320 choices does
00:39:55.380 that mean he
00:39:55.920 sucks?
00:39:56.280 or was it
00:39:59.900 just an
00:40:00.240 impossible
00:40:00.620 situation and
00:40:01.940 he picked
00:40:02.280 one?
00:40:03.080 I don't know.
00:40:03.860 Was it a
00:40:04.240 good choice or
00:40:04.780 a bad choice?
00:40:05.920 I don't know.
00:40:07.060 Does it
00:40:07.600 indicate that
00:40:08.520 Elon Musk
00:40:09.500 loves dictators?
00:40:11.860 No.
00:40:13.040 Does it
00:40:13.600 indicate that
00:40:14.440 Twitter was
00:40:15.700 always going to
00:40:16.160 go along with
00:40:16.880 the dictator?
00:40:18.020 No.
00:40:18.620 Because one of
00:40:19.120 the factors is
00:40:19.900 that it was
00:40:20.260 right before the
00:40:20.880 election.
00:40:22.700 Right?
00:40:23.160 It could have
00:40:23.860 gone a different
00:40:24.380 way had there
00:40:25.400 not been an
00:40:25.860 election that
00:40:26.340 was really
00:40:26.840 critical and
00:40:28.360 on point at
00:40:29.420 that moment.
00:40:30.720 So it's easy
00:40:32.560 to be the
00:40:34.120 critic but if
00:40:35.780 you're going to
00:40:36.040 be the critic
00:40:36.520 you have to
00:40:36.980 say what you
00:40:37.500 would have
00:40:37.800 done that
00:40:38.940 was the
00:40:39.300 right answer.
00:40:40.440 What's the
00:40:40.880 right answer?
00:40:42.140 You're saying
00:40:42.640 he did the
00:40:43.160 wrong thing.
00:40:43.880 Now you tell
00:40:44.420 us what's the
00:40:45.060 right answer.
00:40:46.380 Because it
00:40:46.760 wasn't a right
00:40:47.300 answer.
00:40:48.360 But yet they
00:40:49.500 can throw that
00:40:50.040 on the laundry
00:40:50.600 list and oh
00:40:51.880 yeah well that
00:40:52.460 one doesn't
00:40:52.980 bother me so
00:40:53.640 much.
00:40:54.760 Let's see
00:40:55.120 what else is
00:40:55.580 on the
00:40:55.880 list.
00:40:56.680 Maybe when
00:40:57.140 I see it
00:40:57.560 all I'll
00:40:58.280 really be
00:40:58.680 worried.
00:40:59.200 Alright next
00:40:59.540 thing on the
00:41:00.180 list is that
00:41:02.040 that Musk said
00:41:04.560 that George
00:41:05.100 Soros hates 0.83
00:41:05.720 humanity and
00:41:07.660 that gets
00:41:09.300 very close to
00:41:10.200 the line of
00:41:10.820 anti-Semitic.
00:41:14.080 It wasn't
00:41:14.980 anti-Semitic but
00:41:16.960 it reminded them
00:41:18.200 of things that
00:41:19.000 are.
00:41:20.300 Right?
00:41:20.960 So now you've
00:41:21.480 got two terrible
00:41:22.500 things that Musk
00:41:23.260 did.
00:41:23.580 he had a
00:41:25.200 no-win
00:41:25.580 situation in
00:41:27.160 which he
00:41:27.480 didn't win.
00:41:29.220 That's the
00:41:29.880 first strike
00:41:30.360 against him.
00:41:31.300 He didn't
00:41:31.820 win in a
00:41:33.120 no-win
00:41:33.520 situation with
00:41:34.360 Turkey.
00:41:35.620 And then
00:41:35.980 secondly he
00:41:37.160 made a
00:41:37.520 comment about
00:41:38.180 an individual
00:41:38.880 which reminded
00:41:40.440 other people
00:41:41.260 of anti-Semitism.
00:41:42.260 It wasn't
00:41:43.820 anti-Semitism
00:41:44.640 because it was
00:41:45.760 about one
00:41:46.260 person whose
00:41:47.020 actions seemed
00:41:49.000 to suggest
00:41:50.680 that interpretation
00:41:52.060 according to
00:41:52.960 Musk, not 0.99
00:41:53.820 according to
00:41:54.300 me.
00:41:55.320 But, all
00:41:56.740 right, so you
00:41:57.060 got two of the
00:41:57.600 sketchiest things
00:41:58.480 in the world
00:41:59.040 but there's two
00:42:00.080 of them now.
00:42:01.280 If they could
00:42:01.980 only come up
00:42:02.520 with a third
00:42:03.220 thing, then
00:42:04.640 they'd have a
00:42:05.120 proper little
00:42:05.700 list then,
00:42:06.300 wouldn't they?
00:42:09.060 Let's see,
00:42:09.600 what else?
00:42:10.120 Oh, so then
00:42:11.440 the third thing
00:42:12.160 is that when
00:42:13.740 he reinstated
00:42:14.620 thousands of
00:42:15.400 Twitter accounts
00:42:16.120 people, they
00:42:17.180 were often
00:42:18.460 racist and
00:42:19.220 anti-Semitic
00:42:20.200 people, as
00:42:23.380 researchers have
00:42:24.220 documented.
00:42:26.640 So, his
00:42:28.620 problem was
00:42:29.440 that he
00:42:30.320 valued free
00:42:31.220 speech over
00:42:33.240 how offended
00:42:34.040 you would be
00:42:34.660 by it, which
00:42:36.880 is how free
00:42:38.100 speech works.
00:42:40.080 And he said
00:42:40.960 explicitly, it's
00:42:42.680 only free speech
00:42:43.720 like in a real
00:42:44.560 way if you
00:42:46.020 allow the
00:42:46.580 speech that
00:42:47.060 you don't
00:42:47.460 like.
00:42:50.100 So, they
00:42:50.800 managed to
00:42:51.360 say that
00:42:52.620 supporting free
00:42:53.520 speech, which
00:42:54.660 necessarily supports
00:42:55.980 speech you
00:42:56.440 don't like, and
00:42:57.540 then he acted
00:42:58.180 upon it by
00:42:58.940 putting it back
00:42:59.520 on Twitter.
00:43:01.600 So, now they
00:43:02.300 got these three
00:43:02.940 terrible things.
00:43:04.840 He didn't win
00:43:05.740 in a no-win
00:43:07.240 situation in
00:43:08.260 Turkey.
00:43:09.320 He said
00:43:10.040 something about
00:43:10.600 one individual
00:43:11.360 which reminded
00:43:12.260 somebody else of
00:43:13.240 something else
00:43:13.800 that was
00:43:14.160 anti-Semitic.
00:43:16.820 And he
00:43:17.960 reinstated people
00:43:19.040 under the
00:43:19.900 First Amendment
00:43:21.120 principle.
00:43:24.440 And that's
00:43:25.320 it.
00:43:27.440 That's their
00:43:28.240 laundry list.
00:43:29.860 And here's
00:43:30.300 the title of
00:43:32.220 the piece.
00:43:33.300 What happened
00:43:33.920 to Elon Musk?
00:43:36.000 What happened
00:43:36.900 to him?
00:43:38.600 The title is
00:43:39.860 what people are
00:43:40.420 going to say.
00:43:41.340 And they're
00:43:41.580 going to say,
00:43:41.900 oh, how did
00:43:42.760 he turn bad?
00:43:43.980 Well, I don't
00:43:44.440 have time to
00:43:45.020 read that
00:43:45.360 article, but
00:43:45.920 it looks like
00:43:46.420 something sketchy
00:43:48.380 is with that
00:43:48.800 guy.
00:43:50.100 Oh, my God.
00:43:51.080 Now, is it
00:43:51.700 obvious to you
00:43:52.660 that they're
00:43:53.980 just trying to
00:43:54.960 take down his
00:43:56.100 power?
00:43:57.120 Because his
00:43:57.780 power is pretty
00:43:59.560 awesome at the
00:44:00.240 moment.
00:44:01.140 Yeah.
00:44:02.400 This is not
00:44:03.180 even close to
00:44:04.020 news.
00:44:05.300 And I realize
00:44:06.220 it was an
00:44:06.640 opinion piece.
00:44:07.760 But even as
00:44:08.740 an opinion piece,
00:44:09.500 it doesn't belong
00:44:10.080 in a news
00:44:10.700 entity.
00:44:12.080 Because it's
00:44:12.640 not news.
00:44:13.220 It's just,
00:44:13.880 and it's not
00:44:14.300 really an opinion.
00:44:16.240 This is just a
00:44:17.140 head job to
00:44:17.860 take somebody's
00:44:18.420 power down.
00:44:19.600 Is that why
00:44:20.120 you need CNN?
00:44:21.380 To do hit
00:44:22.240 pieces on
00:44:22.780 people?
00:44:23.980 How useful
00:44:24.720 is that?
00:44:26.240 Well, let's
00:44:27.460 talk about
00:44:28.060 Ukraine.
00:44:32.140 Do you
00:44:32.660 remember when
00:44:33.220 we first started
00:44:34.140 and one of the
00:44:35.620 things that we
00:44:36.200 definitely weren't
00:44:37.020 going to do?
00:44:37.660 Oh, we're
00:44:38.980 definitely not
00:44:39.920 going to do it.
00:44:40.560 We might give
00:44:41.520 Ukraine some
00:44:42.460 help.
00:44:43.520 Oh, yeah.
00:44:44.320 Give them some
00:44:44.820 old weapons,
00:44:45.640 maybe some
00:44:46.160 ammunition we
00:44:46.900 weren't going to
00:44:47.360 use.
00:44:48.580 Yeah, give them
00:44:49.360 some MREs and
00:44:51.600 maybe some
00:44:52.480 helmets and
00:44:53.040 stuff, whatever.
00:44:54.720 But we're
00:44:55.520 definitely not
00:44:56.420 going to give
00:44:56.800 them any F-16s.
00:44:59.440 I mean, let's
00:45:01.340 not be crazy.
00:45:02.640 We're definitely
00:45:03.320 not going to
00:45:03.740 give them any
00:45:04.160 F-16s, because 1.00
00:45:05.540 if we did that,
00:45:06.400 we basically would
00:45:07.240 be in a hot
00:45:07.980 war with
00:45:08.380 Russia.
00:45:09.660 Because, you
00:45:10.320 know, Russia's
00:45:11.140 going to say,
00:45:11.840 we don't care
00:45:12.480 who's flying
00:45:13.020 them.
00:45:14.000 Those are your
00:45:14.700 F-16s.
00:45:15.660 We're in a war
00:45:16.220 with you now.
00:45:18.180 So definitely
00:45:19.160 we're going to
00:45:19.680 do that.
00:45:20.660 No F-16s.
00:45:23.780 So today's
00:45:24.700 news is that
00:45:25.600 Biden has
00:45:28.640 approved F-16s.
00:45:30.680 So the
00:45:30.980 Ukrainians are 1.00
00:45:31.680 now learning
00:45:32.860 to fly those
00:45:33.460 F-16s.
00:45:34.260 and I
00:45:39.000 started to
00:45:41.220 think that
00:45:41.660 the Democrats
00:45:42.560 led by
00:45:43.900 Biden are 0.60
00:45:45.000 doing the
00:45:45.460 same thing
00:45:45.940 to Putin
00:45:46.420 that they
00:45:47.000 do to
00:45:47.300 the Republicans,
00:45:48.840 which is
00:45:49.840 they're just
00:45:50.200 gaslighting the
00:45:50.980 shit out of
00:45:51.460 them.
00:45:52.680 Aren't they?
00:45:55.020 The same
00:45:56.100 technique that
00:45:56.780 they use
00:45:57.180 domestically,
00:45:58.280 which is just
00:45:59.120 propaganda and
00:46:00.340 bullshit and
00:46:01.060 gaslighting and
00:46:01.900 lying and,
00:46:02.540 oh, we
00:46:03.280 definitely won't
00:46:03.940 do that before
00:46:04.720 they do it,
00:46:05.780 and blaming the
00:46:06.520 other side of
00:46:07.080 the thing they're
00:46:07.560 doing.
00:46:08.240 It's like every
00:46:08.980 trick they do
00:46:09.580 domestically against
00:46:10.580 the Republicans,
00:46:11.420 they're doing
00:46:12.020 against Putin.
00:46:14.960 And it's
00:46:15.560 working, which of
00:46:16.440 course is why
00:46:17.100 they do it.
00:46:17.820 They're totally
00:46:18.420 gaslighting Putin,
00:46:19.940 and they're doing
00:46:20.400 a good job of it.
00:46:21.420 I don't think the
00:46:22.180 Republicans are
00:46:23.500 good at that.
00:46:24.960 I think the
00:46:25.500 Republicans just
00:46:26.260 say, okay, is
00:46:27.620 there a reason for
00:46:28.340 a war?
00:46:30.260 Yes, no.
00:46:30.900 No, no, no
00:46:33.360 reason for a war? 0.53
00:46:34.180 All right, how
00:46:34.580 about let's not
00:46:35.640 have one?
00:46:37.180 Is there a reason
00:46:38.020 for a war?
00:46:39.240 Oh, well then
00:46:40.580 let's just like
00:46:41.400 win the war.
00:46:43.340 But the
00:46:43.920 Republicans are
00:46:44.560 all this mind
00:46:45.240 games.
00:46:45.800 It's just like a
00:46:46.400 complete propaganda
00:46:48.140 mind game, you
00:46:50.440 know, the whole
00:46:51.100 thing.
00:46:53.160 So, anyway, I
00:46:54.320 don't believe
00:46:54.680 anything that comes
00:46:55.360 out of Ukraine.
00:46:56.100 It's all lies,
00:46:57.360 damn lies, damn
00:46:58.840 lies.
00:46:59.280 Anything else
00:47:00.260 happening on
00:47:00.800 this quiet
00:47:01.960 Saturday?
00:47:03.060 Did I miss
00:47:03.420 any stories?
00:47:04.680 Oh, yeah, Putin
00:47:05.260 banned Obama from
00:47:06.480 traveling to
00:47:07.080 Russia.
00:47:08.080 Why in the
00:47:08.620 world would
00:47:09.040 Obama travel to
00:47:10.060 Russia? 0.61
00:47:11.980 Was that a
00:47:12.940 thing?
00:47:14.300 Are you telling
00:47:14.820 me that Obama
00:47:15.520 was planning a
00:47:16.400 trip to Russia
00:47:18.540 now?
00:47:20.640 Or was that
00:47:21.580 just preemptive?
00:47:24.740 Does anybody
00:47:25.400 know?
00:47:26.120 He wasn't
00:47:27.040 actually going
00:47:27.800 to go to
00:47:28.160 Russia, right?
00:47:29.280 during the
00:47:30.400 current situation?
00:47:32.600 It must have
00:47:33.260 been just some
00:47:33.960 generic thing.
00:47:37.180 He's just
00:47:37.820 calling attention
00:47:38.520 to Obama?
00:47:39.340 Yeah, that's
00:47:39.740 probably smart. 0.79
00:47:42.580 We provoke
00:47:43.520 this war.
00:47:44.220 That's true.
00:47:45.080 How many
00:47:45.420 people think the
00:47:46.140 United States is
00:47:47.520 primarily responsible
00:47:48.880 for the war in
00:47:49.720 Ukraine?
00:47:51.860 Primarily
00:47:52.380 responsible.
00:47:53.800 I do.
00:47:55.540 To me, that
00:47:56.100 seems obvious.
00:47:56.860 the public
00:47:59.400 information seems
00:48:00.380 to support that
00:48:01.140 narrative completely.
00:48:02.980 It looks like we
00:48:03.780 did it.
00:48:04.740 And it looks like
00:48:05.420 we didn't get any
00:48:06.100 gain for it.
00:48:07.700 Did we gain
00:48:08.320 anything?
00:48:09.100 Or is there
00:48:09.580 anything we could
00:48:10.200 gain?
00:48:11.240 I see no
00:48:12.140 potential for
00:48:12.840 winning.
00:48:14.880 Do you?
00:48:15.480 I think we
00:48:19.200 screwed the
00:48:20.960 pooch on this
00:48:21.660 since 2014
00:48:23.200 and earlier.
00:48:25.320 I think we've
00:48:26.080 just been getting
00:48:27.240 this wrong the
00:48:28.480 whole way.
00:48:31.360 Yeah.
00:48:32.280 And earlier.
00:48:33.540 I mean, we've
00:48:34.320 been doing nothing
00:48:34.960 but trying to
00:48:35.580 overthrow Putin
00:48:36.460 and threatening
00:48:38.000 his borders.
00:48:38.700 What the hell
00:48:40.160 is he supposed
00:48:40.640 to do?
00:48:44.820 Yeah.
00:48:46.960 I'd love to
00:48:47.820 know the
00:48:48.500 secret backstory
00:48:49.300 with Putin.
00:48:50.840 I feel as if
00:48:52.140 there's something
00:48:53.960 that we don't
00:48:55.220 know about
00:48:55.740 Putin or
00:48:56.460 maybe what we
00:48:57.240 did.
00:48:57.760 Maybe it's
00:48:58.200 something we
00:48:58.580 don't know
00:48:58.860 about the
00:48:59.180 United States.
00:49:00.160 But there's
00:49:00.480 something about
00:49:01.000 the whole
00:49:01.460 Putin story
00:49:03.340 that got us
00:49:04.200 to this place
00:49:04.960 that didn't
00:49:06.960 make sense.
00:49:08.700 And the
00:49:10.180 only way to
00:49:11.040 explain it I
00:49:11.660 can think of
00:49:12.180 is that the
00:49:12.780 military-industrial
00:49:14.780 complex is
00:49:16.580 just doing
00:49:17.060 whatever it
00:49:17.540 takes to
00:49:17.860 have more
00:49:18.180 war.
00:49:19.980 Follow the
00:49:20.660 money is the
00:49:21.160 only thing
00:49:21.480 that's completely
00:49:22.360 working.
00:49:23.620 Everything else
00:49:24.360 looks like it
00:49:25.620 doesn't make
00:49:26.000 sense.
00:49:27.020 But if you
00:49:27.400 follow the
00:49:27.820 money,
00:49:28.160 everything
00:49:28.700 suddenly is
00:49:31.800 perfectly
00:49:32.240 ordered.
00:49:33.540 You can see
00:49:34.060 the whole
00:49:34.380 field.
00:49:35.100 There's no
00:49:35.660 questions left.
00:49:38.700 Yes, he
00:49:40.260 considers Ukraine
00:49:41.080 part of
00:49:41.580 Russia.
00:49:42.760 But there
00:49:43.460 are genuine
00:49:44.220 reasons why
00:49:47.760 he needs
00:49:48.180 Ukraine as
00:49:48.880 a buffer.
00:49:53.420 All right.
00:49:56.800 What if
00:50:00.220 Russia is the 0.99
00:50:00.800 most valuable
00:50:01.360 threat?
00:50:03.860 Yeah.
00:50:06.580 They're the
00:50:07.320 most valuable
00:50:08.080 threat.
00:50:08.700 That's an
00:50:09.340 interesting way
00:50:09.880 to put it.
00:50:15.060 What Scott
00:50:15.960 Barnes is
00:50:16.540 saying?
00:50:23.740 John,
00:50:24.440 are you,
00:50:25.160 so somebody,
00:50:26.260 who's Scott
00:50:27.020 Barnes?
00:50:29.440 Are you
00:50:30.360 referring to
00:50:30.900 me incorrectly
00:50:31.680 and also not
00:50:32.600 understanding any
00:50:33.340 of my positions
00:50:34.000 while criticizing
00:50:34.640 me?
00:50:35.240 Or is there a
00:50:36.160 person named
00:50:36.680 Scott Barnes
00:50:37.380 that you're
00:50:37.820 speaking of?
00:50:38.700 Were you
00:50:39.840 drinking this
00:50:40.400 morning?
00:50:45.380 All right.
00:50:46.120 So there's
00:50:46.420 something about
00:50:46.820 Robert Barnes,
00:50:47.540 I guess.
00:50:48.920 All right.
00:50:55.760 Some
00:50:56.220 mentalities
00:50:56.820 passed down
00:50:57.480 from the
00:50:57.860 sources to
00:50:58.680 the suns,
00:50:59.440 maybe.
00:51:02.500 Google it,
00:51:03.260 Scott.
00:51:03.540 Google
00:51:06.320 what?
00:51:09.740 That never
00:51:10.620 helps.
00:51:17.180 All right.
00:51:21.680 So was that
00:51:22.440 a comment about
00:51:23.100 me?
00:51:23.700 Did somebody
00:51:24.280 think I was
00:51:24.900 changing my
00:51:25.820 opinion?
00:51:27.480 I'm just
00:51:27.820 trying to see
00:51:28.280 if somebody
00:51:29.080 was drunkenly
00:51:30.080 criticizing me
00:51:31.320 or not?
00:51:35.460 Yeah,
00:51:35.840 how can
00:51:36.160 you drink
00:51:36.500 all day
00:51:36.820 if you
00:51:37.040 don't start
00:51:37.460 in the
00:51:37.640 morning?
00:51:38.060 That's
00:51:38.220 a good
00:51:38.620 point.
00:51:44.020 All right.
00:51:46.400 Yeah,
00:51:46.780 I don't think
00:51:47.200 I changed
00:51:47.620 my opinion,
00:51:48.260 so you must
00:51:48.620 have been
00:51:48.800 talking about
00:51:49.320 Barnes.
00:51:50.800 I don't
00:51:51.280 know.
00:51:53.340 Barnes is
00:51:53.820 an idiot, 0.96
00:51:54.340 so I don't
00:51:54.680 talk about
00:51:55.700 him.
00:51:55.900 All right,
00:52:03.880 here's
00:52:04.200 John.
00:52:05.380 Scott,
00:52:05.800 you're
00:52:05.940 taking
00:52:06.160 exactly
00:52:06.580 the
00:52:06.860 position
00:52:07.200 of the
00:52:07.480 people
00:52:07.740 you
00:52:07.940 criticize
00:52:08.400 for being
00:52:09.180 a RT
00:52:10.540 stooges
00:52:11.180 on Russia.
00:52:12.460 I'm
00:52:12.680 taking
00:52:12.920 exactly
00:52:13.360 the
00:52:13.600 position
00:52:13.920 of the
00:52:14.180 people.
00:52:14.300 No,
00:52:14.540 I'm not.
00:52:15.660 No,
00:52:16.100 you're
00:52:16.540 hallucinating
00:52:17.060 that.
00:52:19.120 Nothing
00:52:19.600 like that's
00:52:20.120 happening.
00:52:22.920 I have
00:52:23.520 no idea
00:52:23.900 what you're
00:52:24.200 talking about.
00:52:24.620 now you're
00:52:26.020 hallucinating.
00:52:27.740 All right,
00:52:28.420 that's all
00:52:28.900 for YouTube.
00:52:30.880 I will
00:52:31.240 talk to
00:52:31.660 you tomorrow
00:52:33.620 in the
00:52:34.360 morning.
00:52:35.780 Don't
00:52:36.260 drink for
00:52:36.680 breakfast.
00:52:37.460 That's my
00:52:38.060 advice to
00:52:38.520 you.
00:52:38.860 Your
00:52:39.040 comments
00:52:39.340 will be
00:52:39.680 much
00:52:39.900 better.