Real Coffee with Scott Adams - August 31, 2024


Episode 2583 CWSA 08⧸31⧸24


Episode Stats


Length

1 hour and 8 minutes

Words per minute

152.3724

Word count

10,363

Sentence count

771

Harmful content

Misogyny

17

sentences flagged

Hate speech

23

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of Coffee with Scott Adams, host Scott Adams talks about a new invention that could completely change the lives of some of you, and why you should be grateful it s not you who needs to change your life, it s me.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Do-do-do-do-do.
00:00:02.660 Ram-ba-bum-bum-bum-bum-bum-bum. 1.00
00:00:05.520 Ba-bum-bum-bum. 0.92
00:00:08.780 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:13.720 It's called Coffee with Scott Adams.
00:00:15.500 Would you like to take this up to a level that nobody can even understand with their tiny, shiny human brain?
00:00:22.560 Well, all you need for that is a cupper mug or a glass, a tanker chalice or a stein, a canteen jug or a flask,
00:00:28.320 a vessel of any kind, fill it with your favorite liquid I like, coffee.
00:00:32.840 And join me now for the unparalleled pleasure, the dopamine at the end of the day,
00:00:36.500 the thing that makes everything better.
00:00:38.840 It's called the simultaneous sip.
00:00:41.600 Go.
00:00:46.320 Mmm, mmm, extra good.
00:00:50.060 Well, today's a little bit special because I'm going to completely change the lives of some of you.
00:00:57.120 That's a promise.
00:00:58.700 Now, it's sort of a statistical promise because I can't get you all.
00:01:04.040 And you don't all need any kind of an improvement.
00:01:06.640 Some of you are just right the way you are.
00:01:09.100 But for some percentage of you, I'm going to give you a reframe in a few minutes.
00:01:14.120 It's not right away.
00:01:16.020 That will completely change your life.
00:01:18.000 And some of you will say, damn, was it always that obvious?
00:01:22.680 And then you'll say, all right, I'm going to go change my life.
00:01:26.840 And then you will.
00:01:28.160 This is actually real, by the way.
00:01:30.540 You legitimately, if we get 100,000 people to watch this, which would be typical.
00:01:36.920 And of 100,000, I'll bet you I will change the lives of 5,000 to 10,000 people completely.
00:01:46.860 So that's my promise to you.
00:01:48.360 But first, let's talk about this overdue book from a Virginia library.
00:01:54.060 Huh.
00:01:54.600 Why would that be an interesting story?
00:01:56.740 Well, you'd be the judge.
00:01:58.340 But there was a book returned to the library that had been missing for 50 years.
00:02:03.720 And that was the new record for a book returned to a library.
00:02:07.260 50 years.
00:02:08.440 Do you know what the record was before that for the same library?
00:02:12.100 I know.
00:02:12.600 That's what you're wondering.
00:02:13.500 You're like, what was the record before that?
00:02:16.620 Well, the record before that was nine years.
00:02:19.140 But I'm proud to say it was my book.
00:02:20.860 So before that, the longest a book had been out from the library was my book,
00:02:28.280 The Dilber of Future, from the mid-'90s.
00:02:33.260 And it was described as a cartoon self-help and joke book.
00:02:38.660 Oh, it's so much more than that.
00:02:41.160 But yes, my book is so good that people will steal it from libraries.
00:02:47.080 Yeah.
00:02:47.980 That's better than a five-star review.
00:02:50.860 How good is it?
00:02:51.880 It's so good, I'm not even taking it back to the library.
00:02:56.080 So I felt pretty good about that.
00:03:00.040 As you know, I'm at what's called a slider.
00:03:03.460 I'm one of those people who makes lights blink when I go by them,
00:03:08.300 street lights and other things.
00:03:09.760 I now have three separate lights in my house that blink continuously.
00:03:14.060 One in the kitchen, one on the far end of the house by the dog door,
00:03:21.260 and one in my master bedroom.
00:03:24.480 Different lights, different situations.
00:03:26.980 All three of them blink, blink, blink.
00:03:30.860 True story.
00:03:32.480 Can I explain it?
00:03:34.140 No.
00:03:34.480 Now, here's a question for you.
00:03:38.740 How often in your life has it been true that your car is working fine,
00:03:43.960 there's nothing wrong with it, no lights are on, don't need any service,
00:03:47.500 at the same time that all your major appliances in your kitchen are also working?
00:03:53.360 Now, for most of you, that's fairly common, isn't it?
00:03:58.960 For me, it's never.
00:04:01.660 Any time in my life has it been true that my car doesn't have a light on
00:04:06.640 and that all my appliances work.
00:04:10.760 So my car has been to major service twice this month.
00:04:15.060 I've had a dishwasher repaired, a coffee maker, a refrigerator.
00:04:19.680 I'll have the third repair on it.
00:04:21.240 I woke up to another kitchen full of water.
00:04:26.600 That's my normal life.
00:04:28.240 There's nothing unusual about that.
00:04:30.420 So electronics just go around me.
00:04:33.860 I don't know why.
00:04:37.160 So here's a small-sounding story that's big to me.
00:04:40.800 So the Amazon digital assistant, whose name I shall not say,
00:04:45.280 because it will activate all of your Amazon digital assistants,
00:04:48.140 who start with the letter A, L-E-X-I-S.
00:04:52.740 But they're going to be upgraded with AI.
00:04:56.280 So they had not had real AI until now.
00:04:59.140 But sometime this year, they'll have it.
00:05:00.940 Now, most of you are saying,
00:05:02.980 that's the smallest little unimportant story.
00:05:06.220 Not for me.
00:05:07.000 I have those devices around my house and all the major rooms,
00:05:12.340 and I talk to them all day.
00:05:14.300 The trouble is that talking to them goes like this.
00:05:18.020 Hey, name of device, what's the weather today?
00:05:23.620 I do not have control over the kitchen.
00:05:27.820 No, I said, what's the weather today?
00:05:30.220 It's 2 o'clock.
00:05:33.800 Now, what's the weather, the weather today?
00:05:38.920 The top news story is, no, shut up.
00:05:41.780 What's the weather?
00:05:43.500 So if you could actually make those devices understand what I say on the first try,
00:05:49.000 like AI does.
00:05:50.020 AI understands you really, really well.
00:05:53.540 And then maybe just have a conversation with it.
00:05:55.860 It would be completely life-changing.
00:05:58.040 This would be the Star Trek moment for me.
00:06:00.960 Do you remember in Star Trek when the crew would just talk to the ship,
00:06:05.340 and it would be some AI thing running the ship,
00:06:08.520 and they would just sort of hit their little communicator and talk to it?
00:06:12.260 That will be me.
00:06:13.940 In just a few months, I'll be just in my spaceship here talking to my computer.
00:06:20.140 Can't wait.
00:06:20.600 I really can't.
00:06:22.520 By the way, that's not a joke.
00:06:24.780 I'm actually very excited about it.
00:06:26.860 It's transformative.
00:06:30.900 There's a story coming up which would explain this better,
00:06:35.560 which is that in Japan,
00:06:38.440 I think there were over 30,000 people who died alone at home in the past year.
00:06:44.680 I saw that number somewhere.
00:06:45.900 Imagine that.
00:06:50.620 In Japan, over 30,000 people died alone in their homes in one year
00:06:56.940 because they had a massive loneliness problem.
00:07:00.820 Imagine what will happen to your loneliness problem
00:07:03.500 if you can just talk out loud in any room of your house
00:07:08.160 and there's some intelligent entity talking back to you.
00:07:11.720 It's completely transformative.
00:07:14.600 Trust me, it's a big, big deal.
00:07:16.360 It just doesn't seem like it yet.
00:07:19.180 Well, speaking of AI and robots,
00:07:21.180 there's this new robot getting a lot of attention today.
00:07:24.080 It must be close to release.
00:07:25.680 It's called the NEO, N-E-O,
00:07:29.380 and I guess it's a beta version.
00:07:32.820 So all of the video calls it the NEO beta.
00:07:36.820 And here's the funny part.
00:07:38.540 The robot is actually designed to act like a beta male
00:07:41.740 that does whatever you want.
00:07:44.640 Oh, now may I grab your purse for you, miss? 0.99
00:07:47.700 And it shows what looks to be a single woman with her beta male robot 0.96
00:07:52.320 that just does whatever she wants. 1.00
00:07:54.520 And it even calls her over for a hug.
00:07:58.880 And I'm thinking,
00:08:00.180 did they really have to put the beta part on there?
00:08:03.940 I mean, that's a little bit too on the nose.
00:08:06.720 So if you're wondering which humans would be replaced first,
00:08:10.580 it's beta males.
00:08:12.120 Beta males will be replaced first. 1.00
00:08:14.400 And is that a surprise?
00:08:16.460 No, no.
00:08:17.960 Because if you were to rank the importance of human beings,
00:08:21.940 that would be a horrible thing to do.
00:08:24.520 Wouldn't it?
00:08:25.220 But we could all do it.
00:08:26.880 Meaning that if you were in the life,
00:08:28.260 if you were in the lifeboat
00:08:29.520 and somebody had to be thrown overboard,
00:08:32.520 we could all make the same decision about who goes first,
00:08:36.040 if you know what I mean.
00:08:38.460 So, you know,
00:08:39.460 your rank is usually children are first.
00:08:42.560 Although if you're a Democrat,
00:08:44.100 it might be women are first and children are second. 1.00
00:08:46.560 But, you know,
00:08:47.080 men are going to be lower on the list. 0.60
00:08:48.860 And then the beta men,
00:08:51.220 the beta males,
00:08:52.480 literally have the lowest value in society.
00:08:55.800 So the beta males got replaced first.
00:08:59.140 Surprise.
00:09:00.400 Who needs a simp when you've got a robot?
00:09:03.440 So I'm sure they could have made them
00:09:05.480 so they didn't act exactly like beta males.
00:09:09.760 But if you're going to make them look like that,
00:09:11.520 don't call it the neo beta.
00:09:13.180 That's all I'm saying.
00:09:14.160 I know they're using beta in a different way.
00:09:16.100 It's just funny.
00:09:16.700 Well, two more stories about major breakthroughs
00:09:21.940 in battery development every single day.
00:09:24.140 These are different stories.
00:09:26.080 I'm not telling you the same battery breakthrough story every day,
00:09:29.640 but there are two more.
00:09:32.320 This is a big, big deal
00:09:33.720 because you need your batteries for your robots.
00:09:37.700 You're not going to have a world of robots
00:09:40.020 unless the battery technology gets a lot better
00:09:43.160 and it's all happening.
00:09:44.140 Now, do you ever ask yourself,
00:09:46.780 why is it all happening at exactly the right time?
00:09:50.180 Isn't that weird?
00:09:51.880 That exactly the time
00:09:53.420 we need really, really big advancements in batteries,
00:09:57.660 it's happening.
00:09:59.700 When do we need it for the robots?
00:10:02.220 Isn't that weird?
00:10:03.740 Anyway, that feels like a simulation situation.
00:10:06.940 There's a company, Atio Systems.
00:10:10.180 They've got a new thing
00:10:11.240 that will get rid of all your forever chemicals in your battery.
00:10:15.920 That's a big deal.
00:10:17.020 You know, the ones that you can't get rid of otherwise.
00:10:21.100 And 20% reduction in cost
00:10:23.220 and 50% increase in energy density,
00:10:26.700 82% decrease in energy consumption
00:10:28.980 to make the batteries themselves.
00:10:30.800 There's another tech explorer,
00:10:32.500 there's another story about researchers.
00:10:34.520 Oh, they figured out if you have a lithium battery,
00:10:40.060 that how you charge it the first time
00:10:42.620 can make a gigantic difference in how long it lasts.
00:10:47.600 So just knowing that little bit of information
00:10:49.760 could double the length of time your battery lasts.
00:10:53.240 These changes in the battery technology are all gigantic.
00:10:58.940 This is one little tweak on existing technology,
00:11:02.140 simply knowing that your first charge is the important one,
00:11:05.380 that presumably they could do it at the factory
00:11:07.400 so it gets done right.
00:11:10.320 And it doubles.
00:11:11.680 It doubles the life of it.
00:11:13.280 Doubles its value.
00:11:14.900 These are gigantic changes.
00:11:17.160 Civilization changing technology.
00:11:19.380 All right, University of Surrey had a research
00:11:23.560 and they found out that humans are most,
00:11:27.940 well, they're very influenced by the repetition of messages
00:11:30.780 and that you can make something that's untrue
00:11:33.620 seem true to people simply by repeating it
00:11:36.600 over and over again.
00:11:38.200 Now, what do you think I'm going to say about that?
00:11:41.120 Huh, they did research to find out
00:11:44.080 that the more somebody repeats a message,
00:11:46.340 the more likely you'll act like it's true.
00:11:49.960 Well, they could have just saved a little bit of money
00:11:52.800 and asked me because it's the most basic knowledge
00:11:55.720 that every hypnotist knows.
00:11:57.660 Every advertiser knows it.
00:11:59.540 Everybody in marketing knows it.
00:12:01.480 Everybody, everyone in politics knows it.
00:12:04.300 You didn't even have to ask me.
00:12:07.020 Tell me, could they have asked you?
00:12:09.760 Could they have skipped all this science
00:12:11.580 and just said, oh, I'll just use the phone book
00:12:14.280 if that existed and I'll randomly call somebody?
00:12:17.260 Hello? Hi.
00:12:18.480 Oh, you don't know me.
00:12:19.840 It's a random phone call.
00:12:21.340 I'm trying to save some money on some research.
00:12:23.600 Do you think you could help me out?
00:12:25.300 Sure.
00:12:26.420 Well, we were going to research whether showing somebody
00:12:29.260 the same piece of information over and over again
00:12:31.520 would have any influence on what they believe to be true.
00:12:36.400 Yeah, that's obviously the case.
00:12:39.760 So you feel confident about that?
00:12:41.840 Yeah, totally.
00:12:42.640 You can skip all the research.
00:12:43.980 Save that money.
00:12:45.240 Thank you very much, stranger.
00:12:46.920 See, that's how they should have handled it.
00:12:48.800 Just randomly call a stranger.
00:12:51.800 Now, remember I told you I was going to change your lives?
00:12:55.200 Not all of you,
00:12:56.700 but a solid 5% to 10% of you who watch this,
00:12:59.860 either now or later,
00:13:01.180 your lives will be completely changed
00:13:03.320 by the reframe I'm about to give you.
00:13:05.740 You ready for it?
00:13:07.120 A lot of priming.
00:13:08.080 And it has to do with this obvious science
00:13:11.940 that the more you see something,
00:13:13.720 the more it changes what you think is true.
00:13:16.280 Here's the reframe.
00:13:19.440 Each of us spends a certain percentage of our day
00:13:22.400 thinking about sometimes the past
00:13:25.180 and thinking about sometimes the future.
00:13:28.960 Would you agree that so far you all agree?
00:13:32.060 Now, I'm not saying what's right or wrong.
00:13:34.020 I'm just saying that we all spend some amount of time
00:13:37.300 thinking past, about the past,
00:13:39.520 some amount thinking about what we'd like the future.
00:13:42.520 Here's the reframe that will change your life forever.
00:13:45.720 If you think about the past too much,
00:13:47.660 it'll make you depressed
00:13:48.680 and maybe fill you with a regret.
00:13:51.760 And you're also going to be thinking about somebody
00:13:53.800 that you no longer are.
00:13:55.920 You're not the past.
00:13:57.800 So the more time you spend thinking about the past
00:14:00.800 and the problems and the things you should have done
00:14:02.980 and the things that didn't work out,
00:14:04.720 the more you're guaranteed to be depressed.
00:14:07.520 Because what is true is what you think about the most.
00:14:11.320 So if you think about the past the most,
00:14:13.540 even though the past doesn't even exist, it's gone.
00:14:16.860 You can't grab a piece of the past.
00:14:19.340 It's gone.
00:14:20.180 But if you think about it the most,
00:14:21.780 it becomes your reality.
00:14:23.660 Now take it the other way.
00:14:25.060 If you think about the future and what it could be,
00:14:28.160 if you make the right moves,
00:14:29.900 you become like Arnold Schwarzenegger.
00:14:33.540 He's someone who talks about visualizing the future.
00:14:37.280 You've probably heard other people,
00:14:39.640 like probably Tony Robbins,
00:14:41.480 maybe you've listened to me.
00:14:43.600 And Jim Carrey says the same thing.
00:14:46.640 So you'll find tons and tons of super successful people
00:14:49.960 who will consistently tell you the same thing,
00:14:53.260 that they think mostly about the future
00:14:56.080 and then they imagine the future they want.
00:14:59.540 Arnold Schwarzenegger's got a great documentary
00:15:01.420 on Netflix right now,
00:15:02.740 Story of His Life.
00:15:04.240 I recommend it.
00:15:05.100 I haven't finished it,
00:15:05.940 but I totally recommend it.
00:15:08.080 For the mental part.
00:15:10.400 So just as it is true
00:15:12.040 that if a message is given to you over and over,
00:15:14.320 you'll think it's true,
00:15:15.180 whether it is or not,
00:15:16.700 when you think about your past,
00:15:18.180 that becomes your reality.
00:15:20.080 And it's optional.
00:15:20.980 You can change the percentage of time
00:15:24.120 you think about your ideal future,
00:15:27.080 you know, an ideal future
00:15:28.200 where you have some steps
00:15:29.420 where you could actually get there.
00:15:31.260 And that will completely transform
00:15:33.940 your experience of life
00:15:35.360 as well as your success.
00:15:37.540 So the simplest way to remember this
00:15:39.540 is that the more time you spend
00:15:41.220 thinking about your preferred
00:15:43.520 realistic path in the future,
00:15:46.020 and the less time you think about where you were,
00:15:48.760 the happier you will be
00:15:51.240 and the more successful you'll be.
00:15:53.740 Now, half of you just said,
00:15:56.120 I already do that, Scott.
00:15:57.340 I know.
00:15:58.540 Remember, I'm going after 5% to 10% of you. 0.99
00:16:01.400 So some of you are just along for the ride,
00:16:03.580 and you're part of the productive change
00:16:06.740 for the 5% or 10%.
00:16:08.020 5% or 10% of you just said,
00:16:10.780 wait a minute,
00:16:11.440 you just pulled together
00:16:12.520 everything I was thinking about,
00:16:14.100 but I didn't quite see it that cleanly.
00:16:16.320 Because you all knew that affirmations are good.
00:16:19.720 You all knew that positive thinking is good.
00:16:23.320 Right?
00:16:24.160 You all knew that thinking about the future
00:16:26.000 and having a plan is good.
00:16:28.020 But I just simplified it in a way
00:16:29.820 that will actually activate it for you.
00:16:32.200 If you're thinking about the past,
00:16:34.120 just say, get out.
00:16:35.300 Get out of your head.
00:16:36.620 See, now I'm combining two techniques.
00:16:38.800 The get out.
00:16:39.800 If you're locked into your thoughts
00:16:41.280 of something that happened,
00:16:43.000 just say, get out.
00:16:44.360 Get out.
00:16:45.180 Think of the future.
00:16:46.640 Think of what you want to happen tomorrow.
00:16:48.320 Think of what you want to happen in five years.
00:16:50.300 You can do it.
00:16:51.360 All you have to do is
00:16:52.300 build yourself a little habit.
00:16:54.620 Completely changes your life.
00:16:58.020 All right.
00:16:58.580 Well, the complaints about air travel
00:17:01.020 are up 12% since last year.
00:17:03.680 Tons of complaints.
00:17:04.520 This is more evidence of what I call
00:17:06.680 the incompetence crisis.
00:17:10.080 We have a general incompetence crisis
00:17:12.680 where we expect that nothing
00:17:15.140 that used to be simple
00:17:16.500 will ever work.
00:17:18.100 I mean, even the most basic transactions
00:17:20.600 that you used to think would work,
00:17:22.740 I don't really think they'll even work
00:17:24.620 before I try them now.
00:17:25.760 So, I don't know what's behind it,
00:17:29.080 but we got a lot of incompetence happening.
00:17:31.240 I think a lot of it has to do
00:17:32.860 with everything's more complicated.
00:17:35.200 You know, humans didn't get smarter,
00:17:36.940 but everything that we have to do
00:17:39.500 got more complicated.
00:17:40.900 So, that would get you
00:17:42.060 to an incompetence crisis.
00:17:43.740 If not for the DEI programs,
00:17:45.700 which has nothing to do with the people,
00:17:47.580 has to do with the program
00:17:48.700 would artificially require
00:17:51.440 more demand than there is supply.
00:17:55.760 Short term.
00:17:58.360 Anyway, seven,
00:18:00.080 or some people say six,
00:18:02.320 in the news,
00:18:03.240 U.S. service members
00:18:04.100 were injured in a raid in Iraq
00:18:05.820 that went after a bunch of ISIS militants
00:18:09.040 and killed them. 0.67
00:18:10.660 So, we're still fighting in Iraq.
00:18:13.660 We're still fighting ISIS.
00:18:15.740 I hope that this is just,
00:18:18.100 I think they call it mowing the lawn.
00:18:21.340 You know, every once in a while,
00:18:22.260 you have to go out and mow the lawn
00:18:23.420 and just make sure that you get
00:18:25.180 the new militants that are popping up.
00:18:28.660 So, I hope that's not any kind
00:18:29.880 of a big trend that's coming.
00:18:32.340 I hope that was just mowing the lawn,
00:18:34.300 as they say,
00:18:34.900 which is a terribly diminishing thing
00:18:37.040 to say about human life,
00:18:38.300 even ISIS.
00:18:39.020 All right.
00:18:40.420 Searchlight Pictures presents
00:18:43.260 The Roses,
00:18:44.180 only in theaters August 29th.
00:18:46.200 From the director of Meet the Parents
00:18:47.920 and the writer of Poor Things
00:18:49.520 comes The Roses,
00:18:50.980 starring Academy Award winner
00:18:52.480 Olivia Colman,
00:18:53.680 Academy Award nominee
00:18:54.800 Benedict Cumberbatch,
00:18:56.180 Andy Samberg,
00:18:57.140 Kate McKinnon,
00:18:57.900 and Allison Janney.
00:18:59.060 A hilarious new comedy
00:19:00.280 filled with drama,
00:19:01.620 excitement,
00:19:02.260 and a little bit of hatred,
00:19:03.760 proving that marriage
00:19:04.700 isn't always a bed of roses.
00:19:06.580 See The Roses,
00:19:08.040 only in theaters August 29th.
00:19:09.760 Get tickets now.
00:19:12.300 According to The Wall Street Journal,
00:19:14.620 American consumers are more optimistic.
00:19:17.120 So, they asked 1,500 voters
00:19:19.000 in late August,
00:19:20.440 and a pretty big change.
00:19:22.460 So, they asked about their feelings
00:19:23.540 about the economy.
00:19:24.380 34% said it's improving
00:19:26.920 compared to 26 in early July.
00:19:29.660 That's a pretty big move,
00:19:31.120 but 34% is still only a third
00:19:34.840 of the country thinks things
00:19:35.960 are moving in the right direction.
00:19:38.080 And then the share of thought
00:19:39.180 the economy was worse
00:19:40.120 fell to 48 from 54.
00:19:42.680 So, of course,
00:19:43.880 that would be an equivalent change.
00:19:46.620 Now, so what do you make
00:19:50.180 of the fact that now that you know
00:19:53.760 that the more you hear something,
00:19:55.700 the more true it is,
00:19:57.260 even if it's not true?
00:19:59.000 So, we just talked about that.
00:20:01.120 So, then you look to
00:20:01.920 The Wall Street Journal
00:20:02.700 and you try to pull together
00:20:03.760 these concepts.
00:20:04.560 The Wall Street Journal says
00:20:06.620 the economy might be improving
00:20:09.080 according to consumers.
00:20:11.000 And if consumer opinions go up,
00:20:13.240 then probably the economy
00:20:15.520 will follow the consumer sentiment
00:20:17.340 because the sentiment
00:20:18.660 makes them buy stuff,
00:20:19.840 buying stuff is good
00:20:20.660 for the economy, etc.
00:20:22.500 So, how do you read this story then?
00:20:25.560 Do you read this story
00:20:26.400 as a true and useful piece of news?
00:20:30.780 I don't.
00:20:33.840 I see all the news
00:20:35.580 within the umbrella
00:20:36.400 of brainwashing and propaganda,
00:20:39.280 that everybody's got an agenda.
00:20:41.460 And when you get this close
00:20:42.920 to the election
00:20:43.580 and a major publication
00:20:45.020 tells you that people
00:20:46.840 are thinking better
00:20:47.780 about the economy,
00:20:49.220 how do you take that?
00:20:51.240 Well, if you saw enough stories
00:20:53.400 that says,
00:20:53.940 I think this economy
00:20:54.900 is turning around,
00:20:55.880 what would the public think?
00:20:57.380 they would think
00:20:58.580 the economy is turning around.
00:21:00.360 And then they would vote
00:21:01.240 for whoever did that.
00:21:03.560 So, I don't really even see
00:21:05.640 stories like this as news.
00:21:07.840 To me, they're newsy.
00:21:09.880 I mean, it's probably
00:21:11.520 a true survey.
00:21:13.220 I'm not saying it's false.
00:21:15.140 I'm saying that
00:21:16.020 what they decide to talk about
00:21:18.380 is a decision
00:21:21.480 about how much you see
00:21:22.800 of every message.
00:21:24.220 If the news can decide
00:21:25.720 how much you see
00:21:26.820 of each message,
00:21:28.400 then even what is true
00:21:30.040 and what is false
00:21:30.740 stops mattering.
00:21:32.860 Because remember,
00:21:34.020 what you see the most
00:21:35.480 is your truth.
00:21:37.400 That's what you lack them.
00:21:39.160 So, if they keep telling you
00:21:40.560 things are getting better,
00:21:41.660 well, maybe that's
00:21:42.500 an indication
00:21:43.520 that whoever's writing
00:21:44.600 these articles
00:21:45.320 might want a little bit
00:21:47.620 less of Trump
00:21:48.340 and a little bit more
00:21:49.180 of Harris.
00:21:50.500 Maybe.
00:21:51.340 You don't know that.
00:21:52.640 I'm not a mind reader.
00:21:54.220 I'm just saying
00:21:54.640 that's what you'd expect
00:21:55.540 the more you see it.
00:21:58.460 All right.
00:22:00.880 Here's what
00:22:01.460 Nicole Shanahan,
00:22:02.860 RFK Jr.'s running mate,
00:22:06.040 said on Fox News.
00:22:08.160 Democrats spend millions
00:22:09.420 on lawsuits
00:22:09.980 to keep us off the ballots.
00:22:12.500 Now that we've suspended
00:22:13.520 our campaign,
00:22:14.200 and they're scrambling
00:22:14.900 to keep us on the ballot.
00:22:17.420 You know,
00:22:18.180 so you know
00:22:19.080 the background of that, right?
00:22:20.620 So,
00:22:21.700 the Democrats
00:22:23.480 were trying to make sure
00:22:24.420 RFK Jr. couldn't run
00:22:25.840 so they used all their
00:22:26.660 lawfare that they could.
00:22:28.440 That cost the campaign
00:22:29.980 millions of dollars
00:22:30.940 to defend against
00:22:31.800 all that stuff.
00:22:32.820 And then as soon as he said,
00:22:34.260 but you know,
00:22:34.620 I'm going to take myself
00:22:35.400 off the ballot
00:22:36.100 in some of these
00:22:37.100 battleground zones
00:22:38.640 so that Trump
00:22:39.700 has a better chance
00:22:40.360 of winning.
00:22:41.320 Now they're fighting
00:22:42.200 the opposite fight.
00:22:43.140 Oh,
00:22:43.960 you're on the ballot now.
00:22:45.440 You're definitely
00:22:46.420 staying on the ballot now.
00:22:48.060 Oh,
00:22:48.300 we're not taking you
00:22:49.080 off the ballot.
00:22:50.280 Nice try.
00:22:50.940 We're keeping you
00:22:51.520 on the ballot.
00:22:52.840 So,
00:22:53.400 I've never seen
00:22:54.180 a more naked
00:22:55.080 abuse of power.
00:22:57.680 But here's my question.
00:22:59.760 How many Democrats
00:23:00.760 are even aware of this?
00:23:03.300 It's on Fox News.
00:23:05.220 I'll bet every one
00:23:06.000 of you were aware of it
00:23:07.020 because if you're
00:23:07.860 watching this,
00:23:08.440 you're probably news
00:23:09.180 junkies like I am.
00:23:11.360 I'll bet every one
00:23:12.080 of you knew it,
00:23:13.060 that the Democrats
00:23:13.880 went from trying
00:23:15.160 to keep them
00:23:15.640 off the ballots
00:23:16.280 to trying to keep
00:23:17.160 them on the ballots.
00:23:18.680 And none of this
00:23:19.340 has anything to do
00:23:20.120 with democracy
00:23:20.940 or freedom
00:23:21.640 or anything.
00:23:23.700 It's just
00:23:24.040 a naked power move.
00:23:26.140 Now,
00:23:26.360 I'm not saying
00:23:26.880 that maybe Republicans
00:23:27.960 would never do
00:23:28.780 any naked power moves.
00:23:30.500 This isn't something
00:23:31.360 that's limited
00:23:31.900 to one side,
00:23:32.580 of course.
00:23:33.460 But I guess
00:23:35.100 the thing that shocks me
00:23:36.860 is that it could
00:23:38.220 be so blatant.
00:23:39.720 And the reason
00:23:40.420 it could be so blatant
00:23:41.400 is that nobody
00:23:42.660 who's voting Democrat
00:23:44.920 will ever even
00:23:46.100 see this more than once.
00:23:48.440 They might have
00:23:49.240 seen it once,
00:23:50.240 but if you only
00:23:51.520 see it once,
00:23:52.240 it doesn't become
00:23:52.960 a thing.
00:23:53.840 Even though you know
00:23:54.700 it's true,
00:23:55.500 if you've only heard
00:23:56.840 it once,
00:23:58.040 it will never
00:23:58.580 activate any action.
00:24:00.780 So the news
00:24:01.600 can actually report
00:24:02.580 this accurately
00:24:03.300 and say,
00:24:03.940 well,
00:24:04.700 they were doing this
00:24:05.560 but now they're
00:24:06.080 doing this
00:24:06.600 or they could have
00:24:07.780 Scott Jennings come on
00:24:09.140 and every once in a while
00:24:10.100 he'll say,
00:24:10.800 but, you know,
00:24:11.380 remind you they did this
00:24:12.640 and now they're doing that.
00:24:14.100 As long as it doesn't
00:24:15.320 come up much,
00:24:16.980 it won't have any effect
00:24:18.560 on Democrats.
00:24:21.060 It would have to come up
00:24:22.160 over and over again
00:24:23.020 before anything sinks in.
00:24:24.620 That's what we just learned.
00:24:27.940 So it's such a naked power play,
00:24:29.840 but they can get away with it.
00:24:31.900 So this is
00:24:32.880 Trump's new framing
00:24:34.120 of Harris.
00:24:36.060 He said,
00:24:36.820 quote,
00:24:37.120 Harris didn't do
00:24:38.100 a lot of interviews
00:24:38.820 but she's not good at it, 0.95
00:24:41.120 I guess,
00:24:41.700 which I think
00:24:43.160 is everybody's bottom line.
00:24:45.160 It looks like
00:24:45.660 she's not good at it. 1.00
00:24:46.940 Yeah,
00:24:47.220 she's not good at it. 1.00
00:24:47.960 We'll talk more about that.
00:24:49.320 And he says,
00:24:49.980 quote,
00:24:50.400 I think she would have
00:24:51.420 been better off
00:24:52.280 if she just did interviews,
00:24:55.120 even if they weren't great.
00:24:56.740 It would have been better
00:24:57.660 because now everyone's watching
00:24:59.060 and now we see
00:25:00.300 she's defective. 1.00
00:25:01.200 she's a defective person. 1.00
00:25:04.440 And we don't need
00:25:05.420 another defective person
00:25:06.620 as President of the United States.
00:25:08.060 We just had that.
00:25:14.820 He is so good
00:25:16.040 with framing and words.
00:25:19.440 I never really thought
00:25:20.740 of that word,
00:25:21.700 defective.
00:25:23.380 Not once have I ever thought
00:25:25.280 that would be the right word
00:25:26.380 to apply to any of the situation.
00:25:28.460 But as soon as you hear it,
00:25:31.060 you say to yourself,
00:25:32.240 oh, wow,
00:25:33.340 he found a way
00:25:34.140 to tie dementia
00:25:35.520 to whatever problems
00:25:38.860 Kamala Harris has
00:25:39.920 talking in public.
00:25:41.160 We don't know
00:25:41.680 what drives that.
00:25:43.140 But they're both defective.
00:25:45.840 So he doesn't have to be
00:25:47.040 any more specific than that
00:25:48.780 because as soon as you hear
00:25:49.980 that word,
00:25:50.540 you're like,
00:25:51.140 defective.
00:25:52.500 Yeah.
00:25:52.900 Yeah.
00:25:53.860 Yeah.
00:25:54.620 It's not like
00:25:55.640 two regular politicians
00:25:58.160 in a contest.
00:26:00.620 Kamala Harris does actually
00:26:02.060 look defective.
00:26:03.780 Now,
00:26:04.580 to be fair,
00:26:06.000 the Democrats would say,
00:26:07.200 but that Trump
00:26:08.040 is defective
00:26:08.840 in so many ways,
00:26:10.080 but they don't use that word.
00:26:11.400 So it's just funny
00:26:12.100 that he picked that word first.
00:26:13.640 They probably will,
00:26:15.020 you know,
00:26:15.200 now that he's used it.
00:26:16.240 If it makes a dent,
00:26:17.820 so far it hasn't.
00:26:18.780 But if it makes a dent,
00:26:21.000 they're going to pick it up
00:26:22.640 and say,
00:26:23.120 you're defective. 0.66
00:26:24.260 Oh, yeah?
00:26:24.920 You're defective.
00:26:25.920 But not until it works.
00:26:27.540 So it's not working yet.
00:26:29.220 But I love that choice of words
00:26:30.500 just tying
00:26:31.100 defective Biden
00:26:33.020 to defective Harris.
00:26:34.320 It does feel like
00:26:35.260 if you were looking at them
00:26:36.720 as a product,
00:26:38.340 well,
00:26:38.540 let's put it this way.
00:26:40.360 All right.
00:26:40.580 Here's the best way
00:26:41.260 to look at it.
00:26:43.280 If robots were
00:26:44.560 a little bit better
00:26:45.360 and a robot
00:26:46.720 could run for office,
00:26:47.920 and there was
00:26:49.680 a robot Harris
00:26:50.640 and there was
00:26:51.140 a robot Biden
00:26:52.000 and they acted
00:26:53.140 just like the real
00:26:54.580 Human Beings Act 0.77
00:26:55.460 but they're robots.
00:26:56.660 What would you say
00:26:57.520 about the robots?
00:26:59.900 You would say
00:27:00.660 they're defective.
00:27:02.280 Wouldn't you?
00:27:03.300 If you built a robot
00:27:04.500 and it couldn't handle
00:27:05.440 an interview,
00:27:06.400 you'd say,
00:27:06.860 ah,
00:27:07.440 what did we do wrong
00:27:08.940 with a robot?
00:27:09.760 It couldn't handle
00:27:10.480 the interview.
00:27:11.500 If you met a robot
00:27:12.600 that acted exactly
00:27:13.600 like Joe Biden,
00:27:15.280 you'd say,
00:27:15.980 oh, man,
00:27:16.700 back to the drawing board.
00:27:17.920 This robot
00:27:18.540 couldn't even do
00:27:19.220 a debate
00:27:19.820 without looking
00:27:20.440 like it was glitching out.
00:27:22.700 So the fact
00:27:23.620 that he uses
00:27:24.380 like a robot name
00:27:25.600 to describe
00:27:26.600 both of them
00:27:27.140 as defective,
00:27:28.680 I don't know.
00:27:29.660 There's just something
00:27:30.280 perfect about it.
00:27:31.720 The timing of it
00:27:32.680 is right.
00:27:34.980 Fox News
00:27:35.700 had a body language
00:27:36.920 expert
00:27:37.760 talking about
00:27:38.540 Kamala Harris
00:27:39.420 and I didn't see
00:27:40.880 on the clip
00:27:41.380 the name of the expert
00:27:42.500 so I'm not leaving
00:27:43.720 it out
00:27:44.180 by neglect.
00:27:46.400 It just wasn't
00:27:47.260 easily found.
00:27:48.620 So,
00:27:49.300 I apologize
00:27:49.940 to the expert
00:27:50.720 but
00:27:51.440 there's a body language
00:27:53.240 expert who said
00:27:54.120 that Kamala Harris
00:27:55.320 lacked confidence
00:27:56.700 and presidential
00:27:57.620 appearance,
00:27:59.600 that she was
00:28:00.160 bobbing and waffling
00:28:01.340 showing that she
00:28:02.120 didn't have confidence
00:28:02.980 or she was unprepared
00:28:04.060 and that
00:28:05.600 it's an indication
00:28:06.740 says the body language
00:28:07.820 expert
00:28:08.180 that her words
00:28:09.100 were not matching
00:28:09.920 her internal feelings.
00:28:11.320 her words
00:28:13.980 were not matching
00:28:14.660 her internal feelings.
00:28:16.120 I felt that.
00:28:17.640 Now,
00:28:18.020 I'm no body language
00:28:19.240 expert.
00:28:20.280 Well,
00:28:20.700 a little bit.
00:28:21.780 I mean,
00:28:22.120 I do spend a lot
00:28:23.060 of time looking
00:28:23.600 into it.
00:28:24.140 It's within
00:28:24.900 the persuasion
00:28:25.660 domain
00:28:26.100 but I wouldn't
00:28:27.520 call myself
00:28:28.020 a body language
00:28:28.720 expert.
00:28:29.460 I just
00:28:29.800 pay attention
00:28:30.760 to it more
00:28:31.260 than other people
00:28:31.880 do.
00:28:33.960 But he said
00:28:34.500 she would
00:28:36.460 break her gaze.
00:28:38.540 If you watched
00:28:39.180 it,
00:28:39.460 did you notice
00:28:39.840 how often
00:28:40.260 she would look
00:28:40.740 down
00:28:41.080 instead of
00:28:41.620 looking at
00:28:41.960 the person
00:28:42.320 she was
00:28:42.660 talking to?
00:28:44.240 Who looks
00:28:44.980 away from
00:28:45.480 the person
00:28:45.900 they're talking
00:28:46.380 to?
00:28:47.660 I've done
00:28:48.220 a lot of
00:28:48.700 interviews
00:28:49.080 with a lot
00:28:49.820 of,
00:28:50.060 you know,
00:28:50.260 hundreds and
00:28:50.860 hundreds of
00:28:51.320 interviews.
00:28:51.900 I don't believe
00:28:52.860 I've ever
00:28:53.560 not looked at
00:28:55.420 them when I was
00:28:55.960 talking to them
00:28:56.680 if it was just
00:28:57.340 the two of us
00:28:57.900 on camera.
00:28:59.220 And imagine
00:29:00.220 the person
00:29:00.660 sitting right
00:29:01.160 next to you,
00:29:02.900 you know,
00:29:03.120 like right
00:29:03.760 to your left
00:29:04.580 instead of
00:29:05.680 looking at
00:29:06.160 them the whole
00:29:06.540 time and
00:29:06.900 talking,
00:29:07.540 you spend a
00:29:08.340 good deal
00:29:08.700 time like
00:29:09.620 looking down
00:29:10.460 to form
00:29:11.860 your thoughts.
00:29:12.920 That's a
00:29:13.560 real bad
00:29:14.100 look.
00:29:15.340 Yeah,
00:29:15.520 that's a
00:29:15.920 bad,
00:29:16.340 bad look.
00:29:17.140 It was like
00:29:17.580 she couldn't
00:29:18.060 make eye
00:29:18.540 contact and
00:29:19.320 also keep
00:29:19.900 her thoughts
00:29:20.360 straight.
00:29:21.380 So that's
00:29:22.120 what you want
00:29:22.560 negotiating with
00:29:23.340 Putin,
00:29:24.340 someone who
00:29:24.840 can't make
00:29:25.480 eye contact
00:29:26.100 and also
00:29:27.260 think at the
00:29:27.960 same time.
00:29:28.420 it's a
00:29:30.080 problem.
00:29:30.880 It's a
00:29:31.140 big problem.
00:29:34.000 Byron
00:29:34.480 Donalds was
00:29:35.220 making similar
00:29:36.040 comments about
00:29:36.680 the body
00:29:36.980 language on
00:29:38.280 Fox News
00:29:40.520 and he said
00:29:41.040 that when
00:29:42.820 Kamala was
00:29:43.340 asked about
00:29:43.720 her family,
00:29:44.840 she looks
00:29:45.200 relaxed and
00:29:46.040 she makes 0.98
00:29:46.400 eye contact.
00:29:48.320 And I
00:29:48.520 thought,
00:29:48.820 ooh,
00:29:49.340 Byron
00:29:49.700 Donalds,
00:29:50.460 good observation.
00:29:51.740 That is
00:29:52.080 exactly correct.
00:29:53.780 When she's
00:29:54.300 not lying and
00:29:55.600 you know it
00:29:56.240 because she's
00:29:56.760 just talking
00:29:57.480 about her
00:29:57.800 feelings or
00:29:58.420 her family
00:29:58.800 or something,
00:29:59.220 there's no
00:29:59.520 reason to
00:29:59.920 lie.
00:30:01.100 She looks
00:30:01.800 relaxed and
00:30:02.580 like a
00:30:02.920 normal person.
00:30:04.580 But as
00:30:05.240 soon as she
00:30:05.760 had to talk
00:30:06.200 about policies,
00:30:08.400 because she's
00:30:08.980 got all that
00:30:09.540 flip-flopping and
00:30:10.620 maybe she's not
00:30:11.260 the best at
00:30:11.860 explaining her
00:30:12.560 point of view,
00:30:13.840 she looked
00:30:14.540 confident and
00:30:15.460 she had to
00:30:15.860 look away and
00:30:16.560 look at her
00:30:16.880 hand and
00:30:17.400 look at the
00:30:17.760 table and
00:30:18.340 look at
00:30:18.600 everything else.
00:30:21.060 And Byron
00:30:21.920 says,
00:30:22.500 why does she 0.56
00:30:23.180 have to do
00:30:23.500 that?
00:30:23.740 Because she
00:30:24.120 doesn't believe
00:30:24.780 these policies.
00:30:26.240 And she's
00:30:26.560 trying to
00:30:26.900 remember what
00:30:27.440 staffers
00:30:27.940 coached her
00:30:28.480 to say.
00:30:29.240 Now, I
00:30:29.760 can't read
00:30:30.240 minds, so
00:30:32.000 I'm not going
00:30:32.920 to agree with
00:30:34.140 Byron that
00:30:36.260 that's what I
00:30:36.720 can see in
00:30:37.220 her mind.
00:30:38.340 But is it a
00:30:39.820 reasonable
00:30:40.460 perspective that
00:30:42.660 it certainly
00:30:43.160 looks like that's
00:30:44.060 what's happening?
00:30:44.860 Yes.
00:30:45.860 Yes.
00:30:46.240 From the
00:30:46.660 perspective of
00:30:47.480 that's what it
00:30:48.040 looks like,
00:30:49.000 absolutely.
00:30:50.200 But remember,
00:30:51.480 we can't read
00:30:52.080 minds, so don't
00:30:53.120 go too far with
00:30:53.940 it.
00:30:54.760 You know, Byron's
00:30:55.360 in a political
00:30:55.980 realm, so
00:30:57.540 going a little
00:30:58.200 extra far, sort
00:30:59.740 of normal
00:31:00.200 politics, but
00:31:01.220 he can't read
00:31:01.780 her mind.
00:31:03.200 He's just
00:31:03.620 really good at
00:31:04.200 communicating.
00:31:07.260 All right,
00:31:07.760 Bill Maher's
00:31:08.800 back from his
00:31:09.740 summer break,
00:31:11.920 and he
00:31:13.500 joked, I
00:31:14.700 don't know,
00:31:15.360 I don't know
00:31:16.100 why we ever
00:31:16.740 thought, talking
00:31:17.400 about Harris,
00:31:18.300 I don't know
00:31:18.820 why we ever
00:31:19.360 thought she
00:31:19.800 was as bad
00:31:20.920 as people
00:31:21.320 thought she
00:31:21.820 was.
00:31:24.600 He said
00:31:26.520 that Biden
00:31:26.920 had one
00:31:27.360 bad night,
00:31:28.040 but she
00:31:28.260 had a bad
00:31:29.600 three years.
00:31:31.600 She did,
00:31:32.440 but she's 0.97
00:31:32.800 fine.
00:31:33.460 So even
00:31:33.820 Bill Maher
00:31:34.500 can see that
00:31:36.340 Biden was
00:31:36.860 defective,
00:31:38.060 you know,
00:31:38.300 there's no
00:31:38.760 question about
00:31:39.380 that anymore,
00:31:40.200 but more
00:31:40.900 than that,
00:31:41.660 that Harris
00:31:42.880 has been
00:31:43.300 basically defective
00:31:44.400 for the entire
00:31:45.300 time she's
00:31:45.800 been in office.
00:31:47.360 Now, here's
00:31:47.820 my question.
00:31:51.800 Nothing
00:31:52.520 fascinates
00:31:53.580 me more
00:31:54.260 than watching
00:31:56.140 Bill Maher
00:31:56.800 navigate the
00:31:57.760 situation in
00:31:59.200 which he is
00:32:00.180 very clearly
00:32:01.140 aware at
00:32:02.040 this point
00:32:02.640 that he's
00:32:03.800 on the bad
00:32:04.380 side.
00:32:05.900 Now, he
00:32:06.280 didn't start
00:32:06.780 that way,
00:32:08.000 and indeed,
00:32:09.180 you know,
00:32:09.500 as you know,
00:32:10.460 I've had
00:32:11.660 more Democrat
00:32:12.680 voting and
00:32:14.860 support over
00:32:16.440 my life
00:32:17.060 I just
00:32:18.440 don't have
00:32:18.780 it at the
00:32:19.120 moment.
00:32:20.240 So if I
00:32:21.280 look at the
00:32:21.760 two parties,
00:32:22.400 I say to
00:32:22.800 myself,
00:32:23.400 one of those
00:32:24.000 parties looks
00:32:24.560 like a pure
00:32:25.240 criminal party.
00:32:26.900 One of them
00:32:27.420 looks like they
00:32:27.940 still like the
00:32:28.520 Constitution.
00:32:30.000 They don't even
00:32:30.440 look similar,
00:32:31.340 like a little
00:32:31.980 bit to me.
00:32:32.920 There was a
00:32:33.520 time when we
00:32:34.120 used to say
00:32:34.560 that parties
00:32:35.060 are the same.
00:32:36.220 You know,
00:32:36.400 Al Gore is
00:32:37.060 going to give
00:32:37.400 you the same
00:32:37.820 thing as
00:32:38.240 George Bush.
00:32:39.660 And that's
00:32:40.560 not true,
00:32:41.780 but it's
00:32:43.840 way more true
00:32:44.640 that the
00:32:45.020 difference is
00:32:45.620 extreme at this
00:32:46.540 point.
00:32:48.060 So if
00:32:48.720 you're Bill
00:32:49.020 Maher and
00:32:50.260 you obviously
00:32:51.820 you're swimming
00:32:52.520 in politics,
00:32:53.160 it's your
00:32:53.660 job, how
00:32:56.260 do you not
00:32:56.800 notice that
00:32:58.000 your team is
00:32:58.640 the anti-democratic
00:32:59.680 team?
00:33:00.860 Absolutely.
00:33:02.060 You know,
00:33:02.320 keeping people
00:33:02.960 off the
00:33:03.380 ballots.
00:33:03.980 And he
00:33:04.220 mentioned that,
00:33:04.840 by the way,
00:33:05.260 to his credit.
00:33:06.500 He's saying it 0.89
00:33:07.320 directly that his
00:33:08.320 team is keeping
00:33:09.060 people off the
00:33:09.740 ballots.
00:33:10.240 He had Nancy
00:33:10.760 Pelosi on.
00:33:12.580 He went kind
00:33:13.940 of hard at
00:33:14.440 her, but 0.94
00:33:14.900 also jokingly,
00:33:16.420 which is a
00:33:17.620 perfectly good
00:33:18.260 approach.
00:33:19.460 And I
00:33:22.140 think he's
00:33:22.600 very close,
00:33:23.960 not to
00:33:25.280 voting for
00:33:25.820 Trump, but
00:33:27.340 just totally
00:33:28.940 treating his
00:33:29.660 team like they
00:33:30.440 failed.
00:33:31.960 And, you
00:33:32.740 know, he's
00:33:33.160 hinting around
00:33:34.200 the corners,
00:33:34.940 certainly leaning
00:33:35.920 in that direction.
00:33:36.780 But we can all
00:33:37.680 see that.
00:33:38.700 It's a
00:33:39.300 separate question
00:33:40.220 whether Trump
00:33:41.820 is your best
00:33:42.540 choice as
00:33:43.180 president.
00:33:44.360 And I get
00:33:45.420 that reasonable
00:33:46.120 people could
00:33:46.780 have a different
00:33:47.260 opinion on
00:33:47.840 that.
00:33:48.380 I wouldn't
00:33:48.900 say anybody's
00:33:49.580 crazy if
00:33:51.000 they'd prefer
00:33:51.540 a different
00:33:51.980 president.
00:33:52.800 But if you're
00:33:53.480 looking at what
00:33:53.940 the Democrat
00:33:54.600 Party is and
00:33:56.760 how they act,
00:33:58.220 it looks purely
00:33:59.140 criminal to me.
00:34:00.940 Like, really
00:34:01.840 criminal.
00:34:02.680 Like, super,
00:34:03.660 super,
00:34:04.220 frickin' 0.93
00:34:04.640 criminal.
00:34:05.860 And how do
00:34:06.880 you not notice
00:34:07.800 that?
00:34:08.260 It feels
00:34:10.580 like, I
00:34:12.140 think, Bill
00:34:12.580 Maher does
00:34:13.180 notice it.
00:34:14.560 Because he's
00:34:15.380 talked around
00:34:16.140 it and about
00:34:16.760 it, you
00:34:17.420 know, about
00:34:17.700 the details of
00:34:18.560 it, enough
00:34:18.940 that it's
00:34:19.280 obvious he's
00:34:19.900 immersed and
00:34:21.620 is understanding
00:34:22.580 what's going
00:34:23.140 on.
00:34:23.800 I think he
00:34:24.340 probably, I
00:34:25.440 can't read
00:34:25.820 minds, probably
00:34:27.420 thinks that it's
00:34:28.340 correctable and
00:34:29.360 maybe it's
00:34:29.900 temporary and,
00:34:31.200 you know, the
00:34:31.460 party will
00:34:32.060 normalize.
00:34:33.880 But I'm not
00:34:35.040 really seeing an
00:34:35.780 indication of
00:34:36.400 that.
00:34:36.620 So at
00:34:38.080 some point
00:34:38.620 you're going
00:34:38.940 to have to
00:34:39.280 break with
00:34:41.200 your team
00:34:41.780 because you're
00:34:42.940 just a criminal
00:34:43.720 if you're
00:34:44.480 participating with
00:34:45.380 it.
00:34:45.540 At some point
00:34:46.200 you just
00:34:47.180 become a
00:34:47.640 criminal.
00:34:48.580 Now, I'm
00:34:49.040 not, I don't
00:34:50.480 want to call
00:34:50.860 Democrats
00:34:51.400 deplorables.
00:34:52.480 That's not
00:34:52.960 where I'm
00:34:53.220 going with
00:34:53.660 this.
00:34:54.220 I'm not
00:34:54.660 saying there's
00:34:55.040 anything wrong
00:34:55.500 with the
00:34:55.740 people.
00:34:56.820 I'm saying
00:34:57.580 that the
00:34:57.920 people are
00:34:58.360 in a
00:34:58.600 situation.
00:34:59.960 They're in
00:35:00.540 a brainwashing
00:35:01.340 situation and
00:35:02.860 the, let's
00:35:05.400 say the
00:35:05.880 signals that
00:35:06.980 they've been
00:35:07.360 brainwashed are
00:35:08.200 so clear at
00:35:09.060 this point that
00:35:09.740 people are
00:35:10.140 noticing.
00:35:10.840 The smart
00:35:11.300 people first.
00:35:12.840 It's the smart
00:35:13.760 people who are
00:35:14.200 noticing first and
00:35:15.140 they're leaving
00:35:15.560 first.
00:35:16.780 And I think
00:35:17.920 the rest are
00:35:18.520 going to start
00:35:19.020 noticing that
00:35:20.760 they're in
00:35:21.220 essentially a
00:35:23.280 criminal enterprise.
00:35:24.760 And you could
00:35:25.600 hate everything
00:35:26.320 that the
00:35:26.660 Republicans do
00:35:27.480 and they could
00:35:27.960 be political
00:35:28.600 and they could
00:35:29.180 do naked
00:35:29.620 political things
00:35:30.440 like everybody
00:35:30.960 else.
00:35:32.060 But it
00:35:32.740 doesn't look
00:35:33.480 like they're
00:35:33.860 trying to be
00:35:34.480 criminals.
00:35:36.120 It doesn't
00:35:36.820 look it to
00:35:37.220 me.
00:35:38.680 Anyway.
00:35:40.740 At Grey
00:35:41.540 Goose, we
00:35:42.040 believe that
00:35:42.540 pleasure is a
00:35:43.280 necessity.
00:35:44.380 That's why we
00:35:44.900 craft the world's
00:35:45.720 number one
00:35:46.240 premium vodka
00:35:46.960 in France using
00:35:48.060 only three of
00:35:48.780 the finest
00:35:49.320 natural ingredients.
00:35:50.820 French winter
00:35:51.340 wheat, water
00:35:52.280 from Jean Sac,
00:35:53.280 and yeast.
00:35:56.040 With Grey
00:35:56.860 Goose, we
00:35:57.600 invite you to
00:35:58.300 live in the
00:35:58.720 moment and
00:35:59.440 make time
00:35:59.900 wait.
00:36:00.680 Sip
00:36:00.860 responsibly.
00:36:05.060 Big difference.
00:36:06.320 Reid Hoffman,
00:36:07.040 as you know,
00:36:07.620 one of the
00:36:07.900 biggest donors
00:36:08.620 to the
00:36:09.020 Democrats,
00:36:10.260 was on the
00:36:11.520 all-in pod.
00:36:12.980 Now that's
00:36:13.420 extra interesting
00:36:14.260 because they
00:36:14.920 would be people
00:36:15.920 who have lived
00:36:17.100 and dealt in
00:36:17.720 the real world.
00:36:18.340 I'm sure
00:36:18.680 I'm sure they
00:36:18.840 all knew
00:36:19.100 each other.
00:36:20.660 Sacks and
00:36:21.340 Hoffman had
00:36:22.100 been part of
00:36:22.620 the PayPal
00:36:23.120 group, so
00:36:25.440 they've known
00:36:25.800 each other a
00:36:26.280 long time, I
00:36:26.800 assume.
00:36:29.620 So here's, I
00:36:31.460 didn't see the
00:36:32.020 entire pod, I
00:36:32.800 just saw clips
00:36:33.460 from it, and
00:36:34.100 I've got a few
00:36:34.800 suggestions.
00:36:36.300 My first
00:36:36.680 suggestion is for
00:36:37.520 Reid Hoffman,
00:36:38.880 how to do a
00:36:40.300 interview on
00:36:42.280 the video.
00:36:42.740 Now, if
00:36:44.740 you're looking
00:36:45.100 at me on
00:36:45.600 video, the
00:36:47.140 framing that
00:36:47.800 you see in
00:36:48.400 this video is
00:36:49.340 roughly an
00:36:50.680 ideal framing.
00:36:52.580 So I'm a
00:36:53.920 certain age and
00:36:54.680 I'm not wearing
00:36:55.540 makeup, which
00:36:56.980 is, you know,
00:36:57.380 typical on
00:36:58.080 video stuff.
00:36:59.460 These days it's
00:37:00.280 typical.
00:37:01.040 So you don't
00:37:01.820 want to get
00:37:02.160 too close, and
00:37:04.340 it's creepy if
00:37:05.160 you get too
00:37:05.620 close.
00:37:06.820 Now, I would
00:37:07.620 like to give
00:37:08.040 you my impression
00:37:08.720 of Reid Hoffman
00:37:09.480 on the all-in
00:37:10.540 pod.
00:37:10.920 This is going
00:37:11.980 to be very
00:37:12.740 disturbing to
00:37:13.460 many of you.
00:37:13.980 You might
00:37:14.240 want to look
00:37:14.640 away.
00:37:17.060 So, let me
00:37:18.440 tell you about
00:37:18.960 what all the
00:37:19.520 things I'm
00:37:19.920 doing.
00:37:21.300 This is
00:37:23.080 terrible.
00:37:24.320 Don't let
00:37:24.940 this be what
00:37:25.640 you look like
00:37:26.260 on video.
00:37:27.100 Nobody wants
00:37:27.720 to see this.
00:37:28.880 If you're male
00:37:29.720 and you're a
00:37:30.240 certain age,
00:37:30.980 nobody wants
00:37:31.700 to look at
00:37:32.040 your damn 0.93
00:37:32.460 face at
00:37:33.380 all.
00:37:34.060 Certainly not
00:37:34.660 like this.
00:37:35.620 This is the
00:37:36.140 most disturbing
00:37:36.760 thing that
00:37:37.200 could ever
00:37:37.500 happen.
00:37:38.420 Don't do
00:37:38.800 this again,
00:37:39.480 Reid.
00:37:40.920 That's my
00:37:41.540 advice.
00:37:42.960 But also,
00:37:43.780 you don't
00:37:44.060 want to do
00:37:44.460 a Dershowitz.
00:37:45.640 Have you
00:37:45.960 seen Dershowitz?
00:37:47.240 The other
00:37:47.840 big mistake
00:37:48.480 is you put
00:37:48.980 your laptop
00:37:49.680 that has
00:37:51.180 your camera
00:37:51.580 on it too
00:37:52.060 low.
00:37:53.140 So, here's
00:37:53.660 the other
00:37:53.920 way to go.
00:37:56.940 This is
00:37:57.480 the Dershowitz.
00:37:59.220 Don't do
00:37:59.760 the Dershowitz
00:38:00.460 where you're
00:38:01.700 looking down
00:38:02.160 at your
00:38:02.420 computer.
00:38:04.060 Terrible
00:38:04.460 look.
00:38:05.700 No,
00:38:06.060 terrible
00:38:06.340 look.
00:38:06.640 Oh, let
00:38:08.780 me show
00:38:09.020 you what
00:38:09.220 I have
00:38:09.580 here.
00:38:09.860 I'll turn
00:38:10.320 it around
00:38:10.760 if you
00:38:11.020 can see
00:38:11.340 it.
00:38:12.100 Can you
00:38:12.380 see this
00:38:12.760 little table?
00:38:15.380 These little
00:38:15.840 tables are
00:38:16.620 perfect.
00:38:17.920 It's like
00:38:18.380 six inches
00:38:21.280 tall or
00:38:21.800 something.
00:38:22.940 And if your
00:38:24.020 chair is
00:38:24.460 right and
00:38:25.700 you set it
00:38:26.580 down on
00:38:28.280 the little
00:38:28.560 table,
00:38:30.000 you get
00:38:30.900 perfect video
00:38:34.320 just like
00:38:34.760 this.
00:38:35.560 By the
00:38:35.940 way,
00:38:36.620 this seems
00:38:37.100 obvious.
00:38:37.760 It took
00:38:38.060 me forever
00:38:38.760 to figure
00:38:39.680 out all
00:38:40.140 these little
00:38:40.580 tips.
00:38:42.340 So, if you
00:38:44.060 don't do it
00:38:44.500 every day, I
00:38:45.260 can understand
00:38:45.860 why it's
00:38:46.520 not as
00:38:46.860 obvious.
00:38:47.860 So, don't
00:38:48.200 do that.
00:38:50.360 And then
00:38:50.800 Reid was
00:38:51.760 challenged on
00:38:52.580 giving money
00:38:53.260 to groups
00:38:53.940 that are
00:38:54.980 doing some
00:38:55.820 anti-democratic
00:38:56.840 stuff, such
00:38:57.760 as trying to
00:38:58.320 keep RFK
00:38:59.060 Jr.
00:38:59.520 off the
00:38:59.840 ballot and
00:39:00.240 stuff.
00:39:00.920 And Reid
00:39:01.260 Hoffman's
00:39:01.700 answer was
00:39:02.400 that, like
00:39:04.580 a startup,
00:39:06.120 he might
00:39:07.000 donate money
00:39:07.980 to a
00:39:08.300 startup, but
00:39:09.140 he doesn't
00:39:09.520 have control
00:39:10.120 of what all
00:39:10.820 the employees
00:39:11.360 do moment
00:39:12.180 to moment
00:39:12.640 or what
00:39:13.040 the CEO
00:39:13.480 does.
00:39:14.860 And so,
00:39:15.380 similarly, he
00:39:16.360 was saying
00:39:16.720 he gives
00:39:17.100 money to
00:39:17.580 these lots
00:39:19.920 of different
00:39:20.280 groups,
00:39:20.700 apparently.
00:39:21.460 So, he
00:39:22.000 doesn't fund
00:39:22.600 one group.
00:39:23.260 He funds
00:39:23.720 apparently
00:39:24.640 numerous
00:39:25.260 groups.
00:39:26.620 And I
00:39:28.340 think he
00:39:28.680 was acknowledging
00:39:29.340 that some
00:39:30.320 of them might
00:39:30.760 be doing
00:39:31.100 some things
00:39:31.640 he didn't
00:39:31.940 like.
00:39:33.400 Does that
00:39:34.080 sound like
00:39:34.540 a credible
00:39:35.220 answer to
00:39:35.920 you?
00:39:38.200 Here's what
00:39:38.840 would be a
00:39:39.300 credible
00:39:39.640 answer to
00:39:40.280 me.
00:39:41.980 Can you
00:39:42.640 tell me the
00:39:43.120 names of
00:39:43.660 those places
00:39:44.760 that did
00:39:45.340 things wrong
00:39:45.980 so I can
00:39:47.700 never give
00:39:48.140 them money
00:39:48.500 again because
00:39:49.080 that's
00:39:49.300 terrible?
00:39:50.920 Now, that
00:39:51.640 would be
00:39:52.060 somebody who
00:39:52.740 really didn't
00:39:53.480 want his
00:39:53.880 money to
00:39:54.280 be going
00:39:54.660 to non-democratic
00:39:56.160 stuff.
00:39:56.860 That's sort
00:39:57.300 of what you'd
00:39:57.680 expect.
00:39:58.500 It's like,
00:39:58.880 oh my god,
00:39:59.580 yeah, you're
00:40:00.060 right.
00:40:00.780 Some of that
00:40:01.280 went to some
00:40:01.820 people who
00:40:02.180 did some
00:40:02.500 bad stuff.
00:40:03.720 We have
00:40:04.260 the names
00:40:04.620 of those
00:40:04.940 groups.
00:40:05.460 I'll make
00:40:05.760 sure my
00:40:06.120 money never
00:40:06.660 gets to
00:40:07.040 them.
00:40:08.580 Now, he
00:40:09.100 did say
00:40:09.580 that he
00:40:10.420 had always
00:40:10.760 been very
00:40:11.180 clear that
00:40:12.600 nobody should
00:40:13.180 be doing
00:40:13.580 any non-democratic
00:40:14.840 stuff with
00:40:15.800 his money.
00:40:16.260 But, you
00:40:18.040 know, it's
00:40:18.640 not exactly
00:40:19.240 stopping
00:40:19.620 anybody.
00:40:21.140 So, he
00:40:24.520 also said
00:40:25.300 that he
00:40:25.580 believed what
00:40:27.140 I'd call the 0.82
00:40:27.740 fine votes
00:40:28.620 hoax.
00:40:31.020 So, Reid
00:40:31.640 Hoffman
00:40:32.060 believed that
00:40:33.180 when Trump
00:40:35.040 called
00:40:35.900 Raffzenberger in
00:40:37.680 Georgia after
00:40:38.360 the 2020
00:40:38.960 election, he
00:40:40.100 said we only
00:40:40.640 need to find
00:40:41.560 so many votes,
00:40:42.920 that that was a
00:40:43.620 clear indication
00:40:44.380 he was asking
00:40:45.120 him to cheat.
00:40:46.980 Now, how
00:40:48.720 do you process
00:40:50.220 that?
00:40:51.560 Because we
00:40:53.440 know Reid
00:40:53.900 Hoffman is
00:40:54.540 unusually smart,
00:40:56.300 but nobody
00:40:58.120 smart believes
00:40:58.800 this.
00:41:00.140 So, how do
00:41:00.540 you reconcile
00:41:01.200 unusually smart
00:41:02.760 with believing
00:41:04.460 something that
00:41:05.200 only a dumb
00:41:05.820 person would
00:41:06.300 believe?
00:41:08.460 Is that
00:41:09.180 brainwashing?
00:41:10.580 Or is he
00:41:11.360 lying?
00:41:12.820 Or is he
00:41:13.600 just didn't
00:41:14.180 know the
00:41:14.440 facts?
00:41:15.120 Now, hold
00:41:16.180 that thought.
00:41:17.100 Hold the
00:41:17.740 thought that
00:41:18.200 he thought
00:41:18.720 that with
00:41:19.820 lots of
00:41:20.240 people listening,
00:41:21.400 because Trump
00:41:22.200 knew that
00:41:22.580 there were
00:41:22.980 people on
00:41:23.840 the call,
00:41:24.800 right?
00:41:25.500 So, he
00:41:26.320 believes that
00:41:27.080 Trump, with
00:41:28.320 lots of
00:41:28.680 people listening,
00:41:30.260 would ask
00:41:31.040 somebody to
00:41:32.160 cheat on
00:41:33.140 the election
00:41:33.560 so he
00:41:33.940 could win
00:41:34.220 the election.
00:41:36.600 That is
00:41:37.380 an absurd
00:41:37.820 belief.
00:41:39.440 A reasonable
00:41:40.180 belief is
00:41:40.880 that Trump
00:41:41.340 really believed
00:41:42.400 he won,
00:41:43.480 really believed
00:41:44.340 that it
00:41:44.640 would be
00:41:44.920 obvious the
00:41:45.620 election was
00:41:46.320 stolen, and
00:41:48.020 that it
00:41:48.320 wouldn't be
00:41:48.720 that hard to
00:41:49.420 figure out how
00:41:50.020 stolen it
00:41:50.600 was if you
00:41:51.380 were close to
00:41:51.960 it like
00:41:52.280 Raffensperger is.
00:41:53.720 So, if they
00:41:54.220 just, you
00:41:55.060 know, maybe
00:41:55.380 looked a little
00:41:55.940 harder, they
00:41:57.080 might find that
00:41:57.660 some of the
00:41:58.000 votes are
00:41:58.480 illegally cast,
00:42:00.820 something like
00:42:01.580 that.
00:42:02.340 Now, the way I
00:42:03.140 heard it,
00:42:04.240 because I'm
00:42:04.700 not, well,
00:42:07.140 maybe I am.
00:42:08.280 I was going to
00:42:08.860 say I'm not
00:42:09.380 brainwashed, but
00:42:11.140 when you're
00:42:11.460 brainwashed, you
00:42:12.120 always think
00:42:12.920 you're not, so
00:42:13.640 maybe I am.
00:42:15.640 But the way it
00:42:16.340 looks to me is
00:42:17.240 that if you
00:42:20.820 really believe
00:42:21.520 that he was
00:42:22.060 trying to
00:42:22.460 overthrow the
00:42:23.060 country by
00:42:23.760 telling somebody
00:42:24.480 that he just
00:42:26.580 needs to find
00:42:27.240 X number of
00:42:27.900 votes, I
00:42:28.880 don't know how
00:42:29.360 to process
00:42:29.800 that.
00:42:30.980 But I'm
00:42:31.360 going to give
00:42:31.680 you one more
00:42:32.300 hint that
00:42:33.420 will help you
00:42:34.100 maybe come up
00:42:34.860 with a hypothesis
00:42:35.660 of why somebody
00:42:37.280 this smart and
00:42:39.120 this connected
00:42:39.740 to politics,
00:42:40.580 because he
00:42:41.320 has an
00:42:41.640 interest in
00:42:42.120 politics, he's
00:42:42.740 not a casual
00:42:43.460 observer, why
00:42:44.600 would he think
00:42:45.120 something so
00:42:46.620 absurd?
00:42:48.120 Well, you
00:42:49.080 might say he's
00:42:49.960 brainwashed because
00:42:51.080 a lot of
00:42:51.460 Democrats think
00:42:52.340 the same absurd
00:42:53.200 thing, and they
00:42:54.420 also believe a
00:42:55.180 whole bunch of
00:42:55.720 hoaxes.
00:42:56.800 So here's one
00:42:57.820 way we could
00:42:58.440 tell if he is
00:43:00.260 lying intentionally
00:43:01.440 or he has been
00:43:03.360 hypnotized by his
00:43:04.320 own team.
00:43:06.080 One way would be
00:43:07.140 if we knew that
00:43:07.940 he believed in
00:43:08.580 another hoax,
00:43:09.340 like a real
00:43:10.500 obvious one?
00:43:11.920 If you knew
00:43:13.100 that he believed
00:43:13.700 some other
00:43:14.420 real obvious
00:43:15.260 hoax, and you
00:43:16.080 were sure that
00:43:16.600 he actually
00:43:17.040 believed it,
00:43:18.260 like not just
00:43:18.920 saying he
00:43:19.280 believed it,
00:43:20.440 wouldn't that
00:43:21.040 suggest that
00:43:22.420 maybe the
00:43:22.880 problem is
00:43:23.380 brainwashing and
00:43:24.560 not an
00:43:24.980 intentionality?
00:43:26.600 Well, at the
00:43:27.420 same podcast,
00:43:28.840 he said that
00:43:29.940 the protesters
00:43:30.800 on January 6th
00:43:32.080 quote, killed
00:43:32.760 police officers.
00:43:35.700 Killed police
00:43:36.580 officers.
00:43:37.040 Now, you
00:43:39.040 know that
00:43:39.340 didn't happen,
00:43:40.020 right?
00:43:40.640 And David
00:43:41.180 Sachs said,
00:43:41.880 whoa, I
00:43:43.460 can't let that
00:43:44.140 go.
00:43:44.940 I got to
00:43:45.280 fact check
00:43:45.720 that.
00:43:46.080 There were no
00:43:46.420 police officers
00:43:47.060 killed on
00:43:47.580 January 6th.
00:43:48.920 And then he
00:43:49.600 sort of
00:43:50.000 retreated to,
00:43:50.820 well, you
00:43:51.240 know, the
00:43:51.860 one died of
00:43:52.960 something soon
00:43:53.940 after, but it
00:43:55.460 wasn't connected.
00:43:56.780 At least nobody's
00:43:57.580 made a connection.
00:43:59.480 And one of
00:44:00.200 them may have
00:44:01.200 taken his own
00:44:01.820 life, but
00:44:02.820 again, there's
00:44:05.360 no suggestion
00:44:06.060 it's connected.
00:44:08.080 So that was
00:44:09.580 a case where
00:44:11.980 I think that
00:44:12.640 news has been
00:44:13.380 told a million
00:44:14.020 times.
00:44:15.180 Like, how do
00:44:16.000 you not know
00:44:16.740 that the
00:44:17.080 protesters didn't
00:44:18.000 kill any
00:44:18.400 police?
00:44:19.640 That's really
00:44:20.500 important.
00:44:21.680 If the
00:44:22.340 protesters had
00:44:23.120 killed police,
00:44:25.000 I mean, even
00:44:25.480 I would feel
00:44:26.020 differently about
00:44:26.800 it.
00:44:28.280 So what does
00:44:30.520 that tell you
00:44:31.000 about the
00:44:31.320 other thing?
00:44:32.660 So he
00:44:32.960 believed that
00:44:33.480 the fine
00:44:34.060 votes thing
00:44:34.760 was an
00:44:36.740 attempt to
00:44:37.220 overthrow a
00:44:38.260 proper election
00:44:39.120 and he
00:44:40.160 believed that
00:44:40.620 police officers
00:44:41.400 were killed
00:44:42.040 on January
00:44:43.320 6th.
00:44:44.300 Or did he
00:44:45.140 just say it
00:44:46.020 and not
00:44:48.420 think he
00:44:49.040 wouldn't get
00:44:49.540 fact-checked?
00:44:50.400 Because of
00:44:51.120 course he
00:44:51.520 would be
00:44:51.820 fact-checked.
00:44:53.240 So I think
00:44:54.420 he wouldn't
00:44:54.840 have said it
00:44:55.640 unless he
00:44:56.920 thought it
00:44:57.260 was true.
00:44:57.660 So it
00:44:59.880 does suggest
00:45:00.700 that brain
00:45:02.420 washing might 0.97
00:45:03.140 be more
00:45:03.520 active than
00:45:04.300 some other
00:45:04.900 motive.
00:45:05.900 But we
00:45:06.200 don't know.
00:45:06.760 Can't read
00:45:07.320 his mind.
00:45:08.640 So I
00:45:09.960 don't know.
00:45:12.500 It's confusing.
00:45:13.920 I'm going to
00:45:14.340 say he
00:45:14.800 couldn't.
00:45:15.520 There's no
00:45:16.060 way that the
00:45:17.040 problem is
00:45:17.660 he's dumb. 0.91
00:45:19.360 Would you
00:45:19.960 agree we
00:45:20.500 can eliminate
00:45:21.740 dumb as
00:45:22.820 one of the
00:45:23.200 explanations?
00:45:24.180 There's just
00:45:24.540 no way he's
00:45:25.060 dumb.
00:45:26.020 That's just
00:45:26.560 not a thing.
00:45:27.660 He's very
00:45:28.520 smart.
00:45:29.680 So if you
00:45:30.440 eliminate
00:45:30.760 dumb, then
00:45:32.960 there's
00:45:33.320 misinformed.
00:45:35.520 But it
00:45:36.140 didn't sound
00:45:36.600 like he was
00:45:37.100 exactly
00:45:37.700 misinformed.
00:45:39.260 It looked
00:45:39.740 like he
00:45:40.300 knew stuff,
00:45:41.720 but he was
00:45:42.420 interpreting it
00:45:43.240 through an
00:45:43.600 absurd filter.
00:45:45.260 And that's
00:45:45.600 a tell for
00:45:46.160 brainwashing.
00:45:48.180 So I'm
00:45:48.880 going to go
00:45:49.220 with he's
00:45:50.000 actually
00:45:50.280 brainwashed.
00:45:52.280 Now he
00:45:53.000 may have
00:45:53.460 other things
00:45:54.060 going on as
00:45:54.640 well.
00:45:55.740 But I'll
00:45:57.560 just say
00:45:58.100 that Elon
00:45:58.680 Musk made 1.00
00:45:59.440 a public
00:46:01.160 prediction
00:46:02.140 here that
00:46:02.900 I have
00:46:04.620 no evidence
00:46:05.460 whatsoever
00:46:06.540 that this
00:46:08.200 would be
00:46:08.500 an accurate
00:46:08.940 statement.
00:46:10.160 So I'm
00:46:10.600 not backing
00:46:11.160 this.
00:46:11.700 I'm just
00:46:12.000 telling you
00:46:12.540 what Elon
00:46:13.680 Musk said
00:46:14.280 yesterday,
00:46:14.960 I think,
00:46:16.240 based on
00:46:16.680 the, because
00:46:17.360 of the
00:46:17.560 all-in pod.
00:46:19.900 Elon Musk
00:46:20.520 posted on
00:46:21.440 X, he
00:46:21.860 said,
00:46:22.200 Reed is
00:46:22.860 terrified of
00:46:23.560 Trump winning
00:46:24.220 and being
00:46:25.920 prosecuted for
00:46:26.800 being one
00:46:27.200 of Epstein's
00:46:27.880 top clients.
00:46:30.200 Yikes.
00:46:32.500 Yikes.
00:46:35.800 Would Elon
00:46:36.600 Musk say 0.99
00:46:37.440 that if he
00:46:39.160 didn't have
00:46:39.640 any extra
00:46:40.420 information,
00:46:41.820 you know,
00:46:42.080 beyond what
00:46:42.560 you and I
00:46:42.960 know?
00:46:44.260 Would he?
00:46:45.140 I don't know.
00:46:46.180 I mean,
00:46:46.420 actually, that's
00:46:47.260 an open question.
00:46:48.200 Would he do
00:46:48.920 it without
00:46:51.020 any extra
00:46:51.960 information?
00:46:52.560 Like, what
00:46:53.980 he knows is
00:46:54.620 exactly what
00:46:55.400 you know,
00:46:56.040 that, and
00:46:57.380 Reed has
00:46:57.840 admitted he's
00:46:58.480 used the, he
00:46:59.740 had used the
00:47:00.220 Epstein plane,
00:47:01.240 but there's
00:47:02.420 no indication
00:47:03.160 whatsoever in
00:47:04.200 the public
00:47:04.600 domain that
00:47:06.220 he did
00:47:06.500 anything illegal
00:47:07.280 or inappropriate.
00:47:09.460 All right.
00:47:09.680 I want to say
00:47:10.080 that as clearly
00:47:10.680 as possible.
00:47:11.620 I'm not aware
00:47:12.440 of anything.
00:47:14.200 Just being on
00:47:15.300 the plane is
00:47:16.360 not enough for
00:47:17.000 me, right?
00:47:18.180 Because the
00:47:18.480 plane was used
00:47:19.400 for a variety
00:47:20.440 of things,
00:47:20.880 and obviously
00:47:21.920 Epstein made
00:47:22.580 his plane
00:47:22.940 available as
00:47:24.180 part of his
00:47:24.720 networking and
00:47:25.700 doing favors
00:47:27.340 for people who
00:47:27.980 might do favors
00:47:28.640 for him later
00:47:29.200 and stuff like
00:47:29.720 that.
00:47:30.120 But none of
00:47:30.540 that's illegal
00:47:31.100 and probably
00:47:32.740 not that uncommon
00:47:33.660 for people who
00:47:34.500 have private
00:47:35.040 planes, and
00:47:36.420 more than
00:47:36.700 one of them.
00:47:39.220 So that's
00:47:40.220 quite the
00:47:40.900 accusation, and
00:47:42.200 it makes me
00:47:42.740 wonder if
00:47:43.780 Elon knows
00:47:44.800 something that
00:47:45.340 we don't
00:47:45.700 know.
00:47:46.760 But we
00:47:48.120 don't know,
00:47:49.000 so I'm not
00:47:49.720 going to assume
00:47:50.200 he knows more
00:47:50.740 than we know.
00:47:51.780 It's just an
00:47:52.480 interesting question.
00:47:54.160 Ontario, the
00:47:55.340 wait is over.
00:47:56.580 The gold standard
00:47:57.340 of online casinos
00:47:58.360 has arrived.
00:47:59.420 Golden Nugget
00:48:00.120 Online Casino
00:48:00.980 is live, bringing
00:48:01.940 Vegas-style
00:48:02.660 excitement and a
00:48:03.640 world-class gaming
00:48:04.560 experience right to
00:48:05.740 your fingertips.
00:48:07.040 Whether you're a
00:48:07.640 seasoned player or
00:48:08.540 just starting,
00:48:09.380 signing up is fast
00:48:10.320 and simple.
00:48:11.500 And in just a few
00:48:12.320 clicks, you can have
00:48:13.160 access to our
00:48:13.880 exclusive library of
00:48:15.000 the best slots and
00:48:16.160 top-tier table games.
00:48:17.740 Make the most of
00:48:18.500 your downtime with
00:48:19.320 unbeatable promotions
00:48:20.560 and jackpots that 0.83
00:48:21.520 can turn any
00:48:22.260 mundane moment into
00:48:23.420 a golden opportunity
00:48:24.660 at Golden Nugget
00:48:25.900 Online Casino.
00:48:27.220 Take a spin on the
00:48:27.960 slots, challenge
00:48:28.780 yourself at the
00:48:29.380 tables, or join a
00:48:30.540 live dealer game to
00:48:31.540 feel the thrill of
00:48:32.600 real-time action, all
00:48:33.940 from the comfort of
00:48:34.800 your own devices.
00:48:35.900 Why settle for less
00:48:36.880 when you can go for
00:48:37.840 the gold at Golden
00:48:39.180 Nugget Online Casino?
00:48:41.200 Gambling problem?
00:48:42.100 Call Connex
00:48:42.700 Ontario, 1-866-531-2600.
00:48:46.400 19 and over,
00:48:47.320 physically present in
00:48:48.060 Ontario.
00:48:48.700 Eligibility restrictions
00:48:49.680 apply.
00:48:50.320 See GoldenNuggetCasino.com
00:48:52.020 for details.
00:48:52.800 Please play responsibly.
00:48:55.000 Well, Mike Benz is
00:48:56.580 trying to wake up the
00:48:59.300 country on the question
00:49:00.320 of what Brazil is doing.
00:49:02.920 And here's what he said
00:49:04.240 today in all caps.
00:49:05.980 And he doesn't post in
00:49:08.060 all caps.
00:49:09.980 I think he's getting a
00:49:11.220 little frustrated that
00:49:12.960 he's clearly trying to
00:49:14.440 tell us something really
00:49:15.820 important and we're not
00:49:18.280 hearing it and the
00:49:19.140 government's acting like
00:49:20.100 it's not happening.
00:49:21.280 I think he's in some
00:49:22.780 kind of personal hell
00:49:23.800 where he can so clearly
00:49:25.800 see what's happening
00:49:26.760 behind the scenes, but
00:49:28.820 describing it to the
00:49:29.900 rest of us is almost
00:49:30.920 impossible, which I said
00:49:33.640 today.
00:49:34.160 So let me give you,
00:49:35.260 here's what he said, in
00:49:37.500 all caps.
00:49:38.580 If the Biden State
00:49:39.620 Department, USAID,
00:49:41.620 NED, whoever that is,
00:49:43.100 and a thousand USG
00:49:44.920 funded gongos, that 0.98
00:49:47.040 must be government
00:49:48.240 organized, non-government
00:49:50.960 organizations.
00:49:53.060 What's a gongo?
00:49:54.520 In Brazil, if they do not
00:49:59.040 do the below actions on
00:50:00.460 Brazil banning acts, the
00:50:02.120 Republican House of
00:50:03.560 Representatives can zero 0.99
00:50:04.560 out all of their programs
00:50:05.660 in the budget.
00:50:07.120 All right.
00:50:08.100 Does anybody understand
00:50:09.320 what any of that means?
00:50:10.140 What percentage of the
00:50:13.860 general population of
00:50:15.120 America could read that
00:50:17.020 statement from Mike
00:50:18.020 Benz, who is a perfect
00:50:19.600 communicator, by the way.
00:50:20.700 He's one of the best
00:50:21.300 communicators you'll ever
00:50:22.280 see.
00:50:22.860 It's the topic.
00:50:24.860 The topic has too many
00:50:26.100 parts.
00:50:28.780 State Department.
00:50:29.760 Okay.
00:50:30.120 Okay.
00:50:30.720 Well, what's that got to
00:50:31.560 do with USAID?
00:50:32.520 Who are they?
00:50:33.160 Wait, what?
00:50:34.100 Who's NED?
00:50:35.160 Never heard of them.
00:50:36.740 Wait, what?
00:50:37.160 There's a thousand USG
00:50:38.520 funded gongos?
00:50:39.140 What's a gongo?
00:50:40.960 What's it got to do with
00:50:41.840 Brazil?
00:50:42.760 Wait a minute.
00:50:43.480 Is this the Democrats
00:50:44.340 doing something?
00:50:45.340 Or is Brazil doing something?
00:50:47.140 Is it Bill Gates?
00:50:48.760 Is it the WHO?
00:50:51.440 Like, this story is
00:50:52.840 impossible to tell because
00:50:55.720 we don't have enough
00:50:57.940 background and we don't
00:50:59.920 have to understand what
00:51:00.820 the parts are.
00:51:03.140 So here's my thing.
00:51:04.700 I think 1% of Americans
00:51:06.180 are able to understand what
00:51:08.020 Mike Benz is saying
00:51:09.060 is a huge problem
00:51:10.640 and I'll describe it
00:51:12.820 better.
00:51:13.600 So here's my best
00:51:14.560 summary of it.
00:51:16.820 The State Department
00:51:18.360 of the United States
00:51:19.220 wants to control all
00:51:20.820 information in every
00:51:22.320 country, including the
00:51:23.940 United States, because
00:51:26.140 doing so is critical to
00:51:27.980 its goal of controlling
00:51:29.640 everything.
00:51:31.480 Controlling other countries
00:51:33.260 so we can use the
00:51:34.260 resources and they don't
00:51:35.600 become enemies and we
00:51:36.800 can control the
00:51:37.480 government, controlling
00:51:38.880 citizens in America so 0.57
00:51:40.620 that they vote the way
00:51:41.480 the State Department would
00:51:42.480 like them in their view
00:51:43.580 what's good for the
00:51:44.480 country, I suppose.
00:51:45.860 But part of the State
00:51:46.740 Department is massively
00:51:48.100 funding these
00:51:49.680 non-government
00:51:50.760 groups, many of which
00:51:53.400 are active in Brazil,
00:51:55.760 to collectively support
00:51:57.680 this idea of banning
00:51:59.020 X in Brazil.
00:52:00.400 There's one judge in
00:52:01.560 particular who seems to
00:52:03.260 be the Hannibal Lecter of
00:52:06.160 Judges.
00:52:07.320 He just seems like this
00:52:08.760 totally evil guy.
00:52:10.580 So I think the bottom
00:52:13.680 line is that Mike Benz is
00:52:15.620 trying to tell the
00:52:17.080 Republicans in the House,
00:52:19.020 because they still have the
00:52:19.840 House, that they can cut
00:52:21.440 the funding of these
00:52:23.800 organizations that are
00:52:24.920 operating against the
00:52:26.080 interests of free speech.
00:52:27.300 But it's complicated.
00:52:31.180 So, does that make
00:52:33.780 sense?
00:52:35.680 So let me just say it
00:52:36.720 one more time.
00:52:38.520 The U.S.
00:52:39.560 State Department has
00:52:40.840 massive funding for
00:52:42.060 various entities around
00:52:43.460 the world, not just
00:52:44.740 American entities, but
00:52:46.460 since they fund them,
00:52:47.380 they control them.
00:52:48.660 And those entities are
00:52:50.000 collectively putting
00:52:51.620 pressure on various
00:52:52.720 government and private
00:52:55.720 entities to censor.
00:52:57.300 And so basically, the
00:52:59.900 State Department is
00:53:01.140 operating a massive
00:53:02.180 censorship campaign,
00:53:04.860 which has the effect of
00:53:07.120 censoring U.S.
00:53:09.180 platforms like X in
00:53:11.000 other countries.
00:53:12.620 And of course, those
00:53:13.520 platforms need other
00:53:14.980 countries to survive,
00:53:16.320 right?
00:53:16.660 They're not just a one
00:53:18.180 country business.
00:53:19.220 They need Europe.
00:53:20.300 They need South America,
00:53:21.660 et cetera.
00:53:22.440 So you can force
00:53:24.140 American companies to
00:53:26.220 censor, or you can make
00:53:28.220 them get censored in
00:53:29.320 other countries through
00:53:30.560 all this, you know,
00:53:31.800 various organized
00:53:32.800 entities.
00:53:34.900 So, and the House would
00:53:36.420 have the ability to cut
00:53:38.020 those budgets if they
00:53:39.980 don't cut it out and
00:53:41.160 start acting like we'd
00:53:42.280 like them to.
00:53:44.440 So, having explained all
00:53:46.080 that, I believe that I
00:53:47.380 could, if it were really
00:53:48.960 well explained, instead of
00:53:50.600 1% of Americans
00:53:51.680 understanding the risk
00:53:52.920 involved here, I could
00:53:54.660 get that up to 2%.
00:53:55.860 I just don't know what
00:53:57.600 you do with this.
00:53:58.740 I mean, I'm actually,
00:53:59.840 I'm baffled.
00:54:02.220 Mike Benz understands.
00:54:04.240 He explains great.
00:54:06.320 It's just really
00:54:07.120 complicated.
00:54:08.700 So, humans are not able
00:54:10.920 to act on complicated
00:54:12.120 stuff.
00:54:13.040 We just can't do it.
00:54:14.540 Because we say, ah, I
00:54:16.120 don't even know what
00:54:16.820 lever to push.
00:54:17.880 What are you saying?
00:54:18.760 I'm going to go work on
00:54:20.880 things I understand.
00:54:22.180 We're always going to be
00:54:23.500 driven to things we
00:54:24.280 understand over things we
00:54:25.720 don't understand.
00:54:26.540 It's a huge, you know,
00:54:28.920 personal impulse.
00:54:31.940 Meanwhile, Laura Loomer
00:54:33.760 has a scoop.
00:54:36.000 I guess Tim Walsh's
00:54:37.480 brother, Jeff Walsh, had
00:54:39.240 not talked to his brother
00:54:40.220 in eight years.
00:54:41.140 We don't know what that
00:54:41.800 was about.
00:54:42.860 But Jeff appears to be
00:54:44.520 either a Republican or at
00:54:46.300 least anti his brother.
00:54:48.240 And he says, I'm 100%
00:54:49.900 opposed to all his
00:54:50.920 ideology.
00:54:52.460 He said that on a
00:54:53.500 Facebook post recently.
00:54:56.380 He said, I've thought
00:54:57.700 hard about doing
00:54:58.500 something like that.
00:55:02.260 He says, the stories I
00:55:03.920 could tell.
00:55:04.880 So, this is Walsh's
00:55:05.900 brother.
00:55:06.580 The stories I could tell.
00:55:08.020 Not the type of character
00:55:09.200 you want making decisions
00:55:10.420 about your future.
00:55:14.400 That's his brother.
00:55:15.480 If your brother says
00:55:18.360 you're not qualified for
00:55:19.640 your job, I think I
00:55:23.660 would listen to that.
00:55:25.260 What does the brother
00:55:26.140 know?
00:55:27.460 My God.
00:55:29.780 Anyway, the Trump
00:55:31.200 campaign has a video
00:55:33.360 ad mocking Kamala Harris
00:55:35.240 for saying, my values
00:55:36.260 have not changed.
00:55:37.220 Something she said in the
00:55:38.580 Dana Bash interview.
00:55:40.340 And so, it starts with
00:55:43.700 my values have not
00:55:44.660 changed.
00:55:45.260 And then it's a whole
00:55:45.780 bunch of cuts of her
00:55:46.760 saying things that she's
00:55:48.280 in fact changed her
00:55:49.240 mind on.
00:55:50.460 I saw Rahim Kassim,
00:55:53.320 I hope I said that
00:55:54.160 right, pointing out the
00:55:55.760 same thing I was going
00:55:56.460 to point out, which is
00:55:57.300 it'd be better if you
00:55:58.560 put her saying my values
00:56:01.280 haven't changed in
00:56:02.420 between each of the
00:56:03.360 stories of the thing
00:56:04.200 that changed.
00:56:04.820 Because you lose a
00:56:06.960 little bit of the fact
00:56:07.780 that her lie is not a
00:56:09.960 little lie.
00:56:10.840 It's gigantic.
00:56:13.440 So, if you want to tie
00:56:14.800 her to her values have
00:56:16.180 not changed, I would
00:56:17.700 really repeat it.
00:56:20.640 Remember, repetition is
00:56:22.620 persuasion.
00:56:23.660 That's the lesson you
00:56:24.680 learned today.
00:56:25.500 So, they need to repeat
00:56:26.860 the my values have not
00:56:28.000 changed a little bit
00:56:29.580 more in that ad.
00:56:30.360 I would re-edit it and
00:56:31.400 re-issue it.
00:56:31.940 Tim Walz has some
00:56:36.100 kind of political
00:56:36.820 origin story.
00:56:38.880 Again, it's just one of
00:56:40.000 these complicated stories
00:56:41.420 so I'll just kind of
00:56:42.440 brush on it.
00:56:43.660 The Washington
00:56:44.160 Examiner has the
00:56:45.040 details.
00:56:46.300 So, apparently he's
00:56:47.820 long told this story
00:56:49.020 back in 2004.
00:56:52.180 He was a folksy high
00:56:54.260 school teacher and he
00:56:56.160 took two of his
00:56:56.760 students to attend a
00:56:57.740 campaign rally for
00:56:58.720 George Bush as an
00:57:00.760 educational experience.
00:57:02.520 And he says all three
00:57:03.720 of them were denied
00:57:04.400 entry because one of
00:57:06.520 them had a John Kerry
00:57:07.640 sticker on a wallet.
00:57:11.400 Blah, blah, blah.
00:57:12.780 And then they were
00:57:15.080 interrogated and treated
00:57:16.660 poorly, etc.
00:57:18.340 Now, the reporting is
00:57:20.000 that none of that
00:57:20.640 happened.
00:57:21.500 That, in fact, they
00:57:22.300 actually got into it
00:57:23.440 and that they were not
00:57:24.760 students of his.
00:57:26.900 So, they were not his
00:57:28.000 students.
00:57:28.440 They were, they had
00:57:32.200 some, I think they were
00:57:33.820 associated through some
00:57:34.800 political group.
00:57:36.760 So, they weren't
00:57:37.380 students but they, you
00:57:38.860 know, they had some
00:57:39.420 legitimate reason to be
00:57:40.440 there, I guess, with
00:57:41.080 him.
00:57:41.240 So, I'm not sure I care
00:57:44.860 too much about his
00:57:46.240 folksy story of how he
00:57:48.240 got weaponized to become
00:57:49.780 run for office.
00:57:51.520 But, it does seem to be
00:57:53.280 kind of a pattern that he
00:57:54.640 lies about everything.
00:57:56.920 He seems to lie about
00:57:58.220 everything.
00:57:59.660 So, is he that
00:58:01.540 different than the other
00:58:02.340 candidates?
00:58:03.320 Well, I haven't seen J.D.
00:58:06.520 Vance lie about anything.
00:58:07.420 So, he must be different
00:58:10.100 than J.D.
00:58:10.600 Vance.
00:58:10.960 I haven't seen when
00:58:12.720 Vivek was running.
00:58:14.320 I don't recall him lying
00:58:15.400 about anything.
00:58:16.840 When RFK Jr. was more
00:58:18.960 actively in the race, I
00:58:21.460 don't think I've heard
00:58:22.640 anything that sounded
00:58:24.260 like a lie.
00:58:25.680 You know, maybe he was
00:58:26.400 wrong about some stuff
00:58:27.400 that would be normal.
00:58:29.080 But lie?
00:58:30.620 I didn't hear any.
00:58:32.060 Nothing that I would
00:58:32.920 identify as a lie.
00:58:34.900 So, how about
00:58:37.440 Nicole Shanahan?
00:58:39.480 So, running with
00:58:40.480 RFK Jr.
00:58:41.480 Did she ever tell a lie?
00:58:43.420 I don't think so.
00:58:44.940 I don't even think she's
00:58:45.780 been accused of it.
00:58:47.420 So, it turns out that
00:58:48.620 we've, by the way,
00:58:50.900 if you think about it,
00:58:51.660 this is one of the most
00:58:52.320 optimistic things you
00:58:53.480 could ever hear.
00:58:54.360 I just told you the names
00:58:55.760 of several candidates
00:58:57.180 for president, you know,
00:58:58.780 most of them not running
00:58:59.680 at the moment, who,
00:59:01.880 as far as I could tell,
00:59:02.720 weren't lying.
00:59:03.900 About anything.
00:59:06.300 Just think about that.
00:59:08.140 Now, Trump is Trump,
00:59:09.540 and he's going to use
00:59:11.020 his hyperbole, and the
00:59:12.340 fact checkers are going
00:59:13.240 to be all over it.
00:59:14.600 So, he's a singular 0.53
00:59:15.960 salesman, bullshitter
00:59:18.300 kind of personality.
00:59:20.040 But the fact is, we had
00:59:22.580 several major, serious,
00:59:24.860 highly qualified candidates
00:59:26.240 for president in the past
00:59:27.800 year who absolutely
00:59:29.640 didn't tell lies.
00:59:31.420 As far as I can tell.
00:59:32.720 I mean, maybe somebody
00:59:33.680 else has a different
00:59:34.340 take on that, but I
00:59:35.460 didn't see any.
00:59:36.800 And, you know, I could
00:59:37.620 identify lies on both
00:59:38.820 sides most of the time,
00:59:40.180 but I didn't see any.
00:59:42.020 So, that's amazing.
00:59:44.140 That's just amazing.
00:59:45.640 Anyway, here's
00:59:47.880 Michael Schellenberger's
00:59:49.840 take on this Brazil
00:59:51.500 situation.
00:59:52.720 He says,
00:59:53.280 today's 1984 type
00:59:55.220 totalitarianism is more
00:59:56.780 dangerous than the
00:59:57.840 tanks and torture type
00:59:59.420 of totalitarianism.
01:00:00.700 There's no need to
01:00:01.520 rig an election or
01:00:02.740 overthrow a government
01:00:03.740 if the ruling party of
01:00:05.780 the media and state
01:00:06.440 sponsored NGOs.
01:00:08.160 So, these are those,
01:00:09.180 you know, NGOs again,
01:00:11.340 control the information
01:00:12.560 environment.
01:00:13.400 There we go.
01:00:14.700 Now, we're in my domain.
01:00:16.660 That's true.
01:00:17.840 You do not need to force
01:00:19.440 people to do things you
01:00:20.580 can brainwash them into
01:00:21.680 wanting to do.
01:00:23.280 No force needed.
01:00:25.200 So, we now have a
01:00:26.960 situation in which our
01:00:29.040 government, working with
01:00:30.980 the media, has enough
01:00:32.560 technique that they can
01:00:33.980 brainwash the public to
01:00:35.420 believe anything.
01:00:37.540 Now, governments could
01:00:39.660 always brainwash the
01:00:40.960 public to believe
01:00:41.620 anything, but they're
01:00:43.140 better at it now.
01:00:44.820 They're way better at it
01:00:46.100 now.
01:00:47.040 So, when you look at
01:00:47.900 the brainwashing
01:00:50.340 skills, we're now into
01:00:52.860 really dangerous
01:00:54.040 territory.
01:00:56.820 So, Michael
01:00:58.500 Schellenberger goes on,
01:00:59.740 Brazil's Supreme Court
01:01:00.800 just banned X and
01:01:02.080 announced an $8,900
01:01:03.240 day penalty for those 0.64
01:01:05.220 who use a VPN to
01:01:06.900 evade it.
01:01:08.240 So, if you use a VPN
01:01:09.620 and they catch you,
01:01:12.420 they're going to find
01:01:13.820 you this enormous amount.
01:01:14.880 it's not an outlier.
01:01:20.260 This is now the normal
01:01:21.600 way that the world is
01:01:22.540 working.
01:01:23.820 So, Schellenberger's all 0.97
01:01:25.640 over it.
01:01:26.140 And the end
01:01:26.960 wokeness account points
01:01:27.960 out that
01:01:28.680 Kamala probably supports
01:01:30.820 what Brazil just did.
01:01:33.100 And how do we know
01:01:34.140 that?
01:01:34.500 there are videos of
01:01:38.280 her talking about how
01:01:39.380 you can't have all
01:01:41.080 that free speech.
01:01:42.520 She doesn't say it
01:01:43.280 that way, but she 0.98
01:01:44.460 says, oh, yeah, you
01:01:45.260 got to have laws, you
01:01:47.020 know, controlling the
01:01:48.120 speech on the
01:01:48.760 platforms.
01:01:49.520 So, she's in favor of
01:01:50.780 that.
01:01:54.520 Bank more encores when
01:01:56.100 you switch to a
01:01:56.820 Scotiabank banking
01:01:57.740 package.
01:01:59.400 Learn more at
01:01:59.960 scotiabank.com
01:02:01.000 slash banking
01:02:01.740 packages.
01:02:02.680 Conditions apply.
01:02:03.400 Scotiabank, you're
01:02:05.500 richer than you
01:02:06.040 think.
01:02:08.120 So, Rasmussen
01:02:09.380 reports, my favorite
01:02:11.460 source for all
01:02:12.680 allegations of
01:02:13.920 election shenanigans,
01:02:16.240 points out that
01:02:17.200 a cyber expert
01:02:20.560 had recently found
01:02:21.560 that the
01:02:22.520 compilers installed
01:02:23.820 on some of the
01:02:25.120 Maricopa
01:02:26.000 election-related
01:02:28.060 machines
01:02:28.740 could modify and
01:02:30.740 create
01:02:31.000 executable files
01:02:33.340 and drivers
01:02:33.960 potentially altering
01:02:35.400 election results
01:02:36.220 undetected.
01:02:39.320 Let me say that
01:02:40.260 again.
01:02:41.140 So, a cyber
01:02:41.720 expert looked at
01:02:43.180 some of the
01:02:43.760 machines.
01:02:44.220 I don't know if
01:02:44.600 these are tabulators
01:02:45.480 or voting machines.
01:02:46.620 It might be a, I
01:02:48.060 don't know, what is
01:02:48.560 an EMS?
01:02:50.160 But it's something in
01:02:51.080 the process.
01:02:51.720 It's either the
01:02:52.200 machine or the
01:02:52.820 tabulator, I guess.
01:02:53.640 But it has some
01:02:56.840 code on it that
01:02:58.040 somebody from the
01:02:58.700 outside could modify
01:02:59.960 to change the
01:03:01.380 election without
01:03:02.480 being detected.
01:03:04.260 Now, this was
01:03:04.760 found a few years
01:03:05.640 ago.
01:03:06.800 Here's the update.
01:03:08.400 Still there.
01:03:11.760 Now, is this
01:03:12.720 story true?
01:03:13.460 Could it possibly be
01:03:16.200 true that a cyber
01:03:17.160 expert found some
01:03:18.760 code on, you know,
01:03:20.400 a key piece of
01:03:21.380 machinery in
01:03:22.040 Maricopa that
01:03:23.560 very clearly would
01:03:24.620 allow a bad person
01:03:25.800 to get in and
01:03:26.440 change the results
01:03:27.240 without getting
01:03:27.780 caught and that
01:03:29.440 it's still there?
01:03:31.800 That doesn't even
01:03:32.700 sound like it could
01:03:33.420 be true.
01:03:34.980 So, I'm going to
01:03:36.140 say maybe there's
01:03:37.440 more to the story
01:03:38.000 than we know.
01:03:39.060 But the number
01:03:40.620 of, you know,
01:03:42.000 smoking-looking
01:03:43.220 guns is just off
01:03:45.040 the chart right now.
01:03:46.500 But they all just
01:03:47.400 sort of look like
01:03:48.940 you're almost there
01:03:50.460 to prove a kraken
01:03:51.760 but not quite.
01:03:54.280 So, I will
01:03:56.200 triple down on my
01:03:57.620 prediction for the
01:03:58.560 election.
01:03:59.240 I believe that it's
01:04:00.700 going to be a
01:04:01.340 coin flip election
01:04:02.680 50-50 by the time
01:04:04.480 we get, yes, by the
01:04:06.320 time we get to the
01:04:07.000 election and that
01:04:08.620 the coin flip will
01:04:10.000 land on the edge.
01:04:12.480 We will not have a
01:04:13.760 president after the
01:04:14.720 election.
01:04:15.880 We'll eventually get
01:04:17.060 something done.
01:04:18.140 I mean, there'll be
01:04:18.580 some leader
01:04:19.080 eventually.
01:04:19.860 But I don't think
01:04:20.720 we're going to have
01:04:21.160 one within a week
01:04:21.960 of the election.
01:04:23.380 Probably not two
01:04:24.280 weeks.
01:04:25.460 I think for at
01:04:26.540 least a month we
01:04:27.740 will be leaderless.
01:04:30.740 And it's because
01:04:31.620 we'll be fighting
01:04:32.300 about the election.
01:04:33.620 There will be all
01:04:34.400 kinds of claims.
01:04:36.580 All kinds of
01:04:37.340 claims.
01:04:37.680 The most likely
01:04:38.420 outcome, based on
01:04:40.000 everything I know,
01:04:41.140 is if you start
01:04:41.840 with the assumption
01:04:42.420 that much of the
01:04:44.000 country believes that
01:04:45.020 Trump would be an
01:04:45.760 existential threat and
01:04:47.540 that everything's on
01:04:48.520 the table to stop
01:04:49.360 him.
01:04:50.440 That guarantee is
01:04:51.560 cheating.
01:04:52.920 Would you agree?
01:04:54.340 It guarantees it.
01:04:55.900 You don't have to
01:04:56.740 wonder.
01:04:57.780 It guarantees it.
01:04:59.200 Now, the part we
01:05:00.020 don't know is if it
01:05:02.120 would be of any
01:05:03.020 scale that would
01:05:03.800 matter.
01:05:05.120 That part we don't
01:05:06.020 know.
01:05:07.060 But you can guarantee
01:05:08.820 that in a big old
01:05:10.420 country with lots of
01:05:11.400 people who think
01:05:11.960 Hitler might come to 0.97
01:05:12.880 power, somebody's
01:05:14.620 going to fill out two
01:05:15.380 ballots when they only
01:05:16.320 should have done
01:05:16.800 one, right?
01:05:18.160 It might be trivial,
01:05:19.300 but you can guarantee
01:05:20.840 it's going to happen.
01:05:22.380 Now, if you can
01:05:22.980 guarantee that there
01:05:23.860 will be irregularities,
01:05:26.640 you know, be they
01:05:27.200 small or be they
01:05:28.400 large, then you can
01:05:30.200 also guarantee that
01:05:31.300 the Republicans will
01:05:32.360 be dead set on not
01:05:34.340 accepting an election
01:05:36.080 when there are so
01:05:37.360 many claims.
01:05:38.980 So the only thing
01:05:39.880 we'll know for the
01:05:40.480 first month or so is
01:05:41.880 that there'll be all
01:05:42.680 kinds of claims of
01:05:44.160 illegality and
01:05:45.960 probably either way
01:05:46.860 it goes, you know,
01:05:47.940 if it goes for or
01:05:49.980 against Trump, there
01:05:50.740 will be claims on the
01:05:51.580 other side or his
01:05:52.960 side.
01:05:54.240 And there is no time
01:05:55.860 to adjudicate all
01:05:57.660 those claims.
01:05:58.820 So what are you going
01:05:59.680 to do?
01:06:00.640 Are you just going to
01:06:01.460 certify it when there
01:06:02.600 are like hundreds of
01:06:04.180 legitimate sounding
01:06:05.220 claims with actual
01:06:06.340 whistleblowers and
01:06:07.420 witnesses and
01:06:08.240 documents?
01:06:09.900 Well, maybe, but
01:06:12.040 that's not a stable
01:06:13.120 situation.
01:06:15.580 So I think the most
01:06:16.680 likely outcome is
01:06:18.140 they're going to find
01:06:18.880 at least small
01:06:19.740 incidents of election
01:06:21.980 irregularities and it
01:06:23.600 will be enough to put
01:06:26.120 everything off the
01:06:27.240 rails.
01:06:28.700 Eventually it will be
01:06:29.660 okay.
01:06:31.900 I'll say again, the
01:06:32.980 only real thing I worry
01:06:34.140 about in this country
01:06:34.980 besides the growing
01:06:37.240 DEI stuff is the
01:06:41.300 debt.
01:06:42.040 the rest of the
01:06:44.660 stuff I'm pretty sure
01:06:45.740 we can work through,
01:06:46.700 but the debt, I
01:06:48.980 don't know how that
01:06:49.600 gets worked out.
01:06:51.280 So anyway, that's all
01:06:52.260 I've got to say for
01:06:53.240 today.
01:06:57.600 In a few minutes, I'm
01:07:00.100 going to create a
01:07:00.660 video of opening up
01:07:02.060 the first look at the
01:07:03.380 Dilbert calendar.
01:07:04.580 It's not available.
01:07:05.760 You can't buy it yet.
01:07:06.760 I'll tell you when you
01:07:07.540 can and where to get
01:07:08.680 it.
01:07:08.880 It'll only be on one
01:07:09.740 website.
01:07:10.200 But when it's
01:07:12.400 available, which will
01:07:13.200 be in the next few
01:07:13.740 days, I'll tell you
01:07:15.500 more about it.
01:07:16.940 And kind of exciting
01:07:18.680 because the Dilbert
01:07:20.680 calendar had to be,
01:07:21.700 had to take a year
01:07:23.380 off.
01:07:24.100 You may have heard
01:07:24.760 why, but it's back.
01:07:26.960 And this time, it's
01:07:28.820 made in America.
01:07:29.700 That's why it took so
01:07:32.080 long.
01:07:32.640 It would have been
01:07:33.060 easier if I just sort
01:07:34.280 of did it the old
01:07:34.920 way.
01:07:35.800 But it took a lot of,
01:07:38.180 let's say, took some
01:07:39.380 somersaults to figure out
01:07:42.020 a way to make it in
01:07:42.780 America and get it 0.97
01:07:44.480 within the price that
01:07:46.320 is reasonable.
01:07:47.580 I think we succeeded.
01:07:48.900 I'll tell you all about
01:07:49.640 that later.
01:07:50.520 All right.
01:07:50.860 That's all I got for
01:07:51.560 today.
01:07:51.880 I'm going to talk to
01:07:52.460 the locals, people
01:07:53.340 privately.
01:07:54.740 Thanks for joining on
01:07:55.580 Action Rumble and
01:07:57.060 YouTube.
01:07:57.400 I will see you
01:07:58.260 tomorrow, same time,
01:07:59.160 same place.
01:08:00.160 You're awesome.