Real Coffee with Scott Adams - July 27, 2023


Episode 2182 Scott Adams: UFOs, Hunter, Trump Indictments, Lots More Fun


Episode Stats

Length

1 hour and 1 minute

Words per Minute

148.78946

Word Count

9,120

Sentence Count

731

Misogynist Sentences

11

Hate Speech Sentences

9


Summary

In this episode of All I Really Need to Know I Learned from a Podcast, Alex Blumberg talks about the recent economic growth numbers, a new photo of a man who may or may not be alive, and the question of the summer being the warmest ever.


Transcript

00:00:00.000 Fix a little technical problem over in the locals platform.
00:00:05.200 But it looks like the locals platform is deader than a doornail today.
00:00:09.640 I'll try it one more time.
00:00:14.660 Hey, it's looking better.
00:00:19.040 Not so much.
00:00:21.700 Well, we're having a terrible time here, technically.
00:00:25.540 Normally, I'd be halfway through the simultaneous sip.
00:00:29.140 But it looks like we've got nothing today.
00:00:32.040 So we'll be closing up locals and opening it up on my phone, which should work.
00:00:44.080 But we'll see.
00:00:47.640 Watch me scramble to make all this work while you wait, wishing there was actually a show going on.
00:00:56.960 Yeah, it looks like locals is dead.
00:00:59.140 All right.
00:01:02.960 Today, it's all you.
00:01:05.660 It's all you today.
00:01:07.580 Would you like to start from the beginning?
00:01:14.760 Good morning.
00:01:15.500 Would you like to take this experience up to levels that you don't even believe are possible?
00:01:21.100 Yes, you do.
00:01:23.280 And all you need for that is a cup or a mug or a glass, a tank or chalice, a sistein, a canteen, a jug or a flask, a vessel of any kind.
00:01:29.360 Fill it with your favorite liquid.
00:01:30.480 I like coffee.
00:01:32.100 And join you now for the unparalleled pleasure of the dopamine to the day, the thing that makes everything better.
00:01:37.780 It's called the simultaneous sip.
00:01:40.620 It happens now.
00:01:41.900 Go.
00:01:47.080 That's good stuff.
00:01:47.940 All right, ladies and gentlemen.
00:01:52.140 Have you seen any news today?
00:01:54.680 Well, there's news all over the place.
00:01:56.600 It's like news happening like you can't even believe it.
00:01:58.920 But let's go through it.
00:02:03.120 So the Wall Street Journal is saying that the economy grew by 2.4%.
00:02:07.660 We'll talk about Hunter and UFOs and stuff, but it'll take a minute for all the locals people to give up and come over here.
00:02:15.940 Sorry.
00:02:18.480 But what I thought was interesting about this is it's good news that the economy is doing well.
00:02:23.880 But the headline was, U.S. economic growth defies a slowdown expectations.
00:02:31.900 Whose expectations?
00:02:34.340 Whose expectations?
00:02:36.140 Because this was my prediction.
00:02:38.180 So I would like to add to my most accurate political predictions and most accurate pandemic predictions and most accurate Ukraine predictions, the most accurate economic prediction.
00:02:50.100 Because I actually said we would not go into a recession.
00:02:53.060 There we go.
00:02:53.880 Remember I told you I was all excited preliminarily yesterday about this story that researchers had made a room temperature superconductor?
00:03:06.740 Do you remember that story?
00:03:09.180 Took about 10 minutes for people to crap all over it and say, well, let's wait to see if anybody can reproduce this.
00:03:17.300 And there's something sketchy going on.
00:03:18.980 And so I would say that all of my optimism about superconductivity, completely gone, completely gone.
00:03:27.040 Yeah, it doesn't sound like it's real, but I would love to be surprised.
00:03:30.640 So don't wait for any room temperature superconductivity.
00:03:35.480 It might be as likely as UFOs.
00:03:40.120 Well, there's a new photo of Wagner leader Pregosian, allegedly in Russia, and he's just shaking hands with some African guy who's visiting.
00:03:51.000 So that's true, right?
00:03:52.300 So that's true, right?
00:03:54.740 Pregosian's fine.
00:03:55.640 He's just hanging out.
00:03:56.740 Hanging out in Russia.
00:03:59.420 Sure.
00:04:00.880 Yeah, there was a...
00:04:03.000 Even CNN reported it.
00:04:05.320 They reported that there was a picture of him.
00:04:07.180 They didn't report he's alive or anything.
00:04:08.940 They just said there's a picture of him, which is fair.
00:04:10.900 And does anybody want to think that he's still free and just doing his own thing?
00:04:18.880 He's just driving his RV around St. Petersburg?
00:04:24.560 No.
00:04:25.660 No.
00:04:26.160 He's not driving his RV around St. Petersburg.
00:04:29.400 He's certainly in...
00:04:30.680 If he's alive, he's certainly controlled by Putin at this point.
00:04:37.500 Did you read it in the news that it was the warmest summer?
00:04:40.900 How many of you think we had the warmest summer because of climate change?
00:04:47.440 The news said so.
00:04:49.780 Well, it may be.
00:04:52.060 It might actually be the warmest summer.
00:04:54.280 But I'd like to give you a counterpoint, which doesn't...
00:04:58.520 It doesn't necessarily refute everything you've heard, but it gives you context.
00:05:02.040 So here's some better context.
00:05:03.820 From Alex Epstein.
00:05:06.420 He said, anyone comment...
00:05:08.080 Commentating responsibly on summer temperature?
00:05:11.540 Must acknowledge four facts.
00:05:13.960 All right, see if you agree with these four facts.
00:05:16.700 Cold-related deaths are much greater than heat-related deaths.
00:05:20.960 Did you know that?
00:05:22.640 That the number of people who die because it's too cold is way higher than the number of people who die because it's warm.
00:05:29.120 So in theory, you can imagine that depending on where it got warmer, it would save lives, right?
00:05:39.340 So if the cold places got warmer, but the places that are already hot didn't get warmer or didn't get much warmer, we'd be better off, wouldn't we?
00:05:47.900 Well, and then the reverse would be true.
00:05:51.180 If the hot places got hotter and the cold places got colder, that probably would be worse.
00:05:56.360 But according to Alex, Earth is warming slowly and less in warm places.
00:06:04.940 Oh, so that's good.
00:06:06.620 So the warming is more concentrated in the cold places, which should save lives.
00:06:11.780 Fossil fuels, number three, fossil fuels make us safer from dangerous temperatures.
00:06:19.600 Yeah, what would you do if you didn't have fossil fuels?
00:06:23.140 Wouldn't you be more exposed to what the weather can do if you can't sit in your car or heat your house?
00:06:30.440 And anti-fossil fuel policies increase the danger from cold and heat.
00:06:37.280 Well, that makes sense.
00:06:38.580 If fossil fuels protect you from temperature extremes, then not having them would make you less safe.
00:06:45.820 So I think a lot of this turns on these two points.
00:06:49.680 It's better to be warmer than cold.
00:06:53.080 And it's not warming as much in the places that are already warm.
00:06:59.100 Now, this is my problem with averages.
00:07:02.700 If somebody told you that the average temperature of the Earth was going up, that's pretty scary, right?
00:07:10.640 But what if they told you that it's warming up in the places that are too cold and the places that are already hot, nobody lives there?
00:07:18.520 Because it's a desert anyway, for example.
00:07:22.400 Now, that's the extreme.
00:07:23.740 I'm exaggerating the effect here.
00:07:25.800 But that's your counter-argument.
00:07:30.240 So I'm not saying that the warming is not a problem.
00:07:34.780 I don't know.
00:07:36.340 Might be.
00:07:37.340 But we're pretty good at dealing with that kind of problem.
00:07:41.420 All right.
00:07:41.760 Here's a quiz for you.
00:07:43.700 And the answer is not 25%.
00:07:45.200 All right.
00:07:46.200 I'm going to see if you can get this right.
00:07:49.060 Put on your thinking caps.
00:07:53.140 All right.
00:07:53.300 There was an article by Katie Mogg in the Wall Street Journal.
00:07:57.660 And apparently there's a trend called the hashtag lazy girl job.
00:08:02.720 And a lazy girl job would be defined as a job.
00:08:07.040 Often you could work at home.
00:08:09.000 You don't put in too many hours.
00:08:10.500 It might be an online kind of thing.
00:08:12.480 You don't make that much money, but you make enough, especially if you live with your boyfriend or something.
00:08:17.200 So apparently it's this big old trend to have a job that you can just sort of pay the rent, but you're not looking to kill it.
00:08:29.100 This would be the opposite of leaning in.
00:08:32.140 This would be like sleeping in.
00:08:34.580 So apparently leaning in has turned into sleeping in.
00:08:38.280 But that's not the question.
00:08:40.140 Here's the question.
00:08:40.880 This phenomenon has been growing lately, and it's racked up close to 18 million views on a particular social media network.
00:08:53.020 Go.
00:08:53.900 Which social media network?
00:08:56.480 Well, just to hypothetate.
00:08:57.560 What would be a social media network that would encourage working age humans in the United States to not work hard,
00:09:06.000 but maybe not send that same message to, let's say, China?
00:09:11.420 What would be a platform that would do that?
00:09:16.480 I mean, who would tell Americans to be lazier because it's awesome,
00:09:20.020 while not telling Chinese citizens to be lazy because it's not awesome?
00:09:24.020 Who would do that?
00:09:26.240 The answer is TikTok.
00:09:30.040 Are you surprised?
00:09:31.040 Oh, big surprise.
00:09:34.140 A trend that's really bad for Americans seems to be running on TikTok, which does not run in the same form in China.
00:09:43.600 So why are people talked into being trans if they never thought of it in their life before?
00:09:50.280 Oh, TikTok.
00:09:51.440 But not in China.
00:09:52.880 They don't run it there.
00:09:54.480 And why are people told to not work so hard?
00:09:58.280 Well, it doesn't happen in China.
00:09:59.640 And what's Congress looking at this week?
00:10:05.680 The fact that it's obvious that we have a brainwashing technology that's been inserted in the phone of every young American.
00:10:16.300 Are we having a hearing on that today?
00:10:18.480 No.
00:10:19.120 No.
00:10:19.400 It's about aliens.
00:10:20.740 UFOs.
00:10:21.680 Because it's summer, and you don't talk about real stuff in the summer.
00:10:25.360 You fool.
00:10:26.540 Let's talk about UFOs.
00:10:27.960 As you know, the Congress had some hearings about whether the government has secret UFOs that they're keeping from us.
00:10:37.300 And I don't know about you, but after listening to some of the testimony about the UFOs, I'm left with more questions than answers.
00:10:47.180 More questions than answers.
00:10:48.580 Here are some of the questions I have.
00:10:49.840 How much money did Hunter get from the UFOs?
00:10:55.720 I assume he shook him down.
00:10:57.260 I just don't know how much money he made.
00:10:58.800 There's no reporting on that.
00:11:00.380 Number two.
00:11:01.220 Are we using this advanced alien technology that we have to make video cameras for the White House that can't see Hunter?
00:11:10.200 Yeah.
00:11:10.500 Special technology.
00:11:11.420 The security cameras can see everybody, except Hunter.
00:11:15.660 He's just invisible on camera.
00:11:17.700 And most importantly, number three, how many aliens are in Trump's boxes?
00:11:24.900 Must be a few.
00:11:26.400 Because one thing we know is that we can't be told what's in the boxes.
00:11:31.480 And I can't think of anything else that they wouldn't tell us.
00:11:33.720 Because anything else they could tell us, am I right?
00:11:37.700 They could tell us in a general way.
00:11:40.080 Let me give you an example.
00:11:41.620 If Trump's boxes of secrets had attack plans for Iran, we would hear that.
00:11:48.720 They just wouldn't tell us the details.
00:11:50.460 If there were nuclear secrets, we would hear that.
00:11:54.180 They just wouldn't tell us the actual secrets.
00:11:56.700 Right?
00:11:56.880 If he had boxes that were full of, let's say, sensitive conversations with a foreign adversary, or a foreign ally, or maybe something that we had found about a foreign adversary, we would have heard that.
00:12:12.500 We just wouldn't hear the details of it.
00:12:14.460 Right?
00:12:15.200 What's the one thing that could be in the boxes that even the government wouldn't even say anything about?
00:12:22.620 Aliens.
00:12:23.700 Aliens.
00:12:24.160 There have to be aliens in the boxes.
00:12:25.820 Because it's the only thing they wouldn't want to say anything about.
00:12:28.740 Otherwise, they would just say, well, yes, he has them.
00:12:30.600 We just can't tell you the details.
00:12:32.240 But they can't tell you that they do have aliens.
00:12:35.020 They just don't want to tell you which ones.
00:12:37.340 See, they can't tell you that.
00:12:38.980 Okay?
00:12:39.220 I'm joking if you can't tell.
00:12:40.460 If you can't tell, I'm joking.
00:12:41.560 I'm joking.
00:12:42.460 All right.
00:12:45.700 Elon Musk weighed in on an Andrew Tate tweet.
00:12:51.100 And Tate was talking about, you know, men who sleep with a lot of women.
00:12:55.820 And vice versa.
00:12:57.500 And Musk tweeted to that.
00:12:59.380 He said, to sleep with women endlessly without love is a cursed and hollow life.
00:13:05.000 Well, if you were not already hating Elon Musk for being the richest person in the world,
00:13:17.680 can you hate him now for apparently knowing the answer to this question,
00:13:22.760 how you feel when you have slept with endless women without love?
00:13:26.220 It didn't sound like he was guessing.
00:13:31.180 I feel like if you're the richest man in the world and you're unmarried,
00:13:35.180 you have experienced sleeping with endless women without love.
00:13:41.320 I think that's probably what a Tuesday looks like for him.
00:13:45.900 Send in number three.
00:13:47.940 Make sure there's no love.
00:13:49.140 So we hate him for knowing the answer to that question.
00:13:53.380 What does it feel like to have sex with endless women?
00:14:00.300 You know, but I'm glad that he did it so we don't have to do it.
00:14:04.260 Am I right?
00:14:05.640 Guys, are you glad that Elon Musk told us what it feels like to sleep with endless women without love?
00:14:11.720 Because I was going to go out and do it.
00:14:14.120 I had plans for the week.
00:14:16.280 I was like, you know what I'm going to do this week?
00:14:18.980 I haven't tried this before.
00:14:21.020 I'm going to try having endless sex with women who don't love me.
00:14:25.740 See what it's all about.
00:14:27.200 But then I would have found out it was cursed and a hollow life.
00:14:29.980 And so I'm glad that Musk did it so I don't have to.
00:14:34.140 But there is something left out of his analysis.
00:14:37.000 That sleeping with women endlessly without love is a cursed and hollow life.
00:14:41.720 How does it compare to being married to one person?
00:14:47.600 I mean, I think that's the valid comparison.
00:14:50.780 I would agree with him that maybe it isn't, you know, a cursed and hollow life.
00:14:55.740 But how does it compare to the alternative?
00:15:00.440 You know what is the worst advice anybody ever gave?
00:15:03.840 The worst advice is follow your passion.
00:15:06.140 You know what the second worst advice is?
00:15:08.200 Find a good mate.
00:15:14.320 It's the worst advice anybody's ever given.
00:15:17.080 Find a good mate.
00:15:18.400 You know, one that will make your life better.
00:15:20.860 Do you know why that's the worst advice in the world?
00:15:23.020 Because I'm no expert.
00:15:29.400 I'm not a marriage expert.
00:15:30.760 I'm just going to put this out there.
00:15:32.860 See, you can do with it what you will.
00:15:35.620 I'm almost positive that 100% of people who get married think they picked the right one.
00:15:42.940 Do you have a different feeling about that?
00:15:46.560 Are there a lot of people getting married saying, you know, this mate I picked is total shit, but I just feel like getting married?
00:15:55.440 No.
00:15:56.740 I believe that everybody enters an illusion in which they figure that whatever flaws the other person has, they'll work it out.
00:16:04.380 And there are no new flaws.
00:16:06.880 There won't be any new ones.
00:16:08.320 You've seen everything there is to see.
00:16:10.960 And so that, therefore, a reasonable and a smart person, using good judgment, can go out and find a good solid mate and have a solid life because of it.
00:16:21.700 Now, some of you say, but, God, you fucking idiot.
00:16:27.320 You idiot.
00:16:28.600 There are plenty of examples.
00:16:30.360 I mean, maybe it's not over 50% of the population, but there are plenty and plenty of examples of people who looked and they found the right person.
00:16:39.560 Are you going to tell me that?
00:16:41.080 Would you like to tell me that there are many examples?
00:16:43.740 You've seen it yourself.
00:16:44.640 Many of you are the example where the correct mate, which you wisely and with your good judgment and your free will, you chose that good mate.
00:16:55.360 And because of your good choice, things are better for you.
00:16:58.840 How many of you, raise your hand if you're in that category.
00:17:01.740 You wisely picked the right mate and it totally worked out for you.
00:17:06.100 Go.
00:17:06.840 How many of you?
00:17:09.000 And therefore would be a good technique, right?
00:17:12.540 Yeah.
00:17:14.640 All right.
00:17:15.800 Does anybody see what's wrong with your analysis?
00:17:17.940 Do I have to be the first one to tell you?
00:17:22.060 You see what's wrong with the analysis?
00:17:25.920 All right.
00:17:26.420 You take a million people and you randomly pair them with each other.
00:17:29.760 Just randomly.
00:17:31.220 Would some of them have happy marriages?
00:17:34.360 Just randomly paired.
00:17:36.800 What do you say?
00:17:38.720 Of course they would.
00:17:40.200 Of course they would.
00:17:41.240 Yeah.
00:17:41.960 Suppose you had AI match people.
00:17:44.640 Would, you know, if you had a million of them, would some of them have amazing marriages?
00:17:49.880 I think so.
00:17:51.380 I think so.
00:17:52.280 Yeah.
00:17:52.800 Does that, does that indicate that you can choose the right mate?
00:17:56.860 Because lots of people have done it.
00:17:58.320 Alexa, cancel.
00:17:59.320 I don't know what that was all about.
00:18:05.960 But, so, so would you say that the evidence is, there are plenty of examples of people who consciously,
00:18:13.780 and this is the important part, they were consciously looking for a good mate.
00:18:17.900 They found one.
00:18:19.000 They lived their whole life happy that they found one, and that's proof that it's a good
00:18:24.260 idea.
00:18:24.680 Would you agree with that statement?
00:18:27.320 It's basically solid, well, maybe not proof in a scientific sense, but it's very solid,
00:18:33.020 solid evidence because you know lots of people.
00:18:36.720 People, you personally know lots of people who put the effort in, found somebody, and it
00:18:42.420 worked, and it worked their whole life, and they were really happy about it, right?
00:18:47.320 That's not, that's not really thinking what you're doing there.
00:18:52.020 There's no, there's no sense of reason or logic to that whatsoever.
00:18:55.760 It's just the law of big numbers.
00:18:58.060 If you have a lot of people doing something, somebody's going to win.
00:19:00.960 That's it, that's the whole story.
00:19:04.880 If a lot of people are doing a thing, some of them are going to get lucky.
00:19:10.280 Every time.
00:19:11.500 Not sometimes, every time.
00:19:13.920 So what would you imagine would be the rate of people getting lucky that would indicate
00:19:18.240 it's luck versus something that would indicate it's a solid plan?
00:19:23.380 Well, let me give you an example.
00:19:25.500 Another solid plan would be, if you wanted a good life, you would stay in the jail, you'd
00:19:30.940 learn some skills, you know, you'd basically stay off drugs, do some things.
00:19:39.260 How often does that keep you from being poor?
00:19:42.360 If you do those few simple things that anybody can do.
00:19:45.700 It's hard, but anybody can do it.
00:19:47.460 And the answer is, almost no poor people in that group.
00:19:51.760 If you, you know, build skills, go to school, stay in a jail, do the basics, pretty much all
00:19:58.220 of you are successful.
00:19:59.060 What's the success rate for picking a spouse?
00:20:03.600 Now remember, you've got to, you're not just counting the people who get divorced.
00:20:08.020 You have to include the people who stayed married but kind of wish they hadn't.
00:20:12.500 Right?
00:20:13.480 So marriage is more of a, maybe I'd say a 25% success rate.
00:20:20.060 If you talk about your whole life.
00:20:21.700 25%.
00:20:23.040 Does 25% sound like chance or the result of people who knew exactly what they were doing,
00:20:33.140 they knew exactly what a good mate would look like, and they went out and tried to get one.
00:20:38.560 Does it look like that?
00:20:40.460 Have you ever noticed that people tend to marry the people they work with?
00:20:43.680 What are the odds you met your soulmate at work?
00:20:49.160 It's mostly where people meet people.
00:20:51.940 Right?
00:20:53.540 The thing we know about people is that they can fall in love with whoever's around.
00:20:59.280 Would you agree with that statement?
00:21:00.660 We're not looking for our soulmate among the 8 billion people on Earth.
00:21:06.140 We easily fall in love with whoever's nearby.
00:21:09.860 Just proximity seems to be enough.
00:21:12.720 So how many of the people who just fell in love because they happen to be in the proximity
00:21:16.620 of another person who was willing to, you know, say yes,
00:21:20.340 how many of them are a good match?
00:21:22.600 It would be kind of weird if they were.
00:21:23.980 So I would say that 25% success rate of picking the right mate and making it last a lifetime.
00:21:34.120 Could be a little more, could be a little less.
00:21:36.380 It's definitely not over 50%.
00:21:38.280 So what kind of advice is it to give somebody advice that could not work more than half of the time?
00:21:45.480 Is that a solid advice?
00:21:47.380 Advice that will not work at least half of the time.
00:21:51.780 I would call that bad advice.
00:21:53.980 Because you do not have a mechanism or the capability to pick a good mate.
00:22:00.220 Now, I suppose there's some really super obvious stuff like, you know,
00:22:05.660 somebody who's been in and out of jail their entire life and has no intention of stopping crime.
00:22:12.100 I wouldn't marry that person, right?
00:22:15.300 But I don't think that's what anybody's talking about.
00:22:17.480 I think most people are looking at average-looking people and saying,
00:22:22.360 I think that's my person.
00:22:23.300 I think that's the one that will work.
00:22:25.120 But it's not because they're so smart or they knew how to pick a good person.
00:22:29.440 Yeah.
00:22:29.980 No, I think it's pure magical thinking that you can pick the right person.
00:22:33.980 That said, that said, you should try as hard as you can to pick the right person.
00:22:40.560 I'm just saying it's magical thinking to think that's some kind of formula for success.
00:22:44.700 It's not.
00:22:46.280 It's something you should try to do, but you really don't know how to do it.
00:22:49.660 Nobody does.
00:22:51.340 Because it's mostly luck.
00:22:53.400 All right.
00:22:53.660 There's yet another announcement about AI movies, and now this Gen 2.
00:23:03.760 I don't know if it's HeyGen or Gen 2 or whatever it is.
00:23:05.980 But there is some new AI that can make movies, make an entire movie just from some prompts.
00:23:13.200 Here's my prediction about movies made by AI.
00:23:17.280 Do you know why I don't watch television right now, besides the fact that content is bad?
00:23:24.680 It's because there are so many streaming services, and each one takes a lot of effort to make
00:23:29.920 it work on any given day.
00:23:31.340 Like, I've got streaming services that work on some devices, but not others.
00:23:37.020 Oh, I could probably fix that.
00:23:39.320 But sometimes I just want to watch a show.
00:23:42.040 So I'll just go to the other streaming device.
00:23:45.280 So I've got all these streaming devices, and as I've said many times, instead of watching content,
00:23:51.180 which is what I used to do, now I just look for content, and I'm sure that there's something
00:23:56.400 better I haven't found yet, and then I never watch any content.
00:23:58.940 So the amount of content made watching content impractical.
00:24:06.160 Would you agree?
00:24:07.300 You understand what I'm saying, right?
00:24:08.720 It's like going to the Cheesecake Factory, and they've got the 50-page menu, and you're
00:24:13.840 sitting there with somebody who's not good at making decisions about food.
00:24:18.120 Don't go to the Cheesecake Factory.
00:24:20.900 DoorDash, do not go there with somebody who's not good with decisions.
00:24:25.060 You're going to be there a long fucking time.
00:24:27.200 Would you like to see my impression of going to the Cheesecake Factory?
00:24:34.560 Vegetarian page.
00:24:35.820 Boop, boop, boop.
00:24:36.820 All right, so you fish, vegetarian.
00:24:38.620 I'll do the same thing I did last time.
00:24:40.340 Good.
00:24:41.140 And decision made.
00:24:43.000 And here's me for the rest of the 15 minutes.
00:24:45.200 That's me eating at the Cheesecake Factory with anybody.
00:25:01.760 So where was I going on that?
00:25:03.440 Oh, AI movies.
00:25:04.660 I think when AI can make movies, everybody's going to make an AI movie.
00:25:08.260 Would you agree?
00:25:08.880 The moment you can make a movie just by typing in some text, there's going to be so many AI movies.
00:25:17.800 How are you ever going to watch anything?
00:25:20.420 Do you think any of these AI movies are going to be good?
00:25:24.540 Probably not.
00:25:25.860 I mean, it's based on human patterns, and only one out of, you know, a thousand movies are good.
00:25:30.280 So it's not going to be better than one in a thousand, but you know what's going to be different?
00:25:35.320 There will be a billion of them, right?
00:25:38.440 When you go to look for a movie, there will be a billion.
00:25:42.280 A billion movies.
00:25:44.760 Like, actually, a billion.
00:25:47.080 And how many of those will be good?
00:25:49.240 Maybe none.
00:25:51.120 Maybe one in a thousand.
00:25:53.220 How long are you going to look for the good one?
00:25:56.500 How much time are you going to spend looking for the good one?
00:25:58.780 Now, here's even, like, a deeper analysis of this.
00:26:03.880 Years ago, I made the mistake of trying to become a script writer for movies.
00:26:09.280 So I was going to make a Dilbert movie.
00:26:11.340 And I, you know, read some books and studied up on the structure of scripts.
00:26:16.800 Unfortunately, that process ruined movies for me forever.
00:26:20.840 Because once you know how a script has to be written in order to make it onto the screen,
00:26:26.280 you realize it's a formula.
00:26:27.900 And if you don't use the formula, you're not going to get it made.
00:26:31.400 Nobody's going to fund it.
00:26:32.440 Because they need the formula.
00:26:33.760 That's what works.
00:26:35.080 But once you see the formula, and you're like, oh, it's a three-act play.
00:26:38.660 I get it.
00:26:39.220 This is the first act.
00:26:40.520 We have to go through, somebody got hurt.
00:26:43.480 Something bad happened.
00:26:44.400 All right.
00:26:45.620 Then third act.
00:26:46.680 So, and then you know that the B plot interferes with the A plot.
00:26:51.020 You know that anything you see that's called out in the first act has to be important in the last act,
00:26:57.460 or else they wouldn't call it out in the first act.
00:26:59.380 And once you see the whole process and the structure, it doesn't look like art anymore.
00:27:04.820 And when it stops looking like art, it loses all of its punch.
00:27:11.100 So I would say that, and other people have had the same experience I've talked to them,
00:27:15.520 the moment you can see the gears, you lose all your love of the movies.
00:27:21.500 I don't know how people make movies and watch them, because they just should know too much.
00:27:28.460 Maybe they watch them for a different reason, to see how well they're made or something.
00:27:31.600 But once there are tons of AI movies, and they're all using the same formulas that humans use,
00:27:38.960 and there's billions of them, I think this will destroy movies as an art form.
00:27:44.380 I just don't think movies will be a thing.
00:27:47.660 Reels might be a thing, but not movies.
00:27:52.100 And of course, when I say that, it's more like radio.
00:27:55.060 You know, radio lasted forever, even though television was supposed to get rid of it.
00:27:58.800 So there will be probably some AI movies and other movies, but, you know,
00:28:03.980 movies as a major cultural phenomenon should shrink to a niche.
00:28:11.380 All right.
00:28:13.160 Democrats really don't like free speech.
00:28:16.120 You know, everybody says it, but when you see these numbers, it's shocking.
00:28:18.840 So Pew Research, or the Pew Group, or whoever there, found that Democrats and Democrat-leaning independents
00:28:25.820 are much more likely than Republicans and Republican-leaners to support the U.S. government
00:28:31.260 taking steps to restrict false information.
00:28:34.080 Now, this isn't just free speech.
00:28:35.940 It's specifically about false information.
00:28:38.680 So 70% of Democrats and leaning Democrats think the government should restrict what is,
00:28:45.300 in their opinion, false information.
00:28:47.020 And 39% of Republicans say the same.
00:28:53.460 So before any of you Republicans get all feeling good about yourself,
00:28:58.240 39% of Republicans think the government should restrict bad information.
00:29:05.920 Really?
00:29:07.020 How did we get to this place?
00:29:09.140 You see, the problem is that this requires somebody to know what information is good.
00:29:13.280 As soon as you say somebody is going to be the judge of what good information is and what's
00:29:18.720 bad information, you're dead.
00:29:21.800 You are so dead if you let somebody decide for you what's true.
00:29:26.940 You cannot live in that world.
00:29:29.000 That is the end.
00:29:30.680 That's the end of the republic.
00:29:32.340 How do 70% of Democrats and 39% of Republicans not know that allowing the government to tell
00:29:43.780 you what's true would be the end of the republic?
00:29:47.120 How do they not know that?
00:29:49.620 What's going wrong here?
00:29:51.020 And my best guess is that, at least with the Democrats, is that they've been so propagandized
00:29:57.720 that Republicans are putting out bad information that they can't see that their own information
00:30:05.920 is bad as well.
00:30:07.560 Is that what's happening?
00:30:08.420 Would you say that the propaganda makes them think that the only bad information is coming
00:30:14.420 from their enemies?
00:30:16.040 They don't understand that the bad information is coming from inside the House.
00:30:20.520 And they still don't know it, even after the pandemic.
00:30:23.780 Think about that.
00:30:25.140 Even after the pandemic, when we've learned almost everything the government told us was
00:30:30.000 a known lie.
00:30:31.760 They weren't just wrong.
00:30:33.080 They knew they were lying.
00:30:34.220 And yet, the Democrats, 70% of them, somehow are not influenced by the fact that we know
00:30:43.820 the government lied to us massively.
00:30:46.260 Not only about that, but everything from the laptop to you name it.
00:30:50.520 And that doesn't have an influence.
00:30:54.220 70% of the Democrat leaners and Democrats are not influenced by recent events.
00:31:00.760 They still want somebody to tell them what's true.
00:31:02.840 Or, more importantly, to tell the other side what's true.
00:31:06.140 Here's my take on this.
00:31:07.640 I think this is bullshit.
00:31:10.380 I think that it's just a team play response.
00:31:14.820 I think that when people answered the question, they were answering as Democrats.
00:31:19.680 Well, I would like my enemies not to talk as much.
00:31:22.600 I think that's what it is.
00:31:24.300 But if you ask them, I think if you took any of these 70% and sat them down and said,
00:31:29.520 what I just had, just the two of you talking, not a poll, say, all right, you know that both
00:31:35.040 sides have put out bad information.
00:31:37.380 Well, yes, that's true.
00:31:38.860 Right?
00:31:39.540 And you know that if somebody in particular is in charge of telling you what's true, you
00:31:45.020 know that you're in big trouble, right?
00:31:47.160 Yeah, that's true.
00:31:47.860 I mean, I don't think anybody actually holds this opinion.
00:31:51.640 Isn't that weird?
00:31:53.080 People are giving opinions they don't hold because they think you'll probably have a strategic
00:31:58.560 benefit, I think.
00:32:00.420 So I think people are just answering what would be a philosophical question.
00:32:04.640 I think they're answering it strategically, partly what's happening.
00:32:10.080 All right.
00:32:10.440 But is it more than just false information?
00:32:16.100 Wouldn't you say that there's also a suppression by the left of opinion?
00:32:23.260 Don't you think that it's more than just what's true or what's false, but that opinion is also
00:32:28.840 being suppressed?
00:32:30.340 You know, stuff that's clearly opinion.
00:32:33.680 Because some of the opinion is about what's true and what's not true.
00:32:36.680 So you end up blocking opinion when you try to block fact.
00:32:42.640 All right.
00:32:45.880 Now, it might be just because there are so many Democrats who are unable to speak in
00:32:50.340 coherent sentences.
00:32:51.540 You got your Kamala Harris, your Joe Biden, your Fetterman, your Senator Feinstein, who actually
00:33:00.000 can't communicate anymore.
00:33:02.280 But now we can add to that Mitch McConnell.
00:33:04.100 If you saw Mitch McConnell's tragic press announcement and he stood there and actually
00:33:10.560 couldn't speak and he looked like he was having some kind of an event in his head.
00:33:16.380 Apparently he can speak because he did come back, but he's in bad shape.
00:33:21.840 So by my count, that is five major members of our government who can't form sentences.
00:33:29.740 Is that right?
00:33:30.860 Tell me I'm wrong.
00:33:31.920 There are four elected people who are really prominent.
00:33:36.220 These are not minor people.
00:33:38.180 These are prominent politicians.
00:33:40.260 Kamala Harris, Joe Biden, Fetterman, Feinstein, and now Mitch McConnell.
00:33:46.140 How is this okay?
00:33:49.640 How is this okay?
00:33:53.560 I mean, really?
00:33:54.720 There's nobody who can tell Mitch McConnell it's time.
00:34:01.680 I'm pretty disappointed in Republicans.
00:34:05.860 You know, let me just say this.
00:34:07.380 If you're a Republican or a conservative and you've been saying that Joe Biden needs to
00:34:12.040 step down because of age, but you're not saying it about Mitch McConnell,
00:34:17.100 then we can't take you seriously.
00:34:21.000 You're not a real serious person.
00:34:22.680 Both of them need to step down, like really soon.
00:34:28.260 Yeah, Grassley actually is weirdly, Grassley seems pretty capable, doesn't he, relatively speaking?
00:34:35.960 But, you know, like you said, he's close.
00:34:37.940 So, yeah, I mean, the team play here is just disgusting because none of these people would be in power if people were trying to pick good people.
00:34:53.440 So picking a good mate is like picking a good politician.
00:34:57.220 You have every good intention of doing it, but then, you know, there are reasons.
00:35:01.320 There are reasons why you've got to pick these bad ones.
00:35:03.940 And we're going to use those reasons.
00:35:06.540 Speaking of bad ones, Adam Schiff was at Adam Schiff is so entertaining because he I don't know what's in his head.
00:35:18.160 But it seems that he must be aware that he can say anything and the news won't check him on his side.
00:35:27.980 You know, only the Republicans will fact check him and nobody, he doesn't care about them because his base and his voters will never see it.
00:35:34.400 So he said, quote, the Republican desire to impeach someone, anyone, no matter if there's any evidence, just shows how they have descended into chaos.
00:35:46.820 Right.
00:35:47.600 So the Republicans are crazy because they want to impeach the person who's in charge of the border who's just letting everybody in.
00:35:55.760 That's crazy.
00:35:56.740 It's crazy to want to impeach somebody who's just not doing the job on a vital thing.
00:36:02.740 And of course, the Biden stuff is just beyond the pale at this point.
00:36:07.740 So at the same time, there might be a Trump indictment today.
00:36:11.640 Is there been any news yet on a Trump indictment?
00:36:14.300 Don't they usually happen in the morning?
00:36:17.360 Can we say there will be no Trump indictment today?
00:36:20.020 Am I wrong that they always happen in the morning?
00:36:23.700 Or is that just me imagining it?
00:36:28.960 Maybe I'm imagining that.
00:36:31.040 All right.
00:36:31.920 But we'll keep waiting then.
00:36:33.900 So I tweeted, because I thought it was funny, that the news today has UFOs and a possible Trump indictment.
00:36:42.120 That's got to tell you that Hunter's in trouble.
00:36:45.480 But then somebody pointed out, to ruin my fun, that the UFOs are, that's all the Republican process.
00:36:50.640 So the Republicans are not trying to cover up any Hunter stories.
00:36:55.300 But it is funny that we get these two flashy stories at the same time Hunter's in trouble.
00:37:02.020 All right.
00:37:02.800 Coincidence.
00:37:03.900 Anyway, you all heard by now that Hunter lost his sweetheart plea deal.
00:37:08.120 Now, I'm no lawyer, so I'm going to give you the idiot's version of what I understand about this.
00:37:13.140 So Hunter had this sweetheart deal worked out in which he would plead guilty to some minor stuff among a large bag of potential stuff that could have been even worse.
00:37:25.200 Now, that's unusual.
00:37:27.260 Normally, you would plead guilty to, well, let me take another tack on this.
00:37:33.140 So the idea was, at least these smart people are saying, that it seems to have been crafted explicitly to get Hunter off, but more importantly, to prevent anybody from ever finding out more information.
00:37:48.600 And the way they would do that is have a deal in which there's no trial, so you don't have the discovery of the trial.
00:37:54.940 But moreover, if the press asked questions, they would still be able to say the investigations are ongoing.
00:38:02.760 So somehow they were going to say, we've made a deal and all the investigations are closed, while also saying they're still open so we can't talk about it.
00:38:10.920 It was some weird situation that was ridiculous on its surface.
00:38:17.080 And it was really transparent, even to the judge, who overturned it, because it seemed obvious that this was so non-standard, it could only be put together for the purpose of protecting Hunter.
00:38:30.580 Could not have been done, in any reasonable person's opinion, for any kind of justice or normal process that another person would go through.
00:38:39.900 So, here's my big question.
00:38:42.940 Is this going to unravel everything?
00:38:44.720 Will everything come to light now?
00:38:49.680 Because if Hunter can be prosecuted, doesn't that open up the ability to investigate everything related to it, which would necessarily include the entire Biden criminal family organization?
00:39:03.760 And by the way, I have no problem saying it's a Biden crime family.
00:39:08.160 Because I think the evidence in the news is now overwhelming.
00:39:12.420 And it seems to me that the 20 shell companies and the six banks and all that stuff, you could do away with all the alleged audio tapes.
00:39:28.300 You know, let's say we never hear any of these alleged Ukrainian audio tapes.
00:39:31.680 I don't think we ever will.
00:39:32.620 You could say that all the documents, like that 1023, you could say those are all fake.
00:39:41.040 But all you need is to know that a huge amount of money flowed from other countries into a labyrinth of shell companies and banks that has no purpose other than to hide your activity.
00:39:53.380 And that's the whole story.
00:39:57.100 You don't need any documentation.
00:39:59.380 All you have to do is show this flow of money and the structure it flowed into and then where it ended up.
00:40:06.000 Now, we do have direct witnesses, right?
00:40:08.400 We've got the Devin Archer and etc.
00:40:11.540 Tony Bobulinski.
00:40:12.780 So we have direct eyewitnesses under oath.
00:40:18.880 And we've got the bank records and the money and where it went.
00:40:24.180 Right?
00:40:25.380 Now, although I would...
00:40:28.280 You make a good point.
00:40:29.900 I always say people are innocent until proven guilty.
00:40:32.680 But if I watch somebody shoot somebody in my living room with my own eyes and I'm standing right next to them, I'm not going to say that that person is innocent.
00:40:44.080 You know, I don't need the court to validate it because I can observe it directly.
00:40:48.520 This is one of those cases where you can observe it directly.
00:40:52.620 Because Hunter's defense has offered no explanation for why you would have all these shell companies and banks and even what they did for the money.
00:41:01.380 You know, what was the product they gave them for the money.
00:41:04.620 Without that, I would say this is as obvious as watching somebody murder somebody right in front of you.
00:41:11.540 And that would be a real...
00:41:13.260 It's a very rare exception to innocent until proven guilty.
00:41:17.600 Now, somebody who definitely should be innocent until proven guilty would be Andrew Tate.
00:41:23.700 Because we don't really know anything about that, do we?
00:41:26.780 We know these allegations, but nothing really sounds right about any of it.
00:41:32.480 And it's Romania, you don't know if there's corruption.
00:41:36.020 So Andrew Tate is the perfect example of innocent until proven guilty.
00:41:40.320 No matter what the truth is.
00:41:42.840 Whereas this Hunter situation was so well understood that I think you can go beyond the court opinion to have your own opinion about it.
00:41:52.680 Now, I would say that's different than the Trump boxes.
00:41:58.060 Because the Trump box case really relies on a lot of legal theory and who did what exactly when.
00:42:07.020 And I would say that's innocent until proven guilty.
00:42:10.300 I mean, it looks like maybe there were some technical violations that nobody should care about.
00:42:14.860 But, yeah.
00:42:16.320 Anyway, so we'll see what comes to that.
00:42:20.940 But it does look like...
00:42:22.400 I would say there's no chance of Joe Biden being the next president.
00:42:30.860 Do you think there's any chance of that now?
00:42:34.540 Because I think the evidence that we have already would guarantee he can't win against any Republican who's sane.
00:42:40.840 And you're going to have strong Republicans going into the race, whoever it is.
00:42:46.520 You know, it looks like Trump at the moment.
00:42:48.580 But no matter who it is, you know, any of the top four is going to just destroy Biden.
00:42:54.700 I mean, all you have to do, imagine a debate.
00:42:58.120 Imagine a debate in which Trump or somebody says,
00:43:01.020 look, we've discovered 20 shell companies, six banks, and whatever the number is,
00:43:07.760 $17 million that flowed into the family,
00:43:10.280 and no explanation of what services were provided for that.
00:43:14.400 And no explanation of why you needed to hide that money.
00:43:18.200 Now, ladies and gentlemen of America, I submit to you
00:43:21.540 that if they offered a defense, you should consider it.
00:43:26.360 They've not offered a defense.
00:43:27.960 And the facts are no longer under dispute.
00:43:33.160 Look with your own eyes.
00:43:34.880 See it with your own eyes.
00:43:36.280 This is the swamp.
00:43:38.000 This is what I'm trying to get rid of.
00:43:40.380 And I'm trying to get rid of the head of the swamp first.
00:43:44.020 Let me take the head of the snake,
00:43:45.540 and we'll worry about the rest of the snake when I get elected.
00:43:48.520 But I think all of you can see what I see.
00:43:51.820 You know, although I agree that people are innocent until proven guilty,
00:43:55.040 there is no other explanation for this money flowing from other countries
00:43:59.060 and from the countries that were exactly the portfolios that Biden was controlling
00:44:03.720 in terms of policy.
00:44:05.960 So let's not get caught up into the legal details.
00:44:09.500 You can all see the stuff that nobody's arguing about.
00:44:12.640 You can all see for yourself.
00:44:14.380 It's been proven.
00:44:15.300 We have the documents.
00:44:16.280 The money flowed from other countries.
00:44:17.700 There was no product or service provided.
00:44:22.220 This is the swamp.
00:44:24.220 This is what you need to get rid of.
00:44:27.860 Everybody wins in that case.
00:44:31.060 Now, I believe that the Democrats are unwilling to rig the election for Biden,
00:44:37.820 even if they could.
00:44:39.580 And there's no evidence that they ever did.
00:44:42.040 But if they could, hypothetically, I don't think they would.
00:44:45.340 Because I think they don't want him to run.
00:44:48.860 It's too much trouble.
00:44:51.320 So even CNN is starting to run some anti-Biden commentary.
00:45:00.360 So, yeah, I think Biden's done.
00:45:03.400 That's my prediction.
00:45:04.640 My prediction is that Biden can't win.
00:45:09.000 So I'm going to go with it doesn't matter who does win.
00:45:12.080 It matters, but that's not part of the prediction.
00:45:14.600 The prediction is Biden will not be president.
00:45:17.760 Whether he gets the primary or not,
00:45:20.920 he doesn't have, I'd say, any chance of winning at this point.
00:45:31.720 This will be a good test.
00:45:34.640 Have you heard anybody else say that Biden can't win,
00:45:38.120 based on the Hunter stuff especially?
00:45:39.880 Has anybody in the pundit space said?
00:45:45.980 Because I think most people are still on the page.
00:45:48.100 The Democrats are thinking he can win.
00:45:51.580 And Republicans are thinking he can win because something will happen with the election, right?
00:45:56.360 Or the Democrats don't care if he's sentient or not.
00:45:59.160 So I don't know how many people have predicted this, if any.
00:46:05.160 I haven't heard it from anybody.
00:46:06.580 But I'm going to say there isn't any chance he can be president again.
00:46:10.880 So between his health and the Hunter stuff and the fact that the Democrats don't want to put up with it again,
00:46:17.380 and even CNN started to change his tune a little bit on the reporting.
00:46:20.440 Now, remember, you're saying that the cheating can make him president.
00:46:26.900 I'm saying that they wouldn't cheat in this case.
00:46:30.140 Because they don't want him to be president.
00:46:32.420 So if there were going to be cheating,
00:46:34.520 it would make far more sense to put in a better candidate ahead of time so you don't have to cheat.
00:46:40.560 See where I'm going?
00:46:41.500 If you imagine there's such a thing as some, you know, Democrat conspiracy to cheat,
00:46:49.400 they would be pretty well organized.
00:46:51.360 Would you agree?
00:46:52.620 If that existed, and there's no evidence of it,
00:46:55.000 but if it existed, they'd be well organized.
00:46:57.560 But they would certainly be well organized enough to make sure they were not in that position,
00:47:02.720 to make sure that Biden was not running.
00:47:04.860 And I think they could make that happen.
00:47:06.580 They, the imaginary people in charge.
00:47:08.900 So that's all just imagining that such people exist.
00:47:13.440 Maybe they do.
00:47:14.420 Maybe don't.
00:47:15.340 I don't know.
00:47:17.940 All right.
00:47:20.820 How bad should we be about this?
00:47:23.640 I think we found something we can all come together on.
00:47:27.780 And that's rare.
00:47:29.420 The left, the right, black, white, LGBTQ.
00:47:33.880 There's one thing we can come together as.
00:47:35.400 I think we can come together as a nation and universally condemn those things that nobody ever said.
00:47:44.020 Can we agree?
00:47:45.640 Would you join with me to condemn the terrible things that nobody's ever said?
00:47:53.040 For example, nobody's ever said that slaves had a good deal.
00:47:56.400 Nobody ever said that slaves were benefiting by those, you know, in a way that made slavery sort of a good deal by having skills.
00:48:06.680 Nobody's ever thought it.
00:48:08.720 Nobody's ever said it.
00:48:10.520 But the entire country is deciding to argue over it.
00:48:13.860 The thing that nobody said and nobody believes.
00:48:16.160 Am I wrong about that?
00:48:19.620 Nobody said it.
00:48:21.340 Nobody believes it.
00:48:23.120 And we're having a big argument about it.
00:48:25.080 The thing that nobody said and nobody believes.
00:48:30.900 So that's happening.
00:48:33.520 But at least we can come together on condemning the things that never happened and nobody ever said.
00:48:38.020 And the funniest part about that story, it's, you know, the topic is history teaching in Florida and whether it should say that the slaves were acquiring skills that had some benefit to them later on.
00:48:56.400 And it turns out that the people who are really mad about the DeSantis plan for history lessons want to replace it with a more generic one that they're happy with.
00:49:09.660 Do you know what the difference is between the generic one that the unhappy people would like to replace and the one that DeSantis wants?
00:49:18.440 Do you know what the difference is?
00:49:20.400 On this question, none.
00:49:23.480 It says the same thing.
00:49:25.540 Same thing.
00:49:26.400 That slaves learn skills, which they could take with them later, both, you know, during slavery and after.
00:49:33.900 And it would be better to have a skill than not to have a skill.
00:49:36.900 They both say that.
00:49:39.020 Not only is the entire debate about something that nobody said, nobody said the part that the slaves had a good deal.
00:49:48.780 Everybody said the part that they learned skills.
00:49:51.980 And everybody agrees that learning skills is better than not learning skills.
00:49:55.240 So this is like a double, triple absurdity situation.
00:50:02.120 This is summer squared.
00:50:02.680 This is summer squared.
00:50:05.600 Stories that aren't real.
00:50:08.120 But we're, you know, we have something to talk about.
00:50:10.140 All right, I would like to give you the best part of my presentation today, which is how to deal with the Republicans, mostly the Biden camp, who are calling Republicans extreme MAGA.
00:50:24.120 Does it seem to you they're getting some traction on this extreme MAGA situation?
00:50:30.400 Does it feel like it's working?
00:50:32.260 Because it feels like other people are picking it up.
00:50:34.520 But I'm going to teach you a persuasion trick for removing extreme MAGA from the attack.
00:50:42.880 You ready?
00:50:43.380 Now, it's not much different from other techniques.
00:50:48.180 So you'll recognize the technique as an embrace and amplify.
00:50:54.100 But I'm tweaking embrace and amplify a little bit.
00:50:57.780 Because embrace and amplify means you just embrace somebody's dumbass idea.
00:51:01.960 And then you pretend you agree with it.
00:51:04.340 And the pretending you agree surfaces all the problem with the idea.
00:51:08.580 This is a little bit different, but it's a close cousin to that.
00:51:13.820 All right.
00:51:14.240 Imagine a video skit.
00:51:16.840 So it would be a viral video using actors.
00:51:21.220 All right.
00:51:21.760 The actors are a couple of very Republican, ordinary looking people, just like the most ordinary Republicans you could have.
00:51:30.960 But let's say they're a little bit overweight.
00:51:33.260 Just so you can imagine this.
00:51:34.600 I'm going to draw you a picture in your head.
00:51:36.540 They're a little bit overweight.
00:51:37.480 They're very middle class.
00:51:39.340 So you see them in their living room sitting on the couch.
00:51:43.000 Maybe mom is knitting or something.
00:51:45.980 And dad's there with a man's mirror, not a Bud Light.
00:51:49.920 And they're watching the news.
00:51:51.420 And you hear Joe Biden say something about extreme MAGA.
00:51:55.300 Extreme MAGA.
00:51:56.320 Extreme MAGA.
00:51:57.420 And you see the two looking at each other.
00:51:59.080 And they're like, well, we're regular MAGA.
00:52:03.720 Was that an upgrade?
00:52:05.360 Is that like a promotion?
00:52:07.480 Like, I'd like to be more extreme.
00:52:09.520 And so you see the two people trying to figure out how they can get to extreme.
00:52:16.640 And so they talk to their friends.
00:52:18.520 You know, it all happens really quickly because it's like a quick video.
00:52:21.420 They talk to their friends and they're like, we'd like to be more extreme.
00:52:25.620 And they're like, well, you could wear this, you know, this hat that has embossed lettering on it.
00:52:33.660 And they're like, is that extreme?
00:52:35.040 Well, compared to the regular hat that has no embossed lettering, that's way more extreme.
00:52:39.920 So you would do like little skits of regular Republicans trying to become or get upgraded to the excitement of being an extreme MAGA.
00:52:50.780 But they can't figure out what is included.
00:52:53.600 So they're trying to figure out how to do it.
00:52:55.420 What's the process?
00:52:56.760 How do I get to be more extreme?
00:52:58.740 And of course, the point of it is it's a ridiculous concept.
00:53:01.860 There's no such thing as an extreme MAGA.
00:53:04.440 It's all made up.
00:53:07.180 So you want to embrace it so that you can mock it.
00:53:12.360 That's different than embrace and amplify.
00:53:15.380 Because when you embrace and amplify, the mocking is sort of embedded in your actions.
00:53:20.060 But you're not actively mocking it.
00:53:22.140 You're pretending like you're just playing it straight.
00:53:24.580 In this case, you would be actively mocking it.
00:53:27.460 But you're embracing it before you actively mock it.
00:53:31.860 So close cousin of embrace and amplify.
00:53:35.440 Now, in my opinion, the silliness of extreme MAGA, you can immediately take it from, extreme MAGA sounds like racist, right?
00:53:45.540 You know, crazy racist.
00:53:48.380 But you can make extreme MAGA just seem silly and cute and ridiculous.
00:53:53.420 Just by having more attention to the counterpoint.
00:53:59.980 What do you think?
00:54:01.860 Would it work?
00:54:03.500 Lots of videos of the extreme MAGA.
00:54:05.540 And then you also get, you'd also want your Republicans to make fun of it.
00:54:10.640 You know, can you imagine Vivek or, you know, somebody who's got a personality saying, you know,
00:54:15.620 are you extreme MAGA?
00:54:17.180 Do you agree with extreme MAGA?
00:54:19.120 There's an extreme MAGA?
00:54:21.180 Wow.
00:54:21.500 Is there any kind of a sign-up for that?
00:54:24.880 Or is it more of a self-identification?
00:54:28.280 How do I become more extreme MAGA?
00:54:31.440 You know, just mock the thing by smiling and laughing at it.
00:54:36.920 You know, I don't think I'm extreme MAGA enough.
00:54:39.860 I'm going to, you know, or how about this?
00:54:42.420 All right, here's another idea.
00:54:43.280 You show somebody with a Make America Great Again hat on, you know, the red hat, and then
00:54:48.700 they're trying to figure out how to become extreme, and then one of them has an idea.
00:54:52.700 And they turn the hat around backwards, and everybody's like, I think you did it.
00:55:01.840 That's like crazy.
00:55:03.600 Oh, my God.
00:55:04.500 My God.
00:55:05.140 Could we do that, too?
00:55:06.400 How did you do that?
00:55:07.900 And then they're all like, huh, huh, huh, huh, and they turn their hats around, and they're
00:55:14.700 like, and suddenly they're all excited.
00:55:17.580 We've done it.
00:55:18.520 We've done it.
00:55:19.240 And they go to have a drink, but it's only Bud Light.
00:55:22.900 They're like, huh, huh, huh.
00:55:25.860 It'd be hilarious.
00:55:27.560 You should make 50 of these.
00:55:30.060 We should use that AI.
00:55:32.720 That's what you do.
00:55:34.520 Since now we've been told, and I'm sure it's true, because people said so, that you can just
00:55:40.780 write text to create a whole movie.
00:55:43.120 So you can just write some text to create this little play of the MAGA people trying
00:55:49.180 to be extreme MAGA, and it'll be totally watchable because AI made it, right?
00:55:54.220 No, it won't.
00:55:55.900 If you want to show yourself that AI can't make a movie, no matter how many posts you see
00:56:01.600 about it, at least now, it can't make a movie, try to do that.
00:56:06.400 Try to do just that little short film, a two-minute clip with a very clear message, and see, you tell
00:56:14.540 your AI to make that, and watch how much you don't want to watch it when it's done.
00:56:18.820 But humans could make the hell out of that.
00:56:25.700 AI can't do that.
00:56:29.460 Yeah, you need humans editing, period, because the machine can't tell what's working and what
00:56:34.700 isn't.
00:56:35.480 That's a human job.
00:56:37.840 Make your bed.
00:56:40.580 What would be the funniest thing a MAGA person could do that in the humorous world would look
00:56:47.460 extreme?
00:56:48.820 But it would be silly.
00:56:51.780 Now, I was also imagining what would be extreme MAGA sporting events.
00:56:59.420 You know, if you had, like, an...
00:57:00.440 Because extreme MAGA sounds like extreme sports.
00:57:03.720 So I was imagining, suppose all the extreme MAGAs had some kind of Olympics just for extreme
00:57:09.100 MAGAs.
00:57:09.560 What events would be there?
00:57:11.720 And I thought, you know, the real Olympics already has the skiing and shooting.
00:57:16.100 I have this feeling that a Republican extreme MAGA events would just be regular sports, but
00:57:26.400 also shooting.
00:57:27.320 So it would be like tennis, but you could shoot.
00:57:33.020 You know, you could hit the ball or you could shoot it in the air.
00:57:35.420 It would be like basketball, but instead of playing defense, you could just shoot the
00:57:39.580 ball when it's in the air.
00:57:41.100 So I think you just take regular sports and you add shooting, and you got your extreme MAGA games.
00:57:49.560 Extreme MAGA!
00:57:51.080 We've got basketball, we've got soccer, we've got soccer, we've got curling, we've got curling, we've got curling, we've got curling, that's it.
00:58:05.960 All right.
00:58:06.880 We've got, we've got racing, we've got shooting, we've got lawn darts.
00:58:13.960 Well, the lawn darts stand alone.
00:58:16.240 That's the only one with no shooting.
00:58:18.340 We've just got lawn darts.
00:58:20.140 But you can fire them at each other.
00:58:22.360 It's more, it's less target practice and more team against team.
00:58:28.360 And cornhole.
00:58:29.060 It's cornhole, it's cornhole, but we're shooting.
00:58:34.680 And no bud light.
00:58:38.960 Yeah.
00:58:39.840 Maybe it's mowing your lawn and not drinking a bud light.
00:58:44.520 I've been mowing, I'm so thirsty.
00:58:46.680 Yeah, that would be the competition.
00:58:48.480 You have to mow your lawn on a hot day, and there has to be a cold bud light sitting in front of you.
00:58:55.580 And you see who can go the longest without taking a sip of the bud light.
00:59:00.020 You have to just keep mowing in the sun.
00:59:02.480 And you can't use sunscreen.
00:59:05.260 You're just getting redder and redder.
00:59:06.720 Like, ah, ah, ah, ah.
00:59:11.000 Yeah, and then whoever, whoever can last the longest wins that competition.
00:59:17.100 Yeah, the mowing and resisting the bud light would be a good extreme mega.
00:59:22.240 What else?
00:59:22.980 What would be extreme?
00:59:32.440 Chess boxing.
00:59:34.080 Chess boxing.
00:59:40.220 That's more mocking Republicans.
00:59:42.540 You can kind of see a chess board, and then two Republicans wearing boxing gloves.
00:59:47.800 And they don't know the rules of the chess game, so they just punch each other instead.
00:59:51.500 That would be pretty funny.
00:59:58.040 NASCAR.
00:59:59.260 We're shooting.
01:00:02.760 All right.
01:00:03.760 Ladies and gentlemen, I think I've done what I need to do, and we've accomplished everything except getting the locals' platform to work.
01:00:12.080 So I guess I'll work on that when I'm done.
01:00:13.960 Well, yeah, the wet t-shirt contest.
01:00:18.640 Except it's all men.
01:00:26.720 Cornhole with grenades.
01:00:29.020 Now you're talking.
01:00:30.700 Jousting with baguettes, okay.
01:00:32.380 Yeah, Sinead O'Connor passed away.
01:00:39.440 I didn't hear of what.
01:00:41.520 Chess boxing is a thing?
01:00:44.080 Is it really?
01:00:48.640 The wife?
01:00:49.380 All right.
01:00:54.760 Competitive, get off my lawn.
01:00:56.680 Get off my porch.
01:01:01.840 All right.
01:01:02.320 I think we've done it.
01:01:03.460 Thank you for joining, and I will see you tomorrow, ladies and gentlemen.
01:01:08.080 And if you are a member of the local subscription service, I will probably see you tonight.
01:01:13.040 But I don't know how many are listening here.
01:01:17.120 Bye for now.