Real Coffee with Scott Adams - February 08, 2023


Episode 2013 Scott Adams: Climate Hoax Explained, State Of The Union Graded, New Printer Destroyed


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

144.34488

Word Count

10,825

Sentence Count

830

Misogynist Sentences

4

Hate Speech Sentences

16


Summary

A civilization problem is when you can't solve a problem, and you don't have enough time to fix it, but you still want to do it anyway. This is a problem that most of us have, but it's not as simple as you might think.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization.
00:00:09.680 Although things might get a little brutal a little bit later on this live stream,
00:00:14.000 I've got a little problem with my printer.
00:00:16.780 You may have heard what happened to the last printer.
00:00:19.640 The last printer was destroyed on the floor of my office for non-performance.
00:00:24.760 But I've replaced it with a brand new HP laser printer, which has no option for finding Wi-Fi that I can determine.
00:00:34.760 So I'll probably decide by the end of this live stream whether I should destroy it on my floor for your entertainment.
00:00:44.600 I think $500 is not too expensive to entertain you.
00:00:49.180 But if you'd like to take this experience up to another level, all you need is a cup or mug or a glass,
00:00:54.580 a tank or gels, a tie, and a canteen jug or a flask, a vessel of any kind,
00:00:58.920 filling with your favorite liquid.
00:01:01.600 I like coffee.
00:01:03.200 And join me now for the unparalleled pleasure of the dopamine to the day,
00:01:06.560 the thing that makes everything better.
00:01:07.880 It's called a simultaneous sip, and it happens now.
00:01:12.580 Go.
00:01:12.820 Go.
00:01:12.880 Now, I'd like to speak to the boomers out there.
00:01:22.680 Boomers, I know we get a bad reputation for not being able to handle technology very well.
00:01:31.240 And, you know, maybe that's deserved.
00:01:33.900 Maybe it is.
00:01:35.240 It's possible.
00:01:35.860 But the other possibility is that we're busy, and there's a category of problems that I find I can't solve.
00:01:45.360 And I want to see if anybody has the same problem.
00:01:48.200 If I have a big problem, I will dedicate a lot of time to solving it, and then I do.
00:01:54.520 So I find that I can solve my big problems quite efficiently.
00:01:58.640 If I have a small problem that I can fix in maybe a minute or five minutes, I'm really good at fixing those.
00:02:08.500 If it's a five-minute problem, I'm on it.
00:02:11.040 If it's a big multi-day or even multi-week problem, oh, I can solve that.
00:02:16.260 But there's a category of problem that I can't solve.
00:02:19.680 And I think I'll be the first person to call this out as a civilization problem.
00:02:29.200 This is a really, really big problem.
00:02:32.280 And it goes like this.
00:02:34.240 I have a number of small problems.
00:02:37.640 They're small problems, but they might take more than an hour to fix.
00:02:42.880 For example, I had some credit card problems with a lost credit card.
00:02:48.500 So everything that's connected to my credit card starts throwing up, and then I have to go add a new credit card.
00:02:55.000 So I had a subscription to the Hulu service, and it told me, oh, your credit card's gone, so you can't keep watching.
00:03:03.860 So I said to myself, well, I'm a boomer, but one thing I can do is add a new credit card to an app.
00:03:12.000 I mean, how hard is that, right?
00:03:13.680 That's pretty simple.
00:03:14.340 So I opened the app, and the app says, oh, not here.
00:03:18.600 You don't add that credit card in the app.
00:03:20.780 Go to our website.
00:03:22.500 So I go to the website.
00:03:24.280 Now I'm approaching five minutes in.
00:03:27.420 Do you see the problem?
00:03:29.440 Because I'm starting to hit my five-minute limit, and I'm not going to be anywhere near getting ready.
00:03:35.280 So I go to the website, and I'm thinking, well, if it's just a website, and I just put in my email, it should be fine.
00:03:43.160 It should be five minutes.
00:03:44.320 And the website says, use iTunes.
00:03:47.820 What?
00:03:48.920 So I go to the app, to the website, to iTunes, and I don't have iTunes on my computer.
00:03:53.500 So am I going to download iTunes, and the reason I don't have it is because it's an enormous problem?
00:04:02.620 I mean, I don't want it on my computer.
00:04:04.760 And if I put it on there, would I really be able to figure out within iTunes how to update my credit card?
00:04:11.480 Because I don't even know what that has to do with my credit card in Hulu.
00:04:14.920 What the hell does iTunes have to do with Hulu?
00:04:18.120 So what do I do?
00:04:19.100 I've made the decision to never watch Hulu for the rest of my life.
00:04:26.700 Now, I'm not blaming Hulu.
00:04:29.040 I'm not blaming myself.
00:04:30.620 I'm just saying that if it's a small problem, because watching Hulu doesn't change my life, right?
00:04:36.780 I can watch something else.
00:04:38.920 So Hulu can never be solved because I will never put an hour of time into it.
00:04:43.820 And if I did, I could totally solve it.
00:04:46.300 Totally.
00:04:46.660 Totally.
00:04:47.220 Now, if I were 25, what would I do in this situation?
00:04:51.660 Because I was 25, I know what I would have done.
00:04:54.380 What would you do if you were 25?
00:04:56.420 You'd spend an hour, right?
00:04:58.480 You'd spend an hour because you have an hour, and you really want to watch Hulu, and you don't have much else going on.
00:05:04.900 So I'd spend an hour, and I'd fix it.
00:05:06.640 I'm not sure this has to do with being a boomer.
00:05:10.680 I think it has to do with what I think my time is worth.
00:05:14.840 And I'm not going to spend an hour so I can watch more Hulu.
00:05:18.460 So the other day, I noticed I started getting ads on YouTube, which you don't get if you've paid for the, what's it called, RedTube or something?
00:05:27.720 If you paid for the subscription on YouTube, you don't need ads.
00:05:31.100 But the same credit card problem took down my RedTube, so I started getting ads.
00:05:36.800 Now, I'm unwilling to watch any service with ads.
00:05:39.960 I don't watch network TV with ads.
00:05:43.080 I won't watch anything with ads.
00:05:44.860 So I say to myself, well, I'm going to have to go put in my credit card.
00:05:50.180 I cannot figure out where in the world I would update my credit card.
00:05:56.140 It might be on my phone.
00:05:58.040 Is it an Apple credit card that's somehow interacted with YouTube?
00:06:01.880 Is it somehow connected to Gmail and Chrome because it's all Google?
00:06:07.240 Well, it's not on my app.
00:06:11.180 So I spent five minutes looking for how to update my credit card.
00:06:19.380 I couldn't find it in five minutes.
00:06:22.600 Or not RedTube.
00:06:23.480 I guess RedTube's porn, isn't it?
00:06:26.340 What's the name of the YouTube service where you don't see subscriptions?
00:06:33.140 All right.
00:06:33.800 YouTube what?
00:06:35.200 Premium.
00:06:35.820 YouTube Premium.
00:06:36.800 All right.
00:06:37.000 Whatever it is.
00:06:38.740 I think that was too much information.
00:06:41.620 So now my current decision is to never watch YouTube again or to spend an hour trying to
00:06:48.280 unfuck that.
00:06:49.900 I think I'm never going to watch YouTube again.
00:06:53.280 Why is it so hard to figure out where my credit card is when I'm trying to update it?
00:06:59.000 Now, take my printer.
00:07:00.200 I got this new HP laser printer because I destroyed my other one on the floor for being defective.
00:07:08.000 And the laser printer comes up.
00:07:11.120 And the only thing it says for setup is go to your computer and use the HP software that you download.
00:07:17.140 So I go to my computer and it says, find your printer on the network.
00:07:22.780 But there was nothing to put the printer on the network.
00:07:26.900 So I only have one interface and the interface doesn't say anything about a network.
00:07:34.820 All it says is it should be on the network, but no way to do it.
00:07:41.040 There's no separate menu.
00:07:42.200 There's no settings.
00:07:43.440 At least I can't find it.
00:07:44.460 Now, if I spend an hour, could I make my printer work?
00:07:51.520 What do you think?
00:07:52.900 If I spend one hour, I think I could.
00:07:56.600 Do you think I will spend one hour?
00:07:58.300 Or do you think I will spend five minutes destroying it on my floor in front of you
00:08:02.200 and then order a different model that might work the first time?
00:08:07.320 Because if I order a new one, my time is fairly valuable.
00:08:11.660 So if I just destroy the HP for not having a user interface that works for me
00:08:18.760 and I just buy another one, the new one will come
00:08:22.560 and I might be able to set it up in five minutes.
00:08:26.420 So what do I do?
00:08:28.580 What would you do?
00:08:30.820 Now, somebody says return it.
00:08:35.200 I basically don't return anything because I'm not going to spend the extra time.
00:08:39.500 So do I destroy it for your entertainment or do I spend an hour trying to fix it?
00:08:49.800 All right.
00:08:50.100 Well, think about that while we talk about the rest of this.
00:08:52.980 I saw Michael Schellenberger tweeting that somebody has figured out that Navy divers,
00:08:59.600 the U.S. Navy divers blew up that Nord Stream pipeline that people were wondering who blew it up.
00:09:06.900 Remember the United States blamed Russia for blowing up their own pipeline?
00:09:12.200 Well, is there anybody who can admit being embarrassed for ever or ever imagining that Russia blew up its own pipeline?
00:09:21.440 Is there anybody who would be willing to say, okay, I'm embarrassed I ever thought that?
00:09:26.440 You should have been embarrassed if you ever thought that Russia blew up its own pipeline.
00:09:32.260 There's not a chance in the world that that happened.
00:09:35.140 Now, I'm not sure I believe this story.
00:09:37.600 I don't know how anybody knows that there was a mid-summer NATO exercise
00:09:41.580 and Navy divers surreptitiously planted these explosives months in advance.
00:09:47.500 Eh, maybe, possibly, but who knows?
00:09:53.720 So I don't believe anything that's, you know, got sources like that.
00:09:58.980 The President Biden's spokesperson, Corinne Jean-Pierre, I love Greg Gutfeld's new name for her,
00:10:12.740 Cringe Jean-Pierre.
00:10:14.820 Cringe.
00:10:16.380 She's hard to watch without cringing.
00:10:19.160 And here's what I find interesting about watching her.
00:10:23.320 There's yet another clip of Corinne Jean-Pierre trying to speak in public, and it just goes all wrong.
00:10:32.580 And I thought to myself, there's a lot of people in that room.
00:10:37.020 Do you think there was even one other person in that whole room of reporters and other politicians
00:10:42.620 or whoever's there, I don't know, other staffers,
00:10:45.100 do you think there was even one person in that room who would not be able to form a sentence?
00:10:50.320 The only person in the room who couldn't form a coherent sentence
00:10:55.240 was the spokesperson for the United States President.
00:11:00.920 And we're kind of okay with that.
00:11:05.160 What would it take?
00:11:07.000 What would it take to get fired?
00:11:09.560 It seems like there's nothing that's bad enough for the Democrats to admit,
00:11:18.560 all right, all right, maybe we ought to make a change here.
00:11:21.980 I mean, we literally, without any hyperbole,
00:11:27.640 have found the least capable communicator in the entire country
00:11:31.900 and put her in charge of communicating for the president,
00:11:35.660 who is the second least good, the second worst communicator in the world.
00:11:44.660 I mean, how did we get to this point
00:11:46.980 where the president and the spokesperson for the president can't speak?
00:11:54.380 Well, anyway, here we are.
00:11:55.680 So, Chef, do you follow Chef Andrew Gruel on Twitter?
00:12:06.300 He's a well-known chef whose last name is Gruel, G-R-U-E-L.
00:12:14.020 That's like the dentist who's Dr. Chu
00:12:15.900 and the baker whose name is Baker.
00:12:19.120 But Chef Gruel, he reports that there's lots of automation
00:12:25.120 happening in restaurants now.
00:12:26.360 So there's a Korean barbecue joint that uses robots to deliver food.
00:12:32.040 What do you think of that?
00:12:33.760 Robots to deliver your food.
00:12:36.080 Well, I plan to open someday a Dilbert Diner
00:12:38.560 where it's a restaurant for lonely introverts.
00:12:44.560 So instead of two tops and four tops,
00:12:46.680 the normal tables, you just have one tops.
00:12:49.880 Everything's just for you and your laptop.
00:12:52.620 Just a one chair, a table, and a place to plug in your laptop.
00:12:57.660 And you'd order on your app, and the robot would bring it to you,
00:13:01.520 and you wouldn't have to deal with any humans whatsoever.
00:13:04.800 The Dilbert Diner.
00:13:06.380 Who would go?
00:13:07.280 Oh, and also the kitchen would be primarily robots.
00:13:12.240 Mostly a big robot that makes everything.
00:13:16.680 You would go to that.
00:13:18.220 Yeah, and Chef Gruel reports that the robot restaurant is packed.
00:13:25.560 So people love it.
00:13:27.340 Now, at some point, it would no longer be a novelty,
00:13:30.680 so it's not going to work that way.
00:13:32.040 But I think the Dilbert Diner is just begging to be made.
00:13:36.900 What do you think?
00:13:38.420 All right, Elon Musk has teased
00:13:40.100 that he's going to be introducing sometime real soon.
00:13:44.560 Master Plan 3,
00:13:46.380 The Path to a Fully Sustainable Energy Future for Earth,
00:13:50.060 will be presented on March 1st, it is.
00:13:53.060 And he says,
00:13:54.100 The future is bright.
00:13:55.660 Now,
00:13:56.200 I could not be
00:13:58.160 more interested
00:13:59.620 than this.
00:14:02.040 All right,
00:14:03.040 here's what I want.
00:14:04.420 Because you probably know
00:14:05.720 that there are a number of people,
00:14:07.360 Alex Epstein being among them,
00:14:09.860 who say that the
00:14:11.260 sort of all-electric world
00:14:16.340 just can't work,
00:14:17.420 like the math can't work.
00:14:19.560 But Elon Musk says it can.
00:14:23.120 Elon Musk says it can.
00:14:25.100 So,
00:14:25.720 don't you want to see that
00:14:27.180 conversation take place in public?
00:14:29.640 Wouldn't you like to see
00:14:32.120 Elon Musk
00:14:33.020 defend
00:14:34.660 his opinion
00:14:36.080 that we can get there
00:14:37.360 with batteries and solar panels
00:14:38.680 and maybe some windmills
00:14:40.200 and eventually get rid of our
00:14:42.600 carbon stuff?
00:14:44.580 Wouldn't you love to see that
00:14:45.760 conversation?
00:14:47.360 Because here's my problem.
00:14:49.900 I don't actually know who's right.
00:14:53.020 I guess that's always my problem.
00:14:54.960 But I'm not sure
00:14:56.420 it is possible.
00:14:57.440 But I'm also not sure
00:14:59.500 it's impossible.
00:15:01.300 It feels like it's within
00:15:02.700 the realm of something
00:15:03.740 we could get to,
00:15:05.280 but maybe we don't know
00:15:06.480 how yet.
00:15:07.740 So there might be
00:15:08.660 difference in assumption
00:15:09.600 of whether we can
00:15:11.240 innovate our way to
00:15:12.700 a better efficiency.
00:15:15.540 And maybe that's
00:15:16.600 the only difference.
00:15:18.040 Yeah.
00:15:18.360 It could come down
00:15:19.500 to the only difference.
00:15:20.600 Yes,
00:15:21.200 I think Elon Musk
00:15:22.320 is pro-nuclear as well.
00:15:24.020 So let's throw in nuclear.
00:15:26.040 All right.
00:15:26.360 A weird thing happened to me
00:15:28.300 as I was flipping through
00:15:29.860 channels last night
00:15:31.260 looking for reactions
00:15:32.860 to the State of the Union,
00:15:34.080 which we'll talk about.
00:15:35.460 And I came across
00:15:36.620 the young Turks
00:15:38.940 with a jenk.
00:15:42.740 And here's what I was expecting.
00:15:45.700 I was expecting,
00:15:47.180 wow, that Biden is killing it.
00:15:49.120 He's so good.
00:15:50.760 Donald Trump
00:15:51.460 was a horrible person.
00:15:54.000 And aren't we glad
00:15:55.060 that Trump is gone
00:15:56.100 and Biden's in.
00:15:58.720 And instead,
00:15:59.900 it was closer
00:16:00.780 to the opposite.
00:16:02.820 And I didn't watch
00:16:03.860 much of it,
00:16:05.100 but the brief part
00:16:06.900 I saw
00:16:07.440 appeared to be
00:16:10.240 jenk Uyger.
00:16:11.820 I have to apologize
00:16:15.060 to him publicly
00:16:15.840 for not being able
00:16:17.620 to say his name correctly.
00:16:18.720 That's on me.
00:16:20.180 So
00:16:20.460 it looked like
00:16:23.360 jenk
00:16:24.980 was
00:16:27.040 turning pro-Trump
00:16:29.560 and it looked like
00:16:32.320 somebody waking up.
00:16:33.420 and I'm wondering
00:16:36.840 if anybody's noticed it.
00:16:39.180 Now, he didn't go
00:16:40.120 so far as to say
00:16:41.140 I like Trump,
00:16:42.100 he was better than Biden.
00:16:43.220 I mean, that's way farther
00:16:44.280 than he went.
00:16:45.340 But it looked like
00:16:46.540 it looked like
00:16:48.520 the spell was breaking.
00:16:50.000 And I watched it
00:16:50.940 with fascination
00:16:51.800 because he was saying
00:16:53.380 things that were just
00:16:54.140 sort of
00:16:54.620 objectively true
00:16:56.400 without any spin.
00:16:59.160 And the objectively
00:17:00.380 true stuff
00:17:01.100 was not leading him
00:17:02.280 to his old opinions.
00:17:03.420 objectively true
00:17:05.120 was sort of
00:17:06.160 leading him
00:17:06.560 into the promised land
00:17:07.560 of actually understanding
00:17:08.680 what's going on here.
00:17:10.760 I don't know.
00:17:12.580 Just give it a look
00:17:13.640 and get back to me, okay?
00:17:15.360 Just look at his
00:17:16.800 just anything.
00:17:17.980 Look at his
00:17:18.440 reaction to the
00:17:20.220 State of the Union
00:17:20.940 and get back to me
00:17:22.240 and tell me
00:17:22.660 he sees it, right?
00:17:24.600 Like he sees that
00:17:25.660 Biden's
00:17:26.600 not quite all there.
00:17:28.640 I think he does.
00:17:30.420 I think he does.
00:17:31.340 So something interesting
00:17:32.260 might be happening there.
00:17:33.420 All right.
00:17:35.380 Here's some more info
00:17:36.180 on the Biden crime family.
00:17:38.320 I'm going to call him
00:17:39.020 that from now on
00:17:39.720 because normally
00:17:41.440 I would not
00:17:42.200 accuse people
00:17:44.120 of crimes
00:17:45.240 unless I was sure
00:17:46.100 that they were
00:17:46.580 really crimes.
00:17:48.120 But I'm sure
00:17:48.620 they were really crimes.
00:17:50.200 Part of the
00:17:50.860 part of that
00:17:53.060 is because
00:17:53.700 the Marco Polo report
00:17:56.380 has a 630-page report
00:18:00.640 with over 2,000 citations
00:18:03.540 and 459 alleged crimes
00:18:07.820 committed by the
00:18:08.560 Biden family
00:18:09.160 and their business associates.
00:18:11.340 And that report
00:18:12.360 has been sent to the
00:18:13.140 House and the Senate
00:18:13.860 and the U.S. attorneys.
00:18:17.540 459 crimes
00:18:18.700 were identified,
00:18:19.880 I think mostly
00:18:20.540 from the laptops
00:18:21.860 or the laptop.
00:18:24.880 Now,
00:18:25.700 obviously,
00:18:27.560 459 crimes
00:18:29.280 is,
00:18:31.120 I assume,
00:18:31.880 it's stuff like
00:18:32.640 he smoked crack once.
00:18:34.800 That's one crime.
00:18:36.280 Then he smoked crack again.
00:18:37.740 That's the second crime.
00:18:38.840 Then he smoked crack again.
00:18:39.900 That's the third crime.
00:18:40.500 So probably,
00:18:41.520 you know,
00:18:41.820 443 of these
00:18:45.380 are just smoking crack.
00:18:48.400 So,
00:18:49.140 crack is not a crime.
00:18:50.740 Well,
00:18:50.980 I don't know.
00:18:55.500 So,
00:18:56.020 I don't know
00:18:56.320 how many actual crimes
00:18:57.160 it were,
00:18:57.580 but the financial ones
00:18:58.720 are the ones
00:18:59.060 I worry about.
00:19:01.880 So,
00:19:02.360 let's talk about
00:19:02.780 the State of the Union speech.
00:19:05.240 So,
00:19:05.720 here are some
00:19:06.240 of the funnier reactions.
00:19:09.520 And,
00:19:09.880 before I get to them,
00:19:11.540 I'd just like to say something.
00:19:13.940 Because on this live stream,
00:19:15.840 we like to take the high road.
00:19:18.240 I'm all about the high road.
00:19:20.160 Right?
00:19:20.520 I don't like to get down
00:19:21.520 in the weeds
00:19:22.980 and blaming people
00:19:24.320 and stuff.
00:19:25.500 That's small.
00:19:26.400 That's small.
00:19:27.160 We're above that.
00:19:29.260 And one of the things
00:19:30.020 that I,
00:19:31.420 is my new,
00:19:33.300 I guess it's my new calling,
00:19:35.280 is to try to get people
00:19:36.860 to stop
00:19:37.540 judging
00:19:38.940 an entire group
00:19:40.300 by the worst members
00:19:42.100 of the group.
00:19:43.320 Right?
00:19:44.220 I mean,
00:19:44.480 that's the lowest thing
00:19:45.440 you can do.
00:19:46.500 I mean,
00:19:46.640 that's literally
00:19:47.360 racist,
00:19:48.840 sexist.
00:19:49.660 I mean,
00:19:49.900 it's literally
00:19:50.280 the worst person
00:19:51.140 you could be.
00:19:52.220 So,
00:19:52.660 do not judge
00:19:53.920 the entire group
00:19:55.040 by any individuals
00:19:56.940 who are bad.
00:19:58.900 We don't do that.
00:20:00.760 So,
00:20:01.080 that's why
00:20:01.460 I'm not judging Democrats
00:20:02.660 by the fact
00:20:03.260 that their leader
00:20:04.140 is a demented
00:20:05.900 criminal.
00:20:08.920 Do not judge
00:20:10.060 all Democrats
00:20:10.680 that way.
00:20:11.740 It's only the leader
00:20:12.840 that they voted for
00:20:13.860 and look like
00:20:14.480 they'll vote for again.
00:20:15.960 So,
00:20:16.440 just because they vote
00:20:17.520 for somebody
00:20:18.200 who's a demented
00:20:19.940 criminal
00:20:20.880 and a liar,
00:20:23.140 that says nothing
00:20:24.460 about them.
00:20:25.700 So,
00:20:26.420 do not
00:20:26.900 take the fact
00:20:28.180 that they voted for him
00:20:29.380 by an overwhelming majority
00:20:30.700 and knew exactly
00:20:31.520 what they were getting,
00:20:32.940 there's no reflection
00:20:33.900 on them.
00:20:34.820 So,
00:20:35.100 we're not that kind
00:20:35.740 of people.
00:20:37.040 All right,
00:20:37.240 Thomas Massey
00:20:37.860 had an interesting
00:20:38.600 tweet about
00:20:39.740 the State of the Union
00:20:40.760 and Biden's performance.
00:20:43.120 He said,
00:20:43.880 the tone began
00:20:44.780 as,
00:20:45.320 quote,
00:20:46.320 animatronic father
00:20:47.500 at Disney
00:20:48.180 carousel of progress
00:20:49.420 and quickly switched
00:20:51.160 to,
00:20:51.720 quote,
00:20:52.180 man shouting
00:20:52.940 at kids
00:20:53.440 to get off his yard.
00:20:57.000 That's pretty funny.
00:20:59.720 A Twitter user
00:21:01.020 named Mr. Joshua
00:21:02.040 said this.
00:21:04.240 This is my favorite one
00:21:05.340 about Biden.
00:21:07.680 Strange anger outbursts
00:21:10.440 that didn't match
00:21:11.140 the message,
00:21:12.000 like watching
00:21:12.840 an old Chinese fight scene
00:21:14.340 dubbed in English.
00:21:17.860 That's exactly
00:21:18.840 what I saw.
00:21:20.580 Because the yelling
00:21:21.600 didn't match
00:21:22.240 the message.
00:21:23.960 You know,
00:21:24.140 it's like,
00:21:24.980 where's my pencil?
00:21:26.320 Where's my pencil?
00:21:28.160 Where's my pencil?
00:21:29.780 Where's the pencil?
00:21:33.420 That's what it felt
00:21:34.220 like to me.
00:21:36.580 And then
00:21:37.180 Geraldo,
00:21:38.720 Geraldo tweet.
00:21:40.060 I honestly can't tell
00:21:45.000 if Geraldo
00:21:45.620 is pranking us.
00:21:48.680 So I'm going to read
00:21:49.440 Geraldo's tweet
00:21:50.400 and
00:21:51.480 I don't think
00:21:53.360 he's pranking.
00:21:54.980 I don't think
00:21:55.620 he is.
00:21:56.880 But see if you can
00:21:57.700 tell the difference
00:21:58.380 between a real tweet
00:22:00.120 and something
00:22:01.100 that's obviously
00:22:01.940 a joke.
00:22:03.240 All right,
00:22:03.500 I'll just read
00:22:04.020 his tweet.
00:22:04.960 So this is Geraldo.
00:22:05.800 Perhaps we were surprised
00:22:08.460 because the steady stream
00:22:11.260 of naysayers critics
00:22:12.320 who think Biden's
00:22:13.800 a stuttering,
00:22:14.500 babbling old fool.
00:22:15.960 But that antique
00:22:17.100 didn't show up
00:22:17.940 at the State of the Union.
00:22:19.320 The president we saw
00:22:20.460 was robust,
00:22:21.940 confident,
00:22:22.500 and in control.
00:22:23.720 He won the nomination
00:22:24.720 for re-election
00:22:25.520 last night.
00:22:29.140 Parody
00:22:29.660 or reality?
00:22:33.460 It's hard to tell,
00:22:34.600 isn't it?
00:22:36.360 It's sort of hard to tell.
00:22:39.080 He had another tweet
00:22:41.580 praising Biden
00:22:42.580 which makes me
00:22:43.540 think he's real.
00:22:45.800 Did he actually see that?
00:22:48.100 Like that was his
00:22:48.780 actual honest opinion
00:22:50.200 when he watched it?
00:22:51.540 Because it could have been.
00:22:52.640 It could have been.
00:22:54.000 Now I happen to think
00:22:55.440 that Geraldo's
00:22:56.080 a straight shooter.
00:22:58.320 Whether you like
00:22:59.480 his opinions or not,
00:23:00.560 I have great respect
00:23:01.380 for him.
00:23:02.440 Both for his careers
00:23:03.900 and the fact that
00:23:04.680 as far as I know,
00:23:06.500 he's brave,
00:23:07.580 he's patriotic,
00:23:08.520 and he tells the truth.
00:23:09.960 You might not like
00:23:10.720 his opinions,
00:23:11.520 but as an American citizen,
00:23:14.540 he's one of the best
00:23:15.400 in my opinion.
00:23:17.720 And I think he's great
00:23:18.780 on Fox,
00:23:19.380 by the way.
00:23:19.880 He does his job
00:23:21.040 on Fox really well.
00:23:21.940 But here's what I saw.
00:23:26.880 If you're giving
00:23:27.700 a State of the Union
00:23:28.580 address
00:23:29.180 and you tell
00:23:30.720 a whopper of a lie
00:23:31.860 like some Republicans
00:23:34.060 want to sunset
00:23:35.180 Medicare and Social Security,
00:23:37.200 which means basically
00:23:38.160 end them.
00:23:40.280 Now he did specify
00:23:42.000 that it was
00:23:43.820 some Republicans.
00:23:45.840 And he was very clear
00:23:46.960 once they started
00:23:47.940 yelling at him
00:23:48.740 that he wasn't
00:23:49.980 saying all Republicans.
00:23:51.780 But do you think
00:23:53.340 that the State of the Union
00:23:54.700 was where he should
00:23:56.560 have been saying
00:23:57.120 that Republicans
00:23:57.840 want to end
00:23:58.700 Social Security
00:24:00.720 and Medicare?
00:24:01.660 Now it turns out
00:24:04.260 that there are
00:24:04.700 a few Republicans
00:24:05.620 who are suggesting that.
00:24:07.220 But there's no way
00:24:08.260 it's going to happen
00:24:08.940 because the vast majority
00:24:10.620 of Republicans
00:24:11.240 would never do that.
00:24:13.080 So it's not a real thing.
00:24:15.360 But bringing it up
00:24:16.240 is just grotesquely
00:24:18.040 misleading,
00:24:21.660 political,
00:24:23.000 divisive bullshit.
00:24:25.620 And I loved
00:24:27.040 the Republican reaction
00:24:28.260 to it.
00:24:30.020 The Republican action
00:24:30.900 was they just yelled
00:24:32.080 liar.
00:24:34.080 They just yelled at him
00:24:35.280 while he was speaking.
00:24:36.340 And he had to stop
00:24:37.180 and fight his way
00:24:38.360 out of it.
00:24:39.500 Now,
00:24:39.980 I'm totally happy
00:24:41.660 with the Republican response.
00:24:43.760 Are you?
00:24:45.920 You know,
00:24:46.500 I wouldn't have mind
00:24:47.260 that if that happened
00:24:48.920 to a Republican
00:24:49.760 either.
00:24:52.520 Because if you say
00:24:53.400 something
00:24:53.800 so overtly divisive
00:24:56.800 and overtly misleading,
00:24:58.900 people should stop you
00:25:02.000 in the middle
00:25:02.320 of your speech
00:25:02.800 and just start
00:25:03.340 shouting at you
00:25:04.080 to get out of town.
00:25:05.920 That's exactly
00:25:06.540 the right thing to do.
00:25:07.640 And again,
00:25:08.460 I would have supported it
00:25:09.400 no matter what side
00:25:10.600 the president was on.
00:25:11.680 That was so over the line.
00:25:13.960 So inappropriate.
00:25:15.340 And the Republicans
00:25:16.160 called them on that.
00:25:17.020 That was pretty funny.
00:25:18.640 My other favorite part
00:25:19.760 was when...
00:25:20.220 This is so stupid.
00:25:25.760 This is so stupid
00:25:26.920 that the Republicans
00:25:28.060 laughed out loud
00:25:29.320 and it was just
00:25:31.660 an honest response
00:25:32.880 to how stupid he was.
00:25:34.840 He said that
00:25:35.660 people are complaining
00:25:37.420 that the oil companies
00:25:38.520 don't want to invest
00:25:39.820 and he's not stopping
00:25:41.720 them from investing.
00:25:43.040 He's just telling them
00:25:44.100 that we only need oil
00:25:45.840 for another 10 years.
00:25:47.680 So why don't they invest?
00:25:49.140 Because, you know,
00:25:49.920 they have 10 years
00:25:50.580 to make money.
00:25:51.800 And the Republicans
00:25:52.780 literally laughed out loud.
00:25:55.960 Now, I hope I don't need
00:25:58.140 to explain why that's so dumb.
00:26:01.120 Does anybody need that?
00:26:02.920 Nobody needs the explanation
00:26:04.180 of why the oil industry
00:26:06.220 is not interested
00:26:07.120 in massive investment
00:26:08.240 for a business
00:26:10.220 that's only going to last 10 years.
00:26:12.300 Does anybody need
00:26:13.460 the explanation?
00:26:14.960 I don't think so.
00:26:17.080 Literally nobody would invest
00:26:18.760 in that kind of a business.
00:26:20.140 Nobody.
00:26:21.400 And Biden's up there
00:26:22.360 in front of the country
00:26:23.560 representing America,
00:26:28.000 the capitalist ideal
00:26:29.520 of the planet.
00:26:31.300 And he's up there saying,
00:26:32.380 oh, I don't know why
00:26:32.900 they don't invest
00:26:33.640 for that 10-year game.
00:26:35.640 Well, I'm trying to put them
00:26:36.620 out of business
00:26:37.080 as fast as I can.
00:26:38.760 And if I can make that
00:26:39.680 10 years, 5 years,
00:26:40.700 I'll do it as fast as I could.
00:26:43.080 Yeah, why don't you want
00:26:43.960 to get into that business?
00:26:46.580 That's just stupid.
00:26:48.880 Am I wrong?
00:26:50.680 How else would you
00:26:51.600 describe that?
00:26:54.120 Yeah, I mean,
00:26:54.660 I don't even think
00:26:55.360 it's senile.
00:26:57.260 Because it would,
00:26:58.600 correct me if I'm wrong,
00:26:59.760 but it was on the teleprompter.
00:27:01.720 Right?
00:27:02.020 If it's on the teleprompter,
00:27:04.020 it's not really
00:27:04.700 a senility problem.
00:27:05.800 But wow.
00:27:09.380 Wow.
00:27:12.380 And then, you know,
00:27:14.380 I turned into MSNBC
00:27:15.560 to see what the crazy
00:27:16.860 people are saying.
00:27:18.060 I hear Lawrence O'Donnell's
00:27:19.380 voice saying,
00:27:20.500 this was the worst possible
00:27:21.620 night for Kevin McCarthy.
00:27:26.040 What?
00:27:27.920 What?
00:27:30.040 What State of the Union
00:27:31.500 was he watching?
00:27:33.580 What?
00:27:35.800 And he said,
00:27:36.760 he went on,
00:27:37.460 Lawrence O'Donnell,
00:27:39.160 this is over for McCarthy.
00:27:41.040 The year is over.
00:27:44.400 So I guess Geraldo
00:27:45.820 and at least one person
00:27:47.840 on MSNBC
00:27:48.760 saw a commanding performance
00:27:51.200 by Biden
00:27:52.080 which guaranteed him
00:27:53.680 the nomination
00:27:54.360 and will make the Republicans
00:27:56.940 sad and ineffective
00:27:58.220 for the rest of their time
00:27:59.520 on this earth.
00:28:02.240 The other funny thing
00:28:03.420 he said was,
00:28:04.760 name me a world leader
00:28:05.740 who would change places
00:28:06.920 with Chairman Xi
00:28:09.060 as head of China.
00:28:11.420 His context was
00:28:13.140 that he's so hard
00:28:14.260 on China
00:28:14.840 and China's now
00:28:16.260 in such a bad position
00:28:17.380 because of how hard
00:28:18.340 Biden is on him
00:28:19.260 that there's no leader
00:28:21.400 in the world
00:28:21.980 who would want to be
00:28:23.140 the head of the
00:28:24.720 second largest economy
00:28:27.160 in the world.
00:28:27.800 because little Kim,
00:28:31.180 he likes running
00:28:32.300 as a hermit kingdom
00:28:33.280 and he would certainly
00:28:34.960 turn down a promotion
00:28:36.020 to be the boss
00:28:38.580 of China for life.
00:28:42.480 So,
00:28:43.120 that was stupid.
00:28:47.180 That was literally
00:28:48.480 just stupid.
00:28:49.900 Am I wrong about that?
00:28:54.860 I mean,
00:28:55.160 how else?
00:28:56.440 Now,
00:28:56.760 and keep in mind,
00:28:57.680 I think that was
00:28:58.400 on the teleprompter.
00:29:00.020 I think it was
00:29:00.580 on the teleprompter.
00:29:02.340 How in the world
00:29:03.300 did so many people
00:29:04.260 allow that
00:29:05.000 to get on the teleprompter?
00:29:06.760 How in the world?
00:29:09.640 Because this has
00:29:10.920 nothing to do
00:29:11.480 with politics, right?
00:29:12.640 This is not even
00:29:13.560 a political statement.
00:29:15.240 It's just stupid.
00:29:17.580 It's just literally stupid.
00:29:19.600 I don't know
00:29:20.000 what else to say about it.
00:29:22.740 Sarah Huckabee
00:29:23.580 did a good job
00:29:24.980 of, you know,
00:29:25.820 the counter-programming
00:29:27.540 after the State of the Union.
00:29:29.240 But here's the part
00:29:30.460 I like the best.
00:29:32.220 She said,
00:29:32.800 quote,
00:29:33.040 giving every child
00:29:33.880 access to a quality education
00:29:35.660 regardless of their race
00:29:37.600 or income
00:29:38.140 is the civil rights
00:29:39.340 issue of our day.
00:29:41.800 A plus.
00:29:42.720 A plus, plus, plus.
00:29:48.240 How many times
00:29:49.280 have I told you
00:29:50.100 that if Trump
00:29:52.360 or really
00:29:53.000 any leading Republican
00:29:54.860 simply stated
00:29:56.920 their policies
00:29:57.820 in the smartest frame,
00:30:00.680 they would be unbeatable.
00:30:02.440 But they don't do it
00:30:03.740 for whatever reason.
00:30:06.060 There is just
00:30:07.760 complete
00:30:08.740 easy, free money
00:30:11.400 laying there.
00:30:12.980 And
00:30:13.320 Sarah Huckabee Sanders
00:30:15.980 picked it up.
00:30:18.660 This is exactly
00:30:19.800 the right frame.
00:30:21.720 This
00:30:22.060 is everything.
00:30:24.920 Number one,
00:30:25.760 she understood
00:30:26.960 the priority
00:30:27.760 best.
00:30:29.100 Who was it
00:30:29.760 recently saying
00:30:30.440 that the biggest risk
00:30:31.320 to the United States
00:30:32.220 was the
00:30:33.840 teachers' union?
00:30:37.180 somebody said
00:30:39.480 that was a bigger...
00:30:40.520 Somebody,
00:30:41.500 a pundit said that?
00:30:43.520 Matt Gaetz?
00:30:44.560 Well, I've said it,
00:30:45.660 but somebody else
00:30:46.360 said it.
00:30:48.320 Somebody famous
00:30:49.180 said it recently.
00:30:50.620 And, yeah,
00:30:51.260 I've said it,
00:30:51.860 of course.
00:30:52.300 Oh, it was Pompeo.
00:30:53.500 Yes, I'm sorry.
00:30:54.380 Mike Pompeo.
00:30:56.820 So,
00:30:57.500 have I ever
00:30:58.520 mentioned how smart
00:30:59.420 Mike Pompeo is?
00:31:00.600 Do you have any idea
00:31:02.460 how smart that guy is?
00:31:03.580 He's not
00:31:05.620 ordinary smart.
00:31:06.820 He's like
00:31:07.300 sort of
00:31:08.560 off-the-chart smart.
00:31:10.240 Now,
00:31:10.700 I don't
00:31:11.080 love him for president
00:31:12.680 because I've got
00:31:13.240 some problems
00:31:13.740 with his policy preferences,
00:31:15.680 but
00:31:15.940 he is really smart.
00:31:19.380 You can't take
00:31:20.220 that away from him.
00:31:23.380 Anyway,
00:31:23.920 so
00:31:24.000 here are
00:31:27.060 two Republicans
00:31:27.840 who have
00:31:28.780 correctly identified
00:31:29.900 the biggest
00:31:31.120 risk to the
00:31:32.680 republic.
00:31:33.580 That's good.
00:31:35.180 That's good.
00:31:36.020 But
00:31:36.380 they're getting
00:31:37.160 very close
00:31:37.940 to just a
00:31:38.840 perfect framing
00:31:39.760 of the education
00:31:40.760 thing.
00:31:41.820 And I think
00:31:42.260 you could even
00:31:43.160 go a little
00:31:43.560 further.
00:31:45.660 It might be
00:31:46.380 hard for a
00:31:47.220 national politician
00:31:48.740 to say what
00:31:49.300 I'm going to say,
00:31:50.360 but here's
00:31:50.760 something I've
00:31:51.120 said before,
00:31:52.260 and I would
00:31:52.640 love to see
00:31:53.080 him say it.
00:31:54.300 We're never
00:31:54.960 going to agree
00:31:55.500 on reparations.
00:31:57.540 So let's do
00:31:58.320 what we can do.
00:32:00.180 Let's do
00:32:00.760 what's doable.
00:32:02.140 What's doable
00:32:02.800 is fixing
00:32:03.900 the biggest
00:32:04.460 part of
00:32:05.060 systemic racism,
00:32:06.540 which is
00:32:06.940 the school
00:32:07.320 system.
00:32:08.540 But you have
00:32:09.240 to fix it
00:32:09.760 for the poor
00:32:10.280 white people,
00:32:11.060 the poor
00:32:11.380 Asian people,
00:32:12.580 Asian American
00:32:13.160 people,
00:32:13.660 the poor
00:32:14.040 Hispanic American
00:32:14.940 people.
00:32:15.620 You have to
00:32:16.100 fix it for
00:32:16.580 everybody.
00:32:17.680 And if you
00:32:18.260 do,
00:32:19.120 this would be
00:32:19.700 speaking to
00:32:20.300 the black
00:32:21.020 American community,
00:32:22.240 if you can
00:32:23.420 join with us
00:32:24.260 and just fix it
00:32:25.580 for everybody
00:32:26.200 who is poor,
00:32:26.900 you will have
00:32:29.600 done the
00:32:30.140 greatest service
00:32:31.020 for the
00:32:32.180 United States
00:32:33.080 ever.
00:32:35.880 It would be
00:32:36.640 bigger than
00:32:37.240 anything.
00:32:38.320 And that
00:32:38.760 would be
00:32:39.240 something that
00:32:39.920 the black
00:32:40.400 American community
00:32:41.540 could own.
00:32:42.820 They could own
00:32:43.400 that.
00:32:44.240 One of the
00:32:44.620 biggest
00:32:44.840 accomplishments
00:32:45.440 that our
00:32:46.340 country could
00:32:46.820 ever hope
00:32:47.780 to have,
00:32:48.560 fixing education.
00:32:50.600 And if that's
00:32:52.460 not worth
00:32:52.940 anything to you,
00:32:53.880 to have that
00:32:55.140 sort of in
00:32:56.520 your history,
00:32:58.880 we teach
00:32:59.740 Black History
00:33:00.460 Month,
00:33:00.840 right?
00:33:02.260 So there's
00:33:03.140 a genuine
00:33:04.540 sort of
00:33:05.180 universal
00:33:05.760 interest in
00:33:07.500 making sure
00:33:08.020 that the
00:33:08.480 black American
00:33:09.260 community is
00:33:11.160 completely
00:33:11.740 credited for
00:33:13.060 historical
00:33:14.620 accomplishments,
00:33:15.480 et cetera,
00:33:15.920 contribution.
00:33:17.160 This could be
00:33:17.840 bigger than
00:33:18.260 all of that.
00:33:19.840 Because this
00:33:20.520 gets to
00:33:21.720 America,
00:33:25.100 basically.
00:33:26.540 Right?
00:33:26.980 The thing
00:33:27.480 that made
00:33:27.740 America America
00:33:28.460 is that the
00:33:29.480 various disparate
00:33:32.120 groups could
00:33:32.680 get along
00:33:33.140 somehow.
00:33:34.080 That we had
00:33:34.460 a system that
00:33:35.120 would make
00:33:35.420 all these
00:33:35.720 different people
00:33:36.480 be on the
00:33:38.240 same team.
00:33:40.020 So this
00:33:41.060 is the
00:33:41.620 biggest one.
00:33:42.780 If you make
00:33:43.240 this work,
00:33:44.540 I would say
00:33:45.260 you could add
00:33:45.780 another month
00:33:46.400 to Black
00:33:47.060 History Month.
00:33:47.720 Make it two
00:33:48.400 months.
00:33:49.460 Because I
00:33:50.360 think we
00:33:50.700 would all be
00:33:51.240 grateful to
00:33:52.700 be on the
00:33:53.060 same side
00:33:53.640 for something
00:33:54.260 that big.
00:33:57.380 Here's
00:33:57.900 something I
00:33:58.340 learned today
00:33:58.880 from Congressman
00:33:59.960 Troy Nels.
00:34:02.000 I didn't
00:34:02.500 know that the
00:34:02.980 cartels have
00:34:03.580 an air force.
00:34:05.360 Did you
00:34:05.680 know that?
00:34:07.000 That's not
00:34:07.720 very big.
00:34:08.860 But apparently
00:34:09.380 there have
00:34:09.800 been 10,000
00:34:10.800 cartel drone
00:34:11.860 detections.
00:34:13.160 Not 10,000
00:34:14.000 drones, but
00:34:14.680 10,000
00:34:15.280 detections.
00:34:16.560 So I guess
00:34:17.040 the cartels
00:34:17.660 used the drones
00:34:18.500 to survey
00:34:19.380 for law
00:34:21.320 enforcement
00:34:21.860 and for
00:34:22.780 other stuff
00:34:23.520 that they
00:34:23.840 might need.
00:34:25.680 They have
00:34:26.360 an air force.
00:34:27.740 Now, do
00:34:29.440 you think
00:34:29.800 if we just
00:34:30.460 keep waiting
00:34:31.220 it'll be
00:34:32.740 easier to
00:34:33.420 take them
00:34:33.820 out?
00:34:35.160 I have a
00:34:35.940 feeling we
00:34:36.340 ought to
00:34:36.860 maybe get
00:34:37.520 on this.
00:34:38.800 Because they've
00:34:39.700 got a little
00:34:40.080 bit of a
00:34:40.460 navy and a
00:34:41.060 little bit of
00:34:41.420 an air force
00:34:41.960 already.
00:34:42.340 I don't think
00:34:44.320 it's impossible
00:34:45.140 that they could
00:34:45.780 get missiles.
00:34:49.500 I feel like
00:34:50.340 they're going
00:34:50.580 to have, I
00:34:52.200 believe that if
00:34:53.080 they're not
00:34:53.460 already dropping
00:34:54.260 bombs from
00:34:54.980 the air with
00:34:55.920 the drones
00:34:56.340 that they
00:34:56.660 have, that
00:34:57.880 it's going to
00:34:58.280 happen really
00:34:58.760 fast.
00:35:00.220 I think the
00:35:00.900 cartels are
00:35:01.540 going to be
00:35:01.880 bombing their
00:35:02.740 enemies with
00:35:04.360 drones.
00:35:05.940 If not
00:35:06.680 already, they
00:35:07.240 might already
00:35:07.600 be doing it.
00:35:09.600 So maybe we
00:35:10.580 should get on
00:35:11.360 that.
00:35:12.700 Part of the
00:35:13.520 State of the
00:35:14.780 Union was
00:35:15.300 talking about
00:35:16.260 the talk.
00:35:18.080 Apparently
00:35:18.460 there's
00:35:18.740 something called
00:35:19.360 the talk
00:35:20.240 that black
00:35:22.380 families often
00:35:23.280 have with
00:35:23.740 their kids
00:35:24.120 when they
00:35:24.480 become driving
00:35:25.560 age, maybe
00:35:27.040 before.
00:35:28.300 And the
00:35:30.440 talk is if
00:35:32.180 you're stopped
00:35:32.780 by a police
00:35:34.080 officer to
00:35:35.740 treat the
00:35:36.380 police officer
00:35:37.080 respectful and
00:35:38.340 make sure that
00:35:39.280 your hands are
00:35:39.860 shown.
00:35:40.580 Basically, the
00:35:42.020 talk is telling
00:35:43.880 people how to
00:35:44.500 handle the
00:35:45.280 authority figures
00:35:46.260 without getting
00:35:47.360 killed.
00:35:49.380 Now, Matt
00:35:50.120 Walsh pointed
00:35:50.740 out that it's
00:35:52.340 not just black
00:35:53.600 and brown
00:35:54.040 families.
00:35:55.220 Like, who
00:35:55.720 thinks that
00:35:56.340 only black and
00:35:57.040 brown families
00:35:57.600 have that talk?
00:35:58.720 So, Matt
00:35:59.320 Walsh said,
00:36:00.380 when I started
00:36:00.980 driving, my
00:36:01.640 parents talked
00:36:02.240 to me about
00:36:02.740 what to do
00:36:03.180 if I'm pulled
00:36:03.700 over and
00:36:04.340 explained that I
00:36:05.120 should be
00:36:05.440 respectful to
00:36:06.240 the officer.
00:36:07.480 This idea that
00:36:08.360 only black
00:36:08.860 families need to
00:36:09.640 have this
00:36:10.000 conversation is
00:36:11.140 just utterly
00:36:11.640 ridiculous
00:36:12.160 nonsense.
00:36:14.220 Now, my
00:36:16.100 parents never
00:36:16.700 had this
00:36:17.100 conversation with
00:36:17.900 me.
00:36:19.760 And I'm not
00:36:21.660 sure why, but
00:36:22.420 I'm not a
00:36:23.040 mind reader, but
00:36:23.720 I speculate.
00:36:24.960 Here's my
00:36:25.420 speculation.
00:36:26.400 I think my
00:36:27.100 parents never
00:36:28.540 sat me down
00:36:29.240 and told me
00:36:29.920 that I should
00:36:31.160 avoid resisting
00:36:32.280 a police officer
00:36:33.500 with a gun.
00:36:34.320 they may
00:36:37.420 have, and
00:36:37.920 I'm just
00:36:38.220 speculating,
00:36:38.880 they may
00:36:39.600 have thought
00:36:40.420 that I'm
00:36:41.780 not a
00:36:42.100 fucking idiot.
00:36:44.060 They may
00:36:44.760 have.
00:36:45.940 Because if I
00:36:46.920 were an
00:36:48.540 idiot, I
00:36:49.800 wouldn't know
00:36:50.440 not to start a
00:36:51.340 fight with a
00:36:51.760 guy with a
00:36:52.140 gun.
00:36:53.660 But I was
00:36:54.320 trying to
00:36:54.620 think at
00:36:55.060 what age I
00:36:56.760 had independently
00:36:57.660 figured out not
00:36:58.700 to start a
00:36:59.260 fight with
00:36:59.640 somebody with a
00:37:00.260 gun.
00:37:01.520 I'm thinking
00:37:01.960 it was age
00:37:02.440 three.
00:37:04.320 age three.
00:37:06.420 Well, how
00:37:07.020 old were you
00:37:07.520 when you
00:37:07.780 realized that
00:37:08.340 starting a
00:37:08.820 fight with
00:37:09.540 somebody who
00:37:09.960 had a gun
00:37:10.560 and a reason
00:37:12.140 to use it?
00:37:14.120 How old were
00:37:14.840 you when you
00:37:15.140 realized that
00:37:15.540 was a bad
00:37:15.940 idea?
00:37:17.420 About three,
00:37:18.260 right?
00:37:19.160 You were
00:37:19.560 probably watching
00:37:20.120 TV, and you
00:37:21.720 thought, wow,
00:37:22.300 if one of them
00:37:22.800 has a gun, and
00:37:25.060 that other one
00:37:25.540 doesn't have a
00:37:26.100 gun, two and
00:37:28.500 two, yeah,
00:37:29.660 yeah, you
00:37:30.420 don't want to
00:37:30.800 start a fight
00:37:31.280 with somebody
00:37:31.660 who has a
00:37:31.980 gun.
00:37:32.140 So, do
00:37:35.200 you really
00:37:35.520 need to
00:37:35.920 have, do
00:37:39.780 you really
00:37:40.380 need to have
00:37:40.760 that talk?
00:37:42.060 All right, I
00:37:42.820 said something
00:37:43.380 highly provocative
00:37:44.340 today.
00:37:45.460 It's a question,
00:37:46.860 but I actually
00:37:47.600 wonder, has
00:37:48.920 anybody who
00:37:49.680 was sober,
00:37:51.000 that includes
00:37:51.740 drugs, right,
00:37:52.640 sober, has
00:37:53.640 there ever been
00:37:54.160 someone who
00:37:54.560 was sober and
00:37:56.080 had an IQ
00:37:56.740 over 110,
00:37:58.820 who has
00:37:59.620 ever been
00:38:00.040 killed in a
00:38:00.780 police stop in
00:38:01.620 a vehicle?
00:38:04.000 Like ever?
00:38:05.900 I don't think
00:38:06.720 so.
00:38:07.760 So, I think the
00:38:08.720 thing that we're
00:38:09.180 avoiding is it's
00:38:10.020 a stupid drunk
00:38:11.480 problem, or a
00:38:13.140 stupid inebriated
00:38:14.080 problem.
00:38:15.020 We keep asking
00:38:15.960 like it's some
00:38:16.600 kind of police
00:38:17.160 problem, or some
00:38:18.240 kind of a race
00:38:19.200 problem, it's
00:38:19.860 just a stupid
00:38:20.540 problem.
00:38:21.000 It's a stupid
00:38:21.620 problem.
00:38:22.800 Stupid and
00:38:23.460 drunk, right?
00:38:24.920 I don't believe
00:38:25.880 any black
00:38:28.520 person who,
00:38:29.740 you know, was
00:38:31.900 totally sober
00:38:33.020 and, let's
00:38:36.120 say, was
00:38:36.660 smart enough
00:38:37.120 to have a
00:38:37.440 job, wasn't
00:38:39.740 a criminal, had
00:38:40.740 a job, I
00:38:42.720 don't think any
00:38:43.160 of them have
00:38:43.540 ever been
00:38:43.900 killed.
00:38:45.300 And if they
00:38:46.120 were, it was
00:38:46.940 because something
00:38:47.540 weird happened.
00:38:48.460 Now, I tweeted
00:38:49.000 this, and I
00:38:49.520 expected a bunch
00:38:50.240 of people to
00:38:50.720 say what you
00:38:52.360 said.
00:38:52.940 So, here I'm
00:38:54.400 seeing the
00:38:54.800 comments, OMG,
00:38:55.760 seriously, I
00:38:58.540 was looking for,
00:38:59.280 I thought people
00:38:59.860 would give me
00:39:00.320 examples of the
00:39:01.120 opposite.
00:39:02.360 But they gave
00:39:03.080 me examples of
00:39:04.060 people who were
00:39:04.620 killed not in
00:39:05.760 car stops, people
00:39:08.220 who were drunk.
00:39:11.740 But are we at a
00:39:13.620 point where we
00:39:14.080 can just say
00:39:14.540 this?
00:39:15.720 Yeah.
00:39:16.180 No, this has
00:39:16.760 nothing to do
00:39:17.260 with race.
00:39:18.600 I'm not bringing
00:39:19.420 race into it at
00:39:20.140 all.
00:39:20.300 well, I'm
00:39:21.240 saying that no
00:39:21.920 matter who you
00:39:22.480 are, if you're
00:39:23.940 sober, and you're
00:39:25.980 an adult, and you
00:39:27.060 have an IQ over
00:39:27.840 110, I'll bet
00:39:29.740 nobody's ever been
00:39:30.560 killed by police.
00:39:32.520 Unless it was
00:39:33.180 just the weirdest
00:39:33.780 situation of, you
00:39:35.060 know, mistaken
00:39:36.420 something, some
00:39:37.540 kind of mistaken
00:39:38.140 identity thing.
00:39:39.740 But that would
00:39:40.440 happen, you know,
00:39:41.160 no matter what you
00:39:41.740 did, if it was
00:39:42.320 mistaken identity.
00:39:47.480 No, that's not
00:39:48.300 true.
00:39:49.000 That is a racist
00:39:49.860 statement.
00:39:50.400 If you think
00:39:50.940 that speaking
00:39:51.600 about IQ in
00:39:52.760 this context is
00:39:53.920 racial, it's
00:39:55.240 not.
00:39:55.960 That's on you.
00:39:57.140 You're making
00:39:57.740 that connection.
00:39:58.760 I'm just saying
00:39:59.340 that if you're
00:40:00.020 smart and sober,
00:40:01.800 you don't start a
00:40:02.820 fight with somebody
00:40:03.480 who has a gun.
00:40:04.840 Do you?
00:40:06.260 And if you do,
00:40:08.100 I'm not sure you
00:40:08.920 have an IQ of
00:40:09.540 110.
00:40:11.160 You know, maybe
00:40:11.520 you have some
00:40:11.840 anger problem.
00:40:12.880 But the talk
00:40:13.540 isn't going to
00:40:14.020 help you.
00:40:15.040 How much is the
00:40:15.780 talk going to
00:40:16.500 help you?
00:40:16.880 All right.
00:40:21.180 Anyway, those
00:40:22.260 are just some
00:40:22.620 questions.
00:40:23.400 I'll probably
00:40:23.760 get canceled for
00:40:24.460 saying that.
00:40:26.260 And that's
00:40:26.860 fine.
00:40:27.760 There's a new
00:40:28.500 piece of software
00:40:29.220 that should scare
00:40:30.300 everybody in the
00:40:31.760 creative industry,
00:40:33.860 especially if you
00:40:34.740 make movies.
00:40:35.400 So I saw this
00:40:36.460 from Machiavelli's
00:40:38.220 account on
00:40:40.900 Twitter.
00:40:41.380 And the new
00:40:41.900 software is called
00:40:42.680 Runway.
00:40:44.240 And it will be
00:40:45.020 hard for me to
00:40:45.620 describe exactly
00:40:46.440 what it can do,
00:40:47.660 but I'll give you
00:40:48.140 some examples.
00:40:50.540 You can take any
00:40:52.100 existing video,
00:40:55.500 and you can
00:40:56.980 combine it with
00:40:58.400 some other video,
00:40:59.880 and you can say
00:41:00.540 use the style of
00:41:01.700 the other video,
00:41:02.420 and just make a
00:41:04.560 whole complete
00:41:05.620 video with that
00:41:06.420 style.
00:41:07.480 Now, that doesn't
00:41:08.400 quite explain what
00:41:09.280 it is until I give
00:41:10.000 you the example.
00:41:11.360 So the example
00:41:12.180 showed a video of
00:41:13.920 somebody walking on
00:41:14.900 the moon, you
00:41:16.000 know, a moon
00:41:16.360 walk, and then
00:41:18.000 separately, somebody
00:41:19.860 took a video of
00:41:20.780 themselves walking
00:41:21.780 in the snow.
00:41:23.600 And then they
00:41:24.440 said, combine these
00:41:25.420 two, but use the
00:41:26.260 style of the
00:41:27.280 spaceman.
00:41:29.060 And so the
00:41:29.860 person walking
00:41:30.560 into the snow
00:41:31.340 turned into a
00:41:33.180 moonscape with a
00:41:34.780 person in a space
00:41:35.960 outfit walking
00:41:36.700 through the moon.
00:41:38.660 They basically
00:41:39.760 created a whole
00:41:40.940 movie with a
00:41:42.320 style, costumes,
00:41:43.800 and everything
00:41:44.120 else, just by
00:41:45.800 one sentence.
00:41:47.860 Use the style of
00:41:48.800 this video.
00:41:50.680 That's it.
00:41:51.880 And then as the
00:41:52.640 person who made
00:41:53.520 their own video
00:41:54.080 moved around,
00:41:54.900 they were just
00:41:55.340 moving around,
00:41:56.440 but wearing a
00:41:57.120 space suit.
00:41:57.920 And you can do
00:41:58.700 that with faces,
00:42:00.360 backgrounds,
00:42:01.340 people, you
00:42:02.500 can create an
00:42:03.460 entire movie just
00:42:05.500 by walking around
00:42:06.600 in your own
00:42:07.240 clothes, and
00:42:08.760 then later, you
00:42:09.780 know, bringing a
00:42:10.340 style over, and
00:42:11.740 the entire movie
00:42:13.440 will be populated.
00:42:14.620 So you don't even
00:42:15.500 have to go frame
00:42:16.280 to frame, say,
00:42:17.700 all right, in this
00:42:18.220 frame, replace this
00:42:19.720 with this.
00:42:20.900 It does it all
00:42:22.000 instantly and
00:42:24.140 perfectly.
00:42:25.380 Now add that to
00:42:26.400 the fact that you
00:42:26.980 can reproduce any
00:42:27.780 voice now with AI,
00:42:29.820 AI, and you
00:42:31.740 can type in any
00:42:32.580 dialogue, and you
00:42:34.060 might not even have
00:42:34.940 to give it dialogue.
00:42:36.940 I think we're
00:42:37.760 pretty close to the
00:42:38.740 point with a few
00:42:39.480 different AIs.
00:42:40.660 It's not all in one
00:42:41.480 AI, but a few
00:42:42.960 different.
00:42:43.480 I think you could
00:42:44.060 write the script and
00:42:45.860 have it create the
00:42:46.740 movie in about 10
00:42:48.960 minutes.
00:42:50.420 Here's how you do
00:42:51.200 it.
00:42:51.980 You say, hey, AI,
00:42:53.380 I'm not sure which
00:42:54.060 one you use for this,
00:42:54.880 but one of them does
00:42:55.520 this.
00:42:55.740 Hey, AI, write me
00:42:58.280 a movie that has a
00:42:59.840 three-act structure and
00:43:02.160 is in the style of,
00:43:03.720 and then you name a
00:43:04.500 famous movie writer,
00:43:07.080 or something in the
00:43:08.340 style of Tom Cruise's
00:43:11.520 Mission Impossible.
00:43:13.740 But it's a new movie,
00:43:16.000 and just use that
00:43:16.740 style.
00:43:17.480 And by the way, you
00:43:18.200 should read this book
00:43:19.200 called Save the Cat
00:43:20.500 that tells you how to
00:43:21.580 write good movie
00:43:22.660 structure.
00:43:23.320 The AI will write
00:43:25.800 the entire movie in
00:43:26.740 about a second, and
00:43:28.560 then you take that
00:43:29.160 whole script and you
00:43:30.960 say, use this style,
00:43:34.380 use the Tom Cruise
00:43:35.500 style of, you know,
00:43:36.700 what the Mission
00:43:37.740 Impossible movies look
00:43:38.800 like, and then you
00:43:40.520 walk outside and, you
00:43:42.000 know, you walk around
00:43:42.780 and maybe you're even
00:43:44.160 in your own office
00:43:46.100 doing stuff, or maybe
00:43:47.680 you don't even need
00:43:48.680 to.
00:43:48.960 Maybe it just creates
00:43:49.840 you, puts you in the
00:43:51.660 movie, and has you do
00:43:52.440 all the scenes.
00:43:54.140 So I think you might
00:43:55.220 be able to create an
00:43:55.860 entire movie in maybe
00:43:57.940 two minutes.
00:44:01.060 And that's already
00:44:03.520 here.
00:44:05.440 Already here.
00:44:06.540 So everybody who kept
00:44:07.760 saying, Scott, Scott,
00:44:09.040 Scott, this AI you keep
00:44:11.040 talking about, it's
00:44:12.280 always in the future.
00:44:13.200 It's the flying car.
00:44:14.840 Oh, the flying car is
00:44:16.040 going to be here
00:44:16.580 tomorrow.
00:44:17.540 Where's my flying car?
00:44:18.940 And I've been saying,
00:44:21.600 no, it's software.
00:44:23.280 When it gets here, it's
00:44:24.580 going to be here really
00:44:25.460 fast.
00:44:26.460 It's going to be not
00:44:27.280 here, not here, not
00:44:28.140 here.
00:44:28.460 Oh, shit, everything
00:44:29.180 changed.
00:44:30.480 And that's where we are.
00:44:31.800 We're in the next year
00:44:33.840 cannot be predicted.
00:44:36.060 You cannot predict next
00:44:38.920 year.
00:44:39.940 And this is the first
00:44:40.960 time in history that
00:44:41.780 that's been this true.
00:44:43.820 I mean, you can't
00:44:44.360 predict pandemics and
00:44:45.620 stuff.
00:44:46.140 But nothing's been less
00:44:47.440 predictable than what's
00:44:49.300 going to happen in the
00:44:50.020 next year.
00:44:50.540 It could be anything.
00:44:52.340 I don't think it'll be
00:44:53.120 bad necessarily, but it
00:44:54.160 could be anything.
00:44:55.960 So that's where we're
00:44:56.840 going with that.
00:45:00.040 I recommend a YouTube
00:45:02.600 interview with Dr.
00:45:04.580 Jordan Peterson and Dr.
00:45:06.180 Richard Lindzen, which
00:45:08.820 you should just Google.
00:45:09.840 You can find it.
00:45:10.420 So search for Dr.
00:45:12.200 Jordan Peterson.
00:45:13.380 And then climate
00:45:14.560 change or Dr.
00:45:16.480 Richard Lindzen, L-I-N-D-Z-E-N.
00:45:21.020 Now, the first part of
00:45:23.320 the video is a whole
00:45:25.160 bunch about the
00:45:26.060 credentials of Dr.
00:45:28.140 Lindzen, which are
00:45:29.540 extreme.
00:45:30.820 And the reason that
00:45:32.000 Peterson spends so much
00:45:33.380 time on his credentials
00:45:34.440 is that he's going to say
00:45:36.420 something that will blow
00:45:37.300 your mind.
00:45:38.940 And unless you know how
00:45:40.260 serious a scientist this
00:45:42.480 guy is, it's just not
00:45:44.220 going to work.
00:45:45.120 Like the communication
00:45:46.420 can't even work until you
00:45:48.440 know how much of a, you
00:45:50.400 know, a study is
00:45:51.380 scientifically.
00:45:52.580 All right.
00:45:52.860 So he's right at the top.
00:45:54.420 You know, best schools,
00:45:55.940 best experience, right place.
00:45:58.160 And he's at an age where he
00:46:00.720 saw the entire climate
00:46:02.480 change narrative created.
00:46:06.220 And so he explained how the
00:46:09.320 narrative was created, which
00:46:10.840 I'm going to call tentatively
00:46:12.060 a hoax.
00:46:13.080 Hoax isn't exactly the right
00:46:14.360 word for this, but it gets
00:46:16.360 you close to it.
00:46:17.480 He explains how it
00:46:18.920 arised and in a way that is
00:46:21.640 completely convincing.
00:46:23.480 Now, do you remember the
00:46:24.680 documentary effect?
00:46:26.800 You have to really beware
00:46:28.200 here.
00:46:28.860 The documentary effect, as I've
00:46:31.180 described on Twitter, is when
00:46:33.120 you watch one documentary,
00:46:35.040 you think it's totally true
00:46:36.420 because they will leave out
00:46:37.920 the other argument.
00:46:39.500 If you never see the other
00:46:41.020 argument, you're going to be
00:46:42.500 really, really convinced by
00:46:43.720 the one you see.
00:46:44.800 So when you watch this, be
00:46:46.200 careful because it's super
00:46:48.780 persuasive, super persuasive,
00:46:52.520 but you're not seeing the
00:46:53.980 other side.
00:46:54.640 So just keep that in mind,
00:46:55.820 right?
00:46:56.120 So keep a little bit of
00:46:57.260 skepticism alive, but man,
00:46:59.380 it's persuasive.
00:47:00.560 And one of the things, one of
00:47:01.980 things that Dr. Linzen does
00:47:03.460 that's persuasive, since I
00:47:06.000 talk about persuasion a lot,
00:47:07.720 is he ends on the best down
00:47:10.340 notes I've ever heard.
00:47:12.780 Now, here's what I mean.
00:47:14.920 You would sound unconfident in
00:47:17.100 your own opinion if you end in
00:47:18.780 an up note.
00:47:19.880 So if I said, I think the moon
00:47:22.820 is made of cheese, I think
00:47:25.560 there's climate change, so that
00:47:27.460 sounds unconfident.
00:47:28.460 And I've taught you before
00:47:30.120 that if you end on a lower
00:47:32.400 note, you sound like you know
00:47:34.180 what you're talking about.
00:47:35.440 Well, we've got climate change.
00:47:38.200 Climate change is caused by
00:47:40.120 the CO2, right?
00:47:42.320 So if you end on the low note,
00:47:43.680 you sound like a serious person
00:47:45.040 who knows what you're talking
00:47:45.840 about.
00:47:47.120 Dr. Linzen ends on the best low
00:47:50.300 note you've ever heard, because
00:47:52.280 he actually drags it out a little
00:47:53.780 bit.
00:47:54.020 So he'll say something like, I'm
00:47:57.120 just making this one up, but he'll
00:47:59.320 say something like, the climate
00:48:05.160 change is caused by a variety of
00:48:08.840 things, but I'm not so sure it's
00:48:10.620 entirely caused by the CO2.
00:48:13.340 And he just milks that low note, so he
00:48:19.240 sounds so smart, and he actually
00:48:22.000 is, so it's compatible with his
00:48:25.100 actual knowledge.
00:48:26.140 All right, here is what I learned
00:48:28.220 from that.
00:48:30.800 Number one, global warming is not the
00:48:35.220 first time that the energy business
00:48:38.220 has been attacked.
00:48:39.080 And there was some speculation, but
00:48:42.620 why is it that, you know, first
00:48:44.700 there's global cooling, so don't
00:48:46.560 use this.
00:48:47.400 First there's, then there's
00:48:48.320 pollution, so get rid of your
00:48:50.040 energy.
00:48:50.480 Now there's CO2, so get rid of your
00:48:52.240 energy.
00:48:52.860 It seemed like there were lots of
00:48:54.140 reasons, historically, and this is
00:48:57.160 the historical perspective that Dr.
00:48:59.240 Linzen gives.
00:49:00.980 And you wonder why.
00:49:03.900 I think this is sort of my take on
00:49:06.480 this.
00:49:06.660 But one of the things I hadn't
00:49:08.880 considered is that there's some
00:49:10.740 group of people who are more about
00:49:12.580 income inequality, meaning that
00:49:15.700 everybody in the energy business and
00:49:18.740 anybody who uses it is probably
00:49:20.820 wealthy.
00:49:22.740 People who don't use a lot of energy,
00:49:25.200 probably less so.
00:49:26.720 And the people who think everybody
00:49:28.400 should be a little more equal, a
00:49:29.960 little bit more socialist, are maybe
00:49:32.580 not so happy that people are getting
00:49:34.280 rich in the energy business.
00:49:35.680 They just like a little less of it, a
00:49:37.900 little more green stuff, so we're all
00:49:39.480 about the same, you know, everybody's
00:49:40.920 eating granola.
00:49:42.260 Now, of course, this is a mind-reading,
00:49:45.460 speculative thing, right?
00:49:46.500 We can't know what people are
00:49:47.560 thinking.
00:49:48.800 But, I don't know, what do you
00:49:50.020 think?
00:49:50.640 Do you think there's any group of
00:49:51.940 people who are just sort of jealous of
00:49:55.680 the prosperity of others?
00:49:58.060 Do you think that's the thing?
00:49:59.140 Because jealousy is such a universal
00:50:03.120 thing, you'd expect there's some of it.
00:50:05.540 But I don't know, I don't, the part
00:50:07.360 that I'm not completely on board with
00:50:09.500 is does that, does that jealousy
00:50:11.780 translate into wanting to take out
00:50:14.860 the big, rich energy business?
00:50:17.720 Probably.
00:50:19.040 Probably a little bit.
00:50:20.640 Then you've also got the media alarmists
00:50:22.720 who can make money by going after the
00:50:25.020 energy business.
00:50:25.800 You've got the green opportunists
00:50:27.460 and entrepreneurs who can make money
00:50:29.140 because the alternatives are, you
00:50:31.740 know, being funded, so people are
00:50:33.240 happy to, you know, be against the
00:50:35.080 energy business.
00:50:36.320 You've got brainwashed kids who are
00:50:38.580 against it, not for reasons, but
00:50:40.560 because they're kids and because
00:50:41.800 they've been brainwashed, which is how
00:50:44.300 all kids learn everything, basically.
00:50:46.560 Of course, there's China who would like
00:50:48.280 the energy business to go out of
00:50:49.920 business because that would make us
00:50:51.300 less competitive.
00:50:52.640 There are posers who just want to
00:50:54.380 look good and say, oh, big, big energy
00:50:57.300 is bad.
00:50:58.280 So there are people who are just
00:50:59.160 basically, it's about their identity.
00:51:01.220 And then you've got these weird
00:51:02.440 people, the anti-humanists, who
00:51:04.620 believe that the planet is better
00:51:05.940 without people.
00:51:07.780 You know, for a while I thought they
00:51:08.840 didn't exist, but there actually are
00:51:10.660 some of them.
00:51:11.480 And they're kind of noisy.
00:51:13.640 There aren't that many of them, but
00:51:15.840 they can be prominent.
00:51:17.320 So there are a whole bunch of reasons
00:51:20.920 historically that the energy business
00:51:23.200 has been attacked.
00:51:24.600 But Dr. Lindzen points out that the
00:51:26.620 CO2 attack is the smartest one.
00:51:30.200 Do you know why?
00:51:32.220 Because there's always going to be
00:51:33.400 CO2 in the air.
00:51:34.840 So you always have something to
00:51:36.420 complain about.
00:51:37.540 Because CO2 will just always be there.
00:51:39.920 It doesn't matter what you do.
00:51:41.420 So you just always have something to
00:51:43.060 complain about.
00:51:43.620 All right, so I would say that this
00:51:47.320 part is more speculative than what
00:51:50.020 I'm going to talk about next.
00:51:51.780 The next part blew my mind.
00:51:54.280 I'd never heard this.
00:51:57.040 And, of course, this is more opinion.
00:52:00.780 This is opinion, not fact.
00:52:03.320 But this is Dr. Lindzen's description
00:52:07.200 of how we got here.
00:52:08.980 How we got here is fascinating.
00:52:11.860 If it's true, right?
00:52:14.980 So here's his take.
00:52:16.740 It started out this way.
00:52:18.100 A scientist, now this is, you know,
00:52:19.840 generically, this is how it works.
00:52:21.860 Generically speaking, a scientist
00:52:23.860 will do a paper and say, hey, it
00:52:25.860 looks like CO2 might make us one
00:52:29.220 degree warmer by some date.
00:52:31.940 Now what's missing from that is
00:52:34.240 therefore the Earth will be
00:52:35.540 destroyed.
00:52:37.440 Scientists didn't say that.
00:52:39.440 Scientists just said, hey, I think
00:52:40.680 temperature might go up a little bit.
00:52:43.340 But that turns into a panic by the
00:52:46.320 time it gets to the media and it
00:52:47.660 gets to the advocates.
00:52:49.220 So the advocates turn this non-scary
00:52:52.460 statement of, well, maybe temperature
00:52:54.840 up a degree.
00:52:56.520 They turn this into the world will end,
00:52:59.180 which was never what the scientists
00:53:00.500 said.
00:53:01.640 And then the politicians say, oh, my God,
00:53:04.880 my God, my God, we have to do
00:53:06.220 something about this because the media's
00:53:07.700 got all our voters whipped up.
00:53:09.660 So they say, we will put massive
00:53:11.440 amount of money into this to fix it.
00:53:13.680 And that money goes back to the
00:53:15.240 scientist who wants to say, oh, I
00:53:21.020 didn't say it's an emergency.
00:53:23.140 I just said the temperature might go up
00:53:24.560 a little.
00:53:24.780 But if they say it might be
00:53:29.840 emergency, well, you know, you don't
00:53:31.760 know if it's an emergency.
00:53:33.160 Maybe we should study it.
00:53:35.620 So you very quickly created a situation
00:53:37.800 where the only way you could get
00:53:39.200 funding was to say there was a panic
00:53:42.400 and we better do something about it.
00:53:43.880 And once that dynamic started, Dr.
00:53:48.300 Lindzen talked about two times that he
00:53:51.040 got a paper published in a scientific
00:53:53.940 publication.
00:53:55.840 And both times the editor was fired
00:53:58.320 immediately because his papers didn't
00:54:01.060 agree with the narrative.
00:54:03.580 Apparently you would be fired
00:54:05.040 immediately if you tried to stop the
00:54:08.460 money train.
00:54:09.860 Because think what would happen if you
00:54:11.200 published a paper that said, oh, oh, you
00:54:13.960 can stop sending all this money.
00:54:16.280 You can just save it.
00:54:17.800 Don't send any money to scientists.
00:54:20.040 My paper has shown that climate
00:54:22.100 change is no problem.
00:54:24.560 You couldn't get that published because
00:54:26.980 the money train was too big.
00:54:29.140 Right?
00:54:29.560 So where we are is we're just in this
00:54:31.960 weird situation where a lot of people
00:54:33.980 have probably convinced themselves it's
00:54:35.860 true, but it was never based on
00:54:38.540 science.
00:54:39.040 Because none of this came from the
00:54:41.860 scientists.
00:54:42.640 The scientists were just responding to
00:54:44.380 the money and then, of course, they
00:54:46.980 found confirmation.
00:54:49.220 Of course they did.
00:54:50.620 Because that's where the money was.
00:54:52.480 The money will get you anything you
00:54:53.800 want.
00:54:54.880 If Congress funded science to find
00:54:57.980 ghosts, like actual ghosts, how many
00:55:03.320 ghosts would the scientists find?
00:55:05.020 A lot.
00:55:08.840 Yeah.
00:55:09.640 The answer is a lot.
00:55:11.460 They would find all the ghosts they
00:55:13.220 could find.
00:55:14.100 They would find indirect evidence of
00:55:16.440 ghosts.
00:55:17.140 They would find a video of a ghost.
00:55:18.960 They'd find ghosts talking to you.
00:55:20.900 They'd have, oh, there would be ghost,
00:55:23.020 ghost, ghost papers.
00:55:25.240 All right.
00:55:25.860 So then there were a few other things
00:55:27.300 that Dr. Linson confirmed that I had
00:55:30.900 always suspected.
00:55:31.620 Do you know what the value of a peer
00:55:34.720 reviewed study is?
00:55:37.740 Do you know the peer reviewer is just
00:55:39.580 checking to see if you did the obvious
00:55:41.580 math right?
00:55:43.020 That's all they do.
00:55:44.700 They're just looking for, is the paper
00:55:47.100 interesting?
00:55:48.400 So it has to be like different enough
00:55:50.140 that it's worthy of attention.
00:55:51.900 Is it interesting?
00:55:53.680 And is there any obvious math problem?
00:55:56.960 That's it.
00:55:58.340 And then the public thinks that it's
00:56:00.760 been reviewed by another scientist,
00:56:03.280 and since two scientists have looked
00:56:04.920 at it and agreed, it must be pretty
00:56:07.200 good, must be solid.
00:56:09.000 Nothing like that's happening.
00:56:11.160 Peer review doesn't have any value,
00:56:14.260 except, you know, checking the basic
00:56:15.840 math, I guess.
00:56:17.240 None.
00:56:18.080 In fact, peer review prevents papers that
00:56:22.480 are against the narrative.
00:56:23.360 So peer review actually hides the truth.
00:56:29.680 Because one part is, oh, here's a paper
00:56:31.900 from Dr. Linson.
00:56:33.140 Nope.
00:56:33.860 Sorry.
00:56:36.460 Peer review reduces the truth.
00:56:42.140 It does not reveal it.
00:56:43.360 It reduces it.
00:56:44.560 By its nature.
00:56:45.800 By design.
00:56:47.440 Not design, but by its nature.
00:56:49.100 And then the funniest thing was
00:56:51.600 listening to Dr. Linson describe how
00:56:54.320 the prediction models for climate
00:56:56.300 science have been created.
00:56:58.220 Now, I can't summarize it in a way
00:57:02.260 that would do credit to it.
00:57:04.320 But suffice to say that if you imagine
00:57:07.340 that Dilbert is sort of a universal thing
00:57:11.500 in all business, science too.
00:57:14.140 So, apparently the models are as ridiculous
00:57:17.200 as you knew.
00:57:19.220 They're ridiculous.
00:57:21.120 And it doesn't take, you know,
00:57:22.920 more than a few minutes of him talking
00:57:24.620 about them before you're actually laughing.
00:57:28.400 So I'll just give you one example.
00:57:30.480 This is just one example.
00:57:31.520 When a climate model doesn't work,
00:57:36.440 like it wildly goes too high or too low
00:57:39.120 and she'd know it's wrong,
00:57:41.060 they'll put in a, what do you call it,
00:57:44.600 like a moderating variable.
00:57:48.360 They'll just put in a variable that stops it
00:57:50.540 from being wildly wrong.
00:57:52.600 But the variable is just something they make up.
00:57:55.680 A damper, right?
00:57:56.800 They put in a damper.
00:57:58.560 A dampening variable.
00:57:59.880 A dampening variable is just a variable
00:58:02.900 they make up in their head
00:58:04.080 to make it come out the way
00:58:05.900 they wanted it to come out.
00:58:07.580 It's literally not even close to science.
00:58:10.920 Not even close.
00:58:12.780 And so we're spending, you know,
00:58:14.740 gazillion dollars based on these predictions
00:58:19.200 that no reasonable person thinks are credible.
00:58:23.300 Nobody.
00:58:24.220 Nobody involved thinks are credible.
00:58:26.680 Unless they're making money
00:58:27.920 by selling them, of course.
00:58:29.880 So that's just a, you know, a taste of it.
00:58:34.120 So, oh, and here is my favorite part.
00:58:38.540 Everything that you thought was true
00:58:40.460 turns out to be true, you know,
00:58:42.940 or at least confirmed by somebody
00:58:44.400 who was in the middle of it.
00:58:46.260 And one was, do you remember the other day I said,
00:58:48.880 do you really believe that scientists
00:58:50.360 can measure the temperature of the Earth?
00:58:53.260 Do you remember me saying that?
00:58:54.700 And just laughing?
00:58:57.400 Did anybody ever believe that was the case?
00:59:00.360 If you ever believe that scientists
00:59:02.180 could measure the temperature of the Earth
00:59:04.880 and then compare it to what it was before,
00:59:09.480 there's something wrong with you.
00:59:10.980 Like, nobody who's lived five minutes in the real world
00:59:14.740 thinks that's possible.
00:59:16.180 And then I'm listening to Dr. Lindzen talk about it.
00:59:18.920 It's like, no, it's not possible.
00:59:20.540 And no, they haven't done it.
00:59:22.400 They've produced a number.
00:59:24.240 They have produced numbers.
00:59:25.640 But no, you can't measure the temperature of the Earth.
00:59:28.140 That's not really a thing we can do.
00:59:30.080 And there's an obvious reason.
00:59:33.540 You can measure a whole bunch of places,
00:59:37.100 but every place you didn't measure
00:59:39.320 is a place that could be storing some extra energy
00:59:43.080 that you didn't know about.
00:59:44.980 So you're not really measuring anything.
00:59:47.660 You're only measuring those locations,
00:59:50.400 and even those locations,
00:59:52.020 and he had a number of other criticisms
00:59:53.920 that would convince you beyond a doubt
00:59:56.680 that it's all garbage.
00:59:58.720 Now, given all that,
01:00:01.680 if you imagine that his take on it is true,
01:00:05.340 could you call climate change a hoax?
01:00:08.660 Does that word fit?
01:00:12.220 Now, keep in mind that he says
01:00:14.160 that 97% of scientists, including himself,
01:00:17.960 would agree with the statement
01:00:19.200 that CO2 is a greenhouse gas
01:00:22.220 and that it would increase warming to some amount.
01:00:25.540 So he's not arguing with the basic idea
01:00:28.400 that CO2 couldn't cause warming.
01:00:30.660 He's just saying
01:00:31.360 it's not an important amount of warming
01:00:33.820 compared to all the other variables.
01:00:39.160 Yeah.
01:00:39.840 All right.
01:00:40.940 So we have a possibility,
01:00:44.860 I suppose it's a small one,
01:00:47.780 that Trump will be considered right about this as well
01:00:51.120 when he called it a Chinese hoax.
01:00:54.480 Now, that's like hyperbole.
01:00:56.980 China's one part of the story,
01:00:58.820 and hoax isn't exactly right
01:01:00.420 because I'm not sure that these people were lying.
01:01:04.160 They may have just motivated reasoning.
01:01:06.140 Oh, if you're going to give me money,
01:01:08.440 I believe this now.
01:01:09.740 People will come to believe
01:01:11.000 whatever is consistent with their thoughts
01:01:12.760 that they're good people.
01:01:14.280 If they think they're good people,
01:01:16.140 they're not going to imagine a world
01:01:18.320 in which they're lying to get money.
01:01:20.500 They'll imagine that they actually believe
01:01:22.460 that their studies are useful
01:01:23.820 and telling you something.
01:01:26.240 So I'm not sure hoax is the right word,
01:01:27.920 but definitely I'm of the current working opinion
01:01:31.760 that climate change is ridiculously absurd
01:01:37.180 in terms of its risk
01:01:38.740 and that there's no real long-term risk.
01:01:42.660 We'll probably be safer in the future,
01:01:43.980 not in more danger.
01:01:46.300 So that's my current view,
01:01:48.020 and it's not too far from my view before
01:01:50.020 except more confirmation of things
01:01:51.940 that I thought were obvious.
01:01:53.600 The things I thought were obvious
01:01:54.980 from my own experience,
01:01:56.260 there's no way you could make
01:01:58.040 a climate model that predicts.
01:02:00.920 I did not need Dr. Linsen to tell me that
01:02:03.580 because I've worked on a lot of prediction models,
01:02:07.280 and you know that it's the assumptions
01:02:09.060 that drive them.
01:02:10.220 It's not the data.
01:02:11.800 The data doesn't drive them.
01:02:13.160 It's your assumptions,
01:02:14.300 and your assumptions are guesses.
01:02:16.500 So, I mean, I knew that
01:02:17.800 because I've done data models.
01:02:19.520 But if you've never done them,
01:02:20.960 you think, oh, scientists are doing them.
01:02:22.580 They might have some science in them.
01:02:24.660 There's no science in them.
01:02:26.820 There's no science in those models.
01:02:29.240 So I knew that.
01:02:30.880 I also knew just from living in the real world
01:02:33.880 that you couldn't measure the temperature of the Earth
01:02:36.500 and know that you had something useful across time
01:02:40.800 and all that stuff.
01:02:41.320 I knew that.
01:02:42.760 Just common sense.
01:02:44.000 If you lived in the world, you knew it.
01:02:46.440 Now, that doesn't mean those two things did not alone
01:02:50.500 mean that we had no risk.
01:02:52.500 But when I hear Dr. Linsen talk about it in context,
01:02:57.120 and you realize the whole history
01:02:58.800 of how the energy business has always been under attack,
01:03:01.800 and when one attack doesn't work,
01:03:03.700 they just change it to a new attack through history.
01:03:06.880 Oh, here's some hallucination.
01:03:13.280 So somebody says to me, Craig Baker says,
01:03:16.260 you, in capitals, stated that the models are good
01:03:19.680 and improving in relationship to Tony Heller.
01:03:22.080 That is something I never said.
01:03:25.900 Craig, you are hallucinating.
01:03:28.120 If you believe I ever said the models are good, really?
01:03:32.100 I don't know what drugs you're doing,
01:03:35.040 but that's literally the closest to the opposite
01:03:39.520 of what I've ever said.
01:03:42.340 Oh, and Craig is yelling in caps.
01:03:44.760 Yeah, Scott, you did say the models are good.
01:03:48.960 You see what I deal with every day?
01:03:51.540 There's like an actual living person
01:03:53.320 who is smart enough to get on here and make a comment
01:03:56.340 who believes that what I've said consistently forever
01:04:00.880 was the opposite.
01:04:03.880 And Craig believes he saw it.
01:04:06.640 He actually believes he saw it and lived through it.
01:04:11.060 Well, let's see.
01:04:14.960 What else we got going on here today?
01:04:18.600 Oh, did anybody see the war room
01:04:21.720 with Steve Bannon talking about me?
01:04:24.260 Did anybody see that?
01:04:25.000 So on Monday, I guess, he had a guest on
01:04:28.980 and they were talking about my long COVID opinions, etc.
01:04:35.060 And so he brings on the guy
01:04:37.020 who has a different opinion from me.
01:04:39.260 I'll call him my critic just for a short hint.
01:04:42.560 And the critics complained because I said
01:04:45.740 that their heuristics beat my analytics.
01:04:52.820 And he wanted to correct that.
01:04:55.000 Oh, no, Scott.
01:04:56.960 We used analytics.
01:04:59.040 We used analytics.
01:05:00.400 You're not the only one who used analytics.
01:05:02.260 You're making it sound like you analyzed
01:05:04.260 and we guessed.
01:05:07.340 But here's my clarification.
01:05:09.660 When I said your heuristics beat my analytics,
01:05:12.980 the context was that I guessed.
01:05:15.660 My analytics were guessing,
01:05:20.760 which I've said as clearly as possible.
01:05:23.300 So analytics was not a real word in this context.
01:05:26.020 I was saying my analytics because I was guessing.
01:05:28.640 So I analyzed it, but there was nothing to analyze,
01:05:31.000 so I guessed.
01:05:32.640 So their biggest complaint is my use of a word,
01:05:35.460 which is the opposite of how I used it.
01:05:37.180 So that was the biggest complaint.
01:05:38.360 Second biggest complaint was,
01:05:40.800 you know, he says something about long COVID,
01:05:43.220 blah, blah, blah, blah.
01:05:44.500 And then after noting that my big question
01:05:50.620 was how could anybody know the risk of long COVID?
01:05:54.820 He never got back to that.
01:05:59.500 I only have one question for all my critics.
01:06:03.560 How did you calculate the risk of long COVID?
01:06:07.580 And watch how they can't answer the question.
01:06:09.880 They actually have to change the subject.
01:06:12.180 All of them.
01:06:13.320 Every one of them will change the subject.
01:06:15.500 And he did the same.
01:06:17.060 So it's, once you see the seven tells
01:06:20.960 for cognitive dissonance,
01:06:22.280 once you know the seven,
01:06:25.480 it's really obvious every time you see them.
01:06:28.720 So I'll tell you the seven tells again.
01:06:32.660 Changes the topic, ad hominems,
01:06:35.820 mind reading, word salad,
01:06:37.780 using an analogy in place of reason.
01:06:40.040 You can use analogies,
01:06:41.040 but don't use them in place of reasons.
01:06:44.200 Insist it's complicated and can't be summarized.
01:06:46.700 And then there's the so tell where you say,
01:06:48.240 so you're saying that water is dry.
01:06:52.280 Where they had something that you didn't say.
01:06:56.360 So I think you see that in every case.
01:06:59.600 So that if you want to see some,
01:07:01.780 another example of that,
01:07:02.960 just watch the war room with Steve Bannon.
01:07:11.560 Now, some of you say,
01:07:13.520 and I guess I have to reiterate that,
01:07:16.240 that you didn't calculate long COVID
01:07:18.620 because nobody knew what the risk was.
01:07:22.280 I didn't know what the risk was.
01:07:25.680 So that's why I guessed.
01:07:27.980 That's why you guess.
01:07:29.440 Because you didn't know what the risk was.
01:07:34.260 All right.
01:07:34.800 I have a cold now.
01:07:45.080 As far as you know.
01:07:46.240 All right.
01:07:46.460 Looking at your comments.
01:07:51.340 You gave long COVID too much weight and still do.
01:07:56.240 Does that,
01:07:56.860 did that comment make sense?
01:07:58.340 So in order for me to have given long COVID too much weight,
01:08:03.340 that suggests that you know the risk of long COVID.
01:08:06.760 Please tell me how.
01:08:08.920 Please explain how you knew that.
01:08:10.520 Because I don't know it.
01:08:12.800 Now, here's another factor that I've never said out loud.
01:08:16.680 Directly.
01:08:17.740 If I were 25 years old,
01:08:20.220 would I be worried about long COVID?
01:08:22.360 Let's say long COVID would take you out for a month.
01:08:26.340 Nope.
01:08:27.320 Nope.
01:08:27.920 Because if I lost a month of my life to long COVID at age 25,
01:08:32.860 it's such a small percentage of the rest of my life
01:08:35.220 that that would be a reasonable trade-off.
01:08:39.500 Now, at my current age,
01:08:41.260 how many good years do you think I have left
01:08:43.420 where I'm healthy and I can do what I want?
01:08:45.860 Maybe two.
01:08:47.740 Maybe two.
01:08:48.380 Because I'm at the age where people younger than me
01:08:51.640 are dying from old age.
01:08:53.980 People younger than me are like suddenly dying.
01:08:57.100 And even if I don't die,
01:08:58.920 I'm in a very thin window
01:09:01.980 where I could still do physically active things.
01:09:06.680 You know, I could still play tennis if I want to, whatever.
01:09:09.420 So if I got my ass kicked for, let's say, a few months,
01:09:14.600 that's actually a pretty big percentage
01:09:16.900 of the two years that I feel confident I might have.
01:09:21.920 So long COVID to somebody who only has a few years left
01:09:26.060 is not the same calculation as if you're 25.
01:09:30.460 And a lot of people made the mistake of assuming
01:09:33.300 that everybody's the same age, so everything,
01:09:35.980 all our calculations are the same.
01:09:37.900 My calculation wasn't anything like your calculation.
01:09:40.260 I literally thought, maybe I have two years,
01:09:44.240 so long COVID is a big percentage of that.
01:09:48.200 So whoever said,
01:09:49.680 I'm giving too much attention to long COVID,
01:09:53.700 it's too much for a 25-year-old,
01:09:56.860 too much for a 40-year-old.
01:09:59.660 I don't think it was too much for me.
01:10:01.420 And the only decision I've ever talked about was my own.
01:10:06.720 I've never recommended anybody do anything.
01:10:11.940 I'm 65.
01:10:15.900 All right.
01:10:17.800 Printer get fixed? No.
01:10:18.900 Please arrange a conversation with Dark Horse.
01:10:26.120 What do I have to say?
01:10:28.140 I mean, my whole argument is,
01:10:30.080 you were guessing on long COVID,
01:10:31.860 therefore you were guessing.
01:10:34.500 Do you know how the conversation would go?
01:10:37.200 Let's not talk about Dark Horse.
01:10:39.600 Let's say I did an interview with somebody
01:10:42.280 who thinks I got everything wrong.
01:10:44.180 Here's how the conversation would go.
01:10:45.680 Scott, how did you calculate long COVID?
01:10:50.420 Well, word salad, word salad, word salad.
01:10:53.520 Okay, but how did you calculate it?
01:10:56.480 Oh, word salad, didn't matter, didn't matter.
01:10:59.080 Assumed it away.
01:11:00.860 Well, okay.
01:11:02.500 If you assumed it away,
01:11:03.640 then you calculated it as a low number.
01:11:06.380 How'd you do that?
01:11:07.760 Well, word salad, word salad, word salad.
01:11:10.820 All right.
01:11:11.260 It would be useless.
01:11:14.320 Useless.
01:11:14.760 Do you think there's anybody on the other side
01:11:19.660 who would listen to that and say,
01:11:20.980 oh, you know what?
01:11:24.000 Wow, yeah, in your case,
01:11:26.000 in your specific case,
01:11:28.100 long COVID would be a big variable.
01:11:30.000 And, you know, now that you pointed out,
01:11:32.560 nobody can calculate what that risk is.
01:11:35.220 Yeah, you're right, Scott.
01:11:36.240 I had not thought of it that way.
01:11:37.780 Do you think it would go that way?
01:11:40.940 Does anybody imagine it would go that way?
01:11:42.940 No, no.
01:11:45.940 So you're living in, like, a dreamland
01:11:49.200 where reasonable people can talk about reasonable things
01:11:53.520 and then somebody will change their opinion.
01:11:56.540 I've never seen that.
01:11:58.720 I've never seen that at all.
01:12:00.160 Now, I was also accused of mind reading
01:12:10.560 for thinking that other people had not, you know,
01:12:14.840 considered everything in their decision.
01:12:18.800 To which I say, I don't have to read your mind.
01:12:21.140 I just have to look at long COVID
01:12:23.940 and know that you didn't talk about it
01:12:27.740 because you just told me you didn't consider it.
01:12:31.200 There's no mind reading.
01:12:32.580 It's right there.
01:12:33.900 You told me you didn't consider it.
01:12:35.900 That's what I'm saying.
01:12:37.280 We're on the same page.
01:12:38.340 Oh, and here was the assumption that was built into this.
01:12:46.860 The assumption is that the sicker you got with COVID,
01:12:52.940 logically, the worse your long COVID risk would be.
01:12:57.220 Now, I don't have confirmation of that.
01:12:59.980 But that's my point.
01:13:02.260 My point is I don't know.
01:13:03.660 But it does make sense to imagine
01:13:05.620 that the sicker you are, you know,
01:13:07.580 the worse your symptoms could be.
01:13:09.820 So that was built into the assumption of things I don't know.
01:13:15.220 Same as any illness, correct?
01:13:17.940 All right, that's all I got for now.
01:13:22.760 Lance, you didn't understand my point,
01:13:25.240 so let me explain it to you.
01:13:26.480 So Lance says, with a so,
01:13:28.620 he starts with a so.
01:13:29.980 That was literally one of my seven tells for cognitive dissonance,
01:13:35.040 but okay.
01:13:37.040 He says, so you don't think Brett and Heather
01:13:39.260 would understand your point?
01:13:41.520 Is that what I said?
01:13:43.340 Did I say that Brett and Heather wouldn't understand my point?
01:13:47.260 Nope.
01:13:48.160 Nope.
01:13:49.520 Nope.
01:13:49.820 It's easy to understand.
01:13:52.280 I'm saying that they would be triggered into cognitive dissonance
01:13:56.360 because they understood the point, right?
01:13:59.560 The cognitive dissonance is because they would understand the point.
01:14:03.340 If they didn't understand the point,
01:14:05.140 then they wouldn't be triggered.
01:14:06.820 It's understanding the point that triggers the cognitive dissonance.
01:14:11.720 So no,
01:14:12.560 I don't believe there's any realistic way I could talk to my critics
01:14:17.100 because they would go into crazy town right away,
01:14:20.860 and it would just be a shit show.
01:14:22.940 See you, Dr. Funk Juice.
01:14:27.920 Everybody say,
01:14:28.820 say see you later to DJ Dr. Funk Juice,
01:14:32.680 one of my favorite,
01:14:33.580 one of my favorite,
01:14:35.200 uh,
01:14:36.060 livestream followers and Twitter followers.
01:14:39.440 Always appreciate you.
01:14:43.380 The banker's beast.
01:14:44.900 What's that?
01:14:45.300 All right,
01:14:47.120 that's enough for now.
01:14:48.020 I'm going to go talk to the,
01:14:49.320 uh,
01:14:50.580 locals people after I make them totally private.
01:14:54.240 And,
01:14:54.840 uh,
01:14:55.300 I'll talk to you YouTubers tomorrow.
01:14:57.220 Bye for now.
01:14:58.980 Best livestream.