Episode 1853 Scott Adams: All The News Is Funny, Fake And Interesting Today. Come Enjoy It With Me
Episode Stats
Length
1 hour and 28 minutes
Words per Minute
141.32814
Summary
In this episode: California votes overwhelmingly to save the Diablo Canyon nuclear power plant, a fake evacuation order goes out in Los Angeles, and the price of gas goes down. Plus, Scott Adams explains why he thinks climate change is going to kill us.
Transcript
00:00:00.000
Oh, what a good day, and congratulations for making it to one of the best moments of your
00:00:11.520
life. Was it your wedding? Was it the birth of your children? No. It's coffee with Scott
00:00:17.000
Adams. The best thing that's ever happened to you? Probably. Probably. And if you want
00:00:22.740
to take it up a notch, all you need is a cupper mug or a glass of tank or jealous of
00:00:25.800
dying, a canteen jug or flask, a vessel of any kind, filling with your favorite liquid
00:00:31.780
I like, coffee. And join me now for the unparalleled pleasure of dopamine, end of the day. It's
00:00:42.480
called the Simultaneous Sip, and it happens now. Go.
00:00:45.620
Now, I don't know, that felt incomplete. It felt, there's something missing. It feels
00:01:01.640
sort of semi-fascist. Semi-fascist. Let's see if I can go full fascist.
00:01:09.100
Yup. Full fascist. Two sips. That's all it took. I didn't realize I was that close. Wow. Shocker.
00:01:26.000
That close. Well, LA woke up to a fake evacuation order today. I love this story. I'm just trying
00:01:38.480
to imagine what it would be like to wake up to this. So apparently, there was an emergency
00:01:44.440
alert system that interrupted regular TV programming in LA Wednesday, telling the entire LA county
00:01:52.660
and the eastern North Pacific Ocean area to evacuate due to a fire. But they say it was
00:01:59.700
an error. They accidentally almost evacuated Los Angeles.
00:02:09.660
Oh, come on. That's funny. Oops. Can you imagine being the programmer who made that error? And imagine
00:02:25.540
the performance review. Well, Bob, let's see. Look at your performance for the year. You got
00:02:35.000
some good things done. You finished the Project Zebra right on time. Seems you completed all
00:02:42.060
your mandatory training. Good. Good. And you accidentally evacuated Los Angeles. Ooh. I'm going to have to ding you for that.
00:02:55.500
All right. I don't know why that's so funny. For some reason, that's just funny to me.
00:03:06.960
Ah. Well, the California legislature just voted overwhelmingly. Overwhelmingly, I say, which is
00:03:15.960
actually important, the overwhelmingly part, to save Diablo Canyon nuclear power plant. Now, if you don't
00:03:22.960
know California politics, the Diablo Canyon nuclear power plant has been a huge political issue, the person I
00:03:32.140
believe most associated with advocacy to keep it was Michael Schellenberger. So I believe this is one of the
00:03:40.000
maybe greatest advocacy successes maybe you've ever seen. I mean, it's hard to think of another one that would be this
00:03:51.420
substantial. Basically, a small group of people just went balls to the wall for years to get this victory, and they got it.
00:04:01.240
So congratulations to everyone involved. But here's the part that's amazing, the overwhelming part.
00:04:13.220
Why is it overwhelmingly popular now? It's just because the danger makes everybody think a little bit more
00:04:22.600
clearly? Because the danger was always here. It's just that the fact that we're actually going to have
00:04:29.000
to turn off the lights and take turns having electricity. I guess that gives you a little religion,
00:04:35.840
doesn't it? So California's wised up. Oil prices have been dropping for a month. Why is that happening?
00:04:50.900
How is it possible that oil prices would be dropping? Is somebody pumping more, or is the demand going down?
00:04:57.560
Is that, is that a recession kicking in? End of summer? Lower demand? Yeah, maybe just the price was so high the demand
00:05:07.400
went down, huh? Something like that. People probably changed, probably a lot of people just started commuting, or I'm
00:05:14.760
sorry, telecommuting. How many of you, let me ask a little quick poll. How many of you, for, because of gas prices
00:05:25.580
alone, or let's say primarily, how many of you primarily for gas prices are working at home more
00:05:32.700
days? Might be one more day a week or something? Anybody work? Yeah, I'm seeing yeses. Some yeses
00:05:40.660
coming in. No change. Most people would be no change, I would think. Yeah. Okay, so a little bit of yes.
00:05:51.500
So that might be part of it. People adjusting their habits. All right, here's my question. We have all seen
00:06:00.600
alarmist climate graphs showing that the temperature is going to be rising, and that the seawater would
00:06:08.280
rise, and people would die. Am I right? We've got very alarming 80-year predictions about climate change.
00:06:15.300
But where are my graphs showing how many people will be killed by climate overreaction?
00:06:23.280
Don't you think that needs to be a graph? And I think we're at that hockey stick juncture right now.
00:06:31.240
Here's my guess. Did I ever tell you I'm good at guessing things that I don't know anything about?
00:06:36.340
It's like an actual skill that you learn if you do a lot of numbers stuff for a living. I did this for
00:06:43.440
years. Did a lot of financial analysis and estimates and business cases and things where you had to guess
00:06:50.060
what something would cost in the future. And you end up getting good at it. Even though there's no
00:06:55.940
reason you should be good at it, you're guessing things you have no experience in. But here's my current
00:07:00.960
guess for how many people will be killed by climate alarmism. So not the climate, but decisions we made
00:07:09.840
because we were alarmed about the climate. One to 10 million. One to 10 million. I'm going to put the
00:07:16.800
probable death rate from making bad energy decisions collectively, not one decision, but bad energy
00:07:25.960
decisions collectively. They're based on, let's say, irrational thinking. So just the decisions that were based on
00:07:35.880
irrational thinking, like nuclear decisions. They were rational at one point. Now they're rational. So I think the
00:07:45.180
current estimate would be one to 10 million. Would anybody disagree with that? Now, if you're saying it's low, I would say,
00:07:55.180
maybe later it would be expanded. But I think one to 10 million has to be a number that we need to start kicking
00:08:02.020
around. Now, first of all, if I were to do a detailed analysis of this, do you think you would trust it? If I did more
00:08:11.360
rigor than just guessing, do you think that that would be more convincing? It shouldn't be. Because no matter how much
00:08:17.900
rigor I put into it, I'd be kind of still guessing. So that's my, and I'm going to say that that's an expert
00:08:26.680
guess. An expert in this context doesn't mean an expert on climate or science. It means an expert at
00:08:33.380
guessing stuff. I'm really good at guessing stuff that are, you know, numbers-wise. Now, I'm not going to say
00:08:41.980
that you should depend on that guess. That's different. I'm just saying that I do have a track
00:08:46.240
record of guessing things that are hard to guess. I don't know why. I think it's because of experience
00:08:52.960
doing that kind of work. Because other people who did that work could do the same thing.
00:08:59.560
All right. Well, I would like to see somebody produce an actual estimate of that with an actual
00:09:05.680
graph. Because I think at this point, we have enough that you could, you could make an argument
00:09:11.860
to support a number. I don't think that an estimate of that type would be accurate. But you could come
00:09:18.840
up with something that was as useful as the climate graphs. Now, I've argued that the climate graphs
00:09:26.400
that say, you know, we're all doomed because the temperature's going up. I've argued two things that
00:09:32.460
sound like opposites. One, is they're totally inaccurate. Two, they're very useful. Do you see
00:09:42.460
how those both could be true? The climate, you know, the predictions for the 80 years in the future,
00:09:50.260
completely inaccurate. And at the same time, completely useful. Because their intention is to
00:09:57.580
make you not have that future. Right? They're used for persuasion. So for persuasion-wise,
00:10:04.140
they're big and scary. They're not accurate. Nobody can make an estimate for 80 years. That's not a
00:10:10.160
thing. So they're not accurate. But if they make you act differently in the moment, and that's what
00:10:15.280
they tried to do, if that's what they tried to do, then it's successful. Now, if what it did was
00:10:21.800
cause you to overreact, then it was the biggest failure ever. What did it do? It caused us to
00:10:29.620
overreact. You could argue it was the biggest failure of all time, you know, in terms of scientific
00:10:35.560
communication. But we'll see. We'll see how that plays out. But it needs to be turned into something
00:10:42.840
visual the way the hockey stick climate graphs are. If you want to get more rationality into the
00:10:51.960
climate decisions, you need to have a graph that shows the cost. Now, there's a long Twitter thread
00:11:02.640
that I just retweeted from Epstein, who's got his book, Fossil Future. And check that out. And he does
00:11:13.980
a great job of explaining how the critics will ignore some variables and only look at others. And he's
00:11:20.980
gotten specific examples of that. Once you see it, right, it's a half analysis. Once you see how
00:11:32.540
the people on one side are continuously ignoring half of the argument, you understand how we got
00:11:39.620
here. So, you know, you know how weird my life is, right? I tell you this all the time.
00:11:52.100
You've, those of you, especially on Locals, if you've been following me for a while,
00:11:55.920
you know how much time I spend trying to save the world, right? Yeah, Alex Epstein is the author of
00:12:04.500
Fossil Future, not the other Epstein. Just clarifying that. So, you know how much time I've,
00:12:14.000
I've put into trying to save the world, right? In a variety of ways. So, I tried to be active in the
00:12:20.340
nuclear energy stuff, and a variety of things. So, this is the weird thing about trying to save the
00:12:27.660
world. You know, no good deed gets unpunished. You've heard that? That is so true.
00:12:37.080
Hey, there we go. We already got a chart, somebody on climate, on Locals already made one.
00:12:42.580
But when I'm not trying to save the world, I'm being criticized by people who hate me
00:12:49.760
for things that they imagine I'm doing. So, the trouble is, if you try to make the world a better
00:12:55.780
place, people will imagine you're destroying it. So, it makes it really hard to be useful.
00:13:03.900
Let me give you an example. I just saw this today. So, I saw an article on Slate. Do you remember Slate?
00:13:10.420
That's still a thing, I guess. And they were debunking Alex Epstein's book, Fossil Future.
00:13:17.740
And part of their debunking is that he had associated with me on Twitter. So, because the
00:13:26.840
author had associated with me on Twitter, that's a reason not to buy the book. What?
00:13:35.740
And so, I was referred to as a conspiracy theorist. And I thought, conspiracy theorist? I'm not
00:13:45.080
even sure what the example of that would be. But you can see how my reputation is just completely
00:13:52.100
up for grabs. Like, anybody can write anything. Just write anything. So, at the very same time,
00:14:00.400
I was literally putting my, you know, time and risking my life, literally, to make the
00:14:08.280
world a better place. I'm literally risking my life. Because a lot of the things I say
00:14:12.960
make you targets for whoever's on the other side, right? It's a crazy world. So, if you're
00:14:18.800
a public figure and you say things that have any impact on the world at all, somebody's going
00:14:23.820
to try to kill you. So, I'm literally risking my life to make the world a better place. Don't
00:14:29.340
really need the money, right? Like, why else am I doing it? And there's some asshole at Slate
00:14:35.700
who's using me as an example of why some other person's book would be less credible because
00:14:41.720
I've tweeted him. Good Lord. Could you be less useful to society than whoever wrote this
00:14:50.380
on Slate? Anyway. All right. The biggest story in the world by far is AI. And again, I'm going
00:15:00.460
to try to save the world in a small way. You really need to be warned about what's coming
00:15:05.420
because it's coming really fast now. We've now entered the things are changing faster than
00:15:11.640
you can keep up phase of it. Let me just give you an example of what's new this week in AI
00:15:19.460
because the rate of change in AI is going to go from, oh my God, they keep saying AI is
00:15:25.840
coming. Where's my AI? All right. Well, that's a little bit of AI, but that's so not impressive.
00:15:31.980
Where's my AI? Years are going by. You keep saying there'll be AI. They'll never be AI. Where's
00:15:37.100
my AI? The thing people don't understand is that when it takes off, it's going to happen
00:15:45.420
really fast. And that's even before the singularity. The singularity is what's called
00:15:51.880
that point where AI can learn on its own. I would argue it's already there, but there's another level
00:16:00.160
of that that it's not quite there at. But way before it reaches the singularity, and that's where
00:16:07.060
we are now, it's doing stuff that you didn't think it could do faster than you thought it could do it.
00:16:12.080
For example, AI already took first place in a human art competition.
00:16:21.460
Just let that sink in. AI already, already, you're not waiting for it. This isn't the future.
00:16:29.560
Art, it already took first place in an art contest. Now, do you remember me telling you two weeks ago,
00:16:40.640
maybe, that in my opinion, AI art was already better than humans? Does anybody remember me saying
00:16:47.580
that two weeks ago? Yeah. Now, do you know why I could see it, and maybe other people couldn't see
00:16:52.520
it as easily? Do you know why? Why could I see it a little bit faster, not much faster, but a little
00:16:59.960
bit faster than you could? Do you know why? Because I make art for a living. I'm not good at it,
00:17:06.300
but I work in the field. If you make art for a living, do you know what you know that other people
00:17:10.960
don't know? It's a formula. Hate to tell you, art's pretty much a formula. It really is. And if you
00:17:23.780
follow the formula, you can make good art. Now, AI knows what colors people like, it knows what
00:17:30.320
paintings were popular before. It's just finding the patterns and putting them together. Do you know
00:17:35.740
what freaked me the fuck out? Think about this. Do you remember the early I asked people to do AI art
00:17:44.480
about me? Or people were doing it anyway, I guess. So AI created a bunch of photos of me in which my
00:17:52.400
face was distorted in a variety of ways. Sometimes my face was combined with other objects or just
00:17:59.380
somehow skewed or whatever. Now, here's what scared the fuck out of me when I saw that.
00:18:05.740
AI knows what a face looks like. AI knows that faces are mostly symmetrical, you know, within
00:18:14.480
reason. AI knew exactly what I do look like. It chose to paint it artistically. The reason it didn't
00:18:27.200
look exactly like me is that it was putting artistic levels on top of it. When I looked at those
00:18:35.120
pictures of me distorted, I had two thoughts. Hey, why doesn't that look, you know, just like me?
00:18:42.160
Because it knows how to do that. And it chose not to. It actually chose not to show me the way I look.
00:18:50.720
It chose a different artistic frame on it. And I don't know how it did that. Because I don't believe
00:18:57.940
that any human told it to make faces non-symmetrical. That looked like an intelligent choice.
00:19:06.660
I mean, I don't know what intelligence means in this specific instant. But now just consider this.
00:19:15.140
Consider my assumptions. Don't you believe that the AI that was advanced enough to do this art
00:19:21.040
certainly knows that a human face is symmetrical. And it can find thousands of pictures of my actual
00:19:29.200
face online. So it knows exactly what I look like. And yet it did not choose to represent me that way.
00:19:36.500
It was already an artist. As soon as I saw that, that's when I knew we were done.
00:19:43.440
As soon as I knew that it could already give me an artistic interpretation that wasn't fed into it,
00:19:51.200
it's already thinking. It's already conscious. It's already sentient. In a sense. In the minor sense.
00:19:59.380
But it's already sentient. In my opinion. Right? It'll always be a different opinion.
00:20:04.440
All right. So was that freaky enough? I've got a book called The Religion War that came out
00:20:09.400
years ago that never had an audiobook associated with it. By the way, all of this is coming through
00:20:14.580
Machiavelli's Underbelly account. So he's doing all the interesting things that we're learning about
00:20:21.020
and tweeting about. So Machiavelli's Underbelly ran at least one part of my book, The Religion War,
00:20:30.860
and used AI to create an audiobook from it. And the first thing you say is, well, they've been doing
00:20:39.400
that forever. That's not new. To have the machine read words. That's not new. Here's what's new.
00:20:51.480
That's new. It was my voice. It read my book in my voice. Are you hearing this? It read my fucking
00:21:06.740
book in my voice. And it wasn't me. Now, you can still tell it was a computer, because it wouldn't
00:21:15.960
hit the accents right, right? It would punch words a little differently than I would. How long
00:21:20.800
do you think it would take for it to learn how to punch the words correctly, given that I have
00:21:25.660
a billion hours of audio of my actual voice talking? You don't think you could just listen
00:21:32.160
to a billion hours of me talking, go back to that audiobook and just fix it? Of course it could.
00:21:45.400
Yeah, so at this point, you could get a phone call from somebody you think you know, and it won't be.
00:21:50.800
I imagine you could change your voice in real time, right? Is there any reason I couldn't
00:21:56.160
speak into my phone and have it come out the other end as somebody else's voice? There's
00:22:02.140
no reason that can't happen. That's here. Within a tweak. We're just a tweak away from that
00:22:09.660
being completely operational. Maybe it is in some other version. Likewise, from also Machiavelli's
00:22:17.560
Underbelly, found a video of me giving a speech at Berkeley years ago. And some people ask if it
00:22:27.980
was a real video of me, because I was wearing a button-down shirt. I promise you, I do own
00:22:34.160
one button-down shirt. It's my TV shirt. So yes, it was me. Actually, I own a few. But
00:22:41.680
in this case, I was rapping Eminem's famous raps, but it was me. So here I am, like it looks
00:22:52.040
like I'm talking, but I'm rapping Eminem. But I think I was rapping in my voice, not Eminem's
00:23:02.500
voice. So it's like, what this can do is crazy. All right. We now know that the kids who were
00:23:12.100
in Zoom school were not in any school at all, they were just being tortured. Can we say
00:23:18.200
that? So at this point, we can just say, what you thought was Zoom school wasn't any kind
00:23:22.920
of school at all. It was just some kind of child abuse that we all participated in, because
00:23:31.320
it was easier than fixing it, right? We all did that. So congratulations to all of us. We
00:23:38.340
all fucked up our youth intentionally and didn't do a thing about it. So good for you and me,
00:23:46.460
by the way. If this looks like I'm criticizing you, nope. This is on me. Just as much. And
00:23:55.420
I'll say the same argument when people were saying, you know, that the rank and file FBI are
00:24:02.280
awesome, to which I say, hmm, it doesn't work that way. Now, if you're in the FBI, and your
00:24:09.840
leadership is embarrassing you, it's sort of on you. It is a little bit on you. It's not maybe the
00:24:16.460
primary thing on you, but it's a little bit on you. Likewise, if I criticize the government for
00:24:25.440
Zoom school, that's a little bit on me, isn't it? Because I let it happen, didn't I? Right? All of
00:24:31.800
us did. So that's not exactly on the government. Yeah, I think you have to be a little bit, a
00:24:39.660
little bit flexible on this one. The government definitely did something that hurt children.
00:24:45.280
But we let that happen. We let that happen. Because we didn't have a better idea either,
00:24:50.900
frankly. So that's on us and them. But it's also on the teachers unions, because the teachers
00:25:01.440
unions were probably the, I would say, one of the main drivers of Zoom school. And we know
00:25:10.500
now that minorities got further behind than ever, presumably because white kids had more resources
00:25:20.660
Asian kids, maybe had more resources. So teachers unions, again, are the largest cause of systemic
00:25:29.980
racism. Now, there were lots of causes, but the largest one was the teachers unions. Again,
00:25:36.800
the teachers unions are consistently the source of systemic racism. Because they, you know, their
00:25:44.920
influence is what allows a lack of competition in schools, if you haven't heard that argument.
00:25:50.040
So if there's a lack of competition, that explains everything. All right. There were two tweets
00:25:55.460
today, yesterday, that seem to indicate ivermectin works for COVID. Hold on, hold on. I'm not saying
00:26:04.860
it works for COVID. Did you hear that? I didn't say that. I said there were two tweets or two claims.
00:26:10.800
Two claims. Two claims. One of them is that the NIH, the National Institute of Health, is now listing
00:26:18.980
ivermectin as a potential treatment for COVID. So if the National Institute of Health puts it on their
00:26:28.140
page as a potential, potential treatment for COVID, I guess then everything you thought about COVID is
00:26:36.980
wrong, right? I guess maybe it does work, huh? Is that what you conclude?
00:26:43.980
No. For those of you who danced on my grave because you saw that tweet, maybe you should have read it.
00:26:52.340
Because here's what the National Institute of Health says. On the very page that is being retweeted,
00:26:58.220
the panel recommends against the use of ivermectin for the treatment of COVID, except in clinical trials.
00:27:09.940
Were they ever not in favor of clinical trials?
00:27:20.020
I mean, if somebody is willing to pay for a clinical trial of anything, mostly we're in favor of it, right?
00:27:28.980
If the people who are in the trial have full disclosure, they know what they're doing,
00:27:33.540
somebody else is paying for it, I'm not paying for it, would you ever be opposed to it?
00:27:42.120
There's no news about ivermectin being more promoted, and there's no new acceptance of it.
00:27:52.920
It's just an acknowledgement that there are trials.
00:27:59.480
Did you see a tweet yesterday that there was a Brazilian trial that showed that ivermectin taken prophylactically,
00:28:06.280
meaning before you get the infection, reduced COVID deaths in Brazil by 92 percent?
00:28:16.580
So I guess it's one of the biggest ivermectin trials, or the biggest.
00:28:44.120
It's one of the biggest studies of an ivermectin.
00:28:46.900
It suggests it not only works, but it works great.
00:28:57.680
So I tweeted it and said, hey, debunkers, take a look at this.
00:29:01.360
How long did it take the debunkers to debunk that study?
00:29:13.760
Well, it wasn't, it was not a randomized controlled trial.
00:29:36.200
The mortality rate among non-users was 5.3%, which is, this came from Dave Explains.
00:29:51.480
And he says, the second problem is that the mortality rate among non-users was 5 times higher than in the U.S.
00:29:58.100
So basically the study says that just living in the U.S. gives you an 80% chance, reduces your risk by 80%.
00:30:08.840
The numbers that they produce are sort of ridiculous on their surface, right?
00:30:15.360
And then the claim that there was a 90% improvement, do you think that if ivermectin, even prophylactically, could improve things by 92%, do you think nobody would have noticed that?
00:30:31.700
That's the sort of claim that on the surface you should say, nah, no, no, come on.
00:30:42.500
Are you waiting for the best part of the debunk?
00:30:45.360
The people who ran the study, they neglected to mention some of their conflicts of interest.
00:30:53.700
Do you think they had any conflicts of interest?
00:30:57.660
They worked for the ivermectin-making manufacturer.
00:31:08.720
Have you heard of Pierre Corrie, banned from Twitter?
00:31:15.360
So the people who did not disclose their conflicts literally worked for an ivermectin manufacturer?
00:31:22.780
I would say that that qualifies them as frauds.
00:31:35.940
If somebody publishes a paper on something that is, you know, their own company, the company that pays them, anyway, not their company, but the company pays them, and they don't disclose that, is that just a lack of disclosure, or is that just full fraud?
00:31:57.940
Now, I don't know if it's fraud in the legal sense, but I would say that it is an intentional effort to mislead people.
00:32:06.940
If you have a connection to the manufacturer of ivermectin, and you don't disclose that when you print your results, that feels like you intended people to be mislead to me.
00:32:29.460
But ethically, as the word fraud gets used in common use, it's fraudulent.
00:32:35.500
So, before you dance on my prediction grave about ivermectin, just know that the news about it is fake, fake news.
00:32:49.320
Just minutes before the end of her term, so there was somebody at the United Nations High Commission for Human Rights that minutes before the end of her term released this damning report on China's treatment of the Uyghurs.
00:33:02.420
Do you know why this was released just minutes before the end of her term?
00:33:08.320
Because China's pressure was so high that she couldn't do it while she was still on the job.
00:33:13.860
She had to quit at the same time that she published her findings.
00:33:20.740
I don't know if it had been announced or whatever.
00:33:22.260
But basically, she couldn't even publish unless she was also leaving the job at the same time.
00:33:29.200
That was, let's see, it's a damning report of how they're treating them.
00:33:38.640
And there were up to 2 million, blah, blah, blah.
00:33:43.740
You know all those videos you saw of the so-called training facilities that look like basically prison camps?
00:33:54.920
There were alleged prison camps that the Chinese would say are allegedly training facilities.
00:34:11.240
So they acknowledged at least, you know, the facility existed.
00:34:22.920
Because the training camps were too big and visual.
00:34:26.900
And China might have said, oh, we can't have a big visual, you know, prison camp.
00:34:31.600
So they may have figured out some way to distribute the pain in some way.
00:34:37.820
It's possible that those facilities got closed.
00:34:42.440
It doesn't mean that they started treating the Uyghurs well.
00:34:51.620
We can't look at those camps directly and see if there's activity.
00:34:55.600
If the United States can't tell you whether these are still there or not,
00:35:00.440
you know, seriously, our satellites can't see them.
00:35:04.500
So there's something going on in terms of what information we're getting.
00:35:19.700
You've heard that the Biden supporters are trying to do what the Republicans are good at,
00:35:29.560
You know, so let's go Brandon was the joke about Biden.
00:35:33.380
So now they've got these memes called Dark Brandon.
00:35:37.720
So he's got an aviator glasses, and he's sort of like a, you know, a cool, evil superhero or something.
00:35:50.860
But here's my pro-suggestion to the persuaders in the Democratic Party.
00:35:55.640
If your leader is the person most associated with running out of electricity and the lights
00:36:06.200
Is it dark because we can't keep the lights on?
00:36:10.800
Because I feel like that's the dark that's suggested by that.
00:36:20.020
I think calling him Dark Brandon is a little bit too much cultural appropriation.
00:36:31.980
But I'm a little more sensitive to these microaggressions than you are.
00:36:36.800
And I feel it's totally inappropriate that he would...
00:36:39.660
I mean, it's basically, it's the meme version of blackface.
00:36:45.300
And I'm totally offended on behalf of other people.
00:36:51.040
If I've taught you nothing, never be offended yourself because that would make you look weak.
00:36:56.980
But you should always be white knighting for people you've never met.
00:37:00.140
There are other people totally offended by this.
00:37:02.560
So on behalf of other people I don't know and will probably never talk to on this topic,
00:37:07.020
I'm deeply offended that they're calling him Dark Brandon and appropriating somebody's cultural property.
00:37:14.740
Well, Kyle Bass, who I love for being a now-spoken critic of China, is also a critic of ESG.
00:37:24.880
And he says these policies that are ESG-driven are going to end up starving the poor children of the world and killing many of them.
00:37:33.580
And he says, I can't believe it's not on the front page of every paper every day.
00:37:41.520
But I can tell you that in a month or two, Dilbert's company will be involved with ESG,
00:37:48.160
and they will destroy the entire nation of Elbonia.
00:37:53.840
So everybody in Elbonia will die because of ESG in the coming months.
00:38:00.700
I do have a series on ESG coming out in the Dilbert comic in September.
00:38:05.500
So when that comes out, and by the way, I don't know if they'll get published,
00:38:10.720
because they're a little bit edgier than what I usually do.
00:38:14.400
So I've promised that I would try to kill ESG by the end of the year by mocking it out of existence.
00:38:28.200
So what about this business of Biden calling the mega people semi-fascists?
00:38:36.180
And then his Jean-Pierre, whatever her name is, his spokesperson sort of went with it
00:38:44.940
and just sort of confirmed that's how he feels.
00:38:52.960
All right, remember, if this were Trump, I would rate it for whether it's effective persuasion-wise
00:39:00.480
and separate from whether he should do it, right?
00:39:03.160
So I'm looking at the skill level separate from the ethics or, you know, whether you should do it or not.
00:39:08.940
And I'm going to give Biden an A-plus for persuasion.
00:39:13.660
I'm wondering if somebody's advising him, because all this stuff that people are complaining about,
00:39:26.520
So what Biden has done is he's created a confirmation bias trap
00:39:32.740
that the Republicans are just going to walk into.
00:39:35.500
And the trap is he's created this conversation about whether Republicans are indeed,
00:39:42.060
or any of them, the MAGA ones, whether or not they are semi-fascist.
00:39:48.140
Now, you say to yourself, that's not even a fair question.
00:39:57.380
has he successfully put that in the public mind?
00:40:01.620
Yes, so the first part of persuasion is he got your attention successfully,
00:40:06.800
and he created a, let's say, a structure that everybody has to react to, right?
00:40:12.660
So he's created the story, and the rest of us just have to react to it.
00:40:20.500
Now, that's not saying how we react, because that could still go wrong.
00:40:25.100
But he's put it out there, he's created a structure, and we're reacting to it.
00:40:29.560
If Trump had done this, I'd say, look at Trump controlling the narrative.
00:40:42.600
If you think this is old man doddering, I don't think so.
00:40:51.340
It looks like they had some testing of this or something.
00:40:54.500
And the reason that this is so brilliant is that it depends on future things.
00:41:02.220
That's where I think the professionals are involved.
00:41:06.480
Because it's a trap that isn't so important today.
00:41:11.160
It will become important when future events are wrapped into the narrative.
00:41:17.640
So now anything that the Republicans say or do can get wrapped back into that narrative.
00:41:24.300
Is that something that Republicans have ever done to Democrats?
00:41:29.200
Can you think of a case where anybody on, let's say, the Trump side had done a similar...
00:41:41.080
Because once you've started this narrative that Democrats are groomers,
00:41:45.700
then suddenly there's a whole bunch of stories that coincidentally fit into it, right?
00:41:59.020
And I think I talked about it at the time, if I recall.
00:42:01.320
Once you say Crooked Hillary, then anything that comes up that sounds a little sketchy
00:42:33.120
That's exactly the same persuasion trick as semi-fascists.
00:42:38.420
When I said Republicans will be hunted, I'm specifically talking about the future.
00:42:44.280
Did I put that thought in a lot of people's heads?
00:42:49.580
Did I put a narrative out there that people have to react to?
00:42:56.820
In fact, even today, somebody sent it to me and said,
00:43:01.660
I'm making people debate whether Republicans are being hunted.
00:43:11.740
It's the fact that I created a narrative that would be favorable for future events.
00:43:20.120
So you should recognize the technique, whether it comes from me or it comes from a politician.
00:43:32.140
And it's the same thing that Biden is using now.
00:43:36.040
And I think Crooked Hillary, either consciously or unconsciously, was the same strategy.
00:43:43.440
When somebody proposes a narrative that you have to react to, they're building a confirmation bias trap.
00:43:50.320
And everything will get sucked into that narrative.
00:44:01.040
And I even saw Harold Ford excusing, so if you don't know, Harold Ford appears on The Five quite often.
00:44:10.620
And I guess his brand, if you could put it that way, is reasonable Democrat.
00:44:18.840
So Harold Ford generally will agree with Republicans when they have a good point.
00:44:25.240
And he will disagree with them when they have a crazy point.
00:44:28.480
So in other words, he's trying to establish himself as the only reasonable person in the room.
00:44:34.460
And he does a good job of it, as a matter of fact.
00:44:36.600
And I would also give him a compliment that I thought when he started on The Five, he was a little stiff.
00:44:44.320
But I think largely with Greg Gottfeld's help and probably Jesse as well, he's loosened up.
00:44:52.560
So now he's like a much more interesting character on the thing.
00:44:57.220
So to Harold Ford, I'd say congratulations on an evolution to a strong player.
00:45:09.600
But even he excused the semi-fascist language as just campaign talk.
00:45:15.620
But he was reminded that this is not like regular campaign talk.
00:45:21.280
Normally Trump, for example, would go after sometimes a celebrity who went after him or a public figure.
00:45:29.320
But he didn't go after entire categories of citizens.
00:45:39.180
That's really, really out of line for a leader.
00:45:45.120
But it's definitely, you know, it's not the country I want to live in, right?
00:45:51.360
So to me, that degrades our experience a little bit.
00:45:56.260
But I suppose everything's been degrading lately, so it's not that different.
00:46:03.640
Anyway, and the most amazing thing I saw is, you know, I keep tracking CNN's evolution.
00:46:10.620
As they have stated, their attempt to be more middle of the road.
00:46:17.660
And the thing I've been watching to see if they're actually serious about that is their most vocal Republican slash Trump critics.
00:46:26.940
And one of them is Collinson, Stephen Collinson.
00:46:30.900
He usually does, well, he's one of like three or four people who do opinion pieces on CNN's website.
00:46:36.680
And usually those opinion pieces are just like, you know, crazy anti-Trump stuff.
00:46:42.980
And he actually said on CNN that Biden went too far with the semi-fascist thing.
00:46:51.760
Now, do you think you would have seen that six months ago?
00:46:55.920
Do you think that the most anti-Trump person would have said that the semi-fascist thing was a political mistake?
00:47:07.640
This actually looks like CNN is making a legitimate attempt to find the middle.
00:47:15.020
Now, are you going to be mad at me if I compliment them for that change?
00:47:26.880
So if you say it's too soon, I agree with you, it's too soon.
00:47:40.340
Remember, finding the middle is largely impossible for anybody.
00:47:52.880
We'll know if they're trying to find the middle.
00:47:55.640
If they try to find the middle and you can see the effort, but in your opinion, they're not finding the middle, that's a different criticism.
00:48:03.220
Because nobody's going to agree what the middle is.
00:48:04.960
So I'm going to actually throw some support toward their new leader, CEO.
00:48:27.500
I respect that something real seems to be happening there.
00:48:49.780
People are telling me different pronunciations for the CEO of CNN, so I guess it's an open question.
00:49:01.340
And I always like to look for the thing that we're not being told.
00:49:05.440
So what is the main thing we're not being told about those boxes?
00:49:36.800
So I guess the federal prosecutors said they're going to wait until after November election
00:49:41.100
to announce any charges against Trump if they determined he broke the laws.
00:49:58.340
The news is Trump has not been charged with anything.
00:50:02.140
And if you asked experts, they'd say you probably won't be.
00:50:12.440
Federal prosecutors are likely to wait until after November election to announce any charges
00:50:19.880
Doesn't that make you think like they have the charges and it's likely to happen?
00:50:25.040
And then it goes, if they determined he broke laws.
00:50:32.560
How about putting that at the start of the sentence?
00:50:36.040
How about if they find he broke any laws, we wouldn't know about it until November.
00:50:42.040
But no laws have been found to be broken or something like that.
00:50:45.600
Although you could argue that they have found broken laws.
00:50:48.860
So we don't know enough to know if they found anything.
00:50:55.040
Who was the GSA dealing with on the Trump staff who was not giving them what they wanted?
00:51:05.520
So we learned that, for example, Trump lawyer Christina Bob signed a document certifying
00:51:13.920
on behalf of Trump's office that all of the documents had been returned, though that was not true.
00:51:18.780
Now, what happens if Trump's lawyer signed something saying all the documents had been returned,
00:51:31.580
Does Trump go to jail if the lawyer said they had been returned and Trump was silent on the question?
00:51:39.800
If your lawyer says something's been done legally, you don't know one way or the other because you're not down there at the boxes.
00:51:47.680
Now, do you think that Trump could have asked the lawyer to lie and then the lawyer would lie on this question?
00:51:57.400
Do you think you could get a lawyer who could work at this level?
00:52:00.920
I mean, an ex-president's lawyer's got to have some credentials.
00:52:06.980
I don't believe you could get a lawyer to lie about something so easily disproven.
00:52:11.860
You might be able to get a lawyer to lie about something that can't be proven one way or the other.
00:52:19.280
But this is something that could have easily been determined to be a lie.
00:52:39.640
Lawyers who are being asked to lie by their client when they know the lie will almost certainly be detected.
00:52:54.500
Now, yeah, granted, they're professional liars,
00:52:56.400
but they lie when they won't get caught or that it's perfectly legal,
00:53:04.260
I don't think they lie when they know they're going to get caught, do they?
00:53:17.920
But to me, this looks like a glaring signal of a bureaucratic problem,
00:53:24.440
meaning that not everybody knew what was happening.
00:53:28.400
Was the lawyer personally going through the boxes?
00:53:45.260
whose hands and eyes were primarily the person handling the boxes?
00:53:49.800
Because whoever's hands and eyes were primarily handling the boxes
00:54:00.500
we're going to find out somebody was just mistaken.
00:54:20.920
So these are several assumptions building a story.
00:55:42.220
So whenever you look at this Trump boxes stuff,
00:55:54.520
And the narrative that we're being given about this
00:56:09.660
liked to sometimes play with them in his office
00:56:17.200
Completely knew that he was doing something wrong
00:56:36.940
what about the ones that were in Trump's office?
00:57:03.800
and it should have been in a more secure place?