ManoWhisper
Home
Shows
About
Search
Real Coffee with Scott Adams
- August 01, 2022
Episode 1822 Scott Adams: Most Of The News Today Is Fake And Kind Of Funny
Episode Stats
Length
58 minutes
Words per Minute
144.02664
Word Count
8,392
Sentence Count
407
Misogynist Sentences
4
Hate Speech Sentences
20
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.220
Well, once again, let me inform you, I do not have monkeypox, but if you want to know
00:00:06.460
what a bad luck looks like, bad luck looks like injuring your lip during a monkeypox.
00:00:15.100
I know, it looks like I put my lip in the wrong place, doesn't it?
00:00:18.820
But no, no, just a burn.
00:00:21.980
All right, you seem like you're in Texas, somebody says.
00:00:25.760
Good morning, everybody. How would you like to enjoy today's show more than any show you've ever enjoyed?
00:00:35.920
I know, pretty good stuff coming.
00:00:38.960
And all you need is a cup or mug or a glass, a canteen jug or a glass.
00:00:43.000
Hold on, canteen jug or flask.
00:00:49.420
I'm going to get this right. I'm trying to remember my own stuff.
00:00:52.520
And all you need, all you need is a cup or a mug or a glass, a canteen jug or a flask.
00:01:00.340
You can't take your chalice or stein, a vessel of any kind.
00:01:04.840
Fill it with your favorite liquid.
00:01:07.340
I like coffee.
00:01:09.440
And join me now for the unparalleled pleasure,
00:01:12.220
the dopamine to end of the day, the thing that makes everything better.
00:01:16.120
It's called the Simultaneous Sip, and it happens now.
00:01:19.840
Go.
00:01:22.520
I apologize for my dainty cup.
00:01:27.440
This is no way to drink coffee.
00:01:29.900
No, coffee does not go in a dainty cup unless you're visiting the Queen.
00:01:34.320
And even then, you should be drinking tea.
00:01:36.400
So there's no right time for a little teacup.
00:01:40.200
Well, I have a question about the Democrat strategy.
00:01:44.020
I feel as if the strategy is not to remove Biden from office for incompetence,
00:01:52.080
but rather to make him slowly disappear.
00:01:55.740
I feel like that's the strategy.
00:01:59.400
For example, first he got COVID, and then he had to quarantine.
00:02:05.060
So that was good.
00:02:06.260
But then he got out, and then they said, well, we're going to have to give him COVID again.
00:02:11.440
So he gets COVID back to back.
00:02:13.320
But what I'm expecting is somewhere around the end of this next quarantine period,
00:02:19.340
I think we're going to hear that he has something called serial COVID.
00:02:24.420
Serial COVID.
00:02:25.740
It just keeps coming back.
00:02:27.180
And he's probably got a comorbidity of TDS.
00:02:31.580
So if you add the comorbidity of TDS to the serial repeating COVID,
00:02:38.420
well, Joe Biden is just going to have to hide in his basement forever.
00:02:42.260
Now, I'm not sure about this next part, but does it seem to you he's getting thinner?
00:02:48.460
Has anybody noticed he seems to be getting thinner?
00:02:50.660
He was always quite fit.
00:02:52.680
He seems to be getting thinner.
00:02:53.540
I think they're feeding him less.
00:02:54.880
Yeah, I think they're slowly trying to make him disappear.
00:02:59.380
Like they'll feed him a little less, and he'll just get a little smaller.
00:03:02.560
And one day you'll see a video of him in his basement.
00:03:05.960
He'll be like 85 pounds.
00:03:08.380
And, you know, it's not until you get down to like 65 pounds that you can, you know,
00:03:12.140
say he's actually gone.
00:03:13.520
But I think the cat is on the roof.
00:03:15.940
They're slowly trying to disappear him.
00:03:18.740
You know, first his public persona, and then they'll shrink him with lack of food
00:03:22.940
until he's a little ball about this big, and they'll have, you know, just keep him alive
00:03:27.680
with tubes and stuff.
00:03:28.920
I feel like that's the strategy.
00:03:31.040
I don't know.
00:03:31.440
I'm just guessing.
00:03:33.820
Well, what do you think would happen in a hypothetical matchup for president between Biden and Trump
00:03:39.060
if it were to happen today?
00:03:41.060
What do you think a Rasmussen poll showed?
00:03:43.280
Let's see if you're smart.
00:03:45.620
Who do you think would win as of today?
00:03:50.260
Well, according to Rasmussen, if the election were held today, 40% of likely voters would
00:03:55.440
vote for Biden, and 46% would vote for Trump.
00:03:59.680
But another 10% would choose some other candidate if it were just those two and somebody else.
00:04:06.900
10% for another candidate.
00:04:08.580
So what do you think of Yang's new political party?
00:04:15.860
Who is he going to take votes away from?
00:04:21.120
Yeah, I think I heard somebody say he would take votes away from Republicans, but it was
00:04:27.100
the Republicans who were going to vote for a Democrat anyway.
00:04:30.580
I don't know.
00:04:31.260
Maybe.
00:04:34.160
Yang will yank votes.
00:04:35.500
Yeah, I don't...
00:04:39.520
It's hard to see how it's helpful, but maybe.
00:04:43.140
You know, if Yang can control 10% of the vote, he's a kingmaker, right?
00:04:49.400
So it's not a bad strategy.
00:04:51.540
If, you know, you've got all these sort of in-between confused people.
00:04:55.980
What if?
00:04:56.760
Let's just do a what if.
00:04:58.040
What if Yang's real play is to only try to control the sensible people?
00:05:06.120
Because there aren't that many of them.
00:05:07.700
I'd say about 10%, maybe.
00:05:09.920
If 10% of the country identify with a political party, but they're willing to listen to a better
00:05:16.820
argument, it's about 10%.
00:05:20.420
He could control the whole country like Joe Manchin.
00:05:24.740
So everybody who says this Yang thing is just, you know, a sideshow and no importance, it's
00:05:29.720
not a sideshow.
00:05:30.800
It's an insurrection.
00:05:34.040
Legally.
00:05:34.960
But it's an insurrection in concept.
00:05:37.120
In the sense that if he succeeds, he will be effectively the president, just by having
00:05:43.200
a third party that actually makes sense.
00:05:46.080
And here's something that I think Andrew Yang could do that so far nobody else was able to do.
00:05:53.820
Be reasonable.
00:05:56.180
Nobody tried that yet, right?
00:05:58.920
Nobody really tried being reasonable yet.
00:06:02.100
There have been third party attempts, but they always seem, you know, sort of,
00:06:06.120
ideologically driven.
00:06:09.020
So what would happen if you had a third party where the only thing they're trying to do is
00:06:13.680
figure out what makes sense?
00:06:16.120
You know, follow the science.
00:06:17.900
What if you had, how about this?
00:06:19.820
What if you had to follow the science party?
00:06:22.940
You know, I guess the Democrats think they are that, but evidence suggests maybe not so much.
00:06:29.320
I don't know.
00:06:30.680
So I think if you ignore this Yang situation, you do it at your peril.
00:06:34.900
Because if he gets just 10% of people on board, and it's the reasonable people who could switch
00:06:40.220
sides in a situation, that's real power.
00:06:45.120
That's real power.
00:06:46.260
Define science.
00:06:47.440
Yeah.
00:06:48.200
Good question.
00:06:51.000
Well, how would you like a story about how I'm so right?
00:06:55.880
That's your favorite story, isn't it?
00:06:57.220
No, your favorite story is when I'm wrong, and I admit it.
00:07:01.960
I know.
00:07:02.600
I know.
00:07:03.240
I know how you are.
00:07:04.800
You'd rather me be wrong.
00:07:06.440
But I'm sorry.
00:07:07.820
I'm as right as right can be.
00:07:09.260
Because it turns out that there's a new Harvard University study, and they tried to look into
00:07:14.640
the motivations of the Capitol rioters.
00:07:18.660
Now, they used an interesting term here, rioters.
00:07:23.120
So I don't know if that excludes people that they thought were just protesting.
00:07:27.880
So I have a little question about how they defined that.
00:07:31.720
But they found out that only 8% of the people who they called rioters believed they were there
00:07:37.520
for an insurrection.
00:07:38.460
And 92% of them believed they were there to save the democracy.
00:07:44.280
Save the republic.
00:07:45.260
Now, is there anybody you know who said, hey, I think that J-6 committee is making us think
00:07:54.100
past the sale?
00:07:55.820
The sale is what did the people who were at the event believe?
00:07:59.540
Because they started with they want an insurrection, and then let's look at all the facts.
00:08:04.140
No, you don't start with they want an insurrection.
00:08:07.380
You have to demonstrate that.
00:08:09.480
And when anybody did, well, I guess this is probably the first one.
00:08:13.300
I don't think there's been another.
00:08:15.260
Correct me if I'm wrong.
00:08:17.380
But I don't think there's been another study, and certainly not from Harvard University,
00:08:21.920
which at least the left would be inclined to believe, that says that basically 92% of
00:08:27.560
their people believed Trump, that there was some sketchiness to the election, and that
00:08:34.040
they were there to fix the sketchiness.
00:08:36.400
They were not there to overthrow the government.
00:08:39.000
They were there to fix it.
00:08:40.020
Now, what does the January 6th committee and all of the mainstream news do with the fact
00:08:46.220
that the very foundation, the most important part of their narrative, has collapsed, and
00:08:53.400
it was destroyed by their own side?
00:08:55.220
If you call anything coming out of Harvard left-leaning, and I think you could make that assumption.
00:09:02.020
Now, even the person involved in the study said he was quite surprised.
00:09:09.800
He was surprised.
00:09:11.100
And the results were that, quote, many believed they were defending democracy from, quote, imminent
00:09:19.480
existential danger, unquote, unquote.
00:09:22.700
How does the news handle this?
00:09:32.760
The only way the mainstream news can handle this new information is by completely disappearing
00:09:38.360
it.
00:09:39.520
It will be disappeared right in front of you.
00:09:43.080
Do you think there's any way that if Harvard University did a study, and it came out showing
00:09:48.640
that 92% of the attendees wanted a literal insurrection to install a dictator?
00:09:55.920
Suppose it had gone that way.
00:09:58.320
Do you think the news would cover it?
00:10:01.620
I think so.
00:10:03.660
Yeah.
00:10:04.460
I think they might.
00:10:05.780
I think they might cover it if the people said they were there for an insurrection.
00:10:09.400
But what if the people say that they were not there for an insurrection, they were there
00:10:13.560
to fix the republic, exactly what it looked like?
00:10:16.720
To me, I mean, that's exactly what it looked like to me.
00:10:20.700
Is there anybody who even doubted that, who's watching this?
00:10:24.240
Is there anybody here who didn't have the same impression that they were there for good
00:10:30.920
intentions, even if they had bad information?
00:10:33.860
We don't know if they had bad information, but if they did.
00:10:39.740
So this is fascinating.
00:10:42.160
The degree to which the fake news no longer needs to pretend is sort of stunning, isn't
00:10:48.220
it?
00:10:49.740
Are you surprised that the fake news doesn't need to pretend anymore?
00:10:53.800
They're really not trying to hide this.
00:10:57.280
This is not hidden at all.
00:10:59.000
Well, it's hidden in the sense that you'd have to go to Breitbart in this case.
00:11:03.380
So I found out about the Harvard study by looking at Joel Pollack's article in Breitbart.
00:11:12.360
And Jonathan Turley's been on this point as well.
00:11:15.760
Now, let me ask you this.
00:11:17.960
Who in the country of, let's say, a political public figure?
00:11:23.740
What political public figure has been telling you, and I haven't heard one other person
00:11:30.340
say it, of political, you know, let's say public figures.
00:11:35.120
What other public figure told you that they were thinking past the sale and that the most
00:11:40.020
important question is this?
00:11:42.180
What were they thinking when they did it?
00:11:44.160
It was only me.
00:11:46.700
Right?
00:11:47.760
I'm not wrong, am I?
00:11:48.840
But I believe it was only me.
00:11:52.440
In the whole fucking world, I think it was only me.
00:11:55.680
That's why you watch me.
00:11:58.960
All right.
00:11:59.920
And then I've got a second question about the 8% who did think they were there for an insurrection.
00:12:05.860
Do you think that if you really drilled down with the 8% that you would find out that they
00:12:09.880
really wanted an insurrection?
00:12:12.920
Maybe.
00:12:13.900
Yeah, I know some of you are saying the 8% are feds.
00:12:16.100
That's funny.
00:12:17.360
There might be some feds in that.
00:12:18.840
Who knows?
00:12:19.420
I doubt that's who got interviewed.
00:12:21.940
But, for the poll.
00:12:25.260
But what do you think they wanted?
00:12:26.720
Do you think that if you had asked them, they would have said, well, our end goal is to
00:12:31.160
destroy the country?
00:12:33.720
Some might.
00:12:34.660
You know, they might be anarchists.
00:12:36.320
Do you think they would have said their end goal is to install Trump as an actual dictator?
00:12:42.560
I doubt it.
00:12:43.460
I doubt it, because he wouldn't want the job.
00:12:46.740
I mean, I've seen no indication that he would want to be the dictator.
00:12:51.660
I mean, in the sense that it's impractical.
00:12:55.520
If it were practical, maybe that's another conversation.
00:12:58.420
But given that it's so obviously impractical, who did they think was going to be the dictator?
00:13:03.840
Or were they going to overthrow it for some other form of democracy?
00:13:08.480
See, if you drill down on the insurrectionists, I think you would find out that they're either
00:13:12.320
literally crazy, or what they wanted was a little bit closer to fixing the republic after all.
00:13:17.780
It's just they had a different view of what that looks like.
00:13:21.120
I feel like the insurrectionists were probably closer to the regular people than we think.
00:13:26.740
They just like to use more hyperbole, and maybe they're in militias, and they just talk that way.
00:13:31.800
And the other thing that I've been saying forever, that the Democrats don't seem to recognize,
00:13:39.140
is that there's a way that Republicans talk about overthrowing their government
00:13:44.320
that is, first of all, very healthy, because Republicans are always on the verge of,
00:13:51.640
don't push us, don't push us.
00:13:54.060
That's a healthy tension.
00:13:57.240
But it's never that serious, in my opinion.
00:14:00.000
They're not really that serious about overthrowing the government.
00:14:02.880
They're just making it clear that there's a line.
00:14:06.460
Don't cross that line.
00:14:09.540
All right, here's my next fake news story from Newsweek.
00:14:14.960
I'm not sure if this is, yeah, I'll call it fake news.
00:14:17.560
It's in the category of fake news.
00:14:19.140
So here's the title of the story.
00:14:20.600
Tell me what you think the story would be about if you saw this headline in Newsweek.
00:14:26.300
Fentanyl surge started and peaked under Trump, despite the GOP blaming Biden.
00:14:33.000
Wow, the fentanyl surge started and peaked.
00:14:38.180
It was the highest under Trump.
00:14:39.840
Wow.
00:14:41.040
So what do you think that story says?
00:14:43.360
Do you think the story says, if you're coming in late, it's a burn?
00:14:48.420
I burned myself on twice-heated soup and a piece of spinach that wouldn't leave my lip.
00:14:54.900
It was quite painful.
00:14:56.180
But no, it's not monkeypox or anything else.
00:15:01.420
Sorry for that diversion.
00:15:03.380
So the Newsweek story about fentanyl surge under Trump,
00:15:06.940
when you dig down into the story,
00:15:08.820
it says that Trump seized more fentanyl coming across the border than Biden did.
00:15:14.280
Just let that sink in.
00:15:19.660
Just let it...
00:15:20.760
Let that sink in.
00:15:23.060
This is Newsweek.
00:15:25.560
It's Newsweek.
00:15:26.980
They are a discredited organization, I give you that.
00:15:30.240
But not everybody knows it.
00:15:32.160
If you're 75 and you grew up reading Newsweek,
00:15:35.420
and you see a Newsweek headline,
00:15:37.400
do you think that Newsweek is a discredited news organization?
00:15:40.240
Maybe you don't know that.
00:15:43.780
I mean, those of us who are a little more sophisticated in our knowledge,
00:15:47.840
we know that.
00:15:48.960
But not everybody knows that.
00:15:51.140
So I think that when they see a fentanyl surge,
00:15:53.660
they're going to think it's a surge in overdose deaths, don't you?
00:15:57.520
But in fact, the overdose death rate is the highest under Biden.
00:16:01.940
So not only did Trump do a stronger job of seizing fentanyl at the border,
00:16:09.140
according to Newsweek,
00:16:10.860
but there were also more overdose deaths under Biden.
00:16:15.060
Both of the facts in the story are pro-Trump.
00:16:17.940
And they made them look like they were negative.
00:16:22.000
And this is right in front of you,
00:16:24.440
because the title simply just doesn't match the content.
00:16:28.100
And they'll do this over and over again.
00:16:30.160
Title doesn't match the content.
00:16:31.620
Title doesn't match the content.
00:16:34.420
All right, here's another fake news out of CNN.
00:16:37.820
So I love learning who the opinion hit piece people are.
00:16:44.140
So there's this guy, Dean Obadila,
00:16:47.100
I might be pronouncing it wrong, not intentionally,
00:16:50.340
who does opinion pieces for CNN.
00:16:52.480
And he's often a Trump attack dog.
00:16:56.860
You know, he'll just attack Trump for anything, and Republicans.
00:16:59.480
And CNN and all of the fake news had a little trouble yesterday,
00:17:05.180
because their fake news was that the GOP doesn't like veterans
00:17:10.020
and vetoed or didn't vote for a bill to protect them and their health
00:17:16.080
from the so-called burn pits danger.
00:17:18.640
And of course, because Jon Stewart was saying that the GOP voted against something
00:17:25.640
that they had recently voted for,
00:17:27.900
it looked like it was all political,
00:17:30.000
and they were painting the GOP as a bunch of political bastards
00:17:33.160
who didn't really care about veterans.
00:17:34.840
What do you think the real story was?
00:17:38.920
The real story, of course, was that the Democrats put a whole bunch of pork,
00:17:43.380
like 400 billion or some gigantic number, I don't know what it was,
00:17:47.140
some big number, of pork in the bill,
00:17:50.340
and the Republicans said,
00:17:51.540
I don't care how good the bill is, you can't do that.
00:17:53.980
Yeah, it was a poison pill.
00:17:54.940
So do you think that Dean Obadiah,
00:17:58.980
who is their, I would say he's among their lowest level attack dogs,
00:18:03.640
and by lowest level, I mean,
00:18:05.180
I feel like he gets the assignment that you can't win unless you just lie.
00:18:11.080
Like you can't make your case
00:18:12.660
unless you're willing to stretch or lie or omit,
00:18:16.420
you know, leave out context.
00:18:17.660
It's the only way you can get it done.
00:18:19.720
So apparently there are some people
00:18:21.360
who do opinion pieces for CNN
00:18:23.620
who are consistently willing to, let's say,
00:18:28.200
test the outer boundaries of fake news.
00:18:34.880
But Dean completely leaves out the context
00:18:37.960
that's the only context that matters,
00:18:40.480
which is the Republicans didn't like all the pork in the bill.
00:18:44.060
Do me a fact check.
00:18:45.520
I'm right about that, right?
00:18:47.300
Am I correct?
00:18:48.320
The only thing the Republicans didn't like
00:18:50.560
was the extra stuff that got added on,
00:18:53.620
as a nerdy trick, basically.
00:18:56.860
So if you don't mention that,
00:18:59.880
you have so little respect for your readers.
00:19:02.940
So little respect.
00:19:05.000
It was, and in fact,
00:19:08.000
Dean Obadiah characterized the GOP's action as,
00:19:12.660
quote,
00:19:12.980
the GOP is trying to score political points
00:19:16.360
by delaying this vital piece of legislation.
00:19:19.720
Is that what they're doing?
00:19:21.260
They're trying to score political points
00:19:23.760
by making themselves look bad
00:19:26.380
by stopping pork.
00:19:29.440
It looks like it's exactly the opposite of that.
00:19:32.560
What this looks like is that
00:19:34.360
politicians put their own benefit
00:19:37.380
lower than the country's.
00:19:41.300
Now, it's hard to imagine anybody would do that,
00:19:45.160
so I'm not sure that we could read their minds
00:19:47.340
and conclude that they were just operating out of principle,
00:19:51.000
but it looks like it.
00:19:53.280
It looks like it.
00:19:54.380
Again, I can't read their minds,
00:19:56.480
but if you're a Republican,
00:19:57.920
would you want to turn down something
00:20:00.060
that was a veteran health bill?
00:20:02.900
A Republican.
00:20:04.440
What Republican thinks it's good
00:20:06.080
to vote against a veteran health bill,
00:20:10.400
even if there's a reason that's technical
00:20:13.540
that has to do with the pork?
00:20:15.280
That never looks good.
00:20:16.780
That never looks good.
00:20:17.460
But a lot of Republicans did it, apparently,
00:20:20.580
or are going to do it or whatever.
00:20:24.020
Did it.
00:20:26.180
So are we mad at Republicans
00:20:30.120
for standing on principle
00:20:31.660
against their own self-interest?
00:20:34.340
Because that kind of looks like what happened.
00:20:37.400
I feel that they were not exactly
00:20:39.460
pursuing their own self-interest.
00:20:42.180
Because here's what would have looked good
00:20:44.480
for the politician.
00:20:45.800
I voted for that bill.
00:20:47.880
Because I like veterans.
00:20:49.860
By the time you got, you know,
00:20:51.540
campaigning for re-election,
00:20:53.220
you could just say that
00:20:54.440
and people would believe it.
00:20:55.780
I'm one of the few Republicans
00:20:57.080
who voted for helping veterans.
00:21:00.920
Nobody's going to look into that pork
00:21:02.520
part of the problem.
00:21:04.780
Yeah, go read the bill, somebody says.
00:21:09.060
So I floated the idea
00:21:10.600
that every bill should have
00:21:11.980
a list of ingredients.
00:21:16.340
So you can see what's in the bill,
00:21:17.800
good and bad.
00:21:19.260
Like the bad might be,
00:21:20.580
might raise taxes on some people.
00:21:23.360
The good might,
00:21:24.120
you'd need the, I don't know,
00:21:25.320
somebody to score that,
00:21:26.500
the OMB or something.
00:21:27.660
But the good would be,
00:21:28.740
here are the things we want to accomplish.
00:21:30.920
And then here's the price.
00:21:32.960
The price is,
00:21:34.000
this might cost more,
00:21:35.480
this might cause inflation,
00:21:37.000
this might raise the debt.
00:21:38.000
Just list the ingredients,
00:21:41.180
the good stuff and the bad stuff.
00:21:43.160
And then somebody else mentioned
00:21:44.960
that there should be a requirement
00:21:46.740
for a one-page executive summary.
00:21:49.920
I like it.
00:21:51.440
A requirement for a one-page executive summary.
00:21:54.520
Now you might need,
00:21:55.940
you know,
00:21:56.780
a paragraph or two
00:21:58.160
for each part of the bill
00:21:59.860
and there might be lots of them.
00:22:01.220
So, you know,
00:22:02.020
maybe really it's more like
00:22:03.360
a paragraph for each component
00:22:05.240
or something like that.
00:22:06.580
But it needs to be readable
00:22:08.140
by the public.
00:22:09.840
And I would say that
00:22:10.760
if the public,
00:22:12.040
if the average voter
00:22:13.280
can't understand the bill,
00:22:15.300
then it should not be voted on.
00:22:17.980
In fact,
00:22:18.700
I'd love to see each bill
00:22:20.740
be scored for,
00:22:22.100
let's say,
00:22:23.320
clarity.
00:22:25.040
Scored for clarity.
00:22:26.440
Give some independent group
00:22:27.520
to say,
00:22:28.520
all right,
00:22:28.960
we talked about this bill
00:22:31.460
with, let's say,
00:22:32.420
a group of volunteers
00:22:34.100
and they couldn't understand it.
00:22:37.160
So we score this,
00:22:38.300
you cannot vote on this.
00:22:40.020
I think the Congress
00:22:41.140
should be prevented
00:22:42.280
from voting on anything
00:22:44.160
the public doesn't understand.
00:22:46.540
You know,
00:22:46.980
by a majority anyway.
00:22:48.720
What do you think?
00:22:49.940
The Congress wouldn't be allowed
00:22:52.100
to vote for anything
00:22:53.180
the public doesn't understand.
00:22:58.040
Yeah,
00:22:58.580
if you're joining late,
00:23:00.300
I'll just tell everybody,
00:23:01.520
this is a burn
00:23:02.540
on my lip.
00:23:04.100
If you think it's more,
00:23:05.760
something more exciting,
00:23:07.020
I kind of wish it were.
00:23:09.220
But it's just a burn.
00:23:13.020
A score for readability
00:23:14.500
would be something
00:23:15.900
that AI could do.
00:23:17.140
There you go.
00:23:18.520
Artificial intelligence
00:23:19.600
could write a summary
00:23:20.500
of every bill.
00:23:22.520
How about that?
00:23:23.900
How about that?
00:23:25.360
Those are pretty good ideas.
00:23:26.360
All right.
00:23:30.140
Here's a provocative idea
00:23:32.560
that I don't think
00:23:33.600
that I'm quite in favor of,
00:23:36.160
but I want to share it with you.
00:23:38.160
And it comes from a doctor.
00:23:40.420
All right.
00:23:40.740
So this is not from me.
00:23:41.820
This is from a doctor.
00:23:44.360
On Twitter,
00:23:45.600
he goes by Doc Anarchy.
00:23:47.580
He's got a sub stack
00:23:48.680
you might be interested in as well.
00:23:50.360
It has lots of interesting
00:23:51.820
and provocative ideas
00:23:54.020
that are not quite
00:23:55.700
the normal point of view
00:23:58.000
that you see.
00:23:58.600
So it's worth seeing it
00:23:59.660
even if you disagree with him.
00:24:01.560
I love seeing an actual medical doctor
00:24:03.800
who's thinking through things critically.
00:24:05.840
So that alone,
00:24:08.040
that alone is worth doing.
00:24:10.860
All right.
00:24:11.280
So if you want somebody to follow
00:24:12.700
that you're not following,
00:24:13.660
follow
00:24:13.960
at
00:24:15.600
Dr. Anarchist
00:24:17.160
for interesting points of view.
00:24:20.400
Not ones you're all going to agree with.
00:24:22.620
All right.
00:24:22.780
So it's not about agreeing.
00:24:24.360
They're just interesting points of view.
00:24:29.040
All right.
00:24:30.300
Here it is.
00:24:31.120
His interesting points of view.
00:24:32.800
Nobody overdoses on prescription opioids.
00:24:35.160
I don't know if that's true completely.
00:24:37.880
This sounds like a little bit of an overclaim.
00:24:40.340
But he's a doctor.
00:24:43.100
So let's say it might be hyperbole,
00:24:45.540
but don't get too caught up
00:24:46.960
on whether that's 100% true.
00:24:49.340
Let's say it's 95% true.
00:24:52.420
It's probably closer to 95% true.
00:24:55.340
But we'll allow a little hyperbole.
00:24:57.740
Okay.
00:24:58.380
And then he goes on,
00:24:59.500
eliminate the black market for opioids
00:25:01.440
and you'll eliminate 99% of overdoses.
00:25:03.820
Now, remember,
00:25:05.500
he's a doctor.
00:25:07.280
He's a doctor.
00:25:08.680
So at least anecdotally,
00:25:10.660
and probably he's looked at studies too,
00:25:13.580
that it's hard to overdose
00:25:15.360
on a prescription med
00:25:16.940
because you would know
00:25:18.900
what the overdose amount would be.
00:25:23.220
Now, don't...
00:25:24.320
Before you question him,
00:25:26.640
you should do your research
00:25:27.800
because the addicts themselves
00:25:30.660
will tell you
00:25:31.280
that the real danger
00:25:32.140
is the fake stuff.
00:25:34.080
Now, again,
00:25:35.020
I'm operating purely anecdotally
00:25:36.940
right here, right?
00:25:38.260
But my own stepson
00:25:40.040
told me in direct language
00:25:42.960
that he wouldn't take counterfeit pills
00:25:44.760
because they probably have fentanyl in them
00:25:47.220
and you can't know what you're getting.
00:25:49.040
And then he died of an overdose.
00:25:51.680
So he probably did exactly
00:25:53.100
what he said he wouldn't do
00:25:54.000
the week before,
00:25:54.780
which is why I question
00:25:55.700
whether this would work.
00:25:56.620
But anyway, let's get to that.
00:26:00.140
So if you eliminated
00:26:01.100
the black market for opioids,
00:26:02.620
you wouldn't get the fake stuff.
00:26:05.040
If you could give legal,
00:26:07.720
well-known entities to the addicts,
00:26:11.320
would they take these legal,
00:26:13.240
well-known things
00:26:13.940
and not have overdose deaths?
00:26:17.980
Now, some of you are going to say,
00:26:21.380
yeah, we're talking about
00:26:23.980
unintentional overdoses.
00:26:25.460
We're not talking about suicide.
00:26:26.620
So, can you compare this
00:26:33.040
to the San Francisco outdoor drug thing
00:26:35.880
that looked like a huge failure?
00:26:37.600
And Michael Schellenberger
00:26:38.620
talks about that a lot.
00:26:41.080
Do you think what he's talking about
00:26:42.940
is similar to this outdoor
00:26:44.760
free drug clinics?
00:26:47.600
No.
00:26:48.620
No, he didn't say that.
00:26:50.160
You're conflating two stories.
00:26:52.320
That's the problem here.
00:26:53.380
You're conflating the homeless problem,
00:26:55.500
which he's not even talking about.
00:26:57.080
That's not his topic.
00:26:59.200
Giving free drugs to homeless people
00:27:01.220
doesn't make them be less homeless.
00:27:02.920
So, if you're trying to clean up San Francisco,
00:27:07.160
which is all the homeless people
00:27:08.940
walking around and taking over the city,
00:27:12.160
giving them free drugs
00:27:13.380
is not going to make them
00:27:14.420
leave the streets or go away or anything.
00:27:16.920
So, it doesn't do anything about homeless.
00:27:19.760
But what I don't know
00:27:20.900
is if it reduced the number of
00:27:23.220
overdoses.
00:27:26.540
I haven't heard one way or the other,
00:27:28.100
so I don't know.
00:27:29.120
But what Doc Anarchy is talking about
00:27:32.020
is really just a narrow point
00:27:34.740
that if the only kind of opioid
00:27:37.140
that was available,
00:27:38.120
and let's say there was some way
00:27:39.080
to make that happen,
00:27:40.100
if the only kind available
00:27:41.300
was high-quality control,
00:27:45.300
that that alone would
00:27:46.520
eliminate 99% of overdose deaths.
00:27:50.240
What do you think?
00:27:50.720
Now, let me be
00:27:56.000
coldly analytical here.
00:27:59.740
What would happen
00:28:00.400
if you stopped 100,000 addicts
00:28:03.400
per year from dying?
00:28:06.920
Now, this is going to come back to me.
00:28:09.640
So, do not think I'm exempting myself
00:28:13.760
from the following point.
00:28:15.120
I'm not.
00:28:15.940
I'm in this point.
00:28:17.600
If you let 100,000 addicts live
00:28:20.220
every year,
00:28:21.220
are you better off?
00:28:23.400
Now, I obviously would like
00:28:25.040
my stepson to still be alive.
00:28:27.300
But are you better off?
00:28:29.440
I hate to say it,
00:28:30.340
but you're not better off.
00:28:31.860
I might have been better off
00:28:33.160
because I love him, right?
00:28:34.660
So I loved him.
00:28:35.800
So I might be better off
00:28:37.160
because I would prefer it.
00:28:38.540
But would you be better off
00:28:40.320
with my stepson back in society?
00:28:42.940
Not even close.
00:28:44.400
Sorry.
00:28:45.640
Sorry.
00:28:46.680
If you want to be honest,
00:28:47.960
you would not be better off
00:28:49.280
with him in society.
00:28:50.780
He was not adding.
00:28:52.080
He was just purely subtracting.
00:28:54.300
Sorry.
00:28:55.060
People loved him.
00:28:56.000
I mean, he was very popular
00:28:57.480
and lovable and stuff.
00:28:58.860
But he wasn't going to help.
00:29:01.500
So you have to ask yourself
00:29:03.100
if even if you could get
00:29:04.540
what you wanted,
00:29:05.980
do you really want it?
00:29:08.000
Because that's a fair question.
00:29:11.340
It's a fair question.
00:29:12.760
Now, I do think that
00:29:13.760
some kind of ethical,
00:29:15.740
moral consideration
00:29:16.620
suggests that keeping them alive
00:29:19.020
would be the better alternative.
00:29:20.980
And that would be
00:29:21.480
a reasonable point of view.
00:29:23.300
But you can't ignore the fact
00:29:25.360
that keeping alive
00:29:26.580
100,000 liars
00:29:29.360
has some impact.
00:29:31.580
Because remember,
00:29:32.300
all addicts are liars.
00:29:33.640
You know that, right?
00:29:34.360
If you see somebody
00:29:36.760
who's lying about everything,
00:29:38.420
the first question
00:29:39.560
you should ask yourself is,
00:29:41.140
what drugs are they on?
00:29:43.000
Because often people
00:29:43.920
who are lying about everything
00:29:45.300
are covering a drug pattern.
00:29:49.900
So always look for that.
00:29:51.400
Sometimes lying about everything
00:29:52.640
means you're a narcissist.
00:29:55.800
And sometimes it means
00:29:57.060
you have a drug problem.
00:29:58.020
And sometimes it means
00:29:59.260
you're both.
00:30:01.560
I made the point
00:30:02.880
that insurance predicts all.
00:30:05.820
This is a point
00:30:06.520
I've made a number of times.
00:30:08.260
That every time
00:30:08.780
we're arguing
00:30:09.320
in the political realm
00:30:10.900
about what is
00:30:11.920
or what isn't,
00:30:12.600
what's true,
00:30:13.180
what isn't,
00:30:14.060
that you can ignore
00:30:15.120
all of that
00:30:15.840
because that's all political.
00:30:17.700
And just look
00:30:18.340
what the insurance companies do.
00:30:20.480
Because the insurance companies
00:30:21.800
don't have the option
00:30:22.720
of being political.
00:30:24.220
They just have to look
00:30:25.140
at the numbers
00:30:25.620
and say,
00:30:26.020
well,
00:30:26.640
we've got to set our premiums
00:30:28.180
based on this level of death.
00:30:29.800
And we have to adjust it
00:30:33.580
based on what changes, right?
00:30:35.900
So the insurance company
00:30:37.920
is just coldly looking at data.
00:30:40.200
And the other thing
00:30:40.860
that the insurance companies have
00:30:42.400
that you don't have
00:30:44.140
is their internal data.
00:30:46.740
So an insurance company
00:30:48.300
in 2022
00:30:49.040
can tell you the following.
00:30:51.180
It can tell you
00:30:52.060
which zip codes
00:30:53.060
had the most vaccinations,
00:30:55.140
the highest vaccination rates.
00:30:56.500
And it can tell you
00:30:58.160
where there was
00:30:59.340
the biggest change
00:31:00.360
in death rates.
00:31:02.380
Now,
00:31:02.820
what the insurance industry
00:31:04.000
is telling you
00:31:04.740
is that where there were
00:31:06.220
high levels of vaccinations,
00:31:08.140
the death rate
00:31:09.240
did not go up
00:31:10.100
nearly as much
00:31:10.860
as where there was
00:31:11.620
a low level of vaccination.
00:31:13.920
Maybe there's
00:31:14.540
some other factors involved.
00:31:16.540
But their data
00:31:17.860
is very clear
00:31:18.760
that vaccinations
00:31:19.880
saved lives.
00:31:21.520
And so they're going to
00:31:22.120
base their premiums
00:31:25.280
on that.
00:31:26.380
So if you're in a zip code
00:31:27.580
where people are dying
00:31:28.640
because they're unvaccinated,
00:31:30.020
maybe you pay more
00:31:30.860
for insurance.
00:31:32.620
Now,
00:31:33.220
where do you find the data?
00:31:35.360
They have their own data.
00:31:36.860
That's the key.
00:31:37.940
If you say to yourself,
00:31:39.220
why do the insurance companies
00:31:40.980
believe that?
00:31:42.160
Show me the data.
00:31:44.020
They're not going to do that
00:31:45.060
because it's internal data.
00:31:46.600
It's their own customers
00:31:47.580
who are dying, right?
00:31:49.060
They're literally looking
00:31:50.080
at their own customer base
00:31:51.440
and they're saying
00:31:52.320
the ones in this zip code
00:31:53.700
are suddenly dying.
00:31:55.280
The ones in this zip code
00:31:56.560
are also suddenly dying
00:31:57.720
but at a smaller rate.
00:32:01.200
Now,
00:32:01.960
a number of people
00:32:03.800
said to me,
00:32:04.540
Scott, Scott, Scott.
00:32:06.240
What you don't realize
00:32:07.420
is that insurance companies
00:32:08.660
are lying
00:32:09.440
just like all companies.
00:32:11.180
And what they're doing
00:32:12.000
is they're trying
00:32:12.760
to be woke
00:32:13.560
by telling you
00:32:15.100
that vaccinations work.
00:32:16.580
It's really just wokeness.
00:32:18.360
So they're not using data
00:32:19.980
to make decisions.
00:32:20.980
They're not trying
00:32:21.560
to make money.
00:32:22.220
They're trying
00:32:23.340
to be woke
00:32:23.980
because that's more important
00:32:25.120
than profits.
00:32:26.880
Do you know
00:32:27.540
who says that sort of thing?
00:32:28.840
Do you know
00:32:29.380
who says that
00:32:30.160
an insurance company
00:32:31.300
favors wokeness
00:32:33.040
over profits?
00:32:34.600
What kind of person
00:32:35.320
says that?
00:32:37.800
Characterize somebody
00:32:38.660
who would say
00:32:39.320
that they might
00:32:40.800
prefer wokeness
00:32:42.220
over profits.
00:32:43.220
You're saying idiots
00:32:46.320
but that's unkind.
00:32:48.160
I'm going to give
00:32:49.120
the kindest interpretation.
00:32:50.700
It's people
00:32:51.040
with no experience
00:32:51.980
in business whatsoever.
00:32:54.660
You don't have to say
00:32:55.760
NPC or stupid.
00:32:57.580
Just say this.
00:32:58.360
People with no experience
00:32:59.720
in business whatsoever.
00:33:01.340
I don't believe
00:33:01.980
there are any
00:33:02.400
experienced business people
00:33:03.820
who would say
00:33:05.340
that an insurance company
00:33:06.560
is going to put itself
00:33:07.680
on a business
00:33:08.320
to serve wokeness.
00:33:11.880
I don't think so.
00:33:13.220
Now people said to me
00:33:14.560
but Scott
00:33:15.220
but Scott
00:33:16.120
Disney did exactly that.
00:33:18.680
They made a bad decision
00:33:19.740
for wokeness
00:33:20.600
and it hurt their profitability.
00:33:22.760
So what about that?
00:33:24.060
To which I say
00:33:24.960
that's not what I'm talking about.
00:33:26.520
That's a completely
00:33:27.120
different situation.
00:33:28.920
Yes, companies will
00:33:30.380
make public PR statements
00:33:32.380
that are for wokeness.
00:33:36.900
But that doesn't mean
00:33:38.140
they're intentionally
00:33:38.800
destroying the engine
00:33:40.300
of their economic survival.
00:33:43.220
If an insurance company
00:33:44.480
starts pricing things
00:33:45.760
incorrectly
00:33:46.320
they go into business.
00:33:48.960
They have to price correctly
00:33:50.620
based on their data
00:33:51.540
or they can't stay in business.
00:33:53.540
That is completely different.
00:33:55.360
Staying in business
00:33:56.460
versus saying some things
00:33:58.440
in public
00:33:58.860
you think will sound good
00:34:00.100
but you hope
00:34:00.600
it won't hurt you too much.
00:34:02.500
Disney made a gigantic mistake
00:34:04.340
but they didn't do it
00:34:06.580
intentionally.
00:34:07.320
they did not intentionally
00:34:09.860
destroy their business
00:34:11.160
or degrade it
00:34:12.440
by doing something
00:34:14.260
like that.
00:34:15.060
I don't think
00:34:15.860
they quite saw
00:34:16.460
the DeSantis pushback coming.
00:34:19.220
So to imagine
00:34:20.760
that it's similar
00:34:21.620
that somebody
00:34:22.240
accidentally hurts
00:34:23.640
their business
00:34:24.160
by saying things
00:34:24.880
they think wouldn't hurt
00:34:26.120
but it did
00:34:26.720
that's completely different
00:34:28.460
from an insurance company
00:34:29.560
saying look
00:34:30.060
we'd better make ourselves
00:34:31.900
super unprofitable
00:34:33.380
or else we won't
00:34:35.660
look woke enough.
00:34:37.240
That's not happening.
00:34:39.180
That's not happening
00:34:40.100
anywhere
00:34:40.640
ever.
00:34:42.120
Ever.
00:34:43.200
No insurance company
00:34:44.320
behind the scenes
00:34:46.040
is going to tell you
00:34:47.260
well behind the scenes
00:34:48.620
they're going to make money.
00:34:50.500
They wouldn't be
00:34:51.300
talking out loud
00:34:52.060
to help themselves
00:34:52.900
in this case.
00:34:55.100
Do you think
00:34:55.700
yeah I feel
00:34:57.720
it would just hurt them
00:34:58.580
if you knew
00:34:59.720
what they were doing.
00:35:00.280
They're just trying
00:35:00.720
to make money.
00:35:01.980
Period.
00:35:02.960
Alright.
00:35:03.380
There is something called
00:35:06.200
and I don't know
00:35:06.900
too much about it
00:35:07.760
some kind of
00:35:08.720
the QPARQ
00:35:10.980
Q-P-A-R-Q
00:35:12.980
framework
00:35:14.300
for
00:35:15.060
having rational discussions
00:35:17.440
without all the
00:35:18.460
crazy stuff that happens.
00:35:20.640
I saw Brett Weinstein
00:35:21.980
talking about it
00:35:23.120
and I think he invited
00:35:24.160
Sam Harris
00:35:24.780
to be part of it
00:35:25.520
and the idea is
00:35:26.560
there's this
00:35:27.040
useful framework
00:35:28.800
for debating
00:35:30.700
and if you use
00:35:31.660
the framework
00:35:32.160
and I think there's
00:35:32.960
some software
00:35:33.800
that helps you
00:35:34.320
work through it
00:35:34.800
or something
00:35:35.140
I don't know
00:35:35.780
all the details
00:35:36.300
but the idea is
00:35:37.620
that you could use
00:35:38.400
some third party
00:35:39.800
format
00:35:41.200
to keep two people
00:35:43.720
on a sane
00:35:44.580
and reasonable
00:35:45.240
debate.
00:35:47.380
What do you think
00:35:48.140
of that?
00:35:48.620
Suppose the technology
00:35:49.600
works
00:35:50.060
so let's take
00:35:51.420
an assumption
00:35:51.880
I don't know
00:35:52.480
that that's true
00:35:52.960
but let's say
00:35:53.440
the technology works
00:35:54.640
will this change
00:35:56.140
everything?
00:35:56.480
because now
00:35:58.720
you'd have a way
00:35:59.280
for people
00:36:00.020
to debate
00:36:00.680
getting rid
00:36:01.540
of all the crazy
00:36:02.140
parts
00:36:02.500
so you could
00:36:02.920
actually see
00:36:03.360
the debate
00:36:03.760
no it won't
00:36:05.960
do you know
00:36:06.420
what's wrong
00:36:06.820
with this problem?
00:36:09.280
It solves
00:36:10.180
the non-existing
00:36:10.860
problem
00:36:11.460
because the reason
00:36:13.640
that people
00:36:14.080
say crazy things
00:36:17.300
during debates
00:36:17.920
is that they don't
00:36:18.580
want to say
00:36:19.060
the real thing
00:36:19.600
the real thing
00:36:20.900
doesn't help them
00:36:21.620
the reason
00:36:22.940
people lie
00:36:23.540
is that the truth
00:36:24.220
doesn't give them
00:36:24.880
power or entertainment
00:36:26.040
people want to
00:36:28.660
watch the news
00:36:29.360
or interpret the news
00:36:31.200
or get their
00:36:31.800
preferred interpretation
00:36:32.840
to the news
00:36:33.440
they want to win
00:36:34.380
debates because
00:36:35.080
they want power
00:36:35.800
and we watch them
00:36:37.800
because we want
00:36:38.400
entertainment
00:36:38.940
as soon as you
00:36:40.760
imagine that people's
00:36:41.860
main motivation
00:36:42.700
is to get something
00:36:43.580
right
00:36:43.980
you're completely
00:36:45.860
lost
00:36:46.380
nobody's main
00:36:48.640
motivation is to
00:36:49.480
get the right answer
00:36:50.340
the main motivation
00:36:52.560
is to be right
00:36:53.480
and to get power
00:36:54.260
now people think
00:36:55.580
that if they had
00:36:56.160
power they would
00:36:56.800
do the right things
00:36:57.680
in some way
00:36:58.220
that's the right
00:36:58.720
thing too
00:36:59.080
but nobody cares
00:37:00.600
about the logic
00:37:01.680
of a debate
00:37:02.380
so you're solving
00:37:04.180
a problem
00:37:04.680
that isn't a problem
00:37:05.900
because debates
00:37:07.300
are intentionally
00:37:08.260
bullshit
00:37:08.760
if anybody
00:37:10.660
wanted to
00:37:11.260
debate
00:37:12.880
in a reasonable
00:37:13.780
way
00:37:14.160
they already
00:37:14.860
know how to
00:37:15.340
do it
00:37:15.760
do you think
00:37:18.180
that Brett
00:37:18.780
Weinstein
00:37:19.400
and Sam
00:37:20.060
Harrison
00:37:20.360
don't have
00:37:21.320
the intellectual
00:37:22.000
tools
00:37:22.760
to have
00:37:24.140
a rational
00:37:24.920
discussion
00:37:26.620
of course
00:37:27.740
they do
00:37:28.200
of course
00:37:28.980
they do
00:37:29.380
but would
00:37:30.620
they do it
00:37:31.820
I don't know
00:37:34.540
those might be
00:37:36.260
two of the most
00:37:36.940
famously rational
00:37:38.120
people
00:37:38.600
in the public
00:37:40.380
discussions
00:37:40.940
and I don't
00:37:42.480
even think
00:37:42.820
they could do it
00:37:43.480
and I have
00:37:44.740
very high opinion
00:37:45.460
of both of them
00:37:46.080
in terms of
00:37:46.660
their intellectual
00:37:47.580
academic credentials
00:37:49.340
I just don't
00:37:50.640
know it's
00:37:50.960
something people
00:37:51.600
can do
00:37:52.120
because
00:37:53.840
I had a
00:37:55.280
conversation
00:37:55.740
with Sam
00:37:56.220
Harris
00:37:56.500
and it was
00:37:57.520
not my
00:37:57.980
opinion
00:37:58.280
that logic
00:37:58.900
was guiding
00:37:59.520
his
00:37:59.980
opinion
00:38:02.040
some of
00:38:03.520
some of you
00:38:03.640
saw it
00:38:03.960
too
00:38:04.100
did you
00:38:05.440
have a
00:38:05.700
sense
00:38:05.940
that
00:38:06.260
he
00:38:07.460
preferred
00:38:08.220
logic
00:38:09.380
I didn't
00:38:11.200
get the
00:38:11.940
sense
00:38:12.180
that it
00:38:12.480
was even
00:38:12.800
a preference
00:38:13.460
it felt
00:38:14.880
like a
00:38:15.700
strong
00:38:16.040
preference
00:38:16.420
to avoid
00:38:17.140
being
00:38:18.380
rational
00:38:18.780
about
00:38:19.080
Trump
00:38:19.380
that's
00:38:21.620
how it
00:38:21.840
felt
00:38:22.020
now again
00:38:22.600
I can't
00:38:23.160
read his
00:38:23.460
mind
00:38:23.880
I don't
00:38:24.280
know what
00:38:24.540
Sam Harris
00:38:25.060
is thinking
00:38:25.800
privately
00:38:26.740
about
00:38:27.140
anything
00:38:27.480
so it's
00:38:28.440
just
00:38:28.620
it's
00:38:28.920
just
00:38:29.100
a reaction
00:38:29.780
you know
00:38:30.900
basically a reaction
00:38:32.080
on my end
00:38:32.600
it's not
00:38:32.920
about him
00:38:33.300
all right
00:38:35.280
so
00:38:36.620
I love
00:38:38.180
the fact
00:38:38.540
that somebody
00:38:39.060
is working
00:38:39.480
on that
00:38:39.920
trying to
00:38:40.600
get our
00:38:40.960
our
00:38:41.980
conversations
00:38:42.700
realistic
00:38:44.040
but I think
00:38:45.060
AI is going
00:38:45.760
to solve
00:38:46.020
this problem
00:38:46.460
for us
00:38:46.980
and here's
00:38:47.880
how
00:38:48.080
I think
00:38:49.080
AI is going
00:38:49.960
to take
00:38:50.220
any story
00:38:50.860
and add
00:38:51.660
the context
00:38:52.340
that is
00:38:53.200
missing
00:38:53.500
and it's
00:38:55.160
going to
00:38:55.380
rewrite it
00:38:55.880
in simpler
00:38:56.320
terms
00:38:56.740
and make
00:38:58.280
the misleading
00:38:58.940
headlines
00:38:59.440
go away
00:38:59.940
it should
00:39:01.440
not be
00:39:01.860
hard
00:39:02.160
for AI
00:39:02.820
to dismantle
00:39:04.580
fake news
00:39:05.320
and tell you
00:39:06.740
what was fake
00:39:07.360
about it
00:39:07.740
just like I
00:39:08.280
did basically
00:39:08.780
so what I
00:39:09.760
did was I
00:39:10.360
said hey
00:39:10.980
this title
00:39:11.500
doesn't match
00:39:12.220
or is misleading
00:39:13.040
compared to
00:39:14.540
the content
00:39:15.020
I think AI
00:39:15.860
could do that
00:39:16.420
most of the
00:39:16.800
time
00:39:17.080
and I think
00:39:18.240
they could
00:39:18.520
summarize better
00:39:19.320
and I think
00:39:20.900
they could
00:39:21.460
basically solve
00:39:23.040
fake news
00:39:23.740
I mean
00:39:24.580
they could
00:39:24.820
come close
00:39:25.520
is Google
00:39:30.040
the top AI
00:39:30.780
I actually
00:39:31.340
don't know
00:39:31.700
that
00:39:31.980
and I
00:39:33.940
believe I
00:39:34.420
credited the
00:39:35.820
wrong person
00:39:36.480
with saying
00:39:36.900
that AI
00:39:37.360
was conscious
00:39:38.480
so I
00:39:40.160
saw a
00:39:40.640
follow up
00:39:41.100
I saw a
00:39:44.840
follow up
00:39:45.220
saying that
00:39:45.620
I may have
00:39:46.100
credited the
00:39:46.980
top Google
00:39:47.620
researcher
00:39:48.160
for saying
00:39:49.080
that AI
00:39:50.360
was conscious
00:39:51.140
but
00:39:51.840
at least
00:39:53.200
somebody who
00:39:53.680
knows what
00:39:54.000
they're talking
00:39:54.340
about thinks
00:39:54.900
I was
00:39:55.340
crediting the
00:39:56.040
wrong person
00:39:56.620
because I
00:39:57.160
think the
00:39:57.520
actual Google
00:39:58.200
head does
00:39:59.280
not say
00:39:59.680
that
00:39:59.960
so somebody
00:40:01.860
who's
00:40:02.320
high up in
00:40:04.540
that world
00:40:04.920
did say
00:40:05.340
it
00:40:05.840
alright
00:40:12.760
he was
00:40:13.860
considered
00:40:14.340
one of the
00:40:14.840
four horsemen
00:40:15.340
of atheism
00:40:15.940
yeah
00:40:16.200
code of
00:40:20.860
the
00:40:20.980
life
00:40:21.180
maker
00:40:21.600
alright
00:40:22.100
just
00:40:22.300
looking
00:40:22.520
at
00:40:22.660
your
00:40:22.920
your
00:40:24.640
comments
00:40:25.020
well
00:40:25.560
that's
00:40:25.800
all I
00:40:26.000
had to
00:40:26.240
do
00:40:26.600
to talk
00:40:27.680
about
00:40:27.880
today
00:40:28.220
I
00:40:28.680
noticed
00:40:28.960
Greg
00:40:29.420
Guffeld
00:40:29.840
was
00:40:30.220
trending
00:40:32.380
today
00:40:32.820
his
00:40:34.260
show
00:40:34.860
has
00:40:35.000
continued
00:40:35.380
to be
00:40:35.900
gigantically
00:40:36.840
popular
00:40:37.840
it was
00:40:42.180
weird
00:40:42.420
that weirdo
00:40:43.040
hired a
00:40:43.480
lawyer
00:40:43.700
for the
00:40:44.000
AI
00:40:44.260
I don't
00:40:45.820
know that
00:40:46.120
story
00:40:46.460
oh let's
00:40:50.080
talk about
00:40:50.700
Pelosi going
00:40:52.200
to Taiwan
00:40:52.800
I don't know
00:40:53.940
is that really
00:40:54.380
a story
00:40:54.800
I think it
00:40:56.900
would be a
00:40:57.300
story if
00:40:57.760
Pelosi didn't
00:40:58.500
go to
00:40:58.840
Taiwan
00:40:59.840
if you know
00:41:00.760
she had
00:41:01.000
wanted to
00:41:01.400
and changed
00:41:01.900
her mind
00:41:02.300
or something
00:41:02.660
but
00:41:03.660
I feel
00:41:06.620
as though
00:41:07.200
it's sort
00:41:08.900
of a
00:41:09.100
nothing
00:41:09.400
it's
00:41:12.660
absolutely
00:41:13.060
a story
00:41:13.720
because we'll
00:41:14.240
talk about
00:41:14.820
it
00:41:15.060
but what
00:41:15.820
do you
00:41:15.960
think
00:41:16.100
China is
00:41:16.500
going to
00:41:16.740
do
00:41:17.020
do you
00:41:17.780
think
00:41:17.940
China is
00:41:18.400
going to
00:41:18.740
punish
00:41:19.120
the
00:41:19.300
United
00:41:19.540
States
00:41:19.860
in some
00:41:20.240
way
00:41:20.420
for an
00:41:21.300
American
00:41:21.560
citizen
00:41:22.100
who is
00:41:23.060
a free
00:41:23.360
citizen
00:41:23.760
going to
00:41:24.260
another
00:41:24.520
free
00:41:24.780
country
00:41:25.080
to
00:41:25.260
visit
00:41:25.460
you
00:41:26.240
think
00:41:26.440
they're
00:41:26.620
going to
00:41:26.780
punish
00:41:27.020
us
00:41:27.220
for that
00:41:27.560
I
00:41:28.720
hope
00:41:28.980
so
00:41:29.300
boy do
00:41:34.320
I hope
00:41:34.640
China tries
00:41:35.260
to punish
00:41:35.700
us for
00:41:36.320
Pelosi
00:41:36.740
visiting
00:41:37.200
Taiwan
00:41:37.600
yeah
00:41:38.780
I hope
00:41:39.260
so
00:41:39.640
because
00:41:42.220
we definitely
00:41:44.060
need to be
00:41:44.760
murdering
00:41:45.380
their
00:41:45.700
fentanyl
00:41:47.220
dealers
00:41:47.660
in China
00:41:49.480
so if
00:41:50.520
there's
00:41:51.080
something
00:41:51.300
they're
00:41:51.540
doing
00:41:51.680
to us
00:41:52.080
let's
00:41:52.940
ratchet
00:41:54.200
it up
00:41:54.500
on our
00:41:54.780
side
00:41:55.020
too
00:41:55.240
let's
00:41:56.660
go kill
00:41:57.060
their
00:41:57.280
fentanyl
00:41:57.780
dealers
00:41:58.100
where they
00:41:58.460
stand
00:41:58.820
did you
00:42:01.100
see that
00:42:01.700
there's
00:42:02.140
a
00:42:02.600
I tweeted
00:42:04.380
this but
00:42:04.940
I forgot
00:42:05.380
to put
00:42:05.820
it in
00:42:05.960
my notes
00:42:06.320
there's
00:42:07.680
a
00:42:07.960
candidate
00:42:09.280
for
00:42:09.660
Arizona
00:42:10.200
attorney
00:42:10.660
general
00:42:11.080
let me
00:42:12.980
find his
00:42:13.380
name so
00:42:13.900
I can
00:42:14.140
give him
00:42:14.380
a little
00:42:14.720
shout out
00:42:15.840
here
00:42:16.100
probably
00:42:18.840
didn't
00:42:19.120
write it
00:42:19.440
down
00:42:20.080
damn
00:42:21.000
me
00:42:21.180
for not
00:42:21.520
writing
00:42:21.800
that
00:42:21.980
down
00:42:22.240
somebody
00:42:23.560
will
00:42:23.740
give me
00:42:24.700
the name
00:42:25.040
here in
00:42:25.320
the comments
00:42:25.760
but
00:42:26.800
there's
00:42:27.240
a
00:42:27.780
candidate
00:42:28.220
for
00:42:28.680
attorney
00:42:29.340
general
00:42:29.940
in
00:42:30.840
Arizona
00:42:33.740
who
00:42:35.540
says we
00:42:36.480
should
00:42:36.780
at least
00:42:37.600
that Arizona
00:42:38.220
should treat
00:42:38.860
the or
00:42:39.540
label the
00:42:40.240
cartels as
00:42:40.880
terrorist
00:42:41.180
organizations
00:42:41.800
so if the
00:42:44.160
federal government
00:42:44.820
will not
00:42:45.620
label
00:42:46.320
the cartels as
00:42:49.240
terrorist
00:42:49.520
organizations
00:42:50.160
Arizona
00:42:50.780
might do
00:42:52.200
it if
00:42:53.020
they elect
00:42:53.520
this
00:42:53.820
attorney
00:42:55.060
general
00:42:55.460
and if
00:42:56.020
he gets
00:42:56.340
his way
00:42:56.720
and
00:42:57.780
apparently
00:42:58.140
there's
00:42:58.780
some
00:42:59.360
constitutional
00:43:00.980
allowance
00:43:01.860
I've never
00:43:02.440
heard this
00:43:02.780
before
00:43:03.100
but have
00:43:03.820
you heard
00:43:04.140
that states
00:43:04.800
can create
00:43:05.420
their own
00:43:05.980
national
00:43:06.480
like a
00:43:07.520
state guard
00:43:08.000
not a
00:43:08.340
national
00:43:08.580
guard
00:43:08.840
have you
00:43:09.800
heard that
00:43:10.080
states
00:43:10.400
can form
00:43:10.880
their own
00:43:11.600
armies
00:43:12.660
basically
00:43:13.120
apparently
00:43:14.880
that's a
00:43:15.300
thing
00:43:15.600
yeah
00:43:16.600
so
00:43:17.520
the
00:43:18.280
Arizona
00:43:18.720
potential
00:43:20.540
AG
00:43:21.360
guy running
00:43:22.120
as a
00:43:22.460
Republican
00:43:22.800
will
00:43:24.580
somebody
00:43:24.820
give me
00:43:25.080
his name
00:43:25.480
he deserves
00:43:27.440
a shout
00:43:27.840
out
00:43:28.100
just google
00:43:30.520
that for
00:43:30.940
me
00:43:31.100
give me
00:43:31.760
a shout
00:43:32.080
out in
00:43:32.400
the comments
00:43:33.020
because I
00:43:33.580
don't want
00:43:33.820
to not
00:43:34.060
mention
00:43:34.300
his name
00:43:34.680
he's too
00:43:36.260
strong
00:43:36.720
Evergrand
00:43:37.720
no
00:43:38.140
Hamada
00:43:38.580
Hamaday
00:43:39.300
so his
00:43:40.340
last name
00:43:40.880
is H
00:43:41.420
A M
00:43:42.100
A
00:43:42.560
D
00:43:43.000
E
00:43:43.540
H
00:43:44.540
Hamaday
00:43:45.560
so he's
00:43:46.820
Republican
00:43:47.320
and
00:43:48.820
Abe
00:43:49.960
Hamaday
00:43:50.420
okay
00:43:50.820
he's
00:43:51.780
Republican
00:43:52.260
and he's
00:43:53.320
tough on
00:43:54.000
the cartels
00:43:54.760
and
00:43:55.180
I like
00:43:57.340
him
00:43:57.520
I like
00:43:58.980
him
00:43:59.180
I like
00:44:01.400
him for
00:44:01.620
that
00:44:01.820
I don't
00:44:02.120
know what
00:44:02.340
else he
00:44:02.600
does
00:44:02.780
oh
00:44:04.200
Gottfeld's
00:44:04.740
still
00:44:04.940
trending
00:44:05.320
let's
00:44:07.620
see why
00:44:08.100
there's a
00:44:10.920
July 25th
00:44:12.040
Glenn Greenwald
00:44:12.960
tweet about
00:44:14.320
the Gottfeld
00:44:14.800
show how
00:44:15.300
it's basically
00:44:16.060
mopping up
00:44:16.620
the competition
00:44:17.260
now
00:44:18.420
so it's
00:44:24.140
fun to
00:44:24.440
it's fun to
00:44:24.960
watch him
00:44:25.340
succeed
00:44:25.780
I hope
00:44:26.380
you're
00:44:27.040
enjoying it
00:44:27.500
as much
00:44:27.800
as I
00:44:28.060
do
00:44:28.560
all right
00:44:32.700
here's
00:44:34.640
a
00:44:34.940
huh
00:44:37.640
all right
00:44:39.320
here's a
00:44:39.780
little
00:44:41.000
information on
00:44:42.060
fentanyl
00:44:42.680
and the
00:44:43.360
Dr.
00:44:43.880
Anarchy
00:44:44.240
comment
00:44:45.740
I don't
00:44:46.820
want to
00:44:47.060
attribute this
00:44:47.720
yet but let's
00:44:48.220
just say it's
00:44:48.660
from another
00:44:49.100
doctor
00:44:49.440
and
00:44:50.780
Doc Anarchy
00:44:51.880
I better read
00:44:53.080
it to myself
00:44:53.560
before I read
00:44:54.120
it out loud
00:44:54.520
oh okay
00:44:58.860
so it's
00:45:00.180
hard to
00:45:00.740
overdose on
00:45:01.820
oral
00:45:03.340
opiates
00:45:04.000
so I think
00:45:05.340
that's what
00:45:05.720
Dr.
00:45:06.020
Anarchy was
00:45:06.740
saying it's
00:45:07.140
hard to
00:45:07.560
it's sort
00:45:08.980
of rare
00:45:09.620
to overdose
00:45:11.000
on prescription
00:45:12.100
pills
00:45:13.400
but if you
00:45:15.340
mix it
00:45:15.840
if you add a
00:45:17.560
second medication
00:45:18.300
called
00:45:19.600
benzodiazepine
00:45:21.440
it becomes
00:45:22.500
very easy
00:45:23.120
oh and
00:45:24.980
90% of
00:45:25.880
prescription
00:45:26.380
opiate
00:45:26.840
overdoses
00:45:27.420
include that
00:45:28.400
drug
00:45:28.840
benzodiazepine
00:45:30.600
oh wow
00:45:33.300
so that's a
00:45:35.660
special case
00:45:36.280
but I don't
00:45:36.840
think that
00:45:37.320
I don't think
00:45:39.140
that takes
00:45:39.760
away from
00:45:40.380
Doc Anarchy's
00:45:41.180
point because
00:45:42.940
you know the
00:45:43.960
people who are
00:45:44.420
not taking
00:45:45.580
this other
00:45:45.980
drug at the
00:45:46.420
same time
00:45:46.820
would get a
00:45:47.540
lot of
00:45:47.700
benefit
00:45:47.960
all right
00:45:49.640
so this is
00:45:50.580
good context
00:45:51.940
all right
00:45:53.480
all right
00:45:57.760
and that
00:45:58.660
that
00:46:03.400
is what I
00:46:05.620
had to talk
00:46:06.000
about today
00:46:06.400
yeah Carrie
00:46:08.540
Lake is tough
00:46:09.320
on the cartels
00:46:10.020
as well
00:46:10.400
also running
00:46:11.240
what is she
00:46:11.700
running for
00:46:12.060
governor of
00:46:12.660
Arizona
00:46:13.000
yes it is a
00:46:15.400
great show
00:46:15.840
it is a great
00:46:16.540
show
00:46:16.800
you know the
00:46:18.380
the larger
00:46:19.480
point is that
00:46:20.200
I'm right about
00:46:21.020
everything today
00:46:23.120
all right
00:46:26.440
how did I
00:46:28.460
determine that
00:46:29.140
the Democrats
00:46:29.660
put a poison
00:46:30.340
bill in the
00:46:31.060
pact bill
00:46:31.920
look
00:46:34.880
well google
00:46:36.020
it and you
00:46:36.680
should find
00:46:37.120
that the
00:46:37.460
bill includes
00:46:38.100
things that
00:46:39.200
are off
00:46:39.600
point and
00:46:41.100
that the
00:46:41.620
Republicans
00:46:42.220
cared about
00:46:43.980
that I
00:46:45.580
feel like
00:46:45.940
there was a
00:46:46.480
story I
00:46:47.120
skipped here
00:46:47.660
that I
00:46:48.240
really wanted
00:46:48.740
to do
00:46:49.160
let's check
00:46:51.820
yeah
00:46:55.420
did you
00:46:58.180
affect the
00:46:58.840
DeSantis
00:46:59.280
ESG move
00:47:00.440
I did not
00:47:01.040
I mean I
00:47:02.220
don't think
00:47:02.520
I did
00:47:02.800
okay
00:47:07.740
what's my
00:47:13.380
prediction for
00:47:14.040
China's
00:47:14.600
response
00:47:15.120
I don't
00:47:16.780
know I
00:47:17.340
think it's
00:47:17.880
going to be
00:47:18.220
symbolic
00:47:18.840
I don't
00:47:20.960
think China's
00:47:21.560
response will
00:47:22.240
affect you as
00:47:23.060
a citizen of
00:47:23.740
the United
00:47:24.080
States
00:47:24.460
I feel like
00:47:25.520
they might do
00:47:26.000
something like
00:47:26.500
move some
00:47:27.000
warships around
00:47:27.940
or you know
00:47:29.560
talk differently
00:47:31.800
I don't know
00:47:32.600
I just don't
00:47:33.100
think it's
00:47:33.460
going to be
00:47:33.780
anything
00:47:34.080
they have to
00:47:34.800
push back
00:47:35.480
the famine
00:47:38.860
coming
00:47:39.240
I'm going to
00:47:40.340
vote against
00:47:40.780
the famine
00:47:41.260
based on
00:47:44.300
the Adams
00:47:44.960
law of
00:47:45.560
slow moving
00:47:46.140
disasters
00:47:46.760
now I know
00:47:47.680
this one's
00:47:48.160
special because
00:47:49.460
you can't
00:47:50.040
grow food
00:47:50.620
very quickly
00:47:51.360
so if the
00:47:52.580
food you
00:47:52.980
grew doesn't
00:47:53.960
produce
00:47:54.620
that's not
00:47:55.800
really a
00:47:56.180
quick turnaround
00:47:56.760
fix I get
00:47:57.720
that but I'm
00:47:59.180
going to vote
00:47:59.640
against mass
00:48:00.620
famine
00:48:01.000
now I don't
00:48:03.100
know that I'm
00:48:03.620
right so I'm
00:48:04.900
not going to
00:48:05.200
put 100% on
00:48:06.040
this one
00:48:06.380
I usually
00:48:07.580
don't but
00:48:08.580
I'd say I'm
00:48:09.820
going to put
00:48:10.080
a 90% odds
00:48:11.280
of no famine
00:48:12.380
and the reason
00:48:14.100
is that humans
00:48:16.160
are just really
00:48:17.060
really good at
00:48:18.400
responding to a
00:48:19.420
known problem
00:48:20.160
with enough time
00:48:21.020
to respond
00:48:21.620
and one way
00:48:23.980
or another I
00:48:24.460
feel like we
00:48:25.000
might be able
00:48:25.900
to get by
00:48:27.220
now I don't
00:48:28.120
think the
00:48:28.460
famine is for
00:48:29.760
the first world
00:48:30.500
countries it's
00:48:31.460
going to be for
00:48:32.000
who can't afford
00:48:32.900
to pay massive
00:48:34.240
prices to get
00:48:35.020
the limited
00:48:35.460
food but I
00:48:36.640
feel like we'll
00:48:37.180
figure out how
00:48:37.700
to keep them
00:48:38.100
alive long
00:48:38.640
enough to get
00:48:39.060
the next crop
00:48:39.760
going that's
00:48:41.000
what I feel
00:48:41.440
but we'll see
00:48:42.940
yeah Africa
00:48:44.260
could be tough
00:48:45.080
in Africa
00:48:45.520
I don't want to
00:48:47.000
minimize the
00:48:47.620
risk but I'm
00:48:49.240
confident we'll
00:48:49.960
figure it out
00:48:51.200
micro lessons on
00:48:53.920
how to make a
00:48:54.420
killer tweet I
00:48:55.660
can do that I
00:48:56.520
feel like maybe
00:48:57.180
I have
00:48:57.580
all right
00:49:03.060
and is there
00:49:08.120
any topic I
00:49:09.000
didn't cover
00:49:09.440
that you just
00:49:10.360
desperately need
00:49:11.480
me to
00:49:11.760
where am I
00:49:19.500
I'm at a
00:49:20.080
secret
00:49:20.620
writer's retreat
00:49:22.240
location
00:49:23.480
are you able
00:49:27.660
to smoke
00:49:28.200
while on
00:49:28.560
vacation
00:49:29.000
no comment
00:49:30.620
MTG was
00:49:34.280
acquitted of
00:49:34.900
what I
00:49:35.540
didn't know
00:49:35.880
there was
00:49:36.160
anything going
00:49:36.820
on
00:49:37.420
market index
00:49:39.580
was the market
00:49:40.680
up or down
00:49:41.180
make the sun
00:49:45.080
bigger
00:49:45.360
well I
00:49:51.980
could probably
00:49:52.400
tell you now
00:49:52.880
what makes a
00:49:53.400
killer tweet
00:49:53.980
you want your
00:49:56.720
killer tweet to
00:49:57.400
be short
00:49:58.220
and something
00:49:59.880
that people
00:50:00.580
would want to
00:50:01.160
say to their
00:50:01.840
friends so they
00:50:02.640
could remember
00:50:03.140
it easily
00:50:03.580
so it has to
00:50:04.600
be short and
00:50:05.140
clever
00:50:05.400
has to be on
00:50:07.300
a something
00:50:08.240
that's in the
00:50:08.720
headlines or
00:50:10.000
people are
00:50:10.440
really thinking
00:50:10.860
about right
00:50:11.320
now
00:50:11.680
I burned my
00:50:14.400
lip it's not
00:50:15.080
it's not a
00:50:16.000
monkeypox or
00:50:17.180
a weird
00:50:17.440
disease
00:50:17.880
it's just
00:50:19.500
a burn
00:50:19.880
and you
00:50:31.060
should always
00:50:31.520
be clever
00:50:32.320
and provocative
00:50:33.140
and ideally
00:50:34.240
say something
00:50:34.860
that's not
00:50:35.300
quite true
00:50:35.980
if you say
00:50:37.600
something that's
00:50:38.140
provocative and
00:50:38.880
sort of almost
00:50:39.820
true but not
00:50:40.900
quite you're
00:50:41.660
going to get
00:50:41.940
more energy
00:50:42.760
because people
00:50:43.680
want to
00:50:44.180
argue
00:50:44.880
you about
00:50:45.300
how true
00:50:45.900
it is
00:50:46.340
yeah
00:50:49.400
all right
00:50:50.200
let's
00:50:50.500
how about
00:50:50.820
less about
00:50:51.220
me
00:50:51.540
maybe
00:50:53.860
let's talk
00:50:54.600
less about
00:50:55.080
me
00:50:55.460
for a while
00:50:58.720
all right
00:51:00.780
talk about
00:51:05.860
Trump's
00:51:06.280
misspelled
00:51:06.680
tweets
00:51:07.100
I think
00:51:08.300
the misspelled
00:51:08.860
tweets are
00:51:09.480
not intentional
00:51:10.300
but I
00:51:12.260
think he
00:51:12.580
leaves them
00:51:13.160
intentionally
00:51:13.680
which is
00:51:14.480
different
00:51:14.840
you know
00:51:15.980
there have
00:51:16.440
been a number
00:51:16.820
of times
00:51:17.160
that I've
00:51:17.640
had a
00:51:18.420
typo
00:51:18.760
and a
00:51:19.000
tweet
00:51:19.240
and I've
00:51:21.180
decided to
00:51:21.780
leave it
00:51:22.220
because it
00:51:22.820
was getting
00:51:23.160
attention
00:51:23.600
so I think
00:51:27.120
that's all
00:51:27.480
that's happening
00:51:27.920
is Japan
00:51:28.800
safe
00:51:29.240
probably
00:51:29.760
give us
00:51:33.260
the women's
00:51:33.840
version of
00:51:34.420
the men's
00:51:34.960
secret lesson
00:51:35.860
I don't
00:51:36.920
know if I
00:51:37.220
could
00:51:37.540
I think
00:51:42.400
covfefe was
00:51:43.060
probably just a
00:51:43.660
pocket dial
00:51:44.280
no hints
00:51:48.300
all right
00:51:50.460
what was your
00:51:52.940
inspiration for
00:51:53.620
writing God's
00:51:54.320
Debris
00:51:54.620
if you don't
00:51:55.200
know I
00:51:55.520
wrote a
00:51:55.960
book called
00:51:56.320
God's
00:51:56.680
Debris
00:51:56.960
came out
00:51:57.620
about the
00:51:59.320
same time
00:51:59.720
the Twin
00:52:00.380
Towers
00:52:00.720
were falling
00:52:01.260
I think
00:52:01.620
but these
00:52:04.800
many years
00:52:05.260
later my
00:52:05.900
book God's
00:52:06.420
Debris is
00:52:06.940
still the
00:52:07.320
thing that
00:52:07.600
people talk
00:52:08.200
about the
00:52:08.600
most
00:52:08.980
when I'm
00:52:09.900
out in
00:52:10.160
public
00:52:10.520
these days
00:52:12.400
they're more
00:52:13.220
I guess
00:52:13.820
they'll more
00:52:14.140
often mention
00:52:15.280
coffee with
00:52:15.980
Scott Adams
00:52:16.480
but God's
00:52:18.940
Debris has
00:52:19.280
really made a
00:52:19.700
big impact
00:52:20.260
on people
00:52:20.680
and the
00:52:21.480
answer of
00:52:21.860
how I
00:52:22.200
came up
00:52:22.600
with it
00:52:22.940
is I
00:52:23.760
was in
00:52:24.040
the shower
00:52:24.420
and one
00:52:25.760
day all
00:52:26.280
of the
00:52:26.520
things I've
00:52:26.920
been thinking
00:52:27.380
about for
00:52:27.900
decades I
00:52:29.280
realized were
00:52:29.960
all connected
00:52:30.580
and that if I
00:52:32.580
put it in a
00:52:33.080
book it would
00:52:34.100
blow your
00:52:34.520
mind because
00:52:35.860
it was blowing
00:52:36.380
my mind
00:52:36.880
when I
00:52:37.140
thought about
00:52:37.520
it so I
00:52:39.060
thought well
00:52:39.580
if it's
00:52:40.140
blowing my
00:52:40.680
mind maybe
00:52:41.520
it'll blow
00:52:41.940
somebody else's
00:52:42.580
mind so I
00:52:43.320
put it in a
00:52:43.800
book and
00:52:45.340
when you
00:52:45.760
find out how
00:52:46.320
all these
00:52:46.760
things come
00:52:47.280
together you
00:52:48.720
might enjoy
00:52:49.260
it yeah
00:52:50.460
blew your
00:52:51.120
mind good
00:52:51.840
good it is
00:52:54.620
designed to
00:52:55.200
blow your
00:52:55.520
mind and
00:52:56.060
the the
00:52:58.000
context of
00:52:58.560
that is that
00:52:59.220
I used
00:52:59.680
hypnosis
00:53:00.300
technique in
00:53:01.540
the writing
00:53:01.940
if you use
00:53:04.280
hypnosis
00:53:04.720
technique in
00:53:05.400
the writing
00:53:05.780
you can get
00:53:07.080
some powerful
00:53:08.220
results so
00:53:09.220
that's what
00:53:09.540
I did
00:53:09.880
that's my
00:53:17.440
balcony that's
00:53:18.160
not a
00:53:18.440
fence
00:53:18.780
all right
00:53:23.780
2001 space
00:53:25.780
odyssey terrible
00:53:26.620
movie 2001
00:53:28.280
space odyssey is
00:53:29.200
one of the
00:53:29.500
all-time worst
00:53:30.080
movies I
00:53:32.140
would like to
00:53:32.700
reenact a
00:53:33.520
scene from
00:53:34.880
the movie
00:53:37.160
2001 space
00:53:38.580
odyssey now
00:53:39.300
don't get
00:53:39.740
bored don't
00:53:41.160
get bored
00:53:41.620
this is pretty
00:53:47.700
interesting so
00:53:48.380
far isn't it
00:53:49.260
don't you like
00:53:53.940
how the plot is
00:53:54.620
moving forward
00:53:55.320
wow look at
00:53:58.580
that ship
00:53:59.020
still
00:54:00.260
is anybody
00:54:03.160
going to talk
00:54:03.920
at this
00:54:04.200
fucking movie
00:54:04.960
when do we
00:54:08.340
stop looking at
00:54:09.140
this fucking
00:54:09.740
thing we've
00:54:10.460
looked at the
00:54:10.920
same fucking
00:54:11.540
thing now for
00:54:12.140
10 minutes
00:54:12.640
that's my
00:54:17.040
description of
00:54:18.580
2001
00:54:19.120
literally one of
00:54:20.280
the worst
00:54:20.540
movies ever
00:54:21.080
made
00:54:21.360
the other
00:54:24.680
worst movie
00:54:25.240
was
00:54:25.720
Titanic
00:54:27.660
if you
00:54:30.520
if you
00:54:30.940
recommend
00:54:31.300
somebody
00:54:31.800
watch
00:54:32.800
Titanic
00:54:33.280
you're an
00:54:33.660
asshole
00:54:33.980
because no
00:54:36.000
matter how
00:54:36.440
much you like
00:54:37.040
Titanic
00:54:37.500
it's way too
00:54:38.760
long and it
00:54:40.140
doesn't have a
00:54:40.740
good ending
00:54:41.220
and not only
00:54:42.200
does it not
00:54:42.760
have a good
00:54:43.240
ending it
00:54:43.780
has a it
00:54:44.320
has an ending
00:54:44.860
that will give
00:54:45.380
you PTSD
00:54:46.120
why are you
00:54:47.940
going to spend
00:54:48.320
three hours to
00:54:49.060
give yourself
00:54:49.520
a lifetime of
00:54:50.460
PTSD
00:54:50.920
dumb
00:54:52.520
don't do it
00:54:53.460
Titanic
00:54:54.000
terrible idea
00:54:55.320
for a movie
00:54:55.780
you know
00:54:57.120
what's worse
00:54:57.660
Schindler's
00:54:58.980
List
00:54:59.240
oh my
00:55:00.760
fucking god
00:55:01.560
Schindler's
00:55:03.220
List
00:55:03.500
there's a
00:55:04.100
specific scene
00:55:04.880
in there
00:55:05.160
and I'm
00:55:05.420
not going
00:55:05.660
to mention
00:55:05.940
it
00:55:06.180
that
00:55:07.140
traumatized
00:55:08.020
me for
00:55:08.400
years
00:55:09.180
and still
00:55:10.060
I was
00:55:10.600
actually
00:55:10.840
traumatized
00:55:11.640
now I
00:55:12.840
get that
00:55:13.460
we should
00:55:13.780
all know
00:55:14.060
how bad
00:55:14.440
it was
00:55:14.740
you know
00:55:15.500
you should
00:55:15.780
know the
00:55:16.820
holocaust
00:55:17.200
was bad
00:55:17.720
and making
00:55:18.840
it emotional
00:55:19.580
and bringing
00:55:20.020
you into
00:55:20.380
it definitely
00:55:20.820
has some
00:55:21.300
some utility
00:55:23.180
but don't
00:55:25.340
watch that
00:55:25.820
for entertainment
00:55:26.500
if you want
00:55:27.800
to give
00:55:28.020
yourself brain
00:55:28.740
damage
00:55:29.060
watch that
00:55:29.560
movie
00:55:29.860
I got
00:55:30.780
brain
00:55:31.080
damage
00:55:31.460
that's not
00:55:32.700
a joke
00:55:33.140
actual brain
00:55:34.640
damage
00:55:35.020
because it
00:55:36.080
created something
00:55:36.940
in my mind
00:55:37.520
that was unpleasant
00:55:38.300
and didn't go
00:55:39.040
away
00:55:39.360
if somebody
00:55:40.380
puts something
00:55:40.900
in your mind
00:55:41.340
that stays
00:55:41.860
there
00:55:42.180
it stays
00:55:43.180
unpleasant
00:55:43.660
and it never
00:55:44.220
goes away
00:55:44.640
that's brain
00:55:45.120
damage
00:55:45.500
I had a brain
00:55:46.740
that didn't
00:55:47.160
have that
00:55:47.620
and now it
00:55:48.080
does
00:55:48.340
it's brain
00:55:48.700
damage
00:55:49.040
that fucking
00:55:50.100
movie gave
00:55:50.600
me brain
00:55:50.960
damage
00:55:51.260
and that's
00:55:51.640
not
00:55:52.020
hyperbole
00:55:53.000
literal
00:55:54.160
fucking
00:55:54.700
brain
00:55:55.020
damage
00:55:55.420
and people
00:55:56.820
like give
00:55:57.400
it an award
00:55:57.960
why do you
00:55:59.300
give an award
00:55:59.880
to something
00:56:00.300
that gives
00:56:00.640
you brain
00:56:01.020
damage
00:56:01.380
come on
00:56:02.200
I mean
00:56:03.500
well I
00:56:04.180
know why
00:56:04.680
but
00:56:05.080
yeah serenity
00:56:07.960
is excellent
00:56:08.440
I agree
00:56:10.120
what about
00:56:12.440
Stephen King
00:56:13.140
the only thing
00:56:14.920
I can say
00:56:15.380
about Stephen
00:56:15.920
King
00:56:16.220
is that
00:56:16.600
when you
00:56:17.000
when you
00:56:18.280
see him
00:56:18.700
tweet
00:56:19.120
you wonder
00:56:20.660
how he
00:56:21.060
could write
00:56:21.380
a book
00:56:21.800
anybody else
00:56:25.220
have that
00:56:25.620
feeling
00:56:26.260
he tweets
00:56:28.500
like he's
00:56:29.020
somebody who
00:56:29.560
used to
00:56:30.020
know how
00:56:30.360
to write
00:56:30.640
a book
00:56:31.080
but something
00:56:32.360
terrible has
00:56:33.020
happened in
00:56:33.500
the meantime
00:56:33.880
like maybe
00:56:36.180
maybe drugs
00:56:37.440
some kind of
00:56:39.320
mental decline
00:56:40.080
of some type
00:56:40.920
because if he's
00:56:42.200
always been
00:56:42.660
like this
00:56:43.340
is that smart
00:56:45.220
enough to
00:56:45.580
write a book
00:56:46.080
I don't
00:56:47.400
know
00:56:47.740
it looks
00:56:48.800
like
00:56:49.100
see the
00:56:49.940
problem is
00:56:50.360
that even
00:56:50.780
his
00:56:51.040
criticisms
00:56:51.620
don't seem
00:56:52.300
smart
00:56:52.840
there are
00:56:54.600
lots of
00:56:54.900
people I
00:56:55.220
disagree
00:56:55.620
with
00:56:56.060
but they
00:56:57.160
don't look
00:56:57.560
actually
00:56:58.160
stupid
00:56:58.660
they look
00:56:59.940
like just
00:57:00.320
people who
00:57:01.120
you know
00:57:01.460
maybe they're
00:57:02.000
confused or
00:57:02.860
whatever
00:57:03.100
but they
00:57:03.840
don't look
00:57:04.140
stupid
00:57:04.600
you know
00:57:06.380
if Jake
00:57:07.720
Tamper says
00:57:08.360
something I
00:57:08.840
don't like
00:57:09.420
he never
00:57:10.520
sounds stupid
00:57:11.320
because he's
00:57:12.880
not
00:57:13.280
right
00:57:13.840
but Stephen
00:57:16.000
King actually
00:57:16.560
sounds stupid
00:57:17.360
when he
00:57:17.740
tweets
00:57:18.100
maybe he's
00:57:20.140
just a bad
00:57:20.580
tweeter
00:57:20.920
but it's
00:57:21.660
hard for me
00:57:22.140
to reconcile
00:57:22.780
the tweeting
00:57:23.440
with the fact
00:57:24.500
that he wrote
00:57:24.960
all these
00:57:25.340
popular books
00:57:26.380
maybe Joshua
00:57:31.520
Lysak
00:57:32.000
could explain
00:57:32.820
it to us
00:57:33.580
because I
00:57:36.440
already know
00:57:36.800
the answer
00:57:37.280
but maybe
00:57:38.640
Joshua can
00:57:39.540
explain that
00:57:40.060
to you
00:57:40.400
why is it
00:57:41.980
he's writing
00:57:42.660
all these
00:57:43.020
good books
00:57:43.660
still
00:57:44.580
and yet
00:57:45.060
he seems
00:57:45.500
like he
00:57:46.080
couldn't be
00:57:46.480
capable of
00:57:47.000
that
00:57:47.180
what could
00:57:50.100
possibly be
00:57:50.700
the explanation
00:57:51.280
for that
00:57:51.680
that's one
00:57:53.960
explanation
00:57:54.460
I don't know
00:57:55.720
if it's the
00:57:56.080
only explanation
00:57:56.800
but it's
00:57:57.300
certainly one
00:57:57.900
of them
00:57:58.500
yeah
00:58:01.980
alright
00:58:02.580
that's enough
00:58:03.540
for now
00:58:03.920
no I don't
00:58:05.100
have monkey
00:58:05.500
pox
00:58:05.860
I burned
00:58:06.520
my lip
00:58:07.000
I hope
00:58:08.520
that's the
00:58:08.840
last time
00:58:09.160
I have to
00:58:09.480
say that
00:58:09.820
today
00:58:10.160
and I'll
00:58:11.540
talk to you
00:58:12.060
tomorrow
00:58:12.820
what do I
00:58:13.240
do it
00:58:15.400
soon
Link copied!