Episode 1093 Scott Adams: Post Office Efficiency, Biden Botching Everything, ACLU Backs China, Barr Disses Teachers Union for Racism
Episode Stats
Words per Minute
145.02388
Summary
In this episode of Coffee with Scott Adams, host Scott Adams talks about the latest in the Russia investigation, including the latest charges against former FBI lawyer Kevin Clinesmith, and why he should get the death penalty. Plus, Bill Mitchell gets banned from Twitter.
Transcript
00:00:00.000
Hey everybody, come on in, come on in, gather around, grab your beverage.
00:00:19.300
Is it going to be the best coffee with Scott Adams you've ever seen?
00:00:27.040
Until tomorrow, because it gets better every day.
00:00:33.840
And all you need to join in on this incredible experience that people around the world are enjoying simultaneously.
00:00:48.580
All you need is a cup or mug or a glass or tank or jails or stein, a canteen, a jug or flask, a vessel of any kind.
00:00:55.040
Fill it with your favorite liquid I like, coffee.
00:00:59.340
And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:01:12.580
I feel justice breaking out all over the place.
00:01:24.560
If you are sitting around tonight and saying to yourself, what could I do in the middle of this pandemic?
00:01:30.660
What type of entertainment can I find that's in my own house without even leaving the house?
00:01:37.880
The answer is you should watch the Greg Gutfeld show tonight.
00:02:01.340
The big news, unless you watch the fake news, is that lawyer Kevin Clinesmith is being indicted for falsifying evidence that continued the investigation onto Carter Page, which we know now was all a bunch of BS.
00:02:27.860
If you have a high-level lawyer working for the FBI, and this high-level lawyer falsifies documents to keep a guy under surveillance and risk his life in the larger service of overthrowing the legal government of the United States,
00:02:54.220
Because, it's a weird thing, because it's sort of a, it feels blue-collar, or white-collar-y.
00:03:02.860
You know, it's like if you change some words on a document, well, that's not so bad, right?
00:03:09.200
I mean, it's not like you murdered somebody, is it?
00:03:21.320
Treason doesn't say you murdered this one person.
00:03:25.880
It says you created a situation that is so dangerous for the country that the entire republic is in danger.
00:03:34.280
So this one lawyer did a crime which, in my opinion, put the entire republic at risk.
00:03:42.180
That, in terms of how bad it is, potentially, you know, he got caught.
00:03:50.500
But in terms of how bad it is, potentially, it's treason level.
00:03:56.760
You know, I don't think there's any chance that the death penalty could be, you know, actually used.
00:04:10.020
Apparently, the MSNBC and CNN, they don't think this is a big story.
00:04:19.500
But you're watching it in real time, and it's probably blowing your mind, that the press can make anything important seem like it's completely unimportant.
00:04:33.000
And they can take something that's completely unimportant and blow it into something that looks like it's the big national story.
00:04:40.260
And you stand there, and you look at it objectively, and you think to yourself, are we spending all our time on the unimportant stuff?
00:04:55.520
Bill Mitchell got banned from Twitter, I understand.
00:05:00.760
Jack Posobiec was reporting on this on Twitter.
00:05:07.060
He said that Bill Mitchell first received a seven-day ban for tweets talking about mask dangers,
00:05:14.380
then was using a second account to tweet during the ban and had his main account permanently suspended.
00:05:33.120
And he's very nice personally, so it's not a personal thing.
00:05:36.560
But I blocked him because his certainty on a medical thing just is not good.
00:05:46.920
Now, I wouldn't have necessarily banned him from Twitter.
00:05:53.000
I don't know if that's something I would have done.
00:05:55.100
But I certainly banned him from my life because I didn't want to interact with someone who was not a medical professional
00:06:08.860
Now, I'm perfectly okay with anybody who tweets a study that turned out not to be valid.
00:06:18.460
There's nobody on Twitter who hasn't at one time or another retweeted a story that turned out to be false,
00:06:24.720
a study that turned out to be not reproducible.
00:06:34.360
Most of us would say, well, this is interesting.
00:06:40.060
Once you cross over into I'm telling you the facts, listen to me, people,
00:06:46.980
So I don't know the full details of this, but I'm guessing that Bill Mitchell was saying something as a medical fact
00:06:56.100
when maybe we shouldn't be speaking in terms of certainty.
00:07:01.680
I guess more information on that will come out.
00:07:04.920
All right, here's my favorite stupid story of the day.
00:07:08.440
Now, as the creator of the Dilbert comic, you know my filters are turned for corporate behavior that's a little bit ridiculous.
00:07:17.080
And I've always wondered, will there be a day sometime in the future where there'll be some company that's so big and so unwieldy,
00:07:26.340
that has so many moving parts, so big, the bureaucracy, that it would someday sue itself?
00:07:37.060
Would a company ever get so big that it wouldn't know that it was suing itself?
00:07:47.080
So the ACLU is, first of all, it's backing China, saying that TikTok and WeChat,
00:07:58.180
the two Chinese apps that are under review for being banned in this country,
00:08:02.700
it looks like they will be banned, and the ACLU is trying to stop that.
00:08:09.740
The ACLU is trying to stop a national security decision by the Commander-in-Chief?
00:08:21.680
And nobody's saying that these products can't exist.
00:08:25.020
They just can't exist under the control of our nemesis who wants to destroy our country.
00:08:30.260
Now, if that doesn't make you think that the ACLU has lost all credibility,
00:08:45.160
The ACLU gets a lot of its funding from Google, for example.
00:08:50.020
So Google is one of the main, you know, big funders of the ACLU.
00:08:55.180
Google, of course, is maybe a little bit friendly with China.
00:08:59.820
So what do you make of the fact that Google is maybe a little bit friendly with China
00:09:06.020
and is funding the ACLU, and the ACLU is siding with China over the United States?
00:09:14.980
Now, they wouldn't put it that way, but that's the way it looks to me.
00:09:17.960
Because even if, you know, even if this is like a gray area,
00:09:24.800
it's still the United States' call how much risk we're willing to take.
00:09:30.100
It's not up to China to decide how much risk the United States takes.
00:09:39.760
I don't think the ACLU should get to decide how much risk the United States takes
00:10:01.620
The ACLU is also going after this small startup called Clearview AI,
00:10:08.040
who does an app that if you have a photo of anybody,
00:10:14.520
it'll find you more photos of that person so you can figure out who they are.
00:10:19.240
So it's used by law enforcement to identify people.
00:10:23.160
And I guess the ICE, so the federal government's ICE, you know, immigration guys,
00:10:29.620
are, they signed a contract with Clearview to use the app to catch child traffickers.
00:10:38.900
There is a technology which we know with complete certainty
00:10:44.000
because it's already being used, already successful at exactly this.
00:10:49.420
So we don't have to speculate whether it works.
00:11:00.760
There are lots of examples already of it working in other contexts, etc.
00:11:03.860
So ICE decides that this technology will help them catch child traffickers.
00:11:15.240
Like, who's on the other side of using, you know, a standard technology,
00:11:20.640
which will be widespread far more than it is now, to catch child traffickers?
00:11:27.700
So the ACLU is trying to stop ICE from using a technology
00:11:43.760
It's not so much that they care about this particular use.
00:11:47.760
They care about the company and that what it might be for, you know, privacy, etc.
00:11:54.000
But that's just a big question for the country.
00:11:59.640
It doesn't really have to do with this contract.
00:12:02.660
And they're using, and there's a punchline to this.
00:12:06.440
If you think this is crazy, there's a whole good part coming.
00:12:15.480
There's, I guess, an Illinois law recently passed that said it's illegal to scrape personal data,
00:12:23.920
like somebody's picture and identity or something, from other websites
00:12:31.840
So since Illinois has this law, that gives the ACLU the ability to, you know,
00:12:39.820
Now, they're saying that Clearview's a problem in general,
00:12:44.240
and then they're taking this ICE thing as the little fight they're going to make,
00:12:50.300
They're literally on the side of child traffickers.
00:12:55.180
They actually, literally, no kidding, took the side of child traffickers.
00:13:09.820
So Google, as you know, is one of the big funders of the ACLU.
00:13:17.600
And the ACLU is going after this company, Clearview.
00:13:20.680
And the cause is that they're scraping this personal information, right?
00:13:26.400
There's a story in the Wall Street Journal saying the same thing,
00:13:30.540
scraping this personal information, except here's the punchline.
00:13:35.340
The whole basis of the story is that Clearview is scraping the photos and identification of people.
00:13:47.520
What they do is they create an index of stuff that's available on the Internet.
00:13:56.360
Clearview AI is a subset of Google function itself.
00:14:05.340
Google does not store on their own servers all the information on the Internet.
00:14:17.860
So Clearview makes an index of faces and does not store any of that data itself.
00:14:28.540
Rather, it stores the index information so you can go find it if you want to see it.
00:14:39.500
But the main point is they're not storing a copy of the whole Internet.
00:14:43.060
So Google's own business model, if you did a Google image search, it would have an index
00:14:52.960
to the images, and then it would show it to you.
00:14:57.320
They just have an index of where all the faces are, and then it just finds it for you.
00:15:02.760
So Google is funding the ACLU to sue Clearview for the business model that is identical to
00:15:13.500
And the only reason this is happening is because apparently the people doing the suing don't
00:15:21.800
They don't know the difference between an index and a copy.
00:15:28.260
This whole thing, it's like national news, and everybody's writing about it, and it's
00:15:33.280
all based on the fact they can't tell the difference between an index, which is perfectly
00:15:47.400
It's basically an extension of a fingerprint database.
00:15:52.880
If fingerprints were legal, and they are, to take fingerprints, this is just a better version
00:16:00.740
of it, using public information that's already available.
00:16:03.540
Google, and if you did a little work, you could find it.
00:16:13.120
Not technically, but they're basically suing through ACLU that they fund, trying to stop
00:16:26.820
So the ACLU is managing to sue on behalf of Google, if you count them as the funding source,
00:16:40.640
somebody doing the same thing that Google's doing.
00:16:49.560
Well, the one I touched said you were wrong, so if I blocked the wrong person.
00:16:55.300
No, I don't block people for disagreeing with me.
00:17:04.260
You can disagree with me all you want, but you just have to show a reason.
00:17:09.400
It could be a summary of a reason, just, oh, I think you didn't look at this data or whatever.
00:17:13.960
But if you just say you're wrong, I block you, because I don't need it.
00:17:19.280
Anyway, so keep an eye on this ACLU story, because it seems they've turned bad.
00:17:25.760
Now, this is an example of a Gell-Mann amnesia.
00:17:31.120
Gell-Mann was the name of a physicist who, whenever he saw a story about physics,
00:17:37.620
But then if he saw a story about some other topic, he would say, oh, that's probably right,
00:17:43.300
knowing that every story he saw that he knew the information about was wrong.
00:17:47.840
Now, is it more likely that they're only wrong about the things he knows about?
00:17:53.000
Now, in this case, with the Clearview AI, I know this much about technology,
00:18:02.040
and I've looked into this specific issue, so I know that it's an index.
00:18:08.020
So if you know just a little bit about how things work,
00:18:11.380
you can tell that all the headline stories are just complete bullshit,
00:18:14.780
because they think that they're scraping and keeping the personal information,
00:18:19.940
All right, I live in California, which, as you know, is a third-world country.
00:18:31.460
But last night, I'm getting an alert on my phone about which parts of California
00:18:38.400
while the temperature yesterday at my house was 106.
00:18:42.440
Now, my house is built to be very green, so it could be 106 outside,
00:18:48.600
and my house wouldn't get warmer than probably 80.
00:18:53.660
But it's pretty rugged if you live in any less insulated house.
00:19:03.460
So I actually live in a state that can't figure out how to keep the lights on all year.
00:19:09.820
Now, could you even think of any greater indicator of incompetence
00:19:20.220
In 2020, you can't keep the lights on all year.
00:19:32.380
It got warm, and my state is not well-run enough to keep the lights on?
00:19:42.800
You know, it wasn't long ago that we didn't have enough water.
00:19:49.640
and it looks like we're going to get rid of police
00:19:58.040
So I saw some stats that people are leaving San Francisco
00:20:06.840
It's basically all the people with money are leaving.
00:20:11.640
And I have to say that I've wondered for some time
00:20:32.280
or, you know, press tour or something like that.
00:20:39.520
which is, you know, there's a great energy here,
00:20:46.500
of course I'm talking about before any of the protests happened,
00:21:05.620
and you just can't even wait to get out of there.
00:21:19.700
I'm not surprised that San Francisco and New York
00:21:25.560
I also don't necessarily think it's the end of the world
00:21:35.520
were they shopping at those retail stores anyway?
00:21:58.120
and their police department just endorsed Trump.
00:22:20.420
if they're going to back the president of the United States.