Episode 1960 Scott Adams: It Turns Out The FBI Was A Big Part Of The Twitter 1.0 World & Lots More
Episode Stats
Length
1 hour and 20 minutes
Words per Minute
149.17484
Summary
A nuclear breakthrough at the Lawrence Livermore Lab, a black hole being formed in the middle of the night, and a huge earthquake in the Bay Area. Scott Adams talks about it all on this morning's episode of Coffee with Scott Adams.
Transcript
00:00:00.400
Good morning everybody and welcome to the highlight of human civilization.
00:00:06.020
It's called coffee with Scott Adams and let me tell you, there has never been a finer thing that's ever happened.
00:00:13.400
And if you'd like to take this up to levels heretofore unimaginable,
00:00:18.680
all you need is a cupper, mug or a glass, a tank or chalice or stein, a canteen jug or a flask, a vessel of any kind.
00:00:25.680
Filled with your favorite liquid. I like coffee.
00:00:31.500
And join me now for the unparalleled pleasure of the dopamine the other day, the thing that makes everything better.
00:00:36.600
It's called the Simultaneous Sip and it happens now. Go.
00:00:47.420
If you were going to do a national advertising campaign in support of the Simultaneous Sip, what would be your slogan? Go.
00:00:59.120
Has to rhyme, has to be fast, has to get to the point.
00:01:03.460
Don't miss the Sip. Don't skip the Sip. Don't skip the Sip. Don't skip the Sip.
00:01:18.460
You heard the other day that there was a huge breakthrough at the Lawrence Livermore Lab just down the road from me.
00:01:31.940
And they, you already heard this, they cracked fusion.
00:01:36.900
They actually did a fusion experiment that created more energy than it used to do the experiment.
00:01:42.020
A gigantic step in the, maybe, maybe one of the greatest scientific breakthroughs of all time.
00:01:52.400
The other thing that happened was early this morning, there was an earthquake.
00:02:04.940
So, yeah, you got the earthquake and the gigantic nuclear fusion experiment roughly in the same place.
00:02:12.720
Well, I didn't want to tell you this, but a black hole is formed in Livermore, California.
00:02:22.560
So far, it has only sucked most of Main Street into it.
00:02:35.780
How many of you believe that Klaus Schwab once said, someday you'll owe nothing and be happy?
00:02:45.100
How many believe that is something that actually happened in your actual world?
00:03:04.500
See if you can find the most famous quote in all of the right-leaning world.
00:03:25.160
And if you want to know, I just tweeted the link to an article that explains where that rumor came from.
00:03:34.500
So it appears to come from an opinion piece that was on the World Economic Forum website.
00:03:42.480
So the first thing that's true is that there's something like that opinion that was, in reality, in truth, part of the World Economic Forum website, I think.
00:03:55.280
So it was associated with Klaus, but you'd have to read the context.
00:04:01.900
The context is not, we want to do this for you.
00:04:06.580
It was more like, this is what we foresee might happen.
00:04:15.000
It wasn't somebody saying, hey, let's take away all your property rights, something like that.
00:04:21.220
Now, does anybody, would you have a problem if you knew it was an opinion piece that was saying more like, well, this stuff is already going to happen?
00:04:33.520
For example, the existence of Uber allowed at least one person I know who had plenty of money to not own a car.
00:04:43.000
And I remember talking to him and he would say, I don't need to own a car now because where I live, Uber is just always available and it's just easier.
00:04:54.120
There's someone who doesn't own a car, who easily could, he's rich, but he doesn't own a car and he's happier.
00:05:01.380
And that's basically the essence of the article, is that the free market would create situations in which you optionally could own something or not.
00:05:11.140
And if you decide not to, you might be happier.
00:05:14.400
Do you know what I was thinking about all yesterday?
00:05:20.860
Because I own an automobile, I don't know if you've heard one of these.
00:05:25.380
There's a, I drive something called a Christmas tree.
00:05:36.240
It has a steering wheel and it has tires that often are inflated.
00:05:40.920
Those tires often have the proper amount of inflation.
00:05:43.540
But otherwise on the inside, there's this thing in front of you, like an electric panel,
00:05:51.420
So it's usually telling me, this is broken, that's broken, and you better get a service, and your tires are flat.
00:05:58.000
And so when I get in my car, the Christmas tree comes on, and I feel festive.
00:06:04.000
But as I was walking three miles to pick up my car with my flat tire yesterday, it was being serviced,
00:06:12.060
I got home and, of course, the parking assistance doesn't work, which is why I ran my car into the side of my garage door.
00:06:21.940
Because I was waiting for that little light to say, you're getting a little close to the edge of your door.
00:06:29.080
And when that little thing goes on, I'm going to stop.
00:06:34.840
So I just took out part of my garage door and my car.
00:06:41.840
Because BMWs don't have good electrical systems.
00:06:46.820
I once bought a brand new BMW, the one, two cars before this one, and the entire electrical system failed as I drove off the lot.
00:07:00.100
As my car, literally, as I pulled out from the lot, brand new car, the entire electrical system of the car failed.
00:07:08.140
It was still running, but there was no electrical.
00:07:13.820
You know, just sort of back up, back onto the lot.
00:07:15.700
Now, how many of you have heard me tell this story about my weird association with electrical and mechanical objects?
00:07:25.800
It's a theme my entire life, which I talk about all the time.
00:07:30.400
I can make streetlights stop by walking under them.
00:07:37.200
You know, things just don't work around me electrically.
00:07:42.220
So my automobiles always have, and so I wanted to ask this question.
00:07:49.760
95% of the time I drive any automobile that's mine, my own car, there's a warning light on on the panel.
00:08:01.740
That 95% of the time you drive your car, there's a warning light on.
00:08:14.080
Now, the reason is, it would be a full-time job.
00:08:18.240
The one that's broken now is this PDC light or something that tells you if you're close to something when you're parking.
00:08:36.440
Now, every time I turn it on, I have to clear the error before I can, you know, use my panel and stuff.
00:08:46.020
How many things in your life are there that you know if you took an hour, you could fix it, but you never have an hour?
00:08:52.260
So you accumulate all of these things that you're just temporarily making not bother you?
00:08:59.620
Like, for the last several days, I drove on a flat tire that I knew was flat.
00:09:08.960
And my whole life is full of things, like, you know, driving on a flat tire.
00:09:16.760
Because, you know, do you know how many problems I have with my computer?
00:09:31.220
But if I did everything, it would be like thousands of hours.
00:09:34.760
So I live in a world where everything's broken.
00:09:51.360
Of owning just a lot of shit that's broken all the time?
00:10:14.440
The idea that not owning things would make you happy
00:10:17.440
is something you might not understand unless you own enough stuff.
00:10:26.000
If you don't own stuff, you really want more stuff.
00:10:29.840
If you're struggling, the whole time you're thinking,
00:10:43.280
I mean, I'd still prefer it, but it doesn't get you close to happy.
00:10:52.200
Has he been, has Rupar been brought back on Twitter yet?
00:10:56.060
It looked like Musk had, he ran a poll about when to bring back the people he had just banned.
00:11:03.940
And I believe I saw just a few minutes ago that he was going to do that.
00:11:14.360
And his defense for why what he did, which was referring to Facebook still having information on Musk's airplane's location,
00:11:25.240
his argument is that that wasn't so bad and it didn't really endanger Musk because all he did was make it much, much easier for somebody to hurt Musk.
00:11:35.940
And his point is that if somebody wanted to, they could still work a little harder to hurt Musk and his family.
00:11:45.360
But all he did was make it a little easier and more convenient to hurt them.
00:11:58.460
Independent of whether he should have been banned for his tweet.
00:12:03.120
Most people would say, you know, there's an argument both ways.
00:12:07.500
But independent of whether the first tweet was bannable, his response to why he thought that tweet was okay should ban him for life.
00:12:18.980
Do you want to have any association with somebody who says,
00:12:22.280
no, I only put your family at a little bit of risk, which I decided would be appropriate for you.
00:12:28.700
I would like to take on what was a right-leaning misinformation site.
00:12:35.040
It doesn't seem like YouTube always fucks with me when, maybe it's because I swore.
00:12:44.700
But do you remember that the FBI was warning Twitter that when Rupar was spreading the fine people hoax,
00:12:57.600
as he frequently does, that that would influence the elections?
00:13:01.880
Do you remember the FBI tried to get Aaron Rupar removed from Twitter for spreading the fine people hoax,
00:13:09.680
which was such a defining moment in American history that Biden actually based his entire campaign on it.
00:13:16.620
But luckily, luckily the FBI was watching and they wouldn't want the election to be, you know, affected by a hoax.
00:13:24.300
So the FBI immediately reported to Twitter and then Twitter removed all of the, all of the references to the fine people hoax because that would affect it.
00:13:38.200
The entire hoax quiz of now, what, 20 fucking, I think there are like 20 fucking hoaxes.
00:13:46.800
Do you think the FBI targeted or tagged any of those hoaxes?
00:13:56.940
Now, if you ask the FBI, would they say, oh, no, the things we're concerned about would be election misinformation such as what day the election is.
00:14:07.640
So a number of the examples were people who were joking that the election was on Wednesday, right?
00:14:15.100
Now, I don't mind if those get banned, or at least, you know, temporarily.
00:14:22.540
Now, I get free speech and all that, blah, blah, blah.
00:14:25.360
But do you have any problem with a social media network banning misinformation about the day of voting?
00:14:34.920
Maybe, you know, a one-week ban or something, right?
00:14:46.780
I don't have a problem if Twitter handles it with a notice.
00:14:59.100
Because I think it's just trolls messing around.
00:15:07.180
Because that goes right to the heart of democracy, right?
00:15:18.480
As long as they didn't, you know, put anybody in jail for it or anything.
00:15:25.580
The evidence suggests that they were banning funny memes that didn't have any informational value, really.
00:15:31.520
So, do we know for sure that the FBI was in no way involved in anything useful?
00:15:47.100
Because even if they were doing something for their own team, it didn't seem to be doing that.
00:15:51.100
The accounts that they were targeting were all these minor, unimportant accounts that who knows if they had any effect at all.
00:16:00.540
And I'm sure that whatever rumors they were trying to stop, they didn't stop at all.
00:16:06.540
Can you name a rumor that the FBI tried to stop that you haven't heard?
00:16:15.400
Is there any rumor that the FBI tried to stop, based on the files?
00:16:26.500
You know, what a lot of it looked like is just a bureaucracy doing what a bureaucracy does.
00:16:31.620
Don't you think there are a lot of people in the FBI who are not qualified to do field work?
00:16:40.300
Do you think there's anybody at the whole FBI who might be, let's say, unqualified to be in the field doing important work,
00:16:49.000
like catching counterfeiters and terrorists, probably.
00:16:53.040
Now, what would you do with somebody who some...
00:16:56.180
You didn't want to get rid of them, but they didn't have the ability to do field work.
00:17:01.940
Can you think of any kind of assignment which would be a way to just park somebody useless?
00:17:21.060
If you wouldn't mind, maybe you could leave your gun locked in the boss's safe.
00:17:30.460
But we don't think you should have a gun right now.
00:17:36.760
However, if I get a job for you, how would you like to watch Twitter all day
00:17:42.600
and then write memos about stuff you don't like?
00:17:51.540
That's pretty much you described my entire entertainment day.
00:17:55.700
Looking at Twitter and bitching about shit I didn't like.
00:18:07.240
It's going to be my job to shitpost about Twitter,
00:18:10.540
except it's even going to be better than tweeting what I don't like.
00:18:13.840
I can go directly to Twitter management and tell them to get rid of it.
00:19:01.200
Can you take my lead on this as the creator of Dilbert?
00:19:04.380
Do I have enough credibility to make the following statement?
00:19:40.720
Some of the, I would imagine, some of the finest employees ever selected for a major business.
00:19:55.600
Now, imagine if you had some way to identify the 10% or less.
00:20:00.620
Let's say the 1% of Apple employees who are not good.
00:20:09.640
Now, let's collect them all together and put them on the same project.
00:20:18.060
Remember, they're Apple employees and some of the best in the world.
00:20:21.240
But for this one project, you did somehow scour the organization to find the worst 1% of the employees.
00:20:32.620
It was a bunch of FBI people who were not qualified for real work.
00:20:36.720
Doing some of the most important work in the country.
00:20:39.800
Because we didn't know how important it was at the time.
00:20:42.200
Now, I didn't really see anything that they went after that mattered to me.
00:20:50.140
Did you see anything that the FBI banned that you or the country would have been better off if it had been fully expressed?
00:21:04.160
I think the bigger story is that it doesn't matter how many of the worst employees you throw at something.
00:21:30.220
It was just 80 bad employees bitching about Twitter and making the Twitter employees really busy.
00:21:49.500
And Matt Taibbi also explains it in his own words better.
00:21:56.300
It's not so much that there's a deep state with some kind of an organized structure that's moving the puppet strings.
00:22:03.900
It's just a bunch of bureaucracy with people who have often common opinions.
00:22:08.960
But it's basically just a rat's nest of bureaucracy doing whatever the rat's nest can do.
00:22:16.920
So basically the FBI, 80 FBI people working with Twitter was a rat's nest of no particular importance.
00:22:26.140
Now, the fact that as far as I know, it didn't change anything, does that mean I'm okay with it?
00:22:33.860
I hope my audience is the ones who can handle nuance.
00:22:39.740
I hope I've filtered you well enough that you can.
00:22:46.120
I'm completely against the FBI having practically a permanent role at Twitter.
00:22:52.620
But the fact is it probably didn't make any difference.
00:22:57.540
Or let me say nothing has been reported yet where it made any difference at all.
00:23:05.920
You say it did, but would you agree there's no evidence?
00:23:10.340
If you're saying you think it did, I won't disagree.
00:23:14.240
I'm just saying there's no evidence of that, right?
00:23:17.380
Hunter's laptop, was Hunter's laptop because of these 80 FBI?
00:23:23.040
The 80 FBI employees were not the cause of the Hunter laptop.
00:23:26.900
I believe that came from actual management and, you know, a conversation.
00:23:31.780
So it did come from the FBI, but if the only thing that was happening was those 80 people reporting stuff, probably it wouldn't have happened.
00:23:41.520
Because I think it was just 80 employees bitching about stuff.
00:23:51.040
I think nothing's been demonstrated that was bad.
00:23:58.100
But we both agree that whether it has or has not been demonstrated by the information we have, you can't let it stand.
00:24:12.340
The fact that maybe there's no direct evidence that I've seen personally doesn't really mean much.
00:24:20.260
You can't have a situation with that much exposure.
00:24:32.440
If you took everything you saw about the FBI, as I said, I don't think it mattered too much, except maybe that laptop story, which I think would have happened in any situation.
00:24:46.320
But what's the biggest thing you're still wondering about Twitter suppression?
00:24:56.840
What's the biggest remaining question you have about what was really happening at Twitter?
00:25:05.120
We heard a little bit about the, yeah, we do have questions about the underage stuff, you're right.
00:25:14.740
I still don't know the answer to whether an account like mine, and specifically mine, but those like it, who have no strikes against them,
00:25:24.200
why were we shadow banned and who made that decision?
00:25:28.880
So the first question is, could it be confirmed that I was ever shadow banned?
00:25:34.160
I mean, it looks like it in every way, but I still need confirmation.
00:25:43.420
How is it possible that I don't know the answer to that at this point?
00:25:49.660
Because the accounts that the FBI tagged to get rid of were small accounts of no impact.
00:25:59.380
I actually move probably tens of thousands of votes.
00:26:06.620
So if somebody shadow banned me, and they did it, if they did it, by name, as in I'm on a list, if that happened, there's no evidence of that.
00:26:15.400
But if that happened, what would be the reason that I would be shadow banned?
00:26:51.260
Knowing if I got shadow banned, and people like me, so I'll say, let's say, a Cernovich, a Posobiec, people who can actually move the dial.
00:27:01.760
I'm not talking about just people who are users, but people you know can move the dial.
00:27:24.780
But if he was, he also, I believe, yeah, Dan Bongino.
00:27:33.700
Dan Bongino was called out by name as somebody who was shadow banned.
00:27:40.400
But I think they also had a specific reason, or was there not?
00:27:46.680
Did the Dan Bongino case, there was a topic he was talking about they didn't like, right?
00:27:55.140
But was the topic, did that give them any justification, even weasel justification?
00:28:02.440
Because I'm wondering why he was called out, and so many other big right-leaning accounts were not.
00:28:15.960
But why would Dan Bongino be, you can tell me that those accounts did or did not get shadow banned.
00:28:29.280
Everything we've seen so far is just like, it's like the teaser to the story or something.
00:28:42.400
So I know that several people were confirmed, but I believe that there was some excuse in each of those cases, right?
00:28:49.720
That whether it was a good excuse or not, they had some, like, reason to act.
00:28:57.040
My question is, did people like me get picked up by the algorithm without doing anything that anybody detected as a problem?
00:29:06.220
Because if the algorithm was just saying, oh, you tweeted, if I tweeted Bongino three times,
00:29:12.600
do you think the algorithm would have started suppressing me just automatically?
00:29:18.500
If they're going to ban anybody who's boosting them, isn't that automatic?
00:29:37.900
But I'm not seeing a direct, I'm not seeing a direct refutation or agreement with me.
00:29:45.020
I want to see that because I want to know if I'm on the right track.
00:29:48.500
True or false, that the part we don't know is the big part?
00:30:01.960
And did you feel that you were being distracted and you missed that story?
00:30:09.600
Because we've all been putting our attention on these little, you know, sparkling objects.
00:30:14.380
But so far, the sparkling objects have all been smallish, you know, of great concern because they have the potential to be biggish.
00:30:22.680
So that's why you have maximum concern, even if the examples so far are not fully expressed as bad.
00:30:30.640
But the other thing is already fully expressed, right?
00:30:36.020
If, in fact, people like my account were shadow banned, don't know that, but if they were, that would be a fully expressed plot.
00:30:44.900
The other stuff is just stuff that could have been bad, but it's a good thing we stopped it when we did.
00:30:50.460
The other stuff is what actually happened, if it happened.
00:31:07.120
Let's see if I can make all of your heads explode.
00:31:10.600
There were 80 FBI employees looking at content on Twitter.
00:31:17.120
How many FBI agents were continuously monitoring TikTok and reporting to TikTok's management what things should be removed?
00:31:33.000
Now, it might be that the FBI is monitoring TikTok, and it might be that they are reporting things.
00:31:39.400
But we might expect that TikTok would not necessarily respond.
00:31:45.480
TikTok might respond to the ones that are just really clean, like this is misinformation.
00:31:50.380
I could imagine that TikTok would take down a video that said the wrong data vote.
00:31:59.200
I can imagine FBI telling TikTok to take it down.
00:32:03.000
And I can imagine TikTok knowing that that's not where the big win is for China.
00:32:12.260
They're not like, oh, I'll start this little rumor and that'll cause some chaos.
00:32:17.580
If China wants to weaponize TikTok, it's going to be through the major persuasion in general, you know, telling you to go crazy on climate change if they think it's not going to help you, that sort of thing.
00:32:35.620
So I wouldn't be surprised if FBI is, in fact, watching TikTok and, in fact, reporting things that look bad.
00:32:50.640
It feels like a really important thing to know.
00:32:57.160
Now, if you say, Scott, Scott, Scott, you know they're not, you could be right.
00:33:09.640
I don't know why we don't know the answer to it.
00:33:15.240
Have I ever mentioned to you that analogies are the best way to know that you want a debate?
00:33:42.080
But if you just say they're all analogies, then you miss that distinction.
00:33:47.260
I'm going to try to explain this thing called a zebra that you've never heard of,
00:33:52.040
But somehow you've already learned what a horse is.
00:33:55.320
So I'm going to do a quick definition of a zebra.
00:34:03.900
It's a little genetically different from a horse,
00:34:11.600
Because all I did was quickly bring you up to speed
00:34:14.820
on something that would have taken longer to explain.
00:34:20.080
And you would all agree with that as a good use of an analogy, right?
00:34:26.360
A bad use is that the logic of, or the situation in one analogy can be ported over to the other one.
00:34:37.100
It's not, you can't port logic from one analogy to the other.
00:34:40.460
Well, if it's true in this unrelated story, it must also be true over here.
00:34:46.760
Now, you notice none of that is the case with my zebra horse thing.
00:34:51.380
The zebra horse thing, everybody, everybody, everybody would hear it and say,
00:35:02.680
The zebra one gives nobody anything to complain about when it's done.
00:35:13.420
and I'll explain to you why it's the end of the conversation.
00:35:17.880
Again, here's the, I'm going to put some ground rules here.
00:35:22.300
This is not a conversation about whether masks should be used.
00:35:33.420
But I'm going to talk about something related to it,
00:35:35.280
and it's the related to it part that's important.
00:35:42.200
But in the conversation I had with a woman who was talking about
00:35:46.800
whether the initial viral load made any difference.
00:35:55.880
Do you believe that if you were exposed to more virus initially
00:36:05.280
So, like everything, the science is not as good as you would like.
00:36:17.420
There's at least one preprint that says that people who lived in a house
00:36:23.680
So, in the conversation, somebody jumped into the conversation with this analogy.
00:36:34.180
Because I'd been making the argument that any kind of a barrier makes a difference.
00:36:40.640
It just might not be enough difference to be worth it.
00:36:52.880
There's no such thing as a barrier that doesn't have work, right?
00:37:22.700
And that's the argument for why masks don't work, right?
00:37:27.540
If it's a poorly fitted mask, it gets out anyway, right?
00:37:35.560
If your air goes sideways and up and down, then it'll stop a hurricane.
00:37:47.040
If you give me a chain-link fence, they'll do the same thing as a mask,
00:37:51.180
which is it redirects some of the air sideways and up.
00:37:55.560
If your chain-link fence can do that, it would stop a hurricane.
00:38:01.120
So, now you say to me, Scott, you're arguing like the details of an imaginary situation with a hurricane.
00:38:14.740
As soon as you talk about an analogy, you're in a different conversation.
00:38:22.160
It tells you nothing about the one you're talking about.
00:38:24.700
So, remember, if it's like the zebra, it's a good analogy.
00:38:29.120
It just says something that everybody understands and agrees with.
00:38:34.540
But as soon as you get to chain-link fence and hurricane, that's just a different conversation.
00:38:48.680
Anyway, so after this long conversation with the woman who was a subject matter expert on masks, I did what I should have done the first minute.
00:39:01.700
Because the problem seemed to be a logic problem and not a knowledge problem.
00:39:08.920
So, right away, I was like, we're losing the audio on YouTube.
00:39:14.840
So, right away, as soon as the conversation happened, I kept thinking, why is it, why is it, it's almost like I'm talking to somebody who doesn't understand reason and logic.
00:39:31.560
So, I was talking to, I was having a science conversation with an author.
00:39:35.900
And it went exactly the way you think it would.
00:39:42.640
I couldn't even, and in the end, it ended up being just a logic disagreement and not a factual disagreement at all.
00:39:52.660
Do you know why she would disagree with my characterization of what it was?
00:39:58.920
Anyway, the most consistent pattern in all of social media is that people who are writers for a living, and I'm guilty, are very unlikely to have logical arguments on science and stuff like that.
00:40:19.480
Have you ever seen me disagree with engineers on Twitter?
00:40:26.200
Have you ever seen me having a, like an, those of you who follow me on Twitter, have you ever seen me have an extended disagreement with an engineer on Twitter?
00:40:40.620
Because they're trained largely the same way I am, which is how to look at the costs and the benefits and make sure you didn't leave anything out.
00:40:47.720
Like, everybody who's trained the same way, you can have a pretty quick conversation with.
00:40:53.480
Because even if you disagree, what happens if I disagree with an engineer?
00:41:01.740
Because, like, I say this, I say this, show me your link.
00:41:09.260
Yeah, conversations with engineers are just, yes, no, is it logical?
00:41:17.720
So, no, I am exaggerating, of course, engineers can disagree as well, but their disagreement is a complete different type.
00:41:24.640
The disagreement with a writer or an artist, you're never really even talking about the same thing.
00:41:31.720
It's like you're just trying to model the water into a sculpture.
00:41:39.060
With an engineer, at least, that's an actual disagreement, so you can work it out.
00:41:43.860
What do you think of the United States deciding to arm Taiwan much more aggressively than we've done before and giving them much better weapons so they can defend against China?
00:42:08.960
Yeah, this one, there's no way to, there's no way to game it.
00:42:12.200
The only thing you can know for sure is that the decision will always go in the direction of, will always go in the direction of, follow the money.
00:42:26.780
So, here's the part that I don't know how to predict.
00:42:31.080
If Taiwan and mainland China got into an actual serious war where China decided, you know, we're just going to end this, would that be good or bad for arms manufacturers in America?
00:42:53.320
Because it would be good in terms of maybe immediate orders for products.
00:43:02.460
But remember, the people who are operating in their own self-interest still have to live in the world.
00:43:08.140
So, there's no arms manufacturer who's going to start a nuclear war to sell more arms.
00:43:15.520
Because that's so clearly not in anybody's interest, including the arms dealers.
00:43:19.860
Because they would be dead just like everybody else.
00:43:22.000
So, I think the arms dealers have to play a game where they have the maximum amount of conflict short of a nuclear confrontation.
00:43:30.300
Now, I think that they went as close to the line with Ukraine and Russia as anybody should ever go.
00:43:37.880
Now, they went as close to the line with Ukraine.
00:43:39.060
They went as close to the line than most of us would have said would have been prudent.
00:43:55.440
The experience in Ukraine is that Russia, so far, is backing down from nuclear use.
00:44:01.480
That is the most dangerous fucking situation we could ever be in.
00:44:06.520
Because that allows those same arms manufacturers to say, you know, Taiwan will be fine too.
00:44:20.800
There's no reason, no reason to think that because Ukraine has so far not sparked a nuclear war that anything we did involving Taiwan and China would turn out the same.
00:44:34.900
That is a big, wrong assumption that somehow those would be similar.
00:44:40.240
The Ukraine war, you can never anticipate the real ripple effect of what a war does.
00:44:48.680
Because sometimes you, you know, you create new technologies because of the war.
00:44:53.320
And it's like, great, at least that part of it.
00:44:56.600
And sometimes there's some lingering thing like, you know, the end of World War I, some would say, maybe you could debate this,
00:45:04.820
but some would say the way we handled the end of World War I wasn't good news.
00:45:08.520
Because it basically created the seeds of World War II.
00:45:12.740
So, I've got a feeling that the Ukraine situation is really going to screw things up for Taiwan.
00:45:20.600
I don't know how, but I hate that we're going to be informed by one thing to handle a separate thing.
00:45:33.100
But, you know, if they thought they could conquer Taiwan in a month, they would have their chips.
00:45:38.520
They just have to rebuild, just don't bomb the chip factories.
00:45:42.200
China would just have to not bomb the chip factories.
00:45:45.100
And if they thought they could gain control in a month, they'd be fine.
00:45:51.820
Yes, we're, we're pronouncing zebra, not zebra.
00:45:55.380
I'm getting a disagreement that zebra should be zebra.
00:46:00.620
Now, I don't know what country you're from, with your zebras.
00:46:18.720
So, now it's been a few days since Trump's NFT successes.
00:46:23.400
And I wonder if there are more people like me who had a Covington Kids initial reaction.
00:46:31.880
Remember the Covington Kids hoax when you saw the first deceptively edited videos?
00:46:43.420
You know, just like being a jerk to this guy in public.
00:46:51.940
Completely opposite of what the deceptive video was.
00:46:57.120
And I think that happened to me with Trump's NFT.
00:47:11.340
There's a rumor that I never admit when I'm wrong.
00:47:18.100
And yeah, the funny thing is, nobody does it more often than I do.
00:47:22.500
I'm going to say that in the world, nobody has consistently admitted when they're wrong
00:47:34.000
My first reaction to the NFT release, especially since he teased it as a major announcement,
00:47:52.120
Now, you saw that was my first response, right?
00:48:02.020
Let me say, with no reservations, sometimes, sometimes you hear me say stuff like, well,
00:48:09.080
I was wrong, but really I was right if you looked at it this way.
00:48:17.040
Yes, I was wrong when I said Putin wouldn't attack, but I was really right in a way, because
00:48:23.080
of the reason I said he wouldn't attack is because it wouldn't work.
00:48:28.200
So, usually, you see me saying, well, I was wrong, but not really completely wrong.
00:48:38.640
I just want you to accept my complete wrongness on this, with no reservations.
00:48:56.900
And it was like a breathtakingly successful fundraiser, right?
00:49:02.260
For the amount of work he put into it, which was one video, probably, and maybe he reviewed
00:49:08.260
So, he did maybe probably one hour of total work and probably netted, we don't know, two
00:49:17.060
to four million dollars or something for like an hour.
00:49:22.500
The best fundraiser ever done, and he owns that now, right?
00:49:27.140
Would you agree that for the bang for the buck, was that not the best fundraising ever done?
00:49:37.360
Now, somebody's saying he did a license deal, which means that he doesn't get the full amount
00:49:42.020
But even as a license deal, it would be tremendous.
00:49:52.040
Now, here's the other reason that it wasn't the disaster it looked like.
00:49:58.640
Once there was a lot of energy around him, he was getting a lot of attention.
00:50:02.040
He followed up quickly with a very strong campaign video about free speech, which was free
00:50:10.180
He said exactly what his base wanted him to say.
00:50:25.000
So, here he takes what I thought was this big mistake, but it created energy.
00:50:31.320
And then, here's why I'm embarrassed that I didn't see this coming.
00:50:38.860
We created all this NFT negative energy, and he just gathered it up, and then it put it
00:50:52.840
You know, it could be that they sped up the free speech video faster than he planned to
00:50:58.360
get it in the same space with the NFT thing, because maybe it wasn't going as well as he thought.
00:51:02.720
But I don't know if it was planned, or he just found a way to snatch victory from the jaws
00:51:17.700
Because Trump is being vindicated on, you know, all the Twitter stuff.
00:51:25.740
So, he's vindicated in seeing that the system was rigged against him, and especially because
00:51:30.460
it's the FBI, because the FBI in this story just makes Trump look more and more like he
00:51:36.540
was always the victim, and more and more like he was right the whole time, even though it's
00:51:48.160
Now, do you think it's a mistake when an 80-year-old releases an NFT?
00:51:56.460
When age is a question, and Trump releases an NFT?
00:52:22.980
You know, if somebody asked him about it, he wouldn't be able to explain it.
00:52:32.600
Do you think that Biden could explain to you what an NFT is?
00:52:42.380
Now, Trump maybe can't give you the details of what a blockchain is.
00:52:47.820
You know, maybe he can give you the big picture that it's like a, it's a public record that nobody can change.
00:52:55.500
But think about the persuasion that is related to Trump embracing a modern technology that even most of the public is not familiar with.
00:53:09.520
Most of the public is not even familiar with NFTs.
00:53:12.340
And he basically just tied his 76-year-old brand to the most current happening young technology and then, and then, made it the most successful one lately.
00:53:28.380
Has anybody had a successful NFT better than this one?
00:53:39.920
The value of the ownership of one of those has risen.
00:53:48.680
So, I think you're missing the best part of the play.
00:53:51.900
The best part of the play is he younged him down.
00:53:54.540
He younged him down when he's running, maybe, against, you know, the oldest potential candidate.
00:53:59.980
He educated the masses that NFTs are like trading cards.
00:54:08.880
I'll bet you there's a whole segment of the population who goes, oh, I get it.
00:54:14.720
And the first thing they ask is, why can't I just make my own copy?
00:54:18.120
And then somebody says, well, it's something about the blockchain.
00:54:20.940
You go, okay, I don't need to understand that, but I get it.
00:54:29.980
And also, I didn't mind that he was pitching a product because he's a salesperson.
00:54:37.360
Isn't Trump the ultimate salesperson, which is what he calls himself?
00:54:41.180
He basically says, I'm going to try to sell the country.
00:54:45.780
And then he goes and he sells stuff for his campaign.
00:54:52.100
The perfect Trump brand, I say, is when he creates all this attention.
00:55:01.760
But when it all settles, you say to yourself, oh, that was a lot better than I expected.
00:55:08.540
How often do you see that pattern where you think he's really stepped in it now?
00:55:13.220
And when the dust clears, you're like, oh, that wasn't so bad after all.
00:55:20.780
I see that Jordan Peterson is warning that Western countries that a totalitarian social credit system
00:55:29.020
is coming to their societies in a highly probable way.
00:55:33.800
So Jordan Peterson says it's highly probable we'll have a totalitarian social credit system.
00:55:50.940
What country already or countries already have?
00:55:57.280
Is there any other country on Earth besides China?
00:56:11.300
But in terms of a total social credit system, nobody else is doing it, right?
00:56:22.600
They're doing things that look like they're moving in that direction, right?
00:56:33.860
There are a lot of things that are not social tracking systems that rhyme with it and feel like it and it feels like it's in a domain.
00:56:44.060
I think you have to treat them differently, right?
00:56:46.320
I believe that in the United States, and there's no way to prove this, because I know there's plenty of pattern recognition that goes against it.
00:56:54.240
But in my opinion, in the United States, if we had implemented, and I'm not arguing we should have.
00:57:03.380
But if we had, do you think that we would not, as a public, have been able to get rid of it?
00:57:08.040
Do you think once it was in there, it would just take root, and it would just expand to a total full system?
00:57:16.460
So a lot of you say, yes, that's exactly what the government does.
00:57:19.980
When the government starts a program, it never stops it, right?
00:57:26.100
Because the government always wants more power, and if they get your money, they never say, we don't need it anymore.
00:57:38.040
I'm not going to say that your reasoning is flawed.
00:57:51.600
It's a well-established pattern that this is exactly the sort of thing you don't want to trust your government with.
00:58:01.140
If it were unimportant, they could certainly get away with it, if we weren't watching and stuff.
00:58:06.460
But I think that the United States, and I won't speak for other Western countries, but I think the United States wouldn't put up with it.
00:58:13.820
So I think you need a totalitarian government to do it, and I think we don't have one.
00:58:23.680
I think the base requirement of a social credit system is a totalitarian government, and we don't have one.
00:58:29.780
Now, does that mean the government would not try to implement it?
00:58:37.940
Definitely the government would try to get away with anything it could, but I don't think the citizens would allow it.
00:58:46.860
And it might be, you know, there might be a, you can imagine a number of ways that would work.
00:58:59.720
All you'd have to do is say, if you're going to give us a social credit system, we want to be able to track the social credit system of Congress.
00:59:11.180
Because there's no way that Congress could impose social tracking on the public and say we're exempt.
00:59:22.980
If Congress ever said, we're going to put a social tracking system on the public that will not apply to us in Congress,
00:59:30.660
I will be with you to take over the Capitol building.
00:59:39.000
I will look to overthrow the government of the United States if they were to do that.
00:59:54.260
No, you're not going to treat us as second-class citizens while you're exempt from your own rule.
00:59:59.880
And do you think that they would say, all right, well, we don't like it applying to us in Congress,
01:00:05.460
but it's so beneficial we're going to apply it to the people?
01:00:12.700
There's no way Congress would allow that to happen to themselves.
01:00:16.300
And I think every one of you would confirm if they were going to, if they try to impose that on us, the public,
01:00:25.240
and make themselves exempt, we will sweep them out of office with whatever means necessary.
01:00:37.440
So I'm not recommending any violence in any current situation.
01:00:40.580
The whole context is we will never be in that situation.
01:00:45.300
So in case you're monitoring me on YouTube for inciting violence, no.
01:00:52.340
I'm saying there's no possibility of inciting violence because the United States will never get to that point
01:00:58.260
because Congress will never impose it on themselves.
01:01:04.540
So can I be sure that my predicted future is more accurate than those of you who have pattern recognition,
01:01:19.460
Because if you say, this looks familiar, you're right.
01:01:41.220
Your pattern recognition, they're all analogies.
01:01:44.260
It happened in this other case, completely different case.
01:01:53.340
That pattern is a very well-established pattern.
01:01:55.840
But the trouble is, if you don't look for what's different about each situation,
01:02:00.480
you might imagine that the pattern can work in every situation,
01:02:04.160
where it might be a pattern that is limited to some kind of domain.
01:02:16.960
Would you stipulate that I'm true, that I'm right?
01:02:20.620
Will you stipulate that patterns are really analogies?
01:02:24.400
And if you don't look at the specific situation, you don't really know what's going to go on.
01:02:30.840
Many of you just love your pattern recognition and will never change.
01:02:39.160
So I'm not terribly worried about that, but it's not a zero.
01:02:45.740
Musk said maybe yesterday morning or the day before that the coup de grace was still coming.
01:02:55.200
Was the coup de grace what we saw about the FBI?
01:03:07.740
You're guessing it's the COVID files or you're guessing it's Fauci.
01:03:25.100
I guess there's still a question whether there's something big coming or not.
01:03:41.380
Yeah, it looks like the YouTube feed is totally dead.
01:04:00.660
They're not responding to me, so I think that they can't hear me.
01:04:16.940
So I've got two feeds going on two new iPads, both of them on the same Wi-Fi.
01:04:25.280
Or are you having troubles on this one as well?
01:04:47.920
It's not my Wi-Fi, because people are getting different outcomes,
01:04:51.600
and I have the same Wi-Fi for everybody, of course.
01:05:04.740
So it might have something to do with just traffic or something.
01:05:12.920
Now, do you assume that what we found out about the FBI and Twitter,
01:05:19.160
do you assume that the FBI has equal operations in the other platforms?
01:05:25.480
Do you think Google and YouTube and Facebook all have their FBI teams?
01:05:42.920
Now, the only way I could see that making sense
01:05:45.340
is if they imagined that Twitter was sort of the tail that wags the dog,
01:05:50.600
and if you get Twitter right, everything else ends up right, but I doubt it.
01:05:54.320
Because a bureaucracy would never ignore Facebook, would it?
01:06:00.220
All right, we're going to put 80 people into Twitter,
01:06:03.440
but Facebook said no, so we won't have anything over there?
01:06:08.180
That doesn't sound like anything that happens in the real world, does it?
01:06:11.240
Because who doesn't take a meeting with the FBI?
01:06:14.120
If the FBI says we want a meeting with you, and you're a business, and you're not in trouble,
01:06:24.500
you're not in trouble, they want to work with you to make your job better.
01:06:38.540
We can help you monitor the bad people and the election interference.
01:06:43.980
And you just have to give us a little, like, access or something.
01:06:52.800
This is the same argument I make about Don Jr.'s famous meeting with those Russian-related people
01:06:59.380
in the last election, the first election for Trump.
01:07:07.480
And sure, they promised something they didn't have, which was some dirt on Hillary.
01:07:14.740
Are you telling me that if you were part of a major campaign, and somebody that you knew,
01:07:20.140
it's not a stranger, it's somebody you know personally.
01:07:23.340
Somebody you know personally says, I have something that will change everything, some dirt on Hillary.
01:07:28.240
All you have to do is come down two floors and sit in this room for five minutes, and you'd have it all.
01:07:36.220
Now, if something came up in the meeting, there was, like, a national security concern,
01:07:58.140
Why do you get the FBI involved when there's no indication of a crime?
01:08:01.300
There's just somebody who says they know something, and most people are lying when they say that.
01:08:05.120
I mean, usually it's hyperbole, usually it's not what they say.
01:08:16.720
If it does, then the second thing you do is call in the FBI.
01:08:20.620
And everybody acts like you should have been the other way around.
01:08:45.180
Yeah, you have to pass the bill to see what's in it.
01:08:47.200
She met with Fusion GPS before and after the meeting.
01:08:56.720
Yeah, I saw the memes about Sam Brinton stealing Santa's bag.
01:09:24.880
Don't forget, Jack doesn't know anything, Flower Girl says.
01:09:28.180
All right, are you surprised that we've reached this point in the process
01:09:50.180
that Jack was directly involved in anything bad?
01:09:56.340
Now, you could argue he should have known, right?
01:10:00.180
Yeah, the argument of whether he should have known,
01:10:07.400
And by the way, I imagine he would agree with that.
01:10:13.240
But I think if you ask Jack, you know, was it like your job?
01:10:16.540
Should you have known what was going on, you know, in greater detail?
01:10:22.400
Because he's been pretty transparent about everything, I think.
01:10:27.340
You say he knew, but there's no evidence to that.
01:10:30.320
In fact, the evidence strongly suggests the opposite.
01:10:39.920
Well, I'm only talking about what the evidence is, not what the truth is.
01:10:47.380
But the evidence suggests exactly what I predicted.
01:10:50.660
I mean, I had exactly zero people agreeing with me.
01:10:57.220
I think zero people agreed with me when I said, I don't think he was involved directly.
01:11:09.100
You know, tomorrow we could find out something where I'm wrong.
01:11:11.540
But the fact that we got this far with my hypothesis that he wasn't involved, still active, it's still a good hypothesis, this far in, that's surprising to some of you.
01:11:35.640
Well, for those of you who are saying, I hear you saying that Jack was a bad manager, blah, blah, blah, because he didn't do it.
01:11:52.920
Jack's management of Twitter did not begin and end with his role as CEO.
01:12:01.580
If you expand your frame, Jack managed it properly by selling it to or supporting the sale to Musk.
01:12:09.460
Because selling it to Musk was what solved the problem.
01:12:16.660
And I think that Jack knew that it would require devastating the company to fix it.
01:12:25.100
It wasn't like disciplining a few employees and moving some people around.
01:12:29.080
Like, you had to, like, take it down by its roots.
01:12:33.000
And I think that Jack probably knew Musk well enough, or at least his operating methods,
01:12:39.920
to know that Musk could potentially root it out at the, you know, at the grassroots and just pull it out of the ground.
01:12:46.140
But it was hard for the existing CEO to do anything like that.
01:12:50.100
There's a long history of when you need to do major layoffs and major restructure.
01:12:57.400
That's usually the way, that's the right way to do it.
01:12:59.780
Because the person who's there has personal relationships that is going to influence whether you can do the hard stuff.
01:13:06.440
The person who's been working there has too many personal stakes.
01:13:10.860
People they put in a position, they don't want to fire their friend, that sort of thing.
01:13:15.920
So, if you say that Jack didn't do his job while he was in the job,
01:13:22.040
I would argue that that is refuted by the fact that he supported or bring in Musk,
01:13:29.880
which had to be very directly related to figuring out what was going on in a more aggressive way.
01:13:36.580
So, I feel like he maybe did what was right to compensate for what problems had grown up during his reign.
01:13:47.700
Jack knew that Twitter was full of leftists and the effect of that was to bias their operation.
01:14:04.800
He's never argued that Twitter was a fair arbiter.
01:14:10.380
Jack has never said Twitter's operating the way I want it to operate.
01:14:15.620
He actually agreed with the critics that Twitter was operating like a left-leaning organization
01:14:28.980
You said Jack didn't know Twitter was actively throttling conservatives.
01:14:45.620
The part we still don't know is if people like me were banned just for political reasons.
01:14:52.240
Now, we know there's a bias, which is that people are banning people for sketchy reasons that really, you suspect, might be political.
01:15:02.600
So there's the gray area, but I think Jack fully accepted that the gray area was exactly what you thought it was.
01:15:10.320
That if you're staffed with leftists, even though what they're trying to do is just the right thing, like get rid of bad information and stuff, it will end up being focused on conservatives.
01:15:23.560
I feel like he was fully transparent about that, right?
01:15:28.620
It's not my job to defend him, but I'm just telling you what I saw.
01:15:31.900
So the part that he has denied is the part I said has not been demonstrated, that they had a policy or a programmatic policy, I guess, an algorithm, that would ban people like me who had no strikes.
01:15:50.520
And as far as I know, there was no conversation about me.
01:15:52.960
So did I get scooped up in some kind of business?
01:15:55.800
So that's the part that I think he says no to, and we do not have evidence of it.
01:16:03.580
I mean, to me it seems highly probable we will, but so far not.
01:16:10.260
And if we haven't seen it yet, it means Musk either hasn't seen it or for some reason needs to know more or something.
01:16:18.080
But that does suggest that you could be the head of the organization and not know exactly how that algorithm got tweaked.
01:16:30.900
Yeah, we all agree with the buck stops at the top, right?
01:16:47.540
I mean, I don't know, but everything suggests he would agree with you on that, that it was his responsibility.
01:16:59.200
You say, if Jack didn't drill down for the answers, that's unbelievable.
01:17:06.340
Jack personally contacted me, you know, a few years ago, when I was complaining that it was a shadow band.
01:17:13.040
He personally introduced me to Del, somebody, whose job it was to work in that area.
01:17:21.600
I personally worked with her and quickly determined that she was the problem and that she knew exactly what was going on because she sort of started ghosting me when I got too close.
01:17:33.340
Now, the fact that he connected me with her suggests that he couldn't penetrate her barrier any more than I did.
01:17:43.500
Now, he could have fired her, but that's why you need a musk, somebody who just says, that's not an answer, you're fired, right?
01:17:52.040
I feel like Jack was more of a, well, I'm not getting the answer I want, but I'll try harder.
01:18:00.800
Maybe I'll see if Scott talking to you gets the answer.
01:18:04.760
You know, he was like working around the edges.
01:18:07.480
But to say that he didn't drill down is inaccurate because I actually was part of a process of trying to drill down.
01:18:15.680
It's just I hit an employee wall, you know, a lower level employee was just a brick wall.
01:18:22.080
But don't you think he hit the same brick wall?
01:18:32.260
And she would say, I see no evidence that that's happening.
01:18:48.440
He can't because he doesn't know if there's nothing there.
01:18:54.400
So all he knows is what his employees tell him.
01:18:57.360
And when do employees ever tell the truth to the CEO?
01:19:04.120
Everybody's shading things when they tell the CEO.
01:19:06.180
They're telling him what they think is going to be good for them to hear.
01:19:11.640
Now, if somebody says, I looked and there's nothing there, how do you fire him?
01:19:28.380
I don't think any level of management excellence could have been replaced.
01:19:34.020
Keep the same employees in place and then get to the bottom of it.
01:19:47.520
Yeah, the fact that Elon called Twitter a crime scene.
01:19:50.600
I don't know how literal that is, but certainly it feels right.
01:20:12.960
Being technical doesn't mean he's going to go scour the algorithm on his own if he's the CEO.