Real Coffee with Scott Adams - December 17, 2022


Episode 1960 Scott Adams: It Turns Out The FBI Was A Big Part Of The Twitter 1.0 World & Lots More


Episode Stats

Length

1 hour and 20 minutes

Words per Minute

149.17484

Word Count

12,013

Sentence Count

974

Misogynist Sentences

10

Hate Speech Sentences

9


Summary

A nuclear breakthrough at the Lawrence Livermore Lab, a black hole being formed in the middle of the night, and a huge earthquake in the Bay Area. Scott Adams talks about it all on this morning's episode of Coffee with Scott Adams.


Transcript

00:00:00.400 Good morning everybody and welcome to the highlight of human civilization.
00:00:06.020 It's called coffee with Scott Adams and let me tell you, there has never been a finer thing that's ever happened.
00:00:13.400 And if you'd like to take this up to levels heretofore unimaginable,
00:00:18.680 all you need is a cupper, mug or a glass, a tank or chalice or stein, a canteen jug or a flask, a vessel of any kind.
00:00:25.680 Filled with your favorite liquid. I like coffee.
00:00:31.500 And join me now for the unparalleled pleasure of the dopamine the other day, the thing that makes everything better.
00:00:36.600 It's called the Simultaneous Sip and it happens now. Go.
00:00:45.300 Oh.
00:00:47.420 If you were going to do a national advertising campaign in support of the Simultaneous Sip, what would be your slogan? Go.
00:00:56.680 Slogans.
00:00:59.120 Has to rhyme, has to be fast, has to get to the point.
00:01:03.460 Don't miss the Sip. Don't skip the Sip. Don't skip the Sip. Don't skip the Sip.
00:01:11.280 Yeah, you can't top that.
00:01:13.820 All right, let's talk about some things.
00:01:16.520 A couple things happened in my neighborhood.
00:01:18.460 You heard the other day that there was a huge breakthrough at the Lawrence Livermore Lab just down the road from me.
00:01:28.600 It's e-bike riding distance for me.
00:01:31.940 And they, you already heard this, they cracked fusion.
00:01:36.900 They actually did a fusion experiment that created more energy than it used to do the experiment.
00:01:42.020 A gigantic step in the, maybe, maybe one of the greatest scientific breakthroughs of all time.
00:01:49.260 Of all time.
00:01:49.820 It's just right down the road from me.
00:01:52.400 The other thing that happened was early this morning, there was an earthquake.
00:01:56.280 An earthquake in, where I live, Bay Area.
00:02:02.120 It was like a little bit under a four.
00:02:04.940 So, yeah, you got the earthquake and the gigantic nuclear fusion experiment roughly in the same place.
00:02:12.720 Well, I didn't want to tell you this, but a black hole is formed in Livermore, California.
00:02:19.560 It's small.
00:02:21.020 It's only about this big.
00:02:22.560 So far, it has only sucked most of Main Street into it.
00:02:26.680 I think we'll be fine.
00:02:28.720 Don't panic.
00:02:31.760 Not really.
00:02:33.280 Not really.
00:02:34.720 Parody.
00:02:35.360 It's parody.
00:02:35.780 How many of you believe that Klaus Schwab once said, someday you'll owe nothing and be happy?
00:02:45.100 How many believe that is something that actually happened in your actual world?
00:02:51.640 A lot of yes.
00:02:52.840 A lot of yes.
00:02:54.060 Have you ever tried to Google it?
00:02:56.820 Have you ever tried to find him saying that?
00:03:01.040 Anybody?
00:03:02.040 You should try doing that.
00:03:03.600 Just for fun.
00:03:04.500 See if you can find the most famous quote in all of the right-leaning world.
00:03:12.540 Nope.
00:03:14.040 Do you know why you can't find it?
00:03:16.100 Because it never happened.
00:03:18.120 Do you know what did happen?
00:03:20.040 So there's something like it that happened.
00:03:22.720 And that's how that all got started.
00:03:25.160 And if you want to know, I just tweeted the link to an article that explains where that rumor came from.
00:03:32.080 And you can see for yourself.
00:03:34.500 So it appears to come from an opinion piece that was on the World Economic Forum website.
00:03:42.480 So the first thing that's true is that there's something like that opinion that was, in reality, in truth, part of the World Economic Forum website, I think.
00:03:55.280 So it was associated with Klaus, but you'd have to read the context.
00:04:01.900 The context is not, we want to do this for you.
00:04:06.580 It was more like, this is what we foresee might happen.
00:04:11.240 Is it as bad as it looks?
00:04:13.460 You know, that sort of thing.
00:04:14.120 It was an opinion piece.
00:04:15.000 It wasn't somebody saying, hey, let's take away all your property rights, something like that.
00:04:21.220 Now, does anybody, would you have a problem if you knew it was an opinion piece that was saying more like, well, this stuff is already going to happen?
00:04:33.520 For example, the existence of Uber allowed at least one person I know who had plenty of money to not own a car.
00:04:43.000 And I remember talking to him and he would say, I don't need to own a car now because where I live, Uber is just always available and it's just easier.
00:04:50.360 And I thought to myself, well, there you go.
00:04:52.880 Perfect example.
00:04:54.120 There's someone who doesn't own a car, who easily could, he's rich, but he doesn't own a car and he's happier.
00:05:01.380 And that's basically the essence of the article, is that the free market would create situations in which you optionally could own something or not.
00:05:11.140 And if you decide not to, you might be happier.
00:05:14.400 Do you know what I was thinking about all yesterday?
00:05:17.160 Is how much I don't want to own a car.
00:05:19.360 All yesterday.
00:05:20.860 Because I own an automobile, I don't know if you've heard one of these.
00:05:25.380 There's a, I drive something called a Christmas tree.
00:05:29.220 Have you seen one of those?
00:05:31.080 It's a Christmas tree.
00:05:32.780 It's like a, it's like an automobile.
00:05:36.240 It has a steering wheel and it has tires that often are inflated.
00:05:40.920 Those tires often have the proper amount of inflation.
00:05:43.540 But otherwise on the inside, there's this thing in front of you, like an electric panel,
00:05:48.800 that is lit up like a Christmas tree.
00:05:51.420 So it's usually telling me, this is broken, that's broken, and you better get a service, and your tires are flat.
00:05:58.000 And so when I get in my car, the Christmas tree comes on, and I feel festive.
00:06:02.000 I feel very festive.
00:06:04.000 But as I was walking three miles to pick up my car with my flat tire yesterday, it was being serviced,
00:06:12.060 I got home and, of course, the parking assistance doesn't work, which is why I ran my car into the side of my garage door.
00:06:21.940 Because I was waiting for that little light to say, you're getting a little close to the edge of your door.
00:06:29.080 And when that little thing goes on, I'm going to stop.
00:06:32.520 Well, it wasn't working.
00:06:34.840 So I just took out part of my garage door and my car.
00:06:39.080 This is about a $5,000 expense.
00:06:41.840 Because BMWs don't have good electrical systems.
00:06:45.880 I'm just going to say that.
00:06:46.820 I once bought a brand new BMW, the one, two cars before this one, and the entire electrical system failed as I drove off the lot.
00:07:00.100 As my car, literally, as I pulled out from the lot, brand new car, the entire electrical system of the car failed.
00:07:08.140 It was still running, but there was no electrical.
00:07:11.540 Just the whole thing failed.
00:07:13.820 You know, just sort of back up, back onto the lot.
00:07:15.700 Now, how many of you have heard me tell this story about my weird association with electrical and mechanical objects?
00:07:25.800 It's a theme my entire life, which I talk about all the time.
00:07:30.400 I can make streetlights stop by walking under them.
00:07:35.160 I've done it a lot.
00:07:37.200 You know, things just don't work around me electrically.
00:07:39.380 It's just a very consistent thing.
00:07:42.220 So my automobiles always have, and so I wanted to ask this question.
00:07:49.760 95% of the time I drive any automobile that's mine, my own car, there's a warning light on on the panel.
00:07:59.860 How many of you would say the same?
00:08:01.740 That 95% of the time you drive your car, there's a warning light on.
00:08:06.760 95% of the time.
00:08:07.920 I'm seeing a few yeses, yeah.
00:08:12.700 Yep, yeses.
00:08:14.080 Now, the reason is, it would be a full-time job.
00:08:18.240 The one that's broken now is this PDC light or something that tells you if you're close to something when you're parking.
00:08:25.260 That warning light has been on for months.
00:08:30.700 Have I fixed it?
00:08:31.580 Yes, of course.
00:08:32.340 I've taken it in twice.
00:08:33.360 And when I take it home, it stops working.
00:08:36.440 Now, every time I turn it on, I have to clear the error before I can, you know, use my panel and stuff.
00:08:42.240 Every time.
00:08:43.660 Now, let me ask you this.
00:08:46.020 How many things in your life are there that you know if you took an hour, you could fix it, but you never have an hour?
00:08:52.260 So you accumulate all of these things that you're just temporarily making not bother you?
00:08:59.620 Like, for the last several days, I drove on a flat tire that I knew was flat.
00:09:05.560 Why?
00:09:06.300 I didn't have an hour.
00:09:07.940 I just didn't have an hour.
00:09:08.960 And my whole life is full of things, like, you know, driving on a flat tire.
00:09:14.860 Totally normal for me.
00:09:16.760 Because, you know, do you know how many problems I have with my computer?
00:09:20.700 My lights, my plumbing, my whole house?
00:09:24.760 Almost all of it is a little bit broken.
00:09:27.860 All of it.
00:09:28.720 Every one of those would take an hour to fix.
00:09:31.220 But if I did everything, it would be like thousands of hours.
00:09:33.720 I don't have that kind of time.
00:09:34.760 So I live in a world where everything's broken.
00:09:39.120 Everything's broken.
00:09:40.740 Everything I own is broken.
00:09:42.560 All of it.
00:09:43.460 A little bit.
00:09:45.740 Now, you know what would solve that problem?
00:09:49.220 Does anybody know how to solve that problem?
00:09:51.360 Of owning just a lot of shit that's broken all the time?
00:09:57.060 Say it.
00:09:58.280 Say it.
00:10:00.280 Thank you.
00:10:02.120 You know what would be great?
00:10:03.140 If I didn't own anything, then I'd be happy.
00:10:10.340 I'm not joking.
00:10:12.160 I'm not joking at all.
00:10:14.440 The idea that not owning things would make you happy
00:10:17.440 is something you might not understand unless you own enough stuff.
00:10:21.900 If you own enough stuff, you don't like it.
00:10:26.000 If you don't own stuff, you really want more stuff.
00:10:29.500 Don't you?
00:10:29.840 If you're struggling, the whole time you're thinking,
00:10:34.240 oh, I wish I had more stuff.
00:10:36.880 Nicer home.
00:10:38.240 Better car.
00:10:40.440 It doesn't make you happier.
00:10:42.280 It really doesn't.
00:10:43.280 I mean, I'd still prefer it, but it doesn't get you close to happy.
00:10:47.940 You've got to do other stuff for that.
00:10:49.400 All right, let's talk about Aaron Rupar.
00:10:52.200 Has he been, has Rupar been brought back on Twitter yet?
00:10:56.060 It looked like Musk had, he ran a poll about when to bring back the people he had just banned.
00:11:01.860 And I think the poll said bring them back.
00:11:03.940 And I believe I saw just a few minutes ago that he was going to do that.
00:11:06.960 So anyway, Rupar has given some interviews.
00:11:14.360 And his defense for why what he did, which was referring to Facebook still having information on Musk's airplane's location,
00:11:25.240 his argument is that that wasn't so bad and it didn't really endanger Musk because all he did was make it much, much easier for somebody to hurt Musk.
00:11:35.940 And his point is that if somebody wanted to, they could still work a little harder to hurt Musk and his family.
00:11:45.360 But all he did was make it a little easier and more convenient to hurt them.
00:11:50.120 And that shouldn't be wrong.
00:11:52.660 That sound like a pretty good argument?
00:11:55.460 Well, let me tell you my opinion.
00:11:58.460 Independent of whether he should have been banned for his tweet.
00:12:03.120 Most people would say, you know, there's an argument both ways.
00:12:07.500 But independent of whether the first tweet was bannable, his response to why he thought that tweet was okay should ban him for life.
00:12:18.980 Do you want to have any association with somebody who says,
00:12:22.280 no, I only put your family at a little bit of risk, which I decided would be appropriate for you.
00:12:28.700 I would like to take on what was a right-leaning misinformation site.
00:12:33.120 Did anybody see them banned?
00:12:35.040 It doesn't seem like YouTube always fucks with me when, maybe it's because I swore.
00:12:39.800 I don't know, it's possible.
00:12:42.200 Anyway, could be a coincidence.
00:12:44.700 But do you remember that the FBI was warning Twitter that when Rupar was spreading the fine people hoax,
00:12:57.600 as he frequently does, that that would influence the elections?
00:13:01.880 Do you remember the FBI tried to get Aaron Rupar removed from Twitter for spreading the fine people hoax,
00:13:09.680 which was such a defining moment in American history that Biden actually based his entire campaign on it.
00:13:16.620 But luckily, luckily the FBI was watching and they wouldn't want the election to be, you know, affected by a hoax.
00:13:24.300 So the FBI immediately reported to Twitter and then Twitter removed all of the, all of the references to the fine people hoax because that would affect it.
00:13:34.400 No, that didn't fucking happen.
00:13:35.940 Of course not.
00:13:36.820 Not one of those things happened.
00:13:38.200 The entire hoax quiz of now, what, 20 fucking, I think there are like 20 fucking hoaxes.
00:13:46.800 Do you think the FBI targeted or tagged any of those hoaxes?
00:13:52.440 Of course not.
00:13:53.720 Because there were all left-leaning hoaxes.
00:13:56.720 None.
00:13:56.940 Now, if you ask the FBI, would they say, oh, no, the things we're concerned about would be election misinformation such as what day the election is.
00:14:07.640 So a number of the examples were people who were joking that the election was on Wednesday, right?
00:14:12.960 Maybe they were joking, but maybe not.
00:14:15.100 Now, I don't mind if those get banned, or at least, you know, temporarily.
00:14:18.920 I don't have any problem with that.
00:14:21.320 Do you?
00:14:22.540 Now, I get free speech and all that, blah, blah, blah.
00:14:25.360 But do you have any problem with a social media network banning misinformation about the day of voting?
00:14:34.920 Maybe, you know, a one-week ban or something, right?
00:14:38.760 Or maybe even put a notice on it.
00:14:41.160 Put a notice on it, this is fake.
00:14:44.140 Something like that.
00:14:45.080 But I don't really have a problem with that.
00:14:46.780 I don't have a problem if Twitter handles it with a notice.
00:14:50.220 I don't have a problem if they suppress it.
00:14:52.820 I don't have a problem if they...
00:14:54.560 I think I would...
00:14:56.080 I wouldn't favor a ban for life.
00:14:59.100 Because I think it's just trolls messing around.
00:15:01.720 Like, it's not quite that level.
00:15:03.820 But I would definitely act on it.
00:15:05.500 I don't think that would be inappropriate.
00:15:07.180 Because that goes right to the heart of democracy, right?
00:15:11.260 Or the republic.
00:15:13.140 It goes right to the heart of it.
00:15:14.280 You've got to show up on the right day, right?
00:15:15.840 So I'd be okay with that.
00:15:18.480 As long as they didn't, you know, put anybody in jail for it or anything.
00:15:22.280 But they did more than that, didn't they?
00:15:25.580 The evidence suggests that they were banning funny memes that didn't have any informational value, really.
00:15:31.520 So, do we know for sure that the FBI was in no way involved in anything useful?
00:15:45.400 Could we say that?
00:15:47.100 Because even if they were doing something for their own team, it didn't seem to be doing that.
00:15:51.100 The accounts that they were targeting were all these minor, unimportant accounts that who knows if they had any effect at all.
00:16:00.540 And I'm sure that whatever rumors they were trying to stop, they didn't stop at all.
00:16:05.620 Do you think...
00:16:06.540 Can you name a rumor that the FBI tried to stop that you haven't heard?
00:16:14.340 Anybody?
00:16:15.400 Is there any rumor that the FBI tried to stop, based on the files?
00:16:18.800 We don't know everything they did.
00:16:19.940 Did they stop any rumor?
00:16:21.840 I don't think so.
00:16:24.500 So, why were they there?
00:16:26.500 You know, what a lot of it looked like is just a bureaucracy doing what a bureaucracy does.
00:16:31.620 Don't you think there are a lot of people in the FBI who are not qualified to do field work?
00:16:38.980 What do you think?
00:16:40.300 Do you think there's anybody at the whole FBI who might be, let's say, unqualified to be in the field doing important work,
00:16:49.000 like catching counterfeiters and terrorists, probably.
00:16:53.040 Now, what would you do with somebody who some...
00:16:56.180 You didn't want to get rid of them, but they didn't have the ability to do field work.
00:17:01.940 Can you think of any kind of assignment which would be a way to just park somebody useless?
00:17:13.620 Hey, Bob, here's your new assignment.
00:17:16.960 And you won't need your gun for this.
00:17:21.060 If you wouldn't mind, maybe you could leave your gun locked in the boss's safe.
00:17:28.000 Oh, it'll be there any time you want it.
00:17:30.460 But we don't think you should have a gun right now.
00:17:36.760 However, if I get a job for you, how would you like to watch Twitter all day
00:17:42.600 and then write memos about stuff you don't like?
00:17:46.760 And I'd say, I do that for fun.
00:17:51.540 That's pretty much you described my entire entertainment day.
00:17:55.700 Looking at Twitter and bitching about shit I didn't like.
00:17:59.180 Oh, there's some shit I didn't like.
00:18:00.520 Hey, how about I make that your job?
00:18:04.900 Wait, am I hearing this right?
00:18:07.240 It's going to be my job to shitpost about Twitter,
00:18:10.540 except it's even going to be better than tweeting what I don't like.
00:18:13.840 I can go directly to Twitter management and tell them to get rid of it.
00:18:17.600 Seriously?
00:18:18.860 And that's my job?
00:18:20.360 That's my job.
00:18:22.320 Let me see if I understand this.
00:18:25.120 I'm not getting this.
00:18:26.320 I'm not getting this.
00:18:26.860 I'm confused.
00:18:27.160 You're also going to give me money for this.
00:18:31.720 Yes, yes.
00:18:32.980 We're going to pay you money to read Twitter
00:18:35.740 and complain about shit you don't like.
00:18:39.200 I'm not getting it.
00:18:40.240 I'm not getting it.
00:18:40.960 There's a trick here, right?
00:18:42.260 What is the trick?
00:18:43.460 What's the trick?
00:18:45.500 You also get excellent employee benefits.
00:18:51.340 I don't get it.
00:18:53.080 I don't know.
00:18:53.660 There's something you're not telling me.
00:18:55.020 It's too good to be true.
00:18:57.160 So, the first thing we need to all agree on.
00:19:01.200 Can you take my lead on this as the creator of Dilbert?
00:19:04.380 Do I have enough credibility to make the following statement?
00:19:08.340 They weren't sending their best.
00:19:11.320 Any disagreement?
00:19:13.220 They weren't sending their best.
00:19:15.040 That's got to be part of the story.
00:19:22.600 Has anybody mentioned that yet?
00:19:25.100 It's got to be a huge part of the story.
00:19:28.840 Imagine this.
00:19:30.040 Take any large organization.
00:19:32.140 Could be government.
00:19:33.180 Could be the FBI.
00:19:34.460 Could be Congress.
00:19:35.660 Could be any business.
00:19:36.980 Could be Apple Computer.
00:19:37.880 Let's say you took Apple Computer.
00:19:40.720 Some of the, I would imagine, some of the finest employees ever selected for a major business.
00:19:48.120 Would you agree?
00:19:49.720 Would you agree?
00:19:50.340 They have very high standards.
00:19:51.600 It's very hard to get a job at Apple.
00:19:53.160 The finest employees in the land at Apple.
00:19:55.600 Now, imagine if you had some way to identify the 10% or less.
00:20:00.620 Let's say the 1% of Apple employees who are not good.
00:20:05.860 Because there's always some, right?
00:20:07.420 You can find the worst employees at Apple.
00:20:09.640 Now, let's collect them all together and put them on the same project.
00:20:15.320 How would that go?
00:20:18.060 Remember, they're Apple employees and some of the best in the world.
00:20:21.240 But for this one project, you did somehow scour the organization to find the worst 1% of the employees.
00:20:28.380 And they're all working on the same project.
00:20:30.620 That's what the Twitter files was.
00:20:32.620 It was a bunch of FBI people who were not qualified for real work.
00:20:36.720 Doing some of the most important work in the country.
00:20:39.800 Because we didn't know how important it was at the time.
00:20:42.200 Now, I didn't really see anything that they went after that mattered to me.
00:20:49.100 Did you?
00:20:50.140 Did you see anything that the FBI banned that you or the country would have been better off if it had been fully expressed?
00:20:59.340 I didn't see an example.
00:21:01.600 Did you?
00:21:02.640 So I think it didn't matter.
00:21:04.160 I think the bigger story is that it doesn't matter how many of the worst employees you throw at something.
00:21:09.120 Nothing happens.
00:21:09.960 In the end, nothing really happens.
00:21:14.580 It was just a bunch of employees doing stuff.
00:21:16.720 But nothing really.
00:21:18.360 They didn't change the world.
00:21:19.740 They didn't change the election.
00:21:21.060 They didn't support the election.
00:21:22.420 They didn't protect the election.
00:21:23.820 They didn't protect the country.
00:21:25.140 They didn't support Twitter.
00:21:26.520 They didn't protect the public.
00:21:29.140 Actually, nothing happened.
00:21:30.220 It was just 80 bad employees bitching about Twitter and making the Twitter employees really busy.
00:21:40.420 I'll tell you what I didn't see.
00:21:43.300 A finely oiled deep state plot.
00:21:48.400 Didn't see that.
00:21:49.500 And Matt Taibbi also explains it in his own words better.
00:21:56.300 It's not so much that there's a deep state with some kind of an organized structure that's moving the puppet strings.
00:22:03.900 It's just a bunch of bureaucracy with people who have often common opinions.
00:22:08.960 But it's basically just a rat's nest of bureaucracy doing whatever the rat's nest can do.
00:22:15.380 And basically that is it.
00:22:16.920 So basically the FBI, 80 FBI people working with Twitter was a rat's nest of no particular importance.
00:22:26.140 Now, the fact that as far as I know, it didn't change anything, does that mean I'm okay with it?
00:22:33.860 I hope my audience is the ones who can handle nuance.
00:22:39.740 I hope I've filtered you well enough that you can.
00:22:43.620 I'm completely against it.
00:22:46.120 I'm completely against the FBI having practically a permanent role at Twitter.
00:22:51.020 Completely against it.
00:22:52.620 But the fact is it probably didn't make any difference.
00:22:56.160 Any disagreement?
00:22:57.540 Or let me say nothing has been reported yet where it made any difference at all.
00:23:03.860 Did it?
00:23:05.920 You say it did, but would you agree there's no evidence?
00:23:10.340 If you're saying you think it did, I won't disagree.
00:23:14.240 I'm just saying there's no evidence of that, right?
00:23:17.380 Hunter's laptop, was Hunter's laptop because of these 80 FBI?
00:23:22.440 No.
00:23:23.040 The 80 FBI employees were not the cause of the Hunter laptop.
00:23:26.900 I believe that came from actual management and, you know, a conversation.
00:23:31.780 So it did come from the FBI, but if the only thing that was happening was those 80 people reporting stuff, probably it wouldn't have happened.
00:23:41.520 Because I think it was just 80 employees bitching about stuff.
00:23:45.800 We don't know.
00:23:46.860 But anyway, let's agree on this.
00:23:49.800 Here's our point of agreement.
00:23:51.040 I think nothing's been demonstrated that was bad.
00:23:56.020 You think maybe it has.
00:23:58.100 But we both agree that whether it has or has not been demonstrated by the information we have, you can't let it stand.
00:24:05.540 It's not a situation we can tolerate, right?
00:24:09.140 Because the potential for abuse is so obvious.
00:24:12.340 The fact that maybe there's no direct evidence that I've seen personally doesn't really mean much.
00:24:20.260 You can't have a situation with that much exposure.
00:24:23.720 That's a clean, easy decision.
00:24:27.300 And Musk is making the right one, I'm sure.
00:24:30.780 All right, here's the dog not barking.
00:24:32.440 If you took everything you saw about the FBI, as I said, I don't think it mattered too much, except maybe that laptop story, which I think would have happened in any situation.
00:24:46.320 But what's the biggest thing you're still wondering about Twitter suppression?
00:24:52.500 Not counting the COVID stuff.
00:24:55.640 Not counting COVID.
00:24:56.840 What's the biggest remaining question you have about what was really happening at Twitter?
00:25:00.840 Something that hasn't been addressed.
00:25:05.120 We heard a little bit about the, yeah, we do have questions about the underage stuff, you're right.
00:25:11.620 Here's what it is to me.
00:25:14.740 I still don't know the answer to whether an account like mine, and specifically mine, but those like it, who have no strikes against them,
00:25:24.200 why were we shadow banned and who made that decision?
00:25:27.020 And were we?
00:25:28.020 Could it be confirmed?
00:25:28.880 So the first question is, could it be confirmed that I was ever shadow banned?
00:25:34.160 I mean, it looks like it in every way, but I still need confirmation.
00:25:38.320 Could be some other explanation.
00:25:40.980 So number one, why don't we know?
00:25:43.420 How is it possible that I don't know the answer to that at this point?
00:25:48.180 It's the biggest question.
00:25:49.660 Because the accounts that the FBI tagged to get rid of were small accounts of no impact.
00:25:55.000 I'm not a small account of no impact.
00:25:59.380 I actually move probably tens of thousands of votes.
00:26:03.180 And I got shadow banned.
00:26:06.620 So if somebody shadow banned me, and they did it, if they did it, by name, as in I'm on a list, if that happened, there's no evidence of that.
00:26:15.400 But if that happened, what would be the reason that I would be shadow banned?
00:26:19.520 There's only one reason.
00:26:21.420 Because I have no strikes.
00:26:23.340 Right?
00:26:24.000 I have no strikes.
00:26:24.840 So the only reason would be political.
00:26:27.960 There couldn't be another reason.
00:26:29.940 I've given nobody any reason otherwise.
00:26:33.580 So, question number one, am I shadow banned?
00:26:39.260 If I am, who did it?
00:26:42.700 And if somebody did it, why?
00:26:46.640 And when would be interesting, too.
00:26:49.120 Right?
00:26:49.800 Now, correct me if I'm wrong.
00:26:51.260 Knowing if I got shadow banned, and people like me, so I'll say, let's say, a Cernovich, a Posobiec, people who can actually move the dial.
00:27:01.760 I'm not talking about just people who are users, but people you know can move the dial.
00:27:06.700 You know, the Ben Shapiro, right?
00:27:10.280 Can Ben Shapiro move the dial in politics?
00:27:13.120 Yes.
00:27:13.620 Yes, he can.
00:27:14.480 If he was shadow banned, that's a big deal.
00:27:17.700 A big, big deal.
00:27:19.180 Would you agree?
00:27:19.700 Now, I don't know that he was shadow banned.
00:27:22.580 I have no direct evidence of that.
00:27:24.780 But if he was, he also, I believe, yeah, Dan Bongino.
00:27:32.000 But now this is interesting.
00:27:33.700 Dan Bongino was called out by name as somebody who was shadow banned.
00:27:40.400 But I think they also had a specific reason, or was there not?
00:27:45.480 Give me a fact check.
00:27:46.680 Did the Dan Bongino case, there was a topic he was talking about they didn't like, right?
00:27:55.140 But was the topic, did that give them any justification, even weasel justification?
00:28:02.440 Because I'm wondering why he was called out, and so many other big right-leaning accounts were not.
00:28:09.100 It was masks, was it?
00:28:12.080 Yeah.
00:28:13.900 All right, so we heard some anecdotes.
00:28:15.960 But why would Dan Bongino be, you can tell me that those accounts did or did not get shadow banned.
00:28:23.680 That's the whole story.
00:28:24.900 Am I wrong?
00:28:27.520 That's the whole story.
00:28:29.280 Everything we've seen so far is just like, it's like the teaser to the story or something.
00:28:35.200 It's, yes, you're wrong, but tell me why.
00:28:41.520 Right.
00:28:42.400 So I know that several people were confirmed, but I believe that there was some excuse in each of those cases, right?
00:28:49.720 That whether it was a good excuse or not, they had some, like, reason to act.
00:28:54.700 I don't think that's the whole story.
00:28:57.040 My question is, did people like me get picked up by the algorithm without doing anything that anybody detected as a problem?
00:29:06.220 Because if the algorithm was just saying, oh, you tweeted, if I tweeted Bongino three times,
00:29:12.600 do you think the algorithm would have started suppressing me just automatically?
00:29:16.320 Why wouldn't it?
00:29:17.900 Why wouldn't it?
00:29:18.500 If they're going to ban anybody who's boosting them, isn't that automatic?
00:29:23.600 It would be automatic, I would think.
00:29:29.200 Bongino said Epstein didn't Clinton himself.
00:29:33.860 All right.
00:29:37.900 But I'm not seeing a direct, I'm not seeing a direct refutation or agreement with me.
00:29:45.020 I want to see that because I want to know if I'm on the right track.
00:29:48.500 True or false, that the part we don't know is the big part?
00:29:54.440 Yes or no?
00:29:56.600 Am I misreading this?
00:29:59.960 Okay.
00:30:00.460 A few people say yes.
00:30:01.760 Right.
00:30:01.960 And did you feel that you were being distracted and you missed that story?
00:30:07.920 Like, I'm wondering, did you miss that?
00:30:09.600 Because we've all been putting our attention on these little, you know, sparkling objects.
00:30:14.380 But so far, the sparkling objects have all been smallish, you know, of great concern because they have the potential to be biggish.
00:30:22.680 So that's why you have maximum concern, even if the examples so far are not fully expressed as bad.
00:30:30.640 But the other thing is already fully expressed, right?
00:30:36.020 If, in fact, people like my account were shadow banned, don't know that, but if they were, that would be a fully expressed plot.
00:30:44.900 The other stuff is just stuff that could have been bad, but it's a good thing we stopped it when we did.
00:30:50.460 The other stuff is what actually happened, if it happened.
00:30:54.700 All right.
00:30:55.220 Too much on that.
00:30:57.660 All right.
00:31:00.900 Would you like your head to totally explode?
00:31:03.000 This is just a question.
00:31:07.120 Let's see if I can make all of your heads explode.
00:31:10.600 There were 80 FBI employees looking at content on Twitter.
00:31:15.100 That is now confirmed.
00:31:17.120 How many FBI agents were continuously monitoring TikTok and reporting to TikTok's management what things should be removed?
00:31:28.620 You don't know, do you?
00:31:30.380 It might be.
00:31:32.020 It might be.
00:31:33.000 Now, it might be that the FBI is monitoring TikTok, and it might be that they are reporting things.
00:31:39.400 But we might expect that TikTok would not necessarily respond.
00:31:43.540 Or TikTok might respond.
00:31:45.480 TikTok might respond to the ones that are just really clean, like this is misinformation.
00:31:50.380 I could imagine that TikTok would take down a video that said the wrong data vote.
00:31:58.040 I can imagine that.
00:31:59.200 I can imagine FBI telling TikTok to take it down.
00:32:03.000 And I can imagine TikTok knowing that that's not where the big win is for China.
00:32:08.320 China isn't playing small ball.
00:32:12.260 They're not like, oh, I'll start this little rumor and that'll cause some chaos.
00:32:16.040 It's not small ball.
00:32:17.580 If China wants to weaponize TikTok, it's going to be through the major persuasion in general, you know, telling you to go crazy on climate change if they think it's not going to help you, that sort of thing.
00:32:30.420 So it would be a whole topic persuasion.
00:32:32.420 So it would be a whole topic persuasion.
00:32:33.680 That's the danger.
00:32:35.620 So I wouldn't be surprised if FBI is, in fact, watching TikTok and, in fact, reporting things that look bad.
00:32:44.060 And maybe TikTok is taking some of them down.
00:32:47.080 Maybe.
00:32:47.400 But wouldn't you like to know?
00:32:50.640 It feels like a really important thing to know.
00:32:54.060 Now, if I'm seeing Angela laughing at me.
00:32:57.160 Now, if you say, Scott, Scott, Scott, you know they're not, you could be right.
00:33:03.340 You could be right.
00:33:05.440 I don't know.
00:33:08.160 But we'll see.
00:33:08.980 It's a big story.
00:33:09.640 I don't know why we don't know the answer to it.
00:33:15.240 Have I ever mentioned to you that analogies are the best way to know that you want a debate?
00:33:22.860 But I feel like I've never explained why.
00:33:26.580 And I've had so much confusion about that.
00:33:28.860 I'm going to try it again.
00:33:32.800 Analogies have two functions.
00:33:35.340 One of them is good, and I support it.
00:33:38.120 The other one is ridiculous, and I don't.
00:33:42.080 But if you just say they're all analogies, then you miss that distinction.
00:33:45.540 Here's a good analogy.
00:33:47.260 I'm going to try to explain this thing called a zebra that you've never heard of,
00:33:50.380 because you're from another planet.
00:33:52.040 But somehow you've already learned what a horse is.
00:33:55.100 All right.
00:33:55.320 So I'm going to do a quick definition of a zebra.
00:33:57.700 All right.
00:33:57.820 You remember a horse?
00:33:58.680 Imagine a horse.
00:33:59.820 Now imagine it's white and black stripes.
00:34:03.900 It's a little genetically different from a horse,
00:34:05.760 but that's basically, that's the idea.
00:34:08.680 That's a good use of an analogy.
00:34:10.540 Why?
00:34:11.600 Because all I did was quickly bring you up to speed
00:34:14.820 on something that would have taken longer to explain.
00:34:19.680 Okay?
00:34:20.080 And you would all agree with that as a good use of an analogy, right?
00:34:23.880 Good use.
00:34:24.760 Now, what's a bad use of an analogy?
00:34:26.360 A bad use is that the logic of, or the situation in one analogy can be ported over to the other one.
00:34:37.100 It's not, you can't port logic from one analogy to the other.
00:34:40.460 Well, if it's true in this unrelated story, it must also be true over here.
00:34:46.760 Now, you notice none of that is the case with my zebra horse thing.
00:34:51.380 The zebra horse thing, everybody, everybody, everybody would hear it and say,
00:34:54.880 Oh, thank you.
00:34:55.940 I understand the zebra better now.
00:34:58.340 Right?
00:35:01.040 Everybody okay so far?
00:35:02.680 The zebra one gives nobody anything to complain about when it's done.
00:35:05.920 That is a complete, perfect approach.
00:35:10.340 Now, I'm going to show you a bad analogy,
00:35:13.420 and I'll explain to you why it's the end of the conversation.
00:35:17.880 Again, here's the, I'm going to put some ground rules here.
00:35:22.300 This is not a conversation about whether masks should be used.
00:35:27.360 We're all anti-mask.
00:35:29.700 We're all anti-mask.
00:35:31.620 Let's not talk about that, right?
00:35:33.420 But I'm going to talk about something related to it,
00:35:35.280 and it's the related to it part that's important.
00:35:37.700 We don't want any mask mandates.
00:35:40.280 Let's not talk about that.
00:35:42.200 But in the conversation I had with a woman who was talking about
00:35:46.800 whether the initial viral load made any difference.
00:35:51.120 So that was the conversation.
00:35:53.380 And what do you think?
00:35:54.660 By the way, just check in with you.
00:35:55.880 Do you believe that if you were exposed to more virus initially
00:35:59.760 that you would get sicker?
00:36:02.640 What do you think?
00:36:03.320 What's your current understanding of that?
00:36:05.280 So, like everything, the science is not as good as you would like.
00:36:15.020 But there is a preprint.
00:36:17.420 There's at least one preprint that says that people who lived in a house
00:36:22.080 with somebody who had COVID.
00:36:23.680 So, in the conversation, somebody jumped into the conversation with this analogy.
00:36:32.820 And the analogy was this.
00:36:34.180 Because I'd been making the argument that any kind of a barrier makes a difference.
00:36:40.640 It just might not be enough difference to be worth it.
00:36:44.360 So, for example, right?
00:36:45.860 Like, less than 1-1-1 percent.
00:36:47.840 That's all.
00:36:50.500 That's my entire argument right there.
00:36:52.880 There's no such thing as a barrier that doesn't have work, right?
00:36:55.760 So, I'm going to put up a chain-link fence.
00:36:57.940 Chain-link fence.
00:36:59.140 How does that do for the hurricane?
00:37:00.800 Bingo.
00:37:01.220 Gotcha.
00:37:02.840 Gotcha.
00:37:05.180 Analogy has made you look like a chimp.
00:37:07.120 So, I guess we're done here, right?
00:37:11.840 I don't think you would even detect the toe.
00:37:16.240 So, my critic has a good point, doesn't he?
00:37:19.520 Have you ever go sideways?
00:37:21.320 And up and down?
00:37:22.700 And that's the argument for why masks don't work, right?
00:37:26.120 Because the stuff gets out.
00:37:27.540 If it's a poorly fitted mask, it gets out anyway, right?
00:37:32.140 So, I said this.
00:37:33.660 Walk up to that fence and blow on it.
00:37:35.560 If your air goes sideways and up and down, then it'll stop a hurricane.
00:37:47.040 If you give me a chain-link fence, they'll do the same thing as a mask,
00:37:51.180 which is it redirects some of the air sideways and up.
00:37:54.700 Yes.
00:37:55.560 If your chain-link fence can do that, it would stop a hurricane.
00:38:00.640 Right?
00:38:01.120 So, now you say to me, Scott, you're arguing like the details of an imaginary situation with a hurricane.
00:38:07.960 You're wasting your time.
00:38:10.060 Exactly.
00:38:11.740 Exactly.
00:38:13.300 That's the point.
00:38:14.740 As soon as you talk about an analogy, you're in a different conversation.
00:38:19.720 It's just a different conversation.
00:38:22.160 It tells you nothing about the one you're talking about.
00:38:24.580 Right?
00:38:24.700 So, remember, if it's like the zebra, it's a good analogy.
00:38:29.120 It just says something that everybody understands and agrees with.
00:38:31.820 There's no debate on the zebra.
00:38:34.540 But as soon as you get to chain-link fence and hurricane, that's just a different conversation.
00:38:41.100 Because your mask is not a chain-link fence.
00:38:43.900 You know, all the dynamics are different.
00:38:45.580 There's nothing you can learn from that.
00:38:46.900 You'll just end up debating the details.
00:38:48.680 Anyway, so after this long conversation with the woman who was a subject matter expert on masks, I did what I should have done the first minute.
00:38:59.300 Which was?
00:39:01.700 Because the problem seemed to be a logic problem and not a knowledge problem.
00:39:08.920 So, right away, I was like, we're losing the audio on YouTube.
00:39:14.840 So, right away, as soon as the conversation happened, I kept thinking, why is it, why is it, it's almost like I'm talking to somebody who doesn't understand reason and logic.
00:39:27.460 Yeah, she's an author.
00:39:29.140 She's a writer.
00:39:31.560 So, I was talking to, I was having a science conversation with an author.
00:39:35.900 And it went exactly the way you think it would.
00:39:41.020 Exactly the way you think it would.
00:39:42.640 I couldn't even, and in the end, it ended up being just a logic disagreement and not a factual disagreement at all.
00:39:50.560 Although she would disagree with that.
00:39:52.660 Do you know why she would disagree with my characterization of what it was?
00:39:58.260 She's a writer.
00:39:58.920 Anyway, the most consistent pattern in all of social media is that people who are writers for a living, and I'm guilty, are very unlikely to have logical arguments on science and stuff like that.
00:40:17.500 It's just the wrong, it's the wrong group.
00:40:19.480 Have you ever seen me disagree with engineers on Twitter?
00:40:25.000 I'm just curious.
00:40:26.200 Have you ever seen me having a, like an, those of you who follow me on Twitter, have you ever seen me have an extended disagreement with an engineer on Twitter?
00:40:35.440 Nope.
00:40:37.860 Nope.
00:40:38.720 Do you know why?
00:40:40.620 Because they're trained largely the same way I am, which is how to look at the costs and the benefits and make sure you didn't leave anything out.
00:40:47.720 Like, everybody who's trained the same way, you can have a pretty quick conversation with.
00:40:53.480 Because even if you disagree, what happens if I disagree with an engineer?
00:40:58.640 It takes 10 seconds to work it out.
00:41:01.740 Because, like, I say this, I say this, show me your link.
00:41:05.200 Oh, okay, we're done.
00:41:09.260 Yeah, conversations with engineers are just, yes, no, is it logical?
00:41:14.920 Oh, okay.
00:41:15.560 Yeah.
00:41:17.720 So, no, I am exaggerating, of course, engineers can disagree as well, but their disagreement is a complete different type.
00:41:24.640 The disagreement with a writer or an artist, you're never really even talking about the same thing.
00:41:31.720 It's like you're just trying to model the water into a sculpture.
00:41:36.820 It's just, it's water.
00:41:37.900 There's nothing you can do with it.
00:41:39.060 With an engineer, at least, that's an actual disagreement, so you can work it out.
00:41:43.860 What do you think of the United States deciding to arm Taiwan much more aggressively than we've done before and giving them much better weapons so they can defend against China?
00:41:58.560 Good idea or bad?
00:41:59.400 Good idea or bad?
00:42:06.580 I don't know.
00:42:08.460 Yeah.
00:42:08.960 Yeah, this one, there's no way to, there's no way to game it.
00:42:12.200 The only thing you can know for sure is that the decision will always go in the direction of, will always go in the direction of, follow the money.
00:42:23.320 Yeah, yeah, you could predict it.
00:42:26.780 So, here's the part that I don't know how to predict.
00:42:31.080 If Taiwan and mainland China got into an actual serious war where China decided, you know, we're just going to end this, would that be good or bad for arms manufacturers in America?
00:42:47.560 You say good?
00:42:50.540 I don't know.
00:42:51.260 Because that was a tough one.
00:42:53.320 Because it would be good in terms of maybe immediate orders for products.
00:42:58.360 So, that part we could agree on.
00:43:00.120 It would definitely boost orders.
00:43:02.460 But remember, the people who are operating in their own self-interest still have to live in the world.
00:43:08.140 So, there's no arms manufacturer who's going to start a nuclear war to sell more arms.
00:43:15.520 Because that's so clearly not in anybody's interest, including the arms dealers.
00:43:19.860 Because they would be dead just like everybody else.
00:43:22.000 So, I think the arms dealers have to play a game where they have the maximum amount of conflict short of a nuclear confrontation.
00:43:30.300 Now, I think that they went as close to the line with Ukraine and Russia as anybody should ever go.
00:43:36.680 Now, agree?
00:43:37.880 Now, they went as close to the line with Ukraine.
00:43:39.060 They went as close to the line than most of us would have said would have been prudent.
00:43:43.560 And here's the problem.
00:43:46.500 You see the problem?
00:43:48.680 So far, they fucking got away with it.
00:43:53.280 That's a problem.
00:43:55.440 The experience in Ukraine is that Russia, so far, is backing down from nuclear use.
00:44:01.480 That is the most dangerous fucking situation we could ever be in.
00:44:06.520 Because that allows those same arms manufacturers to say, you know, Taiwan will be fine too.
00:44:13.900 What would that be?
00:44:16.240 An analogy.
00:44:19.420 That would be an analogy.
00:44:20.800 There's no reason, no reason to think that because Ukraine has so far not sparked a nuclear war that anything we did involving Taiwan and China would turn out the same.
00:44:34.900 That is a big, wrong assumption that somehow those would be similar.
00:44:40.240 The Ukraine war, you can never anticipate the real ripple effect of what a war does.
00:44:48.680 Because sometimes you, you know, you create new technologies because of the war.
00:44:53.320 And it's like, great, at least that part of it.
00:44:56.600 And sometimes there's some lingering thing like, you know, the end of World War I, some would say, maybe you could debate this,
00:45:04.820 but some would say the way we handled the end of World War I wasn't good news.
00:45:08.520 Because it basically created the seeds of World War II.
00:45:12.740 So, I've got a feeling that the Ukraine situation is really going to screw things up for Taiwan.
00:45:20.600 I don't know how, but I hate that we're going to be informed by one thing to handle a separate thing.
00:45:29.280 China needs Taiwan's chips.
00:45:33.100 But, you know, if they thought they could conquer Taiwan in a month, they would have their chips.
00:45:38.520 They just have to rebuild, just don't bomb the chip factories.
00:45:42.200 China would just have to not bomb the chip factories.
00:45:45.100 And if they thought they could gain control in a month, they'd be fine.
00:45:51.820 Yes, we're, we're pronouncing zebra, not zebra.
00:45:55.380 I'm getting a disagreement that zebra should be zebra.
00:46:00.620 Now, I don't know what country you're from, with your zebras.
00:46:05.240 But in America, damn it, they're zebras.
00:46:09.460 Zebras.
00:46:10.760 Australia?
00:46:11.820 British?
00:46:12.660 We don't know.
00:46:13.180 All right.
00:46:18.720 So, now it's been a few days since Trump's NFT successes.
00:46:23.400 And I wonder if there are more people like me who had a Covington Kids initial reaction.
00:46:31.880 Remember the Covington Kids hoax when you saw the first deceptively edited videos?
00:46:38.240 My first response was, God, I hate that kid.
00:46:41.980 Frickin' jerk.
00:46:43.420 You know, just like being a jerk to this guy in public.
00:46:46.840 I hated that kid.
00:46:48.260 And then you see the full video.
00:46:49.960 Nothing like that happened.
00:46:51.940 Completely opposite of what the deceptive video was.
00:46:55.960 Yeah, it was a Rupar video.
00:46:57.120 And I think that happened to me with Trump's NFT.
00:47:05.860 Would you like me to admit I was wrong?
00:47:09.360 Because I hear that I never do that.
00:47:11.340 There's a rumor that I never admit when I'm wrong.
00:47:14.900 Have you heard that rumor?
00:47:16.080 It's widespread.
00:47:18.100 And yeah, the funny thing is, nobody does it more often than I do.
00:47:22.500 I'm going to say that in the world, nobody has consistently admitted when they're wrong
00:47:27.580 more often than I do.
00:47:30.160 Would you agree?
00:47:31.200 If you watch me, you probably say yes.
00:47:32.940 All right, here's where I was wrong.
00:47:34.000 My first reaction to the NFT release, especially since he teased it as a major announcement,
00:47:39.540 was, oh my God, he's done.
00:47:41.740 He's so fucking done.
00:47:43.720 Like, there's just nothing.
00:47:45.800 There's nothing I can say good about this.
00:47:48.500 It's just done.
00:47:49.720 He's just cashing in.
00:47:51.180 It's just done.
00:47:52.120 Now, you saw that was my first response, right?
00:47:54.640 Can you confirm that was my first response?
00:47:57.660 Okay.
00:47:59.040 I was totally wrong.
00:48:02.020 Let me say, with no reservations, sometimes, sometimes you hear me say stuff like, well,
00:48:09.080 I was wrong, but really I was right if you looked at it this way.
00:48:14.060 That's what I say about Ukraine, for example.
00:48:17.040 Yes, I was wrong when I said Putin wouldn't attack, but I was really right in a way, because
00:48:23.080 of the reason I said he wouldn't attack is because it wouldn't work.
00:48:25.680 It would go poorly.
00:48:27.120 And that part I got right.
00:48:28.200 So, usually, you see me saying, well, I was wrong, but not really completely wrong.
00:48:33.480 This time, totally wrong.
00:48:36.940 There's nothing I can say about this.
00:48:38.640 I just want you to accept my complete wrongness on this, with no reservations.
00:48:45.860 This was a good idea that worked out well.
00:48:49.840 Here's why it worked out well.
00:48:53.680 Number one, it's sold out.
00:48:56.900 And it was like a breathtakingly successful fundraiser, right?
00:49:02.260 For the amount of work he put into it, which was one video, probably, and maybe he reviewed
00:49:07.320 it or something.
00:49:08.260 So, he did maybe probably one hour of total work and probably netted, we don't know, two
00:49:17.060 to four million dollars or something for like an hour.
00:49:20.220 Whoever did better than that, right?
00:49:22.500 The best fundraiser ever done, and he owns that now, right?
00:49:27.140 Would you agree that for the bang for the buck, was that not the best fundraising ever done?
00:49:37.360 Now, somebody's saying he did a license deal, which means that he doesn't get the full amount
00:49:41.420 of that.
00:49:42.020 But even as a license deal, it would be tremendous.
00:49:47.060 Tremendous.
00:49:48.140 So, I don't know what percentage he gets.
00:49:49.680 We don't know that.
00:49:51.400 All right.
00:49:52.040 Now, here's the other reason that it wasn't the disaster it looked like.
00:49:56.660 He followed up quickly.
00:49:58.640 Once there was a lot of energy around him, he was getting a lot of attention.
00:50:02.040 He followed up quickly with a very strong campaign video about free speech, which was free
00:50:09.200 money, basically.
00:50:10.180 He said exactly what his base wanted him to say.
00:50:12.480 And he said it well.
00:50:14.560 He said it strong.
00:50:16.420 And I didn't see anything wrong with it.
00:50:18.900 Like, I liked every bit of it.
00:50:20.800 Now, who knows if he could execute.
00:50:22.740 But what he said was perfect.
00:50:25.000 So, here he takes what I thought was this big mistake, but it created energy.
00:50:31.320 And then, here's why I'm embarrassed that I didn't see this coming.
00:50:35.100 He's an energy monster.
00:50:38.860 We created all this NFT negative energy, and he just gathered it up, and then it put it
00:50:45.360 behind the release of his free speech video.
00:50:49.200 Now, who knows how much was planned?
00:50:52.840 You know, it could be that they sped up the free speech video faster than he planned to
00:50:58.360 get it in the same space with the NFT thing, because maybe it wasn't going as well as he thought.
00:51:02.720 But I don't know if it was planned, or he just found a way to snatch victory from the jaws
00:51:09.180 of defeat, but he had a good week.
00:51:12.420 Am I wrong?
00:51:14.920 I think it was one of Trump's best weeks.
00:51:17.700 Because Trump is being vindicated on, you know, all the Twitter stuff.
00:51:24.560 Totally vindicated.
00:51:25.740 So, he's vindicated in seeing that the system was rigged against him, and especially because
00:51:30.460 it's the FBI, because the FBI in this story just makes Trump look more and more like he
00:51:36.540 was always the victim, and more and more like he was right the whole time, even though it's
00:51:40.420 a slightly, you know, off Trump story.
00:51:42.940 It's still, everything is connected.
00:51:45.440 So, I think he had one of the best weeks ever.
00:51:48.160 Now, do you think it's a mistake when an 80-year-old releases an NFT?
00:51:55.400 Go.
00:51:56.460 When age is a question, and Trump releases an NFT?
00:52:01.560 Do you see this play?
00:52:04.140 It's fucking brilliant.
00:52:06.700 It's brilliant.
00:52:08.600 Do you know who's not going to release an NFT?
00:52:12.000 Biden.
00:52:13.580 Do you know why?
00:52:15.560 He's too old.
00:52:16.340 He's too old.
00:52:18.960 Yeah, he wouldn't know what it was.
00:52:20.840 He wouldn't be able to tell you what it was.
00:52:22.980 You know, if somebody asked him about it, he wouldn't be able to explain it.
00:52:26.400 But Trump does licensing for a living.
00:52:30.380 Trump knows what it is.
00:52:32.600 Do you think that Biden could explain to you what an NFT is?
00:52:35.660 You just stop and say, what's an NFT?
00:52:37.480 Nope.
00:52:38.160 Nope.
00:52:38.680 Do you think that Trump could?
00:52:40.440 Yes.
00:52:41.800 Yes.
00:52:42.380 Now, Trump maybe can't give you the details of what a blockchain is.
00:52:46.340 Or maybe he can.
00:52:47.820 You know, maybe he can give you the big picture that it's like a, it's a public record that nobody can change.
00:52:52.880 Like, maybe he knows that.
00:52:54.200 And that would be the only important part.
00:52:55.500 But think about the persuasion that is related to Trump embracing a modern technology that even most of the public is not familiar with.
00:53:09.520 Most of the public is not even familiar with NFTs.
00:53:12.340 And he basically just tied his 76-year-old brand to the most current happening young technology and then, and then, made it the most successful one lately.
00:53:28.380 Has anybody had a successful NFT better than this one?
00:53:33.960 Because the NFT market's garbage, right?
00:53:35.800 It's falling apart.
00:53:36.560 And he still had a successful one.
00:53:37.960 Because, by the way, the value has risen.
00:53:39.920 The value of the ownership of one of those has risen.
00:53:42.680 You could resell it for more.
00:53:43.780 That's completely successful.
00:53:48.680 So, I think you're missing the best part of the play.
00:53:51.900 The best part of the play is he younged him down.
00:53:54.540 He younged him down when he's running, maybe, against, you know, the oldest potential candidate.
00:53:59.980 He educated the masses that NFTs are like trading cards.
00:54:05.580 Exactly.
00:54:06.720 That's exactly what he did.
00:54:08.260 Right.
00:54:08.880 I'll bet you there's a whole segment of the population who goes, oh, I get it.
00:54:12.340 They're digital.
00:54:13.300 They're like trading cards.
00:54:14.720 And the first thing they ask is, why can't I just make my own copy?
00:54:18.120 And then somebody says, well, it's something about the blockchain.
00:54:20.940 You go, okay, I don't need to understand that, but I get it.
00:54:26.280 All right.
00:54:29.980 And also, I didn't mind that he was pitching a product because he's a salesperson.
00:54:37.360 Isn't Trump the ultimate salesperson, which is what he calls himself?
00:54:41.180 He basically says, I'm going to try to sell the country.
00:54:44.460 I sell stuff.
00:54:45.780 And then he goes and he sells stuff for his campaign.
00:54:49.120 I don't know.
00:54:49.500 To me, it was right on brand.
00:54:52.100 The perfect Trump brand, I say, is when he creates all this attention.
00:54:59.980 Even negative attention.
00:55:01.760 But when it all settles, you say to yourself, oh, that was a lot better than I expected.
00:55:07.960 Right?
00:55:08.540 How often do you see that pattern where you think he's really stepped in it now?
00:55:13.220 And when the dust clears, you're like, oh, that wasn't so bad after all.
00:55:17.200 All right.
00:55:19.840 Question.
00:55:20.780 I see that Jordan Peterson is warning that Western countries that a totalitarian social credit system
00:55:29.020 is coming to their societies in a highly probable way.
00:55:33.800 So Jordan Peterson says it's highly probable we'll have a totalitarian social credit system.
00:55:39.960 Agree or disagree?
00:55:41.800 Highly, highly probable?
00:55:44.540 Yes or no?
00:55:45.520 So who already has one?
00:55:50.940 What country already or countries already have?
00:55:54.440 Now, China is the one we know of.
00:55:56.120 Is anybody else doing it?
00:55:57.280 Is there any other country on Earth besides China?
00:56:01.960 You say the UK, but I don't believe so.
00:56:04.700 That might be just for vaccines, right?
00:56:07.600 No.
00:56:08.900 No.
00:56:09.340 I mean, there's talk about vaccine passports.
00:56:11.300 But in terms of a total social credit system, nobody else is doing it, right?
00:56:17.180 You're saying Canada?
00:56:18.100 But not a full credit system.
00:56:22.600 They're doing things that look like they're moving in that direction, right?
00:56:25.300 All right, here's my preliminary take.
00:56:32.220 Preliminary take, number one.
00:56:33.860 There are a lot of things that are not social tracking systems that rhyme with it and feel like it and it feels like it's in a domain.
00:56:44.060 I think you have to treat them differently, right?
00:56:46.320 I believe that in the United States, and there's no way to prove this, because I know there's plenty of pattern recognition that goes against it.
00:56:54.240 But in my opinion, in the United States, if we had implemented, and I'm not arguing we should have.
00:57:01.540 I'm not pro-passport.
00:57:03.380 But if we had, do you think that we would not, as a public, have been able to get rid of it?
00:57:08.040 Do you think once it was in there, it would just take root, and it would just expand to a total full system?
00:57:16.460 So a lot of you say, yes, that's exactly what the government does.
00:57:19.980 When the government starts a program, it never stops it, right?
00:57:24.700 That's the pattern.
00:57:26.100 Because the government always wants more power, and if they get your money, they never say, we don't need it anymore.
00:57:33.100 So whatever grows just keeps growing.
00:57:35.140 All right, this is where we disagree.
00:57:38.040 I'm not going to say that your reasoning is flawed.
00:57:42.300 It's not.
00:57:43.600 You do not have flawed reasoning.
00:57:45.460 Because your reasoning is based on pattern.
00:57:48.060 The pattern is well-established.
00:57:50.360 Would you agree?
00:57:51.600 It's a well-established pattern that this is exactly the sort of thing you don't want to trust your government with.
00:57:57.860 Right?
00:57:58.720 But here's where I disagree.
00:58:01.140 If it were unimportant, they could certainly get away with it, if we weren't watching and stuff.
00:58:06.460 But I think that the United States, and I won't speak for other Western countries, but I think the United States wouldn't put up with it.
00:58:13.820 So I think you need a totalitarian government to do it, and I think we don't have one.
00:58:23.680 I think the base requirement of a social credit system is a totalitarian government, and we don't have one.
00:58:29.780 Now, does that mean the government would not try to implement it?
00:58:34.880 No.
00:58:35.600 I'm not saying that.
00:58:37.940 Definitely the government would try to get away with anything it could, but I don't think the citizens would allow it.
00:58:45.340 Right?
00:58:45.900 I don't think so.
00:58:46.860 And it might be, you know, there might be a, you can imagine a number of ways that would work.
00:58:54.240 All right, here's one way.
00:58:57.140 Embrace and amplify.
00:58:59.720 All you'd have to do is say, if you're going to give us a social credit system, we want to be able to track the social credit system of Congress.
00:59:07.360 And that's the end of it.
00:59:09.920 That's the end of it.
00:59:11.180 Because there's no way that Congress could impose social tracking on the public and say we're exempt.
00:59:20.180 Let me say this as clearly as possible.
00:59:22.980 If Congress ever said, we're going to put a social tracking system on the public that will not apply to us in Congress,
00:59:30.660 I will be with you to take over the Capitol building.
00:59:34.900 I will join an insurrection.
00:59:39.000 I will look to overthrow the government of the United States if they were to do that.
00:59:45.160 That's why they won't do it.
00:59:47.240 That's why I feel safe.
00:59:49.040 Because that's a red line of red lines.
00:59:52.440 I mean, no.
00:59:54.260 No, you're not going to treat us as second-class citizens while you're exempt from your own rule.
00:59:59.880 And do you think that they would say, all right, well, we don't like it applying to us in Congress,
01:00:05.460 but it's so beneficial we're going to apply it to the people?
01:00:09.540 Nope.
01:00:10.460 Nope.
01:00:11.060 There's no way that would happen.
01:00:12.700 There's no way Congress would allow that to happen to themselves.
01:00:16.300 And I think every one of you would confirm if they were going to, if they try to impose that on us, the public,
01:00:25.240 and make themselves exempt, we will sweep them out of office with whatever means necessary.
01:00:32.720 Violence is allowed.
01:00:35.020 Violence is allowed under that condition.
01:00:37.440 So I'm not recommending any violence in any current situation.
01:00:40.580 The whole context is we will never be in that situation.
01:00:45.300 So in case you're monitoring me on YouTube for inciting violence, no.
01:00:50.140 This is the opposite of inciting violence.
01:00:52.340 I'm saying there's no possibility of inciting violence because the United States will never get to that point
01:00:58.260 because Congress will never impose it on themselves.
01:01:03.820 That's it.
01:01:04.540 So can I be sure that my predicted future is more accurate than those of you who have pattern recognition,
01:01:14.920 which is operating quite well right now.
01:01:17.520 Your pattern recognition is on point.
01:01:19.460 Because if you say, this looks familiar, you're right.
01:01:24.560 But you know what else it would be?
01:01:29.100 Do you know what pattern recognition is?
01:01:31.740 What's another word for pattern recognition?
01:01:34.540 An analogy.
01:01:40.720 Yeah.
01:01:41.220 Your pattern recognition, they're all analogies.
01:01:44.260 It happened in this other case, completely different case.
01:01:47.680 So I think it'll happen with this one.
01:01:50.180 You're not wrong.
01:01:51.860 You're not wrong.
01:01:53.340 That pattern is a very well-established pattern.
01:01:55.840 But the trouble is, if you don't look for what's different about each situation,
01:02:00.480 you might imagine that the pattern can work in every situation,
01:02:04.160 where it might be a pattern that is limited to some kind of domain.
01:02:09.060 You have to look at the specifics.
01:02:11.060 That's why analogies fail.
01:02:14.320 You reciprocate this.
01:02:15.380 Thank you.
01:02:16.960 Would you stipulate that I'm true, that I'm right?
01:02:20.620 Will you stipulate that patterns are really analogies?
01:02:24.400 And if you don't look at the specific situation, you don't really know what's going to go on.
01:02:30.840 Many of you just love your pattern recognition and will never change.
01:02:36.220 It works most of the time.
01:02:38.840 All right.
01:02:39.160 So I'm not terribly worried about that, but it's not a zero.
01:02:45.740 Musk said maybe yesterday morning or the day before that the coup de grace was still coming.
01:02:55.200 Was the coup de grace what we saw about the FBI?
01:03:00.580 Or is there something we haven't seen yet?
01:03:03.380 Does anybody know?
01:03:06.320 You're just guessing, though, right?
01:03:07.740 You're guessing it's the COVID files or you're guessing it's Fauci.
01:03:11.640 But you're only guessing, right?
01:03:15.320 Yeah, the coup de grace.
01:03:21.720 All right.
01:03:22.620 But we think it's still coming.
01:03:25.100 I guess there's still a question whether there's something big coming or not.
01:03:28.640 All right.
01:03:29.080 Is there anything else that's coming?
01:03:33.740 Anything else I missed, at least today?
01:03:35.660 Are you all having trouble on YouTube?
01:03:38.880 Dead feed, buffering.
01:03:41.380 Yeah, it looks like the YouTube feed is totally dead.
01:03:44.160 Now, keep in mind...
01:03:46.660 Now, can you hear me at least?
01:03:52.120 Can you hear me on YouTube?
01:03:55.200 Give me a yes or no.
01:03:56.360 Can you hear me?
01:03:57.740 Because I can't tell if they have audio.
01:03:59.360 They're all talking, but they're not...
01:04:00.660 They're not responding to me, so I think that they can't hear me.
01:04:06.020 Oh, it's good now.
01:04:07.700 All right.
01:04:10.860 So you should know that...
01:04:13.260 Can I just...
01:04:14.800 Let me do a quick check.
01:04:16.940 So I've got two feeds going on two new iPads, both of them on the same Wi-Fi.
01:04:22.320 The locals' feed is working perfectly, right?
01:04:25.280 Or are you having troubles on this one as well?
01:04:29.240 Oh, they are...
01:04:30.400 Well, I'm getting opposite.
01:04:31.640 Some say yes, some say no.
01:04:36.700 Okay, the sound is good.
01:04:38.060 Yeah, I'm getting different responses.
01:04:43.760 So I don't think it's my Wi-Fi.
01:04:46.880 Would you agree?
01:04:47.920 It's not my Wi-Fi, because people are getting different outcomes,
01:04:51.600 and I have the same Wi-Fi for everybody, of course.
01:04:54.900 So it's not in my end.
01:04:57.580 It can't be.
01:05:00.020 And it's not localized...
01:05:02.060 It's not localized just to YouTube.
01:05:04.740 So it might have something to do with just traffic or something.
01:05:10.420 I don't know.
01:05:11.100 So I guess that's another mystery.
01:05:12.920 Now, do you assume that what we found out about the FBI and Twitter,
01:05:19.160 do you assume that the FBI has equal operations in the other platforms?
01:05:25.480 Do you think Google and YouTube and Facebook all have their FBI teams?
01:05:32.720 Of course.
01:05:33.340 Of course.
01:05:36.840 Is there any chance they don't?
01:05:41.240 Like, why wouldn't they?
01:05:42.920 Now, the only way I could see that making sense
01:05:45.340 is if they imagined that Twitter was sort of the tail that wags the dog,
01:05:50.600 and if you get Twitter right, everything else ends up right, but I doubt it.
01:05:54.320 Because a bureaucracy would never ignore Facebook, would it?
01:05:58.540 Could you imagine the FBI bureaucracy?
01:06:00.220 All right, we're going to put 80 people into Twitter,
01:06:03.440 but Facebook said no, so we won't have anything over there?
01:06:08.180 That doesn't sound like anything that happens in the real world, does it?
01:06:11.240 Because who doesn't take a meeting with the FBI?
01:06:14.120 If the FBI says we want a meeting with you, and you're a business, and you're not in trouble,
01:06:24.500 you're not in trouble, they want to work with you to make your job better.
01:06:29.220 Who doesn't take that meeting?
01:06:32.240 Everybody takes that meeting.
01:06:34.200 Everybody.
01:06:35.320 Because they're just offering you a service.
01:06:36.940 We have some information.
01:06:38.540 We can help you monitor the bad people and the election interference.
01:06:41.920 We'll do it for free.
01:06:43.980 And you just have to give us a little, like, access or something.
01:06:46.440 Who would say no to that?
01:06:49.340 Nobody.
01:06:50.500 Nobody would say no to that.
01:06:52.800 This is the same argument I make about Don Jr.'s famous meeting with those Russian-related people
01:06:59.380 in the last election, the first election for Trump.
01:07:04.440 Everybody would take that meeting.
01:07:06.440 It was just downstairs.
01:07:07.480 And sure, they promised something they didn't have, which was some dirt on Hillary.
01:07:14.740 Are you telling me that if you were part of a major campaign, and somebody that you knew,
01:07:20.140 it's not a stranger, it's somebody you know personally.
01:07:23.340 Somebody you know personally says, I have something that will change everything, some dirt on Hillary.
01:07:28.240 All you have to do is come down two floors and sit in this room for five minutes, and you'd have it all.
01:07:34.320 100% of the world takes that meeting.
01:07:36.220 Now, if something came up in the meeting, there was, like, a national security concern,
01:07:42.440 should they alert the FBI after the meeting?
01:07:44.840 Well, yes.
01:07:45.900 No, that didn't happen.
01:07:46.960 It was just a sort of a nothing meeting.
01:07:49.480 But you don't alert the FBI first.
01:07:54.160 That's stupid.
01:07:55.740 Because there's no indication of a crime.
01:07:58.140 Why do you get the FBI involved when there's no indication of a crime?
01:08:01.300 There's just somebody who says they know something, and most people are lying when they say that.
01:08:05.120 I mean, usually it's hyperbole, usually it's not what they say.
01:08:09.420 First, you listen.
01:08:11.140 Does anybody disagree with that?
01:08:13.260 First, you listen, and then you decide.
01:08:15.740 Does that sound sketchy?
01:08:16.720 If it does, then the second thing you do is call in the FBI.
01:08:20.620 And everybody acts like you should have been the other way around.
01:08:23.960 Like, immediately calling the FBI.
01:08:26.620 For what?
01:08:27.200 For what?
01:08:29.220 The guy you know wants to talk to you?
01:08:34.540 All right.
01:08:40.820 Soccer ball with a microphone?
01:08:42.940 What?
01:08:45.180 Yeah, you have to pass the bill to see what's in it.
01:08:47.200 She met with Fusion GPS before and after the meeting.
01:08:53.100 Is that true?
01:08:54.360 I never heard of that.
01:08:56.720 Yeah, I saw the memes about Sam Brinton stealing Santa's bag.
01:09:01.560 That was pretty funny.
01:09:06.820 Trailers on the border?
01:09:07.960 I don't know about that.
01:09:08.980 All right.
01:09:15.180 Thank you for saying I'm correct.
01:09:17.720 Yeah, the omnibus bill, I don't know.
01:09:20.720 That bores me.
01:09:22.000 Budgeting.
01:09:24.880 Don't forget, Jack doesn't know anything, Flower Girl says.
01:09:28.180 All right, are you surprised that we've reached this point in the process
01:09:31.380 about looking into the Twitter file
01:09:33.060 where Jack is still not implicated directly?
01:09:38.980 You know we're at that point, right?
01:09:41.700 Did anybody besides me say this would happen?
01:09:45.340 That we would get to all the way to here
01:09:47.360 and there's no direct information whatsoever
01:09:50.180 that Jack was directly involved in anything bad?
01:09:56.340 Now, you could argue he should have known, right?
01:09:58.660 That's a separate argument.
01:10:00.180 Yeah, the argument of whether he should have known,
01:10:02.600 totally good argument.
01:10:06.920 Right.
01:10:07.400 And by the way, I imagine he would agree with that.
01:10:12.240 I don't know.
01:10:13.240 But I think if you ask Jack, you know, was it like your job?
01:10:16.540 Should you have known what was going on, you know, in greater detail?
01:10:19.820 He probably would say yes.
01:10:21.540 Probably would say yes.
01:10:22.400 Because he's been pretty transparent about everything, I think.
01:10:27.340 You say he knew, but there's no evidence to that.
01:10:30.320 In fact, the evidence strongly suggests the opposite.
01:10:34.280 Strongly suggests.
01:10:35.080 You disagree?
01:10:39.920 Well, I'm only talking about what the evidence is, not what the truth is.
01:10:45.260 The truth, maybe you never know.
01:10:47.380 But the evidence suggests exactly what I predicted.
01:10:50.660 I mean, I had exactly zero people agreeing with me.
01:10:55.220 I'm not wrong about that, right?
01:10:57.220 I think zero people agreed with me when I said, I don't think he was involved directly.
01:11:03.100 Right?
01:11:03.580 Zero people.
01:11:06.780 Now, I could still be wrong.
01:11:09.100 You know, tomorrow we could find out something where I'm wrong.
01:11:11.540 But the fact that we got this far with my hypothesis that he wasn't involved, still active, it's still a good hypothesis, this far in, that's surprising to some of you.
01:11:25.420 Yeah, the simplest explanation, you're right.
01:11:31.660 Scott doesn't know what evidence is.
01:11:34.440 You think that's the problem?
01:11:35.640 Well, for those of you who are saying, I hear you saying that Jack was a bad manager, blah, blah, blah, because he didn't do it.
01:11:48.500 But let me give you a different frame on this.
01:11:52.920 Jack's management of Twitter did not begin and end with his role as CEO.
01:11:59.540 It continues now.
01:12:01.580 If you expand your frame, Jack managed it properly by selling it to or supporting the sale to Musk.
01:12:09.460 Because selling it to Musk was what solved the problem.
01:12:12.740 And Jack said that directly.
01:12:14.880 He said he's the right person to do this.
01:12:16.660 And I think that Jack knew that it would require devastating the company to fix it.
01:12:23.320 Like, you couldn't tweak it.
01:12:25.100 It wasn't like disciplining a few employees and moving some people around.
01:12:29.080 Like, you had to, like, take it down by its roots.
01:12:33.000 And I think that Jack probably knew Musk well enough, or at least his operating methods,
01:12:39.920 to know that Musk could potentially root it out at the, you know, at the grassroots and just pull it out of the ground.
01:12:46.140 But it was hard for the existing CEO to do anything like that.
01:12:50.100 There's a long history of when you need to do major layoffs and major restructure.
01:12:54.960 You usually bring in a new person.
01:12:57.400 That's usually the way, that's the right way to do it.
01:12:59.780 Because the person who's there has personal relationships that is going to influence whether you can do the hard stuff.
01:13:06.440 The person who's been working there has too many personal stakes.
01:13:10.860 People they put in a position, they don't want to fire their friend, that sort of thing.
01:13:15.920 So, if you say that Jack didn't do his job while he was in the job,
01:13:22.040 I would argue that that is refuted by the fact that he supported or bring in Musk,
01:13:29.880 which had to be very directly related to figuring out what was going on in a more aggressive way.
01:13:36.580 So, I feel like he maybe did what was right to compensate for what problems had grown up during his reign.
01:13:47.700 Jack knew that Twitter was full of leftists and the effect of that was to bias their operation.
01:14:01.400 Remember, he was very clear about that.
01:14:04.800 He's never argued that Twitter was a fair arbiter.
01:14:09.060 You know that, right?
01:14:10.380 Jack has never said Twitter's operating the way I want it to operate.
01:14:13.880 Everything's fair.
01:14:14.840 He never said that.
01:14:15.620 He actually agreed with the critics that Twitter was operating like a left-leaning organization
01:14:22.500 instead of a, you know, just a platform.
01:14:25.320 He said it directly.
01:14:28.980 You said Jack didn't know Twitter was actively throttling conservatives.
01:14:33.060 Correct.
01:14:34.480 And I still say that.
01:14:37.360 Because the examples given are individuals.
01:14:41.540 There's still not a...
01:14:43.400 That's the dog not barking.
01:14:45.620 The part we still don't know is if people like me were banned just for political reasons.
01:14:52.240 Now, we know there's a bias, which is that people are banning people for sketchy reasons that really, you suspect, might be political.
01:15:02.100 Am I right?
01:15:02.600 So there's the gray area, but I think Jack fully accepted that the gray area was exactly what you thought it was.
01:15:10.320 That if you're staffed with leftists, even though what they're trying to do is just the right thing, like get rid of bad information and stuff, it will end up being focused on conservatives.
01:15:23.560 I feel like he was fully transparent about that, right?
01:15:28.620 It's not my job to defend him, but I'm just telling you what I saw.
01:15:31.900 So the part that he has denied is the part I said has not been demonstrated, that they had a policy or a programmatic policy, I guess, an algorithm, that would ban people like me who had no strikes.
01:15:50.520 And as far as I know, there was no conversation about me.
01:15:52.960 So did I get scooped up in some kind of business?
01:15:55.800 So that's the part that I think he says no to, and we do not have evidence of it.
01:16:01.980 So far.
01:16:03.580 I mean, to me it seems highly probable we will, but so far not.
01:16:10.260 And if we haven't seen it yet, it means Musk either hasn't seen it or for some reason needs to know more or something.
01:16:18.080 But that does suggest that you could be the head of the organization and not know exactly how that algorithm got tweaked.
01:16:30.900 Yeah, we all agree with the buck stops at the top, right?
01:16:34.880 Nobody's arguing responsibility.
01:16:37.700 You get that, right?
01:16:39.780 The responsibility is who's in charge.
01:16:41.900 It doesn't matter if they knew or didn't know.
01:16:43.860 That's just the way it has to work.
01:16:45.640 But I think Jack would agree with that.
01:16:47.540 I mean, I don't know, but everything suggests he would agree with you on that, that it was his responsibility.
01:16:59.200 You say, if Jack didn't drill down for the answers, that's unbelievable.
01:17:03.980 Let me fact check you on that.
01:17:06.340 Jack personally contacted me, you know, a few years ago, when I was complaining that it was a shadow band.
01:17:13.040 He personally introduced me to Del, somebody, whose job it was to work in that area.
01:17:21.600 I personally worked with her and quickly determined that she was the problem and that she knew exactly what was going on because she sort of started ghosting me when I got too close.
01:17:33.340 Now, the fact that he connected me with her suggests that he couldn't penetrate her barrier any more than I did.
01:17:43.500 Now, he could have fired her, but that's why you need a musk, somebody who just says, that's not an answer, you're fired, right?
01:17:52.040 I feel like Jack was more of a, well, I'm not getting the answer I want, but I'll try harder.
01:18:00.800 Maybe I'll see if Scott talking to you gets the answer.
01:18:04.760 You know, he was like working around the edges.
01:18:07.480 But to say that he didn't drill down is inaccurate because I actually was part of a process of trying to drill down.
01:18:15.680 It's just I hit an employee wall, you know, a lower level employee was just a brick wall.
01:18:22.080 But don't you think he hit the same brick wall?
01:18:25.320 Let's say he had drilled down with her.
01:18:27.600 All right, Del, Scott has this complaint.
01:18:32.260 And she would say, I see no evidence that that's happening.
01:18:35.360 Then what does he say?
01:18:36.180 Well, look harder.
01:18:38.580 Okay.
01:18:40.000 I'll look for a week.
01:18:40.980 I'll get back to you.
01:18:42.300 Yep, I looked for a week.
01:18:43.920 Nothing there.
01:18:45.920 Then what does Jack do?
01:18:46.900 Fire her?
01:18:48.440 He can't because he doesn't know if there's nothing there.
01:18:52.200 Right?
01:18:52.820 He doesn't know.
01:18:54.400 So all he knows is what his employees tell him.
01:18:57.360 And when do employees ever tell the truth to the CEO?
01:19:02.140 Hardly ever.
01:19:03.440 Right?
01:19:04.120 Everybody's shading things when they tell the CEO.
01:19:06.180 They're telling him what they think is going to be good for them to hear.
01:19:11.640 Now, if somebody says, I looked and there's nothing there, how do you fire him?
01:19:16.700 Unless you know there's something there.
01:19:18.740 And how could he?
01:19:19.700 It was a much harder problem than you think.
01:19:22.780 The only way to solve it was mass firings.
01:19:26.760 The only way.
01:19:28.380 I don't think any level of management excellence could have been replaced.
01:19:34.020 Keep the same employees in place and then get to the bottom of it.
01:19:38.100 That wasn't going to happen.
01:19:39.600 You needed completely new people in there.
01:19:43.340 Some of them.
01:19:44.360 And I'm sure Musk has done that by now.
01:19:47.520 Yeah, the fact that Elon called Twitter a crime scene.
01:19:50.600 I don't know how literal that is, but certainly it feels right.
01:20:00.660 Okay, that's a long comment.
01:20:04.740 All right.
01:20:06.440 Tulsi Gabbard and Tucker Carlson.
01:20:08.320 I didn't see that.
01:20:10.060 Is Jack not technical?
01:20:11.680 Yes, but that's not going to help you.
01:20:12.960 Being technical doesn't mean he's going to go scour the algorithm on his own if he's the CEO.
01:20:21.700 All right.
01:20:23.080 That is all for now.
01:20:24.360 YouTube, thanks for joining.
01:20:25.860 Sorry about your bad connection.
01:20:28.220 Maybe you'll be better tomorrow.
01:20:31.020 Bye for now.