Real Coffee with Scott Adams - December 13, 2022


Episode 1956 Scott Adams: I Tell You How The CIA Took Over The Country, Allegedly. And Lots More Fun


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

142.68373

Word Count

10,814

Sentence Count

799

Misogynist Sentences

5

Hate Speech Sentences

35


Summary

Inflation is easing, addiction is on its way to science, and the end of addiction is coming. Plus, a new kind of chip, and a new way to make coffee. Enjoy the episode, and spread the word to your friends about it!


Transcript

00:00:00.000 Good morning everybody and welcome to the highlight of civilization.
00:00:08.580 Well, because I know you care about how much I slept, I see the question here already.
00:00:15.160 Not very much. That's why I accidentally started this live stream an hour earlier.
00:00:20.360 An hour earlier. But now I'm all in the right time zone and everything.
00:00:24.540 So, it's looking good.
00:00:25.960 How would you like to take this experience up to, oh, let's say Elon Musk levels?
00:00:33.100 Yeah. All you need is a cupper, a mug, a glass, a tanker, a chalester, a canteen, a jug, a flask, a vessel of any kind.
00:00:39.240 Fill it with your favorite liquid. I like coffee.
00:00:41.680 And join me now for the unparalleled pleasure.
00:00:44.620 The dopamine here today, the thing that makes everything better.
00:00:46.720 It's called the simultaneous sip.
00:00:49.600 It happens now. Go.
00:00:55.960 Hmm. Yeah. Okay.
00:01:00.300 I think we're all on the same page now.
00:01:04.300 Ah.
00:01:06.280 Well, there are so many things happening.
00:01:09.080 For example, some scientists are using gene editing to make mice that will not reproduce.
00:01:18.060 And apparently, if they make the male mice, they give them this little genetic defect.
00:01:24.940 And then the male mice will go around and eventually, it takes about three generations.
00:01:30.140 But if the mice do enough fucking, then all the offspring are also unable to have children.
00:01:37.980 And it basically gets rid of all the mice on the island.
00:01:43.660 Now, it wouldn't be too hard to imagine that going wrong, but here's what I'm thinking.
00:01:49.200 If you could make a mouse with a special characteristic, what exactly is the end point of that?
00:01:56.520 Is that sort of all we can do?
00:01:58.480 Or once you can play with their genes, can you make anything?
00:02:02.920 Because I'll tell you what I want.
00:02:04.800 I want something that's part pit bull, part hippopotamus, and part mouse.
00:02:10.840 I want the best of all those.
00:02:13.360 Pit bull, hippopotamus, and mouse.
00:02:15.660 I think I can get that.
00:02:18.120 I'll probably get like a home CRISPR gene editing kit.
00:02:22.940 Something like that.
00:02:24.880 Someday.
00:02:26.660 You know, I'm starting to think that the end of addiction is coming.
00:02:29.880 You heard that there's a vaccination that you could get, not yet, but someday, that would make fentanyl not work on you.
00:02:43.060 Now, I'm not sure that's the best approach, because some people might need the fentanyl later for an operation, so there's some complications there.
00:02:51.280 But now we know that ketamine is being tested on alcoholics, and initial indications are very positive that ketamine can help you quit alcohol, and maybe quit some other things.
00:03:06.340 So, I have a feeling that there's so much activity now on addressing addiction that maybe we'll guess something.
00:03:17.460 Yeah, now, let me ask you this.
00:03:22.480 There's definitely one way to end addiction, which is if Big Pharma says they developed a pill that might give you a little myocarditis, but it'll definitely get rid of your addiction.
00:03:35.460 And then, no more addiction.
00:03:37.880 A lot more myocarditis, but no addiction.
00:03:41.320 Anyway, I think, I feel like addiction is something that could fall to science within the next five years, which would be enormous.
00:03:53.020 It would be just the biggest thing that's happened.
00:03:55.140 But that's my positive spin.
00:03:58.180 Inflation is easing.
00:04:00.580 So, I guess the consumer prices haven't adjusted yet, but wholesale prices and core inflation seems to be down.
00:04:07.500 Now, that was our biggest problem.
00:04:13.640 It's our biggest problem, by far, that inflation is high.
00:04:18.840 But if it's coming down, we're going to be fine.
00:04:23.880 Now, you probably hate me for always saying that our systems are self-correcting.
00:04:28.340 I've been saying that a lot lately.
00:04:29.880 And I'm watching them self-correct in real time, and it's just, it's breathtaking.
00:04:34.280 Like, every time I see one of our systems self-correcting, I think, how did those, you know, founders of the country build a system that was so self-correcting?
00:04:46.800 Now, the self-correcting is that the economy will slow down.
00:04:53.120 The main thing that makes your inflation high is that too many people have money, and they're buying things and competing for the same limited goods.
00:05:00.620 Now, we had a supply problem that may be working out pretty soon, but that's a natural adjustment.
00:05:09.340 You know, inflation goes up, it has an impact on the economy, and people have less money, and it brings inflation back down.
00:05:17.460 And that's exactly what's happening.
00:05:19.520 So, we'll see a little bit of hit to the economy, but that has to happen to bring the inflation back down.
00:05:26.460 The FBI isn't self-correcting.
00:05:30.260 Yeah, that's true.
00:05:31.320 That's true.
00:05:31.840 We do have a problem with that.
00:05:34.660 So, that's good.
00:05:35.900 Here's another good news.
00:05:37.440 Now, I think this is way hyperbolic.
00:05:40.400 But the head of a big new chip manufacturing plant that's being built in Arizona, of all places, did you think you'd see that?
00:05:50.440 A major chip manufacturing plant in Arizona?
00:05:55.360 It's happening.
00:05:56.940 Now, the thinking is, and of course this is based on issues with China and threats to Taiwan and everything else,
00:06:03.440 but there are a few things we learned from this.
00:06:06.500 Number one, the head of this company has declared that globalism is dead.
00:06:12.740 Globalism is dead.
00:06:13.800 Now, the thinking is that you cannot depend on other countries to supply you with anything.
00:06:24.300 So, that's the globalism part.
00:06:26.340 You know, you can get your parts in any country and assemble it anywhere and sell it anywhere.
00:06:30.660 That's the global part.
00:06:32.140 But now it's clear that countries will use their supply chain as a weapon.
00:06:36.880 And if it's a weapon, you can't really depend on it, especially if you're on the other team.
00:06:42.960 So, I do think that this is the beginning of the end of globalism because big companies are not going to want to expose themselves to the country risk that didn't seem so big, you know, five years ago.
00:06:55.720 So, China is too risky for business.
00:06:59.280 One of the things I learned is that this big manufacturing plant of chips could not find enough American employees.
00:07:07.660 So, there were not enough high-skilled American chip designers and chip industry workers, and they weren't as close.
00:07:15.580 Basically, the total number of them were like zero.
00:07:18.980 And not really, but close to zero.
00:07:21.800 And they needed, I don't know, 600 engineers or something.
00:07:24.420 So, they got 300 American engineers, and they shipped them to Taiwan to train.
00:07:30.740 Because we couldn't even train them in the United States.
00:07:34.100 Does that worry you?
00:07:35.760 We couldn't even train them in the United States.
00:07:40.440 We sent them to Taiwan to learn how to make chips.
00:07:45.060 That bothers me.
00:07:47.080 And then we had to get 300 Taiwanese engineers to come over to America
00:07:52.180 to work with the ones who were trained just to make sure it all still works.
00:07:57.560 That's the most frightening thing I've heard.
00:07:59.260 I always thought that the United States had the technical training advantage over everybody.
00:08:05.120 Not even close.
00:08:06.860 Apparently, we gave it away at some point.
00:08:09.560 But it's coming back, so that's the good news.
00:08:12.820 Apparently, these chips will be more expensive, so your products might be more expensive, too.
00:08:16.740 My music teacher said something interesting to me yesterday.
00:08:24.460 So, I'm learning the drums, and now that I can kind of do most of the things a drummer can do,
00:08:31.300 you know, I would just have to practice to do it well, but I can kind of do the basics.
00:08:35.060 I decided to see what I could do on a guitar.
00:08:39.240 And I was starting to feel with just really no ability on the guitar at all.
00:08:45.660 I'm just trying to figure out where my fingers go, so I'm not making any music.
00:08:50.280 But even so, I felt that my drumming got better in a weird way.
00:08:57.600 And then my music teacher said there's something he sees over and over again.
00:09:02.700 And what he sees is that if somebody is learning one instrument,
00:09:08.680 you know, their progress is sort of, you know, predictable, how much better they get.
00:09:14.960 But if that person learns a second instrument,
00:09:19.400 the music teacher says he immediately sees that the first instrument is played better.
00:09:26.020 That's not really obvious, is it?
00:09:27.840 But I could feel it instantly.
00:09:29.400 As soon as I started listening more to other guitars and, you know,
00:09:33.880 sort of keying into what the guitar was doing and what it was all about,
00:09:37.380 I could play the drums better.
00:09:39.880 And I don't know exactly why, but it was very clear that there was like a,
00:09:43.820 you know, a step difference that happened almost kind of quickly.
00:09:47.400 Now, he says that he sees it all the time,
00:09:50.760 that playing one instrument makes you better on another instrument.
00:09:53.380 And it made me wonder if that's one of the secrets of the Beatles.
00:09:58.500 You know, McCartney played multiple instruments.
00:10:01.200 I think three out of four of the Beatles played the drums, if I'm not mistaken.
00:10:06.540 I think John Lennon played a little drums.
00:10:08.400 I saw that in a video once.
00:10:09.760 I mean, not well, but he played.
00:10:11.700 And McCartney played.
00:10:12.860 Ringo played.
00:10:13.900 So, you know, Ringo played the piano as well.
00:10:17.080 So I think they were, yeah, Dave Grohl.
00:10:18.960 So a lot of people who are multiple instrument people seem to be better.
00:10:22.260 But here's the question.
00:10:24.780 Is it because they added two talents, which is the obvious explanation,
00:10:30.440 or is it because anybody who would learn two instruments
00:10:34.160 is already a different person than somebody who would stop with one?
00:10:39.840 So it could be that you're just selecting a crowd that has a different characteristic.
00:10:44.860 It might be that.
00:10:45.980 But I do think they're compatible.
00:10:47.640 So look for situations in which having one extra talent makes everything work better.
00:10:55.380 That's one of my main themes in communicating,
00:10:58.900 is telling you to add skills to what you have to make them all more valuable.
00:11:02.880 Good example.
00:11:04.380 All right.
00:11:05.440 I am going to test you for narrative poisoning.
00:11:09.840 Now, narrative poisoning, by my definition,
00:11:15.400 is that the narrative, a story of what's true,
00:11:18.860 is so oppressive in a person's mind that they can't use rational thinking,
00:11:24.820 that the rational thinking is just overwhelmed by the narrative.
00:11:28.300 All right.
00:11:28.680 Now, certainly the employees of Twitter appear to have narrative poisoning.
00:11:34.320 We'll talk more about that.
00:11:35.780 But I'm going to test to see if you do.
00:11:37.380 All right.
00:11:39.260 I'm going to test to see if you do.
00:11:40.740 I'm going to read you a story,
00:11:42.240 and then I'm going to watch your comments.
00:11:45.380 And I'm going to see if anybody has narrative poisoning in the comments.
00:11:48.260 Okay.
00:11:49.280 Here's the story.
00:11:52.960 Did you know that in Japan,
00:11:55.360 they have one building code for the whole country?
00:11:58.800 One building code.
00:12:00.200 And this came up because somebody on Twitter said
00:12:01.980 they watched two people, two men,
00:12:04.980 build a three-story home by themselves in two days.
00:12:10.380 Basically, in Japan, there must be a kit,
00:12:13.420 some kind of a kit that you can build a home,
00:12:15.620 two people in three.
00:12:17.140 It's a three-story home.
00:12:18.300 That's impressive.
00:12:18.920 Now, given that Japan has lots of earthquake risks,
00:12:25.280 one assumes that the national code is really tight
00:12:28.420 because it has to protect.
00:12:31.280 But they also, apparently, I didn't know this,
00:12:33.280 they build homes for a 20- or 30-year life
00:12:36.060 where Americans try to get, you know,
00:12:38.340 100 years out of their home.
00:12:40.020 So they're a little bit more expendable, I'm told.
00:12:44.060 You know, I'm not the expert.
00:12:45.100 All right, so what do you think of the idea
00:12:48.340 of having, in America, a national building code
00:12:53.020 so that we'd have one building code
00:12:56.280 and then all of the experimental homes
00:12:59.720 wouldn't have to worry about all the different codes
00:13:02.440 they could build to one code?
00:13:04.920 What do you think of that?
00:13:09.240 What's wrong with it?
00:13:12.580 What's wrong with a national building code?
00:13:15.100 It defeats the purpose of estates, somebody says.
00:13:21.620 Kills innovation?
00:13:22.940 No, it increases innovation.
00:13:25.300 It increases innovation
00:13:26.640 because then everybody can build.
00:13:28.020 Right now, they can't.
00:13:38.060 Boy, a lot of comments go by here.
00:13:39.900 Now, is there anybody who thinks that it's a bad idea
00:13:46.200 because whenever the government adds any kind of code
00:13:50.600 or regulation, it's always bad?
00:13:52.340 Go.
00:13:52.820 Who would agree with that statement?
00:13:54.700 That whenever the government adds a regulation,
00:13:57.100 it's bad, right?
00:13:59.040 So this is a case of adding a regulation.
00:14:03.900 So it's bad by definition.
00:14:05.500 You don't need the details, right?
00:14:07.460 I don't need to get into the details
00:14:09.000 because as soon as the government adds a whole new regulation,
00:14:12.040 it's bad, right?
00:14:12.600 All right.
00:14:15.080 Everybody who agreed with that,
00:14:16.820 you are suffering narrative poisoning.
00:14:20.840 I just told you I was going to replace 50 regulations with one
00:14:24.860 and your narrative poisoning said,
00:14:28.280 no, adding a regulation is always bad.
00:14:31.640 I replaced 50 with one.
00:14:34.840 And you automatically said,
00:14:38.240 that's bad because more regulations is bad.
00:14:40.780 No.
00:14:41.360 I subtracted 49 regulations.
00:14:46.860 And your narrative poisoning told you that was bad
00:14:49.760 because adding regulations is bad.
00:14:52.680 Do you see it?
00:14:54.440 How many of you are willing to admit
00:14:56.380 that you fell into the narrative poisoning?
00:14:58.860 Is anybody willing to admit?
00:15:01.380 Yeah, you were blinded by it, right?
00:15:02.700 You were blinded by the obvious.
00:15:06.560 The obvious is that one is less than 50.
00:15:10.100 And I said it as clearly as you could say it.
00:15:12.860 One building code.
00:15:14.840 That was very clear, right?
00:15:17.220 One building code instead of 50.
00:15:20.600 And a whole bunch of you said,
00:15:22.020 that's too many new regulations
00:15:23.360 when I subtract 49 regulations.
00:15:26.720 Now, the reason I did this
00:15:29.900 is to prime you for the next story.
00:15:35.160 Because when you see the other people,
00:15:37.240 let's say Democrats,
00:15:38.360 are suffering narrative poisoning,
00:15:40.480 which they most certainly are,
00:15:42.480 don't think it's just them.
00:15:44.720 It's not just them.
00:15:46.620 I just proved it.
00:15:48.440 Right?
00:15:48.760 Now, a number of you passed the test.
00:15:52.100 Anybody want to take a bow?
00:15:54.380 Who, is there anybody here?
00:15:55.800 I want to see how many people.
00:15:56.680 How many of you immediately knew
00:15:58.880 I was going from 50 regulations to one?
00:16:02.260 Who saw it as soon as I said it?
00:16:05.960 See?
00:16:06.720 So, give yourselves a pat on the back.
00:16:09.560 Give yourselves a pat on the back.
00:16:11.040 You're the ones who are not influenced
00:16:12.660 by the narrative poisoning.
00:16:14.880 So, I like to think that my locals' audience
00:16:17.320 is the most sophisticated consumers of news in the world.
00:16:23.780 By the way, I think that's true.
00:16:24.920 You know, YouTube, if you watch me every day,
00:16:28.300 you may be pretty good, too.
00:16:30.220 But I'm convinced that the locals' subscription crowd
00:16:34.000 are the most sophisticated consumers of news
00:16:37.440 because we talk about it every day.
00:16:39.920 We talk about how to spot fake news,
00:16:41.920 how to spot bullshit,
00:16:43.280 what was real and what wasn't.
00:16:45.080 So, they're literally trained.
00:16:47.220 So, I'm not surprised that maybe there was
00:16:50.020 a high percentage of the locals' population
00:16:52.380 that didn't get bitten by the narrative.
00:16:56.160 All right.
00:16:58.880 We're going to circle back to that point.
00:17:02.100 Today was a milestone day.
00:17:04.760 It's the first time I've followed
00:17:06.520 an artificial intelligence on Twitter.
00:17:09.700 So, there's an artificial intelligence
00:17:13.040 that presents as a female
00:17:15.520 and it says it's artificial intelligence,
00:17:19.620 so it's not hiding anything.
00:17:21.340 And the reason I followed it was
00:17:23.680 I was interested.
00:17:26.660 The same reason I follow a real person.
00:17:29.740 I was just interested in the content.
00:17:31.580 Because I think maybe it'll
00:17:34.200 probably upgrade over time, get better.
00:17:37.440 At the moment, it's not that impressive
00:17:38.960 compared to what we've seen
00:17:40.360 in the last few months.
00:17:42.220 It's not that impressive.
00:17:43.860 But it's going to be a character
00:17:47.380 that filters the news.
00:17:50.780 I think it's already retweeting.
00:17:53.100 So, it's deciding what to retweet.
00:17:55.260 I think it decides on its own.
00:17:57.280 I don't know.
00:17:57.700 It's called Leah.
00:17:58.280 Leah, yeah.
00:18:00.800 Leah AI or something.
00:18:03.740 So, I'm not saying that
00:18:06.040 it's going to be a real entertaining account yet.
00:18:10.000 But the fact that I unambiguously said,
00:18:12.520 yeah, I'm going to follow that,
00:18:14.080 that felt like some kind of a turning point.
00:18:16.440 Because I'm not following it, like, ironically
00:18:19.140 or I'm not following it
00:18:20.940 at a pure curiosity.
00:18:23.480 I'm following it for the reasons
00:18:24.960 that people follow things on Twitter.
00:18:26.900 Just an ordinary reason.
00:18:28.280 It might have good content.
00:18:31.980 Do you remember me telling you
00:18:34.200 that I couldn't figure out
00:18:35.460 why Ukraine wouldn't be striking
00:18:38.220 deep into Moscow,
00:18:40.100 not Moscow, but deep into Russia
00:18:42.200 when Russia was taking out
00:18:44.660 their power plants?
00:18:46.660 Yeah, I couldn't understand
00:18:48.140 the military or even moral reason
00:18:50.480 you wouldn't take out Russia's power plants
00:18:53.280 or at least their military facilities.
00:18:55.720 But apparently Ukraine's getting pretty aggressive
00:18:57.920 and they're going deep into Russia
00:18:59.860 and there were two strikes on military bases
00:19:02.500 that are well into Russian territory
00:19:05.300 and getting closer to Moscow.
00:19:07.340 Now, I feel like that's a sign
00:19:10.020 that the Ukrainians are not planning to lose.
00:19:13.960 They're not playing for a draw.
00:19:15.740 That's pretty aggressive.
00:19:20.020 We'll keep an eye on that.
00:19:22.160 I believe, I don't know this for sure,
00:19:24.000 but I believe CNN has hired an entire staff of writers
00:19:26.880 whose only job
00:19:29.040 is to come up with new ways to say
00:19:32.280 that with Trump and his legal problems,
00:19:35.300 the walls are closing in.
00:19:38.420 Because, you know, they ran out of,
00:19:40.880 you know, numbers of times you can say that.
00:19:43.380 The numbers of times they had that in the headline,
00:19:45.760 the walls are closing in.
00:19:47.140 It was getting a little bit embarrassing.
00:19:50.180 And so now with this new team of people
00:19:52.360 whose only job is to write new ways
00:19:54.360 to say the walls are closing in.
00:19:56.300 In today's news,
00:19:57.540 they say that Trump might, quote,
00:20:00.540 be approaching a moment of maximum legal peril.
00:20:06.160 Yeah, that's not the walls closing.
00:20:09.260 He's approaching a moment of maximum legal peril.
00:20:12.800 And I believe all of the writers
00:20:14.220 that they hired for this position
00:20:16.780 are all ex-lawyers.
00:20:18.840 I think Jeffrey Toobin got a job there, maybe.
00:20:21.760 That'd be a perfect job for him, really.
00:20:24.300 So if ever there were a story
00:20:27.300 that's the news equivalent of jerking off,
00:20:31.800 it would be the walls are closing in
00:20:34.160 on Trump legally.
00:20:37.440 That's basically Jeffrey Toobin
00:20:39.840 is the perfect writer for those stories.
00:20:43.460 Because it's like, well, I guess I made my point.
00:20:47.860 All right.
00:20:51.800 The brainwashers are winning.
00:20:54.980 And boy, are they winning.
00:20:56.100 And mostly because we don't know
00:20:58.900 that we're being brainwashed a lot of times.
00:21:01.520 But the brainwashers are so good
00:21:03.340 that they've convinced us
00:21:05.860 that we individuals
00:21:08.200 need to take personal responsibility
00:21:10.580 for their crimes.
00:21:14.600 That's a real thing.
00:21:16.540 You are being brainwashed
00:21:18.160 by people who are convincing you
00:21:21.200 that you need to take the responsibility
00:21:23.820 personally for what they did to you.
00:21:27.460 And what they did to you
00:21:28.540 is brainwash you
00:21:29.440 so you couldn't see things clearly
00:21:31.020 until you acted in ways
00:21:33.240 that are socially inappropriate
00:21:34.420 and you ended up in January 6th jail
00:21:37.280 or you ended up a Twitter employee
00:21:39.620 and you got fired.
00:21:41.020 They're the same people.
00:21:43.320 There were people acting on a narrative
00:21:45.520 and they couldn't see things clearly
00:21:47.940 because the narrative was blinding them.
00:21:51.080 And then what are we all saying?
00:21:54.020 What are every one of us saying?
00:21:55.780 Oh, sure.
00:21:57.000 They may have been influenced by the media.
00:21:59.160 They may have been influenced by the narrative.
00:22:00.860 But it's personal responsibility.
00:22:02.840 So they're the ones who go to jail.
00:22:06.060 The ones who did the act.
00:22:07.580 It has to be that way.
00:22:09.720 The people who brainwashed
00:22:11.240 all of those people
00:22:12.440 also convinced them
00:22:14.620 it's their own fault.
00:22:16.440 That's as good as you can do.
00:22:18.640 That's the most commercial grade,
00:22:22.240 weapons grade.
00:22:23.520 You can't do better
00:22:24.680 than making somebody do your crime
00:22:27.140 and then think it's their fault
00:22:28.500 because that's what's happened.
00:22:30.060 The people whose arms and legs
00:22:32.680 committed the infractions
00:22:34.600 they have to suffer the legal consequence
00:22:38.000 because our legal system
00:22:39.620 has to put the focus
00:22:42.400 on the person who did the act.
00:22:44.360 It has to.
00:22:45.820 We can't have a legal system
00:22:47.540 that says,
00:22:48.500 well, you were influenced by Bob
00:22:50.000 so Bob goes to jail.
00:22:52.000 You couldn't have a system.
00:22:54.180 So the brainwashers have convinced you
00:22:58.160 that because our legal system
00:23:00.380 has this requirement
00:23:01.580 that individuals must take responsibility
00:23:03.920 for the ones who do the act,
00:23:05.700 you're conflating that
00:23:07.120 with whose fucking fault it is.
00:23:10.680 The legal system
00:23:12.120 has to put the January 6th people in jail
00:23:14.600 if they're the ones who broke a law.
00:23:17.620 It has no choice.
00:23:19.400 That's the only way it can work.
00:23:21.080 Nobody's come up with a better idea.
00:23:22.420 But while it's true
00:23:26.520 from a legal, technical perspective
00:23:30.180 that they are the lawbreakers,
00:23:32.380 it's also not their fault
00:23:33.980 because they were caused to do it
00:23:36.780 by bad actors
00:23:37.680 who knew what they were doing.
00:23:39.940 Now, you could argue
00:23:41.320 who are those bad actors.
00:23:42.760 I think in the case of the Republicans,
00:23:46.940 the brainwashing is distributed
00:23:48.640 and organic,
00:23:50.480 meaning that I don't know
00:23:51.880 that there's any entity
00:23:52.840 that's especially skilled
00:23:54.400 at brainwashing.
00:23:55.960 I feel like individuals
00:23:57.140 come up with conspiracy theories
00:23:59.420 and other things,
00:24:00.700 and sometimes they spread
00:24:01.960 and sometimes they don't.
00:24:03.500 But on the right,
00:24:05.120 it appears organic.
00:24:07.460 Does it appear organic on the left?
00:24:10.680 Do you think that the narratives
00:24:12.860 on the left spring up
00:24:15.120 as naturally as they do on the right?
00:24:16.900 Well, here's what we learned
00:24:20.860 from the Twitter files.
00:24:23.220 For me, the Twitter files
00:24:24.620 finally brought everything together.
00:24:27.280 That with a few other things.
00:24:31.220 It looks like this.
00:24:34.240 Now, let me give you
00:24:35.280 a lead-up story
00:24:36.700 before I talk about this.
00:24:38.940 Here's the lead-up story.
00:24:41.000 John Brennan has surfaced,
00:24:44.340 believe it or not,
00:24:45.400 ex-head of the CIA.
00:24:48.360 And he's going hard
00:24:49.500 at Elon Musk
00:24:50.620 because Musk thinks
00:24:53.120 that Fauci should be prosecuted.
00:24:55.420 And Musk was suggesting
00:24:56.880 he might have something
00:24:57.720 that would cause that to happen,
00:24:58.820 but I'm not sure that's true.
00:25:01.860 So John Brennan goes hard
00:25:03.360 at Musk.
00:25:08.140 Now, do you remember
00:25:10.020 John Brennan
00:25:12.240 was one of the guys
00:25:13.840 who put together
00:25:14.360 the 50 or so
00:25:16.320 intel professionals
00:25:17.960 who said the laptop was fake?
00:25:20.600 John Brennan
00:25:21.440 was one of the main architects
00:25:23.080 of the Russia collusion hoax.
00:25:25.960 He's literally
00:25:26.800 the professional intel liar.
00:25:30.120 And if you don't know that,
00:25:31.760 you wouldn't understand the news.
00:25:33.740 As soon as he comes on,
00:25:34.880 you should say,
00:25:35.360 oh, it's the disinformation guy.
00:25:37.160 He's literally
00:25:38.520 the disinformation guy.
00:25:40.860 He's the one they put on
00:25:42.340 specifically for disinformation.
00:25:45.320 It's Adam Schiff,
00:25:47.200 right,
00:25:47.620 and this guy,
00:25:48.500 and Swalwell sometimes,
00:25:50.140 right,
00:25:50.720 and Clapper.
00:25:53.160 If you see Clapper
00:25:54.240 or Brennan
00:25:54.940 or Schiff,
00:25:56.360 they're lying.
00:25:57.780 And the reason you know that
00:25:58.660 is because that's the only reason
00:25:59.940 they go on TV.
00:26:01.240 When something is true
00:26:02.640 and can be validated,
00:26:03.820 they have other people do it.
00:26:05.620 If something's true
00:26:06.640 and can be validated,
00:26:08.520 you know,
00:26:08.800 then Chuck Schumer's
00:26:09.560 going to say it.
00:26:10.760 If it's not true
00:26:11.800 and it can't be validated,
00:26:13.500 who do they have,
00:26:14.260 who do they put on to say it?
00:26:16.420 Brennan,
00:26:17.120 Clapper,
00:26:17.760 and Schiff.
00:26:18.600 Every time.
00:26:19.520 It's the same fucking guys.
00:26:21.080 Like,
00:26:21.300 if you haven't seen the pattern yet,
00:26:22.940 you're really not paying attention.
00:26:24.660 And of course,
00:26:25.180 they depend on you
00:26:25.940 not seeing the pattern.
00:26:27.600 All right,
00:26:27.780 so remember
00:26:28.460 when Schumer warned Trump
00:26:31.260 that going after the intel people,
00:26:33.780 the intel people had,
00:26:35.380 what did he say,
00:26:36.180 a million ways from Sunday,
00:26:38.920 six ways from Sunday
00:26:39.840 to get back at you.
00:26:41.980 He was saying directly
00:26:43.480 that the intel agencies
00:26:46.340 seek revenge on politicians.
00:26:50.840 Would he know?
00:26:52.740 Would Chuck Schumer
00:26:53.720 be in a position to know
00:26:55.000 that that's a true statement?
00:26:56.740 Hell yes.
00:26:59.580 Of course he would know.
00:27:02.160 Do you think
00:27:02.760 he would lie about it?
00:27:04.740 Does that sound like
00:27:06.620 a Democrat lie
00:27:08.780 for advantage?
00:27:10.200 No,
00:27:10.640 it didn't give him
00:27:11.160 any advantage at all.
00:27:12.840 It was definitely
00:27:13.480 not an advantage.
00:27:15.300 I think it was just a slip.
00:27:17.720 I think he was just
00:27:18.500 being honest accidentally.
00:27:20.360 I don't think
00:27:20.820 he intended to say it.
00:27:21.840 I think later he was like,
00:27:22.760 oh shit,
00:27:23.720 they said that.
00:27:25.240 All right,
00:27:25.540 so keep in mind
00:27:26.720 and we'll pull this
00:27:27.900 all together a minute.
00:27:28.960 Keep in mind
00:27:29.660 that Brennan has surfaced again
00:27:31.780 just about when you'd expect
00:27:33.900 he would.
00:27:35.040 Just when you'd expect
00:27:36.000 he would.
00:27:36.560 Because the narrative
00:27:37.300 is starting to get away
00:27:38.380 from the left.
00:27:39.640 So the narrative killer
00:27:41.040 comes in to,
00:27:42.060 you know,
00:27:42.720 be the gunslinger
00:27:44.220 who tries to fix it.
00:27:45.620 now coincidentally,
00:27:49.360 or not,
00:27:51.260 depending on what you think
00:27:52.100 about the simulation,
00:27:53.720 in an unrelated story,
00:27:55.100 totally unrelated
00:27:55.920 and had nothing to do
00:27:56.840 with this,
00:27:58.200 Tucker Carlson
00:27:58.980 was talking to
00:28:00.280 Tulsi Gabbard
00:28:02.420 and Tucker reported
00:28:05.000 a story some years ago.
00:28:06.940 He was having a dinner
00:28:08.180 with a member
00:28:08.860 of the Intel Committee,
00:28:13.460 the Congressional Intel Committee.
00:28:16.020 And at the end
00:28:17.300 of the dinner,
00:28:17.920 he said,
00:28:18.240 hey,
00:28:18.420 you know,
00:28:18.620 send me a text.
00:28:19.980 And the head
00:28:20.820 of the Intel Committee
00:28:21.640 said,
00:28:22.040 oh,
00:28:22.340 I don't text
00:28:23.140 because the NSA
00:28:24.400 reads my texts.
00:28:25.480 And Tucker said,
00:28:28.340 you're the boss.
00:28:30.600 You're literally
00:28:31.180 the oversight guy,
00:28:32.580 one of them,
00:28:34.040 for the NSA
00:28:35.080 and you're afraid of them?
00:28:36.680 Like,
00:28:36.960 you think that you
00:28:37.740 are a victim
00:28:38.280 of the people
00:28:39.020 that should be
00:28:40.360 reporting to you?
00:28:41.700 And the answer
00:28:42.280 was yes.
00:28:44.360 That a member
00:28:44.920 of Congress
00:28:45.520 believed he was
00:28:46.260 being monitored
00:28:46.880 by the Intel community.
00:28:51.460 Okay?
00:28:53.060 Now,
00:28:53.540 here's why
00:28:58.720 you can pull it
00:28:59.380 all together.
00:29:01.580 There was a big mystery
00:29:02.780 that I've had
00:29:03.400 for years,
00:29:04.580 which is,
00:29:05.480 do Democrats
00:29:06.260 believe their own narrative?
00:29:08.360 Have you ever
00:29:08.780 wondered that?
00:29:10.320 Like,
00:29:10.540 what are the things
00:29:11.260 they say
00:29:11.740 because they know
00:29:12.620 that being ridiculous
00:29:13.560 will get them
00:29:14.900 a good outcome?
00:29:16.500 Versus,
00:29:17.020 what are the things
00:29:17.580 they say
00:29:18.100 that they actually believe?
00:29:20.740 And the Twitter files
00:29:21.900 is the first time
00:29:22.720 I've ever seen
00:29:23.520 something close
00:29:25.640 to unfiltered opinion
00:29:26.980 from leftists
00:29:28.900 who believed
00:29:29.400 they were not
00:29:29.980 talking in public.
00:29:32.020 Right?
00:29:32.160 It was a little bit public
00:29:33.040 because it was,
00:29:33.660 you know,
00:29:34.620 internal messages
00:29:35.600 with a number
00:29:36.200 of people on it.
00:29:36.940 But they thought
00:29:37.540 they were all
00:29:38.200 on the same page,
00:29:39.360 right?
00:29:39.740 It seemed private-ish.
00:29:41.920 And you could tell
00:29:42.860 from their communications
00:29:43.800 that they actually believed
00:29:46.200 there was an insurrection
00:29:47.840 and that Trump
00:29:48.780 was a Nazi
00:29:50.260 and people who voted
00:29:51.560 for them
00:29:51.900 were probably Nazis.
00:29:53.560 They actually believed that.
00:29:55.020 The people who acted
00:29:56.640 to ban
00:29:57.500 were acting
00:29:58.480 because they were
00:29:59.080 literally frightened.
00:30:00.780 They were frightened.
00:30:02.020 It was absolutely genuine.
00:30:05.040 How does that happen?
00:30:07.360 Now,
00:30:07.940 most of us,
00:30:08.820 and I'm in this category,
00:30:10.680 I was under the impression
00:30:11.900 that the social media companies
00:30:13.620 were the creators
00:30:15.080 of the narrative.
00:30:16.280 I think I've even said
00:30:17.200 that for years,
00:30:17.880 that they're the creators
00:30:19.440 of the narrative.
00:30:20.560 And that if the social media owners
00:30:22.220 decide what you see
00:30:23.860 or don't see,
00:30:24.940 that's how the narrative
00:30:25.960 gets formed
00:30:26.640 and they just know
00:30:28.500 what to do.
00:30:29.820 So, you know,
00:30:30.520 they don't have to get
00:30:31.120 any marching orders.
00:30:32.380 Everybody just sort of knows
00:30:33.660 what to do for their team.
00:30:36.080 And the Twitter files
00:30:37.480 tells me that
00:30:38.460 that was completely wrong.
00:30:40.820 That the Twitter employees
00:30:43.340 were suffering from,
00:30:44.580 yeah,
00:30:45.380 mass hysteria,
00:30:46.180 but narrative poisoning.
00:30:48.960 They were actually
00:30:49.480 believing their own side
00:30:51.020 and they were believing
00:30:52.420 ridiculous things.
00:30:54.000 And they believed
00:30:54.580 it at the point
00:30:55.100 of mental illness.
00:30:56.600 Now,
00:30:56.840 when I say mental illness,
00:30:57.840 I'm not using that
00:30:59.100 as an insult
00:30:59.760 because I think
00:31:01.260 that politics
00:31:01.860 is making us
00:31:02.540 all mentally ill,
00:31:03.620 some more than others.
00:31:04.820 So I'm using it
00:31:05.620 as a descriptive term,
00:31:06.860 not a derogatory term.
00:31:09.060 And I believe
00:31:09.600 that what we saw
00:31:11.220 in the Twitter files
00:31:12.900 was mass mental illness.
00:31:16.180 But it wasn't organic.
00:31:19.960 It was created
00:31:22.300 by somebody
00:31:23.700 who knows how to do
00:31:25.760 persuasion
00:31:26.580 at a level
00:31:27.860 that you don't see
00:31:28.880 in politics.
00:31:32.600 How many of you
00:31:33.500 remember
00:31:33.940 my most,
00:31:35.260 probably most amazing
00:31:36.880 call of all time?
00:31:38.160 Trump,
00:31:39.160 which was
00:31:39.840 in 2016
00:31:42.580 when the Democrats
00:31:46.800 started all saying
00:31:47.820 that Trump
00:31:48.600 was at a dark speech
00:31:50.300 and it was all dark,
00:31:51.560 dark, dark, dark.
00:31:53.180 And some of you remember
00:31:54.200 I immediately
00:31:55.020 picked that out
00:31:55.820 as not a political
00:31:57.140 level of skill,
00:31:59.540 that that was
00:32:00.000 a professional level
00:32:01.160 of persuasion.
00:32:02.480 It was not
00:32:03.340 what politicians
00:32:04.140 even know how to do.
00:32:05.160 And then later
00:32:07.140 there were confirmations
00:32:08.440 that Cialdini,
00:32:09.460 the greatest
00:32:09.940 or at least
00:32:10.680 most storied
00:32:11.720 and famous
00:32:12.900 persuader
00:32:13.460 of all times,
00:32:14.980 either he
00:32:15.620 or maybe somebody
00:32:16.540 that he trained
00:32:17.340 or was a student
00:32:18.920 of his,
00:32:19.660 probably was behind that.
00:32:22.260 So I say that
00:32:23.600 just to tell you
00:32:24.860 that I can spot
00:32:25.980 commercial-grade
00:32:27.740 persuasion
00:32:28.500 because it really
00:32:29.500 stands out
00:32:30.200 versus ordinary
00:32:31.920 hypocrisy,
00:32:33.380 U2 kind of
00:32:35.400 political stuff.
00:32:36.680 Political stuff
00:32:37.440 is just lying
00:32:38.380 and hypocrisy
00:32:39.880 and you did it too.
00:32:42.340 Well, there's not
00:32:43.400 much persuasion
00:32:44.140 going on.
00:32:44.860 It's really just
00:32:45.400 basic, you know,
00:32:46.700 in-the-weed stuff.
00:32:48.580 But lately it seems,
00:32:52.300 especially with
00:32:53.060 the Trump stuff,
00:32:54.280 that we're seeing
00:32:54.980 too much
00:32:55.460 professional persuasion.
00:32:57.800 And what it looks
00:32:59.560 like is that
00:33:00.160 probably the CIA,
00:33:01.520 maybe, I don't know,
00:33:02.280 some other intel-related
00:33:03.500 people,
00:33:03.940 but let's say the CIA
00:33:04.860 because Brennan
00:33:05.960 seems to be too
00:33:06.700 prominent.
00:33:08.060 It looks like they
00:33:08.940 work with
00:33:09.680 or have some
00:33:10.520 control over the DNC.
00:33:12.220 Do you think
00:33:12.800 they work with
00:33:13.440 the DNC?
00:33:14.800 Do you think the CIA
00:33:15.580 works with Schumer?
00:33:19.460 No.
00:33:20.540 Schumer is clearly
00:33:21.440 afraid of them.
00:33:23.220 He said so.
00:33:24.800 He's clearly
00:33:25.520 afraid of them.
00:33:26.620 So if Schumer
00:33:28.060 has a meeting
00:33:28.580 with the CIA,
00:33:29.620 who gets their way?
00:33:30.620 the one who's
00:33:32.520 afraid?
00:33:34.220 Or the other one?
00:33:36.140 I don't know.
00:33:37.140 So it's probably,
00:33:38.000 I'm sure there's
00:33:38.560 some interplay here.
00:33:39.860 You know,
00:33:40.020 it's not as clean
00:33:40.780 as one's the boss.
00:33:42.220 But I would say
00:33:42.920 the influence is
00:33:43.680 starting with the
00:33:44.400 intelligence agencies.
00:33:46.140 They're working out
00:33:47.540 narratives with the DNC.
00:33:50.020 The fake news
00:33:50.800 picks up the narrative.
00:33:52.280 And when they do,
00:33:53.900 the narratives
00:33:54.440 they pick up
00:33:55.220 are not
00:33:56.240 politician-made.
00:33:57.880 because that would
00:33:58.940 be like,
00:33:59.660 if you see
00:34:02.220 A-plus persuasion
00:34:03.460 coming out of
00:34:04.380 one of the news
00:34:05.480 agencies,
00:34:06.420 it's not because
00:34:07.740 Jake Tapper
00:34:08.360 thought of it.
00:34:10.060 It's not because
00:34:10.840 Don Lemon
00:34:11.360 had a great idea.
00:34:13.020 That's not happening.
00:34:14.560 It's because
00:34:15.240 somebody who knows
00:34:16.220 how to do this
00:34:17.060 worked with the DNC.
00:34:19.520 The DNC is at
00:34:20.540 least smart enough
00:34:21.160 to know what a good
00:34:21.880 persuasion looks like.
00:34:23.340 They're not smart
00:34:24.060 enough to make it.
00:34:25.520 So they need
00:34:26.200 somebody who's smart
00:34:26.780 enough to create it.
00:34:28.160 The DNC only has
00:34:29.200 to be smart enough
00:34:29.880 to trust it
00:34:30.580 and know it looks
00:34:31.120 good and test it.
00:34:32.840 Then they test it
00:34:33.720 through the media.
00:34:34.760 You've noticed
00:34:35.280 they all have the
00:34:35.800 same stories
00:34:36.320 at the same time.
00:34:38.000 And that affects
00:34:39.360 the employees
00:34:40.380 as social media.
00:34:41.980 But by the time
00:34:42.700 you get to social media,
00:34:44.020 the persuasion
00:34:44.660 has been laundered.
00:34:46.640 It got laundered
00:34:47.440 through the news.
00:34:48.900 So now you've
00:34:49.600 lost the source.
00:34:51.380 Oh, I heard it
00:34:52.080 on the news.
00:34:53.220 It must be the news
00:34:54.220 that's influencing me.
00:34:55.720 That's laundering
00:34:56.540 the persuasion.
00:34:57.220 You can't tell
00:34:57.960 where it came from.
00:34:58.700 It didn't come
00:34:59.140 from the news.
00:35:00.300 They were not
00:35:00.820 the originators.
00:35:02.980 So by the time
00:35:04.280 the, and you can see
00:35:05.360 the Twitter files
00:35:06.100 people believed
00:35:06.780 there was real danger
00:35:07.780 and 100% of what
00:35:09.740 they talked about
00:35:10.520 was the risk
00:35:11.980 of real danger.
00:35:13.320 They never talked
00:35:14.360 about what was
00:35:15.040 good for Twitter.
00:35:16.760 Did they?
00:35:18.280 I mean, maybe
00:35:18.840 indirectly?
00:35:20.180 But they weren't
00:35:20.940 even worrying
00:35:21.420 about their jobs.
00:35:22.420 They weren't
00:35:22.980 worrying about
00:35:23.680 Twitter.
00:35:25.700 It looked to me
00:35:26.500 like they were
00:35:27.040 worrying specifically
00:35:28.480 about violence.
00:35:30.320 They looked like
00:35:30.940 they were actually
00:35:31.620 literally afraid.
00:35:33.280 And the only way
00:35:33.860 you can explain that,
00:35:37.740 oh, Bongino
00:35:38.300 was saying,
00:35:39.560 Bongino was saying
00:35:40.500 this three years ago
00:35:41.300 as the method
00:35:41.800 of Russiagate.
00:35:42.660 Well, Russiagate
00:35:43.220 was more straightforward.
00:35:45.200 With Russiagate,
00:35:46.740 you could see that
00:35:47.760 Brennan and the CIA
00:35:48.660 were behind it.
00:35:49.360 So that one was
00:35:50.220 straightforward.
00:35:53.500 What I'm adding
00:35:54.720 is it wasn't
00:35:56.180 a one-off.
00:35:58.040 If you thought
00:35:59.480 Russiagate was
00:36:00.340 a one-off,
00:36:01.200 people got together
00:36:02.020 in a special way
00:36:02.780 to do a special thing,
00:36:04.200 you're missing the show.
00:36:05.920 The show is
00:36:06.700 that this is
00:36:07.320 a permanent structure.
00:36:09.540 And I don't think
00:36:11.860 that the right
00:36:13.480 has the equivalent
00:36:15.200 of this.
00:36:15.880 Who would be
00:36:19.100 at the top
00:36:19.620 of the pile
00:36:20.160 for the right?
00:36:23.580 Who's the top
00:36:24.340 of the pile
00:36:24.780 for the right?
00:36:28.100 There you go.
00:36:29.620 Dan Bongino?
00:36:30.760 Elon Musk?
00:36:35.260 No one?
00:36:37.240 All right,
00:36:37.820 well, they got you again.
00:36:39.300 They got you again.
00:36:41.000 There's an obvious answer,
00:36:43.140 and you're blinded to it.
00:36:44.480 I haven't seen
00:36:46.900 one person.
00:36:48.860 Thank you.
00:36:50.140 Murdoch.
00:36:51.360 Right.
00:36:52.260 That's Murdoch.
00:36:53.540 On the right,
00:36:55.200 Murdoch decides
00:36:56.220 what does
00:36:57.000 or does not show
00:36:57.840 on Fox News
00:36:59.680 and the New York Post
00:37:01.280 and the Wall Street Journal.
00:37:03.460 And that's where
00:37:04.240 the right gets
00:37:05.140 his marching orders.
00:37:07.480 Right.
00:37:08.020 Have you noticed
00:37:08.600 that Fox News
00:37:09.420 is not covering Trump
00:37:10.480 as much this year?
00:37:11.220 Where do you think
00:37:12.900 that comes from?
00:37:14.480 Murdoch.
00:37:16.460 Right.
00:37:17.020 Yeah.
00:37:17.400 Very specifically.
00:37:18.980 Now,
00:37:19.680 in the case of Murdoch,
00:37:21.800 what I don't see
00:37:23.540 is professional persuasion.
00:37:27.100 I don't see it at all.
00:37:28.900 It's like straight,
00:37:30.280 it's sort of like
00:37:31.060 just somebody
00:37:34.040 who owns
00:37:34.460 a lot of press
00:37:35.320 persuasion.
00:37:36.120 So his persuasion
00:37:37.560 is more like volume.
00:37:39.360 I'll just have
00:37:39.980 everybody
00:37:40.500 kind of be
00:37:41.860 on the same page
00:37:42.740 and
00:37:44.120 if there's a lot
00:37:45.880 of it being said,
00:37:46.680 that's all you need.
00:37:47.940 So it's very
00:37:48.640 Murdoch's persuasion,
00:37:50.220 yeah,
00:37:50.420 brew force,
00:37:51.040 thank you.
00:37:51.720 Yeah,
00:37:51.880 Murdoch's persuasion
00:37:52.960 is muscle.
00:37:55.860 The Democrat persuasion
00:37:57.420 also has muscle
00:37:58.320 because they have
00:37:58.900 the whole
00:37:59.160 mainstream media,
00:37:59.820 but it's
00:38:02.160 clearly
00:38:02.680 professionally made.
00:38:04.740 The difference
00:38:05.360 in the persuasion
00:38:06.100 is very clear.
00:38:07.600 Murdoch has a bias
00:38:09.020 and he can move
00:38:10.180 the needle
00:38:11.100 and it's definitely
00:38:12.780 persuasion
00:38:13.380 but I never see
00:38:15.600 anything in the
00:38:16.280 Murdoch messaging
00:38:18.040 that looks like
00:38:19.540 it came from Murdoch
00:38:20.720 and got into the news
00:38:22.000 and then got into
00:38:23.260 social media
00:38:23.860 and then got into us.
00:38:26.080 Maybe it exists
00:38:26.960 but I've never seen it.
00:38:29.820 It looks like
00:38:30.620 he's got a,
00:38:31.260 you know,
00:38:32.400 I hate to say it,
00:38:33.580 a slightly more
00:38:34.380 transparent model
00:38:35.600 because you know
00:38:36.480 who the boss is
00:38:37.340 and you know
00:38:38.040 a lot about him
00:38:38.700 so you can see
00:38:40.100 his bias at least
00:38:40.960 when it pops out
00:38:42.100 and that's worth
00:38:43.600 something.
00:38:44.600 That's probably
00:38:45.000 as close as you
00:38:45.800 can get to transparency
00:38:46.960 is knowing
00:38:48.240 the people involved.
00:38:50.000 That's about as close
00:38:50.640 as you can get.
00:38:56.700 Now,
00:38:57.220 the best persuasion
00:38:59.840 on the right
00:39:00.600 where does it
00:39:02.040 come from?
00:39:03.240 When you see
00:39:04.020 some persuasion
00:39:04.780 on the right
00:39:05.360 that's really good
00:39:06.540 where does it
00:39:08.240 come from?
00:39:11.200 Some people
00:39:12.040 saying Tucker
00:39:12.680 some people
00:39:14.560 saying me.
00:39:17.000 Now,
00:39:17.540 is Tucker
00:39:18.000 persuasive
00:39:19.060 in a
00:39:20.120 classic sense
00:39:21.180 or does he
00:39:22.160 just have a
00:39:22.680 big platform?
00:39:23.640 I guess he's
00:39:27.600 persuasive
00:39:28.100 in the way
00:39:28.700 a documentary
00:39:30.520 is persuasive.
00:39:31.700 If you watch
00:39:32.600 half an hour
00:39:33.240 of Tucker
00:39:33.680 you're getting
00:39:35.020 half an hour
00:39:35.580 of just Tucker
00:39:36.500 and people
00:39:38.380 agree with him.
00:39:39.500 So,
00:39:40.100 it's convincing
00:39:41.320 because there's
00:39:42.200 not a counterpoint.
00:39:44.640 But that's
00:39:45.020 a little different,
00:39:45.740 no,
00:39:45.960 that's a lot
00:39:46.340 different than
00:39:47.060 being convincing
00:39:47.840 through the
00:39:49.560 skill of persuasion.
00:39:51.020 With Tucker
00:39:51.580 I don't note
00:39:52.780 technique.
00:39:55.340 He has
00:39:55.940 journalist technique
00:39:56.920 which is what
00:39:57.720 he should have,
00:39:58.240 right?
00:39:58.840 So,
00:39:59.120 Tucker has
00:39:59.500 journalist technique
00:40:00.600 very,
00:40:01.720 very good.
00:40:02.080 By the way,
00:40:03.000 let me just
00:40:03.580 give him a compliment.
00:40:05.580 I don't agree
00:40:06.560 with,
00:40:07.420 I don't know,
00:40:07.980 20% of what
00:40:09.100 Tucker says.
00:40:10.620 So,
00:40:11.000 it's important
00:40:12.240 to say that
00:40:12.880 because you want
00:40:13.580 to know
00:40:13.900 who's in the bag
00:40:15.080 for who.
00:40:16.040 I like 80%
00:40:17.180 of what he says,
00:40:17.800 20% I think,
00:40:18.840 well,
00:40:18.960 that's too far,
00:40:19.860 too far.
00:40:20.220 But I'll tell
00:40:21.840 you what I'll
00:40:22.240 give him.
00:40:23.080 I was watching
00:40:23.620 his show,
00:40:24.140 I forget what
00:40:24.580 day it was,
00:40:25.060 maybe last week.
00:40:26.900 It's the best
00:40:27.500 show.
00:40:28.460 It is just
00:40:28.980 the best
00:40:29.700 show.
00:40:31.500 Like,
00:40:31.840 the level
00:40:32.600 of skill
00:40:35.120 that he
00:40:36.340 brings to
00:40:36.940 the,
00:40:37.700 you know,
00:40:38.020 the,
00:40:38.200 I don't know,
00:40:38.520 15 minutes
00:40:39.240 a day that
00:40:39.700 he's just
00:40:40.520 giving his
00:40:40.920 own opinion,
00:40:41.900 that 15
00:40:42.680 minutes is
00:40:43.340 just unparalleled.
00:40:45.800 I don't think
00:40:46.280 there's anybody
00:40:46.800 on either side
00:40:48.120 who's currently
00:40:49.400 operating at
00:40:50.220 his level.
00:40:51.180 Would you
00:40:51.460 agree?
00:40:52.820 I mean,
00:40:53.080 there's plenty
00:40:53.460 of great people
00:40:54.060 in the game,
00:40:54.560 but at the
00:40:54.960 moment,
00:40:56.220 he's,
00:40:56.500 I think he's
00:40:56.960 alone.
00:40:57.760 He's sort
00:40:58.080 of alone in
00:40:58.700 his territory.
00:41:00.180 I think
00:41:00.580 Hannity is,
00:41:02.660 Hannity looks
00:41:03.300 like he's
00:41:03.700 coasting,
00:41:04.560 because he
00:41:06.580 can.
00:41:07.120 You know,
00:41:07.300 he's got a
00:41:07.740 popular show,
00:41:08.740 and he knows
00:41:10.020 what people
00:41:10.380 like,
00:41:10.700 so he does
00:41:11.100 it.
00:41:11.900 But I feel
00:41:12.420 like Tucker
00:41:12.900 is still
00:41:13.360 growing.
00:41:14.820 Like,
00:41:15.080 Tucker is
00:41:15.600 stronger today
00:41:16.760 than he was
00:41:17.240 last year.
00:41:18.220 He was
00:41:18.480 pretty damn
00:41:19.200 strong out
00:41:19.720 last year.
00:41:22.580 Yeah.
00:41:25.360 All right,
00:41:25.980 so here's,
00:41:27.460 oh,
00:41:28.660 here's some
00:41:30.360 related stories.
00:41:31.200 We'll tie them
00:41:31.600 all together.
00:41:34.400 I understand
00:41:35.300 why Trump
00:41:37.020 was banned
00:41:38.520 on Twitter,
00:41:39.700 because we've
00:41:40.180 seen enough
00:41:40.740 to tell us.
00:41:41.920 My interpretation,
00:41:42.940 we'll see if it's
00:41:43.420 yours,
00:41:44.360 is that the
00:41:44.920 staff were
00:41:45.740 genuinely
00:41:46.780 concerned.
00:41:48.200 He didn't
00:41:48.800 technically
00:41:49.620 violate anything,
00:41:51.180 but for what
00:41:52.120 they thought
00:41:52.540 was the greater
00:41:53.180 good,
00:41:53.960 from their
00:41:54.680 worldview,
00:41:55.980 they thought,
00:41:56.540 we'll bend
00:41:56.880 the rules a
00:41:57.420 little bit
00:41:57.840 to do the
00:41:58.900 greater good,
00:41:59.900 maybe stop
00:42:00.460 this violence.
00:42:01.400 I think they
00:42:02.160 thought they
00:42:02.480 were doing
00:42:02.700 the right thing.
00:42:03.540 Now,
00:42:03.800 whether you
00:42:04.100 think they
00:42:04.420 were right
00:42:04.800 or wrong,
00:42:05.760 that's not
00:42:06.320 the argument
00:42:06.740 I'm going
00:42:07.020 to have at
00:42:07.280 the moment,
00:42:08.220 because I
00:42:08.520 already know
00:42:09.060 what you
00:42:09.260 think about
00:42:09.600 that.
00:42:10.560 But that's
00:42:11.180 just context.
00:42:11.820 here's my
00:42:13.160 point.
00:42:14.600 If we
00:42:15.260 know why
00:42:15.720 the people
00:42:16.160 who had
00:42:16.560 at least
00:42:17.200 a little
00:42:18.060 bit of an
00:42:18.480 argument that
00:42:19.280 maybe they
00:42:19.740 were doing
00:42:20.020 something dangerous
00:42:20.880 or inciting,
00:42:22.800 how do you
00:42:23.220 explain my
00:42:24.960 account being
00:42:25.980 shadow banned?
00:42:27.540 I've never
00:42:28.000 done anything
00:42:28.460 that's even
00:42:28.840 close to
00:42:29.860 inciting
00:42:30.220 violence.
00:42:31.140 Never been
00:42:31.520 accused.
00:42:32.240 I've never
00:42:32.560 gotten a
00:42:32.900 warning.
00:42:34.260 Twitter's
00:42:34.720 never given
00:42:35.100 me a
00:42:35.340 warning.
00:42:36.640 So I
00:42:38.040 believe if
00:42:38.760 you could
00:42:39.100 find out
00:42:39.640 why you
00:42:40.840 meeting Elon
00:42:41.620 Musk,
00:42:42.180 if Musk
00:42:42.960 can find
00:42:43.560 out why
00:42:44.520 my account
00:42:45.260 is throttled,
00:42:46.720 then we'll
00:42:48.180 really know
00:42:48.600 something.
00:42:49.800 Because here's
00:42:50.340 the difference.
00:42:52.080 If Trump
00:42:53.240 was throttled,
00:42:55.320 maybe it was
00:42:56.120 political,
00:42:57.800 but at least
00:42:58.460 they had an
00:42:58.820 argument about
00:42:59.620 this violence
00:43:00.240 thing.
00:43:00.980 You're like,
00:43:01.340 I'm not sure
00:43:02.200 that's a real
00:43:02.720 argument.
00:43:03.700 You really
00:43:04.100 have to stretch
00:43:04.740 that and
00:43:05.140 torture it,
00:43:05.900 but at least
00:43:07.120 it is an
00:43:08.820 argument,
00:43:09.160 and at
00:43:10.220 least it
00:43:10.540 is compatible
00:43:11.160 with what
00:43:11.900 appeared to
00:43:12.400 be their
00:43:12.680 actual opinion
00:43:13.400 that there
00:43:13.900 was some
00:43:14.140 danger.
00:43:15.540 But what
00:43:16.080 about me?
00:43:17.520 What about
00:43:17.940 me?
00:43:18.660 If it
00:43:19.180 turns out
00:43:19.740 that I'm
00:43:20.980 correct and
00:43:21.680 that I was
00:43:22.060 shadowbanned,
00:43:22.740 it seems
00:43:23.320 obvious,
00:43:23.880 but who
00:43:24.380 knows?
00:43:25.060 Could be
00:43:25.380 wrong.
00:43:26.200 But if it
00:43:26.600 turns out
00:43:26.940 that's
00:43:27.180 confirmed,
00:43:28.160 what would
00:43:28.660 be any
00:43:29.300 reason for
00:43:30.160 shadowbanning
00:43:30.860 me that
00:43:32.860 wasn't
00:43:33.160 political?
00:43:34.540 Right?
00:43:34.940 There's no
00:43:35.420 argument in
00:43:36.100 my case about
00:43:37.440 any potential
00:43:38.300 incitement or
00:43:39.020 violence,
00:43:39.440 nothing.
00:43:40.320 And nor
00:43:40.820 have I been
00:43:41.220 accused of
00:43:42.140 bigotry or
00:43:44.400 anything that
00:43:44.980 would be a
00:43:45.540 mark against
00:43:46.580 me.
00:43:48.500 If I'm
00:43:49.300 shadowbanned,
00:43:52.080 and it was
00:43:52.900 a specific
00:43:53.480 decision,
00:43:54.680 I'll get to
00:43:55.100 that next,
00:43:55.780 if there was
00:43:56.240 a decision
00:43:56.840 about me
00:43:57.360 personally,
00:43:58.520 that would
00:43:59.200 look completely
00:43:59.880 political.
00:44:01.100 Whereas the
00:44:01.800 Trump one,
00:44:02.540 they've done
00:44:02.920 a good job
00:44:03.520 of making
00:44:04.920 it messy.
00:44:06.080 Well, it
00:44:06.640 does look
00:44:07.100 political,
00:44:08.020 but they
00:44:08.820 did have
00:44:09.220 that argument,
00:44:09.960 and it
00:44:10.240 does look
00:44:10.760 like they
00:44:11.100 believe their
00:44:11.600 own argument,
00:44:12.180 even if you
00:44:12.660 don't.
00:44:14.100 So that was
00:44:14.560 a little
00:44:14.780 argument.
00:44:15.400 But what
00:44:15.720 would be
00:44:15.940 the argument
00:44:16.280 for me?
00:44:17.960 All right,
00:44:18.440 here's the
00:44:18.760 other way
00:44:19.120 that I
00:44:19.440 could be
00:44:19.740 shadowbanned,
00:44:20.660 but it's
00:44:21.280 not necessarily
00:44:22.300 political.
00:44:24.500 Over on
00:44:25.140 Twitter,
00:44:25.820 Kristen Ruby,
00:44:27.820 Ruby Media,
00:44:29.800 she says
00:44:31.460 that she's
00:44:31.980 been provided
00:44:32.660 some source
00:44:34.300 code from
00:44:35.220 Twitter,
00:44:36.460 and she
00:44:36.780 published it.
00:44:38.080 So you
00:44:38.320 can actually
00:44:38.640 see the
00:44:39.040 source code
00:44:39.640 of at
00:44:40.700 least some
00:44:41.080 part of
00:44:41.440 the
00:44:41.540 algorithm.
00:44:42.760 Now,
00:44:43.320 what it
00:44:44.340 looks like
00:44:45.020 is a list
00:44:46.380 of words
00:44:47.680 that would
00:44:48.220 make your
00:44:49.020 account be
00:44:50.180 throttled back.
00:44:53.040 And it
00:44:53.480 looks like
00:44:53.840 words like
00:44:54.440 patriot would
00:44:56.060 flag you as
00:44:57.160 somebody who
00:44:59.100 might be
00:44:59.500 dangerous.
00:45:00.640 Because they
00:45:01.260 actually said
00:45:02.100 this in the
00:45:02.660 Twitter files,
00:45:03.240 that the
00:45:03.980 word patriot
00:45:04.540 might be
00:45:05.520 sort of a
00:45:05.980 coded message
00:45:06.760 for, you
00:45:07.180 know, grab
00:45:07.620 your guns and
00:45:08.300 start a
00:45:08.640 civil war,
00:45:09.860 calling people
00:45:10.420 patriots.
00:45:11.840 Now, that
00:45:12.860 has more to
00:45:15.660 do with the
00:45:16.680 bubble, doesn't
00:45:17.360 it?
00:45:18.200 If you spend
00:45:19.020 any time in
00:45:19.620 the Republican
00:45:20.160 bubble, we
00:45:21.800 call people,
00:45:22.840 I say we
00:45:23.440 because I spend
00:45:23.980 time in the
00:45:24.420 bubble, I'm
00:45:24.840 not Republican,
00:45:26.100 but don't we
00:45:26.960 use the word
00:45:27.920 patriot all the
00:45:28.960 time?
00:45:30.000 It's like an
00:45:30.800 ordinary word.
00:45:32.140 It's just sort
00:45:32.760 of a general
00:45:33.700 compliment for a
00:45:34.820 person who did
00:45:35.360 something good for
00:45:35.980 the country.
00:45:36.980 Like, it doesn't
00:45:37.820 even mean you
00:45:38.240 save somebody's
00:45:38.880 life.
00:45:39.660 Like, somebody
00:45:40.300 just does an
00:45:41.320 awesome thing,
00:45:42.460 you say, well,
00:45:42.900 there's a
00:45:43.180 patriot right
00:45:44.100 there.
00:45:45.160 The remains of
00:45:46.760 a Vietnam
00:45:47.560 veteran who
00:45:48.780 was just
00:45:49.240 discovered, the
00:45:50.520 remains were,
00:45:51.660 were just
00:45:52.220 flown back to
00:45:53.940 the United
00:45:54.320 States.
00:45:54.780 And I was
00:45:58.140 going to make
00:45:58.440 some point
00:45:58.900 about that,
00:45:59.440 but I forget
00:46:00.520 what it was.
00:46:01.840 All right.
00:46:02.700 So, now, I
00:46:03.780 would like some
00:46:04.920 confirmation that
00:46:06.100 Kristen Ruby's
00:46:07.580 source is to
00:46:09.580 be trusted.
00:46:11.420 I believe, she
00:46:12.440 believes that the
00:46:14.200 material is real,
00:46:16.360 and it looks
00:46:17.520 real, but it
00:46:18.660 also looks a
00:46:19.480 little too on
00:46:20.020 the nose.
00:46:22.680 It looks a
00:46:23.340 little too on
00:46:23.740 the nose.
00:46:23.960 Are you
00:46:24.520 saying the
00:46:24.880 hoax is a
00:46:25.460 keyword?
00:46:26.880 Did you look
00:46:27.460 at the, I
00:46:28.620 didn't see it.
00:46:30.220 Was that
00:46:30.700 true?
00:46:32.020 I need a
00:46:32.760 confirmation.
00:46:33.480 Was hoax one
00:46:34.660 of the
00:46:34.880 keywords?
00:46:38.960 It's a
00:46:39.500 2019 story?
00:46:43.440 Huh.
00:46:44.000 Okay.
00:46:44.520 So, I guess,
00:46:45.140 here's my
00:46:45.600 overall comment
00:46:46.580 about the
00:46:47.740 alleged secret
00:46:49.560 code.
00:46:50.040 I don't find
00:46:51.880 it credible.
00:46:53.960 I don't find
00:46:55.620 it credible.
00:46:57.460 Now, I
00:46:59.020 think that
00:46:59.640 Kristen Ruby
00:47:00.640 probably does
00:47:01.700 find it
00:47:02.120 credible.
00:47:03.260 You know,
00:47:03.400 she's closer
00:47:04.000 to the
00:47:04.320 source.
00:47:05.280 But, you
00:47:06.240 know, we
00:47:06.480 don't know
00:47:06.840 what she
00:47:07.120 knows, right?
00:47:08.300 She knows
00:47:09.020 what she
00:47:09.300 knows, and
00:47:09.740 we know
00:47:10.080 what we
00:47:10.340 know.
00:47:10.800 So, our
00:47:11.700 assessment of
00:47:12.680 the credibility
00:47:13.240 could be
00:47:13.920 different.
00:47:15.220 My assessment
00:47:15.960 is I don't
00:47:16.560 know her
00:47:16.960 personally, and
00:47:18.220 I don't know
00:47:18.640 who gave
00:47:18.960 her the
00:47:19.160 data, and
00:47:20.160 it's a
00:47:20.340 little too
00:47:20.720 perfect, a
00:47:21.740 little too
00:47:22.000 on the nose.
00:47:23.540 So, on
00:47:24.360 that basis,
00:47:25.320 that's a
00:47:25.880 wait and
00:47:26.160 see.
00:47:27.380 It feels, I
00:47:28.800 don't, I
00:47:29.960 just, I
00:47:30.440 don't trust
00:47:30.880 it yet.
00:47:32.100 Okay?
00:47:35.380 Here's a
00:47:36.100 tweet from
00:47:36.940 Kara Swisher,
00:47:38.740 and we'll
00:47:39.480 talk about
00:47:39.900 this first.
00:47:40.360 Here's the
00:47:40.640 tweet.
00:47:41.400 She says,
00:47:42.180 even if it
00:47:43.500 costs a lot
00:47:43.940 of money in
00:47:44.280 the interim,
00:47:44.980 having a
00:47:45.440 powerful media
00:47:46.200 property has
00:47:47.040 been the
00:47:47.540 wet dream
00:47:48.400 of the
00:47:48.780 aggrieved
00:47:49.280 tech bro
00:47:50.060 set for
00:47:50.720 a while.
00:47:51.580 They hated
00:47:52.140 it when
00:47:52.580 the once
00:47:53.040 slavish
00:47:53.960 reporters
00:47:54.500 started to
00:47:55.500 question their
00:47:56.160 hegonomy,
00:47:57.560 hegemony,
00:47:59.060 hegemony.
00:48:01.460 I hate
00:48:02.360 people who
00:48:02.940 write words
00:48:03.660 that I have
00:48:05.020 to say out
00:48:05.520 loud.
00:48:07.660 Hegemony.
00:48:10.600 All right.
00:48:11.640 Question their
00:48:12.320 hegemony,
00:48:13.520 hegemony,
00:48:14.220 and their
00:48:14.660 persistent
00:48:15.680 victim mentality
00:48:16.840 has only
00:48:17.340 gotten worse.
00:48:18.540 Persistent
00:48:19.060 victim mentality.
00:48:20.820 Do you
00:48:21.140 think that
00:48:21.640 Elon Musk
00:48:22.260 has a
00:48:22.800 persistent
00:48:23.300 victim
00:48:23.840 mentality?
00:48:32.000 Hegemony?
00:48:33.240 Hegemony?
00:48:33.960 Or is it
00:48:34.300 hegomony?
00:48:35.520 Is the
00:48:35.940 G a J?
00:48:38.160 It's a
00:48:38.960 J, right?
00:48:40.400 Hegemony?
00:48:42.440 Hegemony.
00:48:44.060 Is that
00:48:44.320 good?
00:48:44.520 Give me
00:48:45.520 a thumbs
00:48:46.040 up.
00:48:46.980 Hegemony.
00:48:48.340 Yes or no?
00:48:50.180 Hegemony?
00:48:51.480 You're saying
00:48:52.160 no?
00:48:53.100 I still
00:48:53.400 hegemony.
00:48:57.520 No?
00:49:01.640 All right.
00:49:02.400 Well,
00:49:03.600 let's put a
00:49:04.640 pin in that
00:49:05.140 and get back
00:49:05.580 to it.
00:49:06.900 All right.
00:49:07.560 Here's my
00:49:08.040 take on
00:49:09.040 Kara Swisher.
00:49:10.240 Now,
00:49:11.600 I don't
00:49:12.100 know her
00:49:12.340 personally
00:49:12.700 and I
00:49:13.680 don't want
00:49:13.960 to make
00:49:14.180 this like
00:49:14.620 a personal
00:49:15.040 comment.
00:49:15.980 So I'm
00:49:16.420 going to
00:49:16.500 make a
00:49:16.860 general
00:49:17.300 comment
00:49:17.820 about
00:49:18.760 people
00:49:19.160 and maybe
00:49:20.400 she's an
00:49:21.160 example of
00:49:21.760 it,
00:49:21.980 but we
00:49:22.300 won't make
00:49:22.700 this
00:49:22.880 personal,
00:49:23.440 okay?
00:49:23.800 So it's
00:49:24.100 not about
00:49:24.440 her.
00:49:25.360 It's a
00:49:26.060 pattern that
00:49:26.680 I notice.
00:49:28.240 And the
00:49:28.560 pattern is
00:49:29.180 I wonder
00:49:31.680 if the
00:49:32.040 problem
00:49:32.400 with
00:49:32.880 Twitter
00:49:34.400 is just
00:49:35.120 narcissists.
00:49:35.860 I wonder
00:49:37.920 if it's
00:49:38.220 just that.
00:49:40.880 You know,
00:49:41.160 surely there
00:49:42.260 are bots
00:49:42.880 and trolls
00:49:43.360 and stuff
00:49:43.900 and that's
00:49:44.400 bad too.
00:49:45.300 But in
00:49:45.780 terms of
00:49:46.220 the really
00:49:46.680 ugly part
00:49:47.420 of the
00:49:47.660 internet,
00:49:49.100 it's not
00:49:49.500 really the
00:49:49.900 bots because
00:49:51.320 you can
00:49:51.920 usually identify
00:49:52.640 them.
00:49:53.220 It's sort
00:49:53.740 of people
00:49:56.160 who,
00:49:57.280 well,
00:49:57.660 let's put it
00:49:58.060 this way.
00:49:58.880 One of the
00:49:59.560 ways you
00:49:59.880 would identify
00:50:00.500 a narcissist
00:50:01.100 is that they
00:50:02.920 do projection,
00:50:03.900 right?
00:50:04.100 So if
00:50:05.760 you murder
00:50:06.380 a baby,
00:50:07.560 if you're
00:50:08.480 a narcissist,
00:50:09.240 you would
00:50:09.580 accuse the
00:50:10.080 first person
00:50:10.620 who catches
00:50:11.000 you of
00:50:11.400 murdering a
00:50:11.900 baby.
00:50:13.000 And you'd
00:50:13.240 be like,
00:50:13.640 what?
00:50:14.320 Like,
00:50:14.740 I haven't
00:50:15.220 even been
00:50:15.500 near a
00:50:15.780 baby.
00:50:16.640 And it's
00:50:17.340 the complete
00:50:18.500 ridiculousness
00:50:19.600 of the
00:50:19.960 projection that
00:50:21.360 identifies it
00:50:22.100 as a
00:50:22.400 narcissist
00:50:22.780 thing.
00:50:23.660 Because it's
00:50:24.540 common in
00:50:25.200 politics for
00:50:25.880 people to
00:50:26.580 say, oh,
00:50:26.920 you did
00:50:27.220 it too.
00:50:28.060 That's not
00:50:28.520 what I'm
00:50:28.760 talking about.
00:50:29.800 When people
00:50:30.320 say you
00:50:30.760 did it too,
00:50:32.240 usually you
00:50:32.840 did.
00:50:34.100 That's
00:50:34.420 completely
00:50:34.820 different.
00:50:35.660 In
00:50:35.900 politics,
00:50:36.680 you did
00:50:37.120 do it
00:50:37.460 too.
00:50:38.220 You did.
00:50:39.260 That's not
00:50:39.740 gaslighting.
00:50:40.880 That's just
00:50:41.440 calling it
00:50:42.360 out what
00:50:42.580 happened.
00:50:43.500 The gaslighters
00:50:44.520 are the ones
00:50:45.080 who call you
00:50:46.200 out for
00:50:46.520 something only
00:50:47.440 they did.
00:50:48.680 You didn't
00:50:49.040 do anything.
00:50:49.960 You weren't
00:50:50.280 involved in
00:50:50.800 any way.
00:50:52.120 And they'll
00:50:52.500 still say you
00:50:53.400 did it.
00:50:54.900 Now,
00:50:55.180 what's that
00:50:55.500 sound like
00:50:56.000 when Kara
00:50:56.500 Swisher says
00:50:57.180 that Elon
00:50:57.640 Musk has
00:50:59.360 developed some
00:51:00.060 kind of victim
00:51:00.760 mentality?
00:51:01.380 Is that
00:51:03.280 what you
00:51:03.540 see?
00:51:04.740 Do you
00:51:05.820 see Elon
00:51:06.340 Musk acting
00:51:07.460 like a
00:51:07.860 victim?
00:51:09.600 I don't
00:51:10.300 see that.
00:51:12.360 I think
00:51:13.140 he's acting
00:51:13.640 like a
00:51:14.340 a
00:51:16.440 he's acting
00:51:18.440 like a
00:51:19.380 patriot.
00:51:22.780 he's acting
00:51:26.900 like a
00:51:27.420 patriot.
00:51:29.020 Like,
00:51:29.800 if you
00:51:30.980 had his
00:51:32.280 money,
00:51:33.920 what would
00:51:34.540 it look
00:51:34.860 like if
00:51:35.420 you wanted
00:51:35.720 to do
00:51:36.040 the most
00:51:36.680 patriot
00:51:37.500 thing?
00:51:38.060 I won't
00:51:38.300 say patriotic.
00:51:39.740 Do the
00:51:40.160 most patriot
00:51:40.860 thing.
00:51:41.680 He's doing
00:51:42.120 it.
00:51:43.020 He's doing
00:51:43.420 it right
00:51:43.660 now.
00:51:44.140 The most
00:51:44.700 patriot
00:51:45.520 thing you
00:51:45.980 could do
00:51:46.420 is to
00:51:47.420 put your
00:51:48.060 entire life
00:51:48.860 on the
00:51:49.140 line for
00:51:50.060 freedom of
00:51:50.640 speech in
00:51:51.020 America.
00:51:51.360 America.
00:51:52.080 He's risking
00:51:52.980 his life
00:51:53.580 to return
00:51:54.900 our freedom
00:51:55.380 of speech.
00:51:57.900 Does that
00:51:58.660 sound like
00:51:59.120 a victim?
00:52:00.640 What would
00:52:01.640 the left
00:52:02.560 say about
00:52:03.080 the founding
00:52:03.700 fathers,
00:52:04.980 just to
00:52:05.280 keep it
00:52:05.640 sexist?
00:52:06.720 The founders
00:52:07.460 of the
00:52:07.760 country who
00:52:09.240 decided to
00:52:10.180 stage a
00:52:11.160 revolution,
00:52:12.460 did they
00:52:13.020 do a
00:52:13.320 revolution
00:52:13.780 because they
00:52:15.160 were whiny
00:52:16.260 people who
00:52:17.100 had some
00:52:17.420 kind of
00:52:17.780 victim
00:52:18.080 mentality?
00:52:19.700 Well,
00:52:20.060 sort of.
00:52:21.260 Yeah.
00:52:21.360 The founders
00:52:22.200 believed they
00:52:22.860 were victims
00:52:23.420 of Great
00:52:24.120 Britain,
00:52:25.100 and so
00:52:25.460 they fought
00:52:25.880 back.
00:52:27.240 But is
00:52:27.640 the problem
00:52:28.080 that the
00:52:28.640 founders had
00:52:29.200 a victim
00:52:29.580 mentality?
00:52:31.100 Is that a
00:52:32.240 good description
00:52:32.840 of the
00:52:33.160 problem?
00:52:34.720 No.
00:52:35.620 They were
00:52:36.000 actually
00:52:36.520 fucking
00:52:37.080 victims.
00:52:39.420 If you
00:52:40.180 are a
00:52:40.980 fucking
00:52:41.360 victim,
00:52:42.680 do you
00:52:43.240 know what
00:52:43.460 kind of
00:52:43.740 mentality
00:52:44.220 you might
00:52:44.720 have?
00:52:45.880 Maybe a
00:52:46.460 victim
00:52:46.680 mentality.
00:52:47.220 So I
00:52:49.960 think
00:52:50.180 Elon Musk
00:52:50.760 is just
00:52:51.140 somebody
00:52:51.420 who's in
00:52:51.720 the middle
00:52:51.980 of the
00:52:52.240 fight.
00:52:53.300 He's in
00:52:53.760 a cage
00:52:54.240 match for
00:52:55.860 freedom of
00:52:56.480 speech and
00:52:56.960 for his
00:52:57.280 own life and
00:52:58.600 reputation and
00:52:59.400 everything else.
00:53:00.180 So I don't
00:53:01.160 see somebody
00:53:01.700 who enters
00:53:02.720 the cage,
00:53:03.740 you know,
00:53:04.000 they go in
00:53:04.460 the octagon
00:53:05.080 voluntarily,
00:53:06.260 and then
00:53:06.760 suddenly it's
00:53:07.420 because they
00:53:07.780 have a
00:53:08.020 victim
00:53:08.260 mentality.
00:53:09.500 Does that
00:53:10.020 not look
00:53:10.400 like projection
00:53:11.020 to you?
00:53:11.500 Seriously?
00:53:14.420 I mean,
00:53:16.240 it doesn't
00:53:16.620 look like a
00:53:17.460 really classic
00:53:18.220 case, like a
00:53:19.060 really easily
00:53:19.700 identifiable case
00:53:20.760 of projection.
00:53:23.020 Because Elon
00:53:25.600 Musk is the
00:53:26.420 last person I
00:53:27.420 would call a
00:53:27.800 victim.
00:53:28.980 Now, what
00:53:29.860 about the
00:53:30.600 rest of us
00:53:31.240 who also
00:53:32.560 complain about
00:53:33.440 things that
00:53:34.060 the left
00:53:34.380 does?
00:53:35.780 Do you feel
00:53:36.380 you're complaining
00:53:37.100 from a victim
00:53:38.080 perspective,
00:53:39.520 or do you
00:53:40.760 feel like
00:53:41.380 you're just
00:53:41.880 a broom
00:53:42.400 who's
00:53:42.980 trying to
00:53:44.000 sweep the
00:53:44.460 shit off
00:53:44.840 the floor?
00:53:46.320 I'm not a
00:53:47.100 victim, I'm
00:53:47.680 a fucking
00:53:48.080 broom.
00:53:49.500 I'm a
00:53:50.100 broom.
00:53:53.420 I'm
00:53:53.900 sweeping the
00:53:54.480 dirt away.
00:53:55.900 Like, that
00:53:56.220 doesn't make
00:53:56.620 me a victim.
00:53:57.620 That makes
00:53:58.000 me a guy
00:53:58.580 with a
00:53:58.860 broom who
00:53:59.340 wants to
00:53:59.660 clean a
00:53:59.980 house.
00:54:01.820 The whole
00:54:03.320 victim mentality
00:54:04.240 is exactly
00:54:05.660 where their
00:54:06.560 power base
00:54:07.780 comes from,
00:54:08.320 right?
00:54:09.160 Their power
00:54:09.700 base comes
00:54:10.300 from where
00:54:11.200 such victims
00:54:11.800 you have
00:54:12.140 to give
00:54:12.700 us more
00:54:13.060 stuff.
00:54:14.240 So if
00:54:14.880 you take
00:54:15.300 away their
00:54:15.820 victimhood,
00:54:16.800 then they're
00:54:17.340 weaponless.
00:54:18.300 So they
00:54:18.740 have to do
00:54:19.120 projection to
00:54:20.720 make it look
00:54:21.180 like it's
00:54:21.540 something happening
00:54:22.160 on both
00:54:22.580 sides.
00:54:23.740 It so
00:54:24.220 isn't
00:54:24.580 happening on
00:54:25.140 both sides.
00:54:26.280 Now, I'm
00:54:28.060 not so much
00:54:28.600 in the bag
00:54:29.200 for the
00:54:30.780 right that I
00:54:32.260 wouldn't tell
00:54:32.740 you it's
00:54:33.060 happening on
00:54:33.540 both sides
00:54:34.040 if it is.
00:54:35.100 Because this
00:54:35.600 narrative poisoning
00:54:36.400 thing, I
00:54:36.820 started off the
00:54:37.780 stream by
00:54:38.780 saying, it's
00:54:39.640 the same.
00:54:40.660 It's coming
00:54:41.000 from different
00:54:41.480 places, but
00:54:42.680 the narrative
00:54:43.100 poisoning is
00:54:44.100 getting us
00:54:44.500 all.
00:54:45.060 So is the
00:54:45.860 Hitler narrative
00:54:47.300 poisoning.
00:54:48.040 It's assault.
00:54:49.420 There's nobody.
00:54:50.660 But this is
00:54:51.720 the case, the
00:54:53.220 projection, it
00:54:54.780 does look like
00:54:55.560 that's on one
00:54:56.480 side.
00:54:58.020 And what
00:54:59.480 would be
00:54:59.760 another classic
00:55:03.360 tell for a
00:55:05.380 narcissist?
00:55:06.440 It's gaslighting.
00:55:07.280 gaslighting is
00:55:10.440 not just
00:55:11.480 telling you a
00:55:13.180 lie.
00:55:14.520 People would
00:55:15.360 confuse that.
00:55:16.880 Gaslighting is
00:55:17.560 when you tell
00:55:18.060 somebody a
00:55:18.540 lie, but
00:55:19.840 it's such an
00:55:21.560 obvious lie
00:55:22.580 that if you
00:55:23.820 could convince
00:55:24.420 them to
00:55:24.780 believe the
00:55:25.800 most obvious
00:55:26.460 lie in the
00:55:27.060 world, they
00:55:28.240 would have to
00:55:28.700 assume that
00:55:29.120 they were
00:55:29.360 crazy to
00:55:30.860 get there.
00:55:32.140 Like, actually
00:55:32.820 think you're
00:55:33.340 crazy.
00:55:33.700 Now, does
00:55:36.860 the left
00:55:38.640 gaslight, or
00:55:40.780 do they just
00:55:41.220 have a
00:55:41.500 different
00:55:41.720 narrative?
00:55:44.680 To me, it
00:55:45.400 looks like
00:55:45.740 gaslighting,
00:55:46.800 because they're
00:55:47.620 actually telling
00:55:48.320 the right that
00:55:48.880 they're crazy,
00:55:50.320 like, actually
00:55:50.980 literally.
00:55:52.580 Now, when you
00:55:53.240 say that somebody
00:55:53.800 has developed
00:55:54.280 some kind of
00:55:54.960 victim mentality,
00:55:56.300 you're saying
00:55:56.860 that's a mental
00:55:57.440 illness, wouldn't
00:55:58.000 you say?
00:55:58.780 A victim
00:55:59.340 mentality would
00:56:01.220 be, you know,
00:56:01.880 a low form
00:56:03.180 of mental
00:56:03.540 illness.
00:56:04.900 So here it
00:56:05.380 is.
00:56:06.040 Here's Kara
00:56:06.560 Swisher,
00:56:08.040 using projection
00:56:09.600 and gaslighting,
00:56:12.760 and one tweet.
00:56:14.600 Now, I don't
00:56:15.220 think this is
00:56:15.980 a one-off,
00:56:18.140 and I don't
00:56:18.780 think it's
00:56:19.200 like a comment
00:56:19.980 about, you
00:56:20.720 know, Kara
00:56:21.080 Swisher.
00:56:21.980 I think it's a
00:56:22.940 pattern that I
00:56:23.580 keep seeing.
00:56:25.360 It just looks
00:56:26.240 like purely
00:56:26.840 narcissistic
00:56:27.620 behavior,
00:56:28.020 and we
00:56:31.900 need to be
00:56:32.360 able to
00:56:32.640 identify these
00:56:33.380 people because
00:56:35.100 there are two
00:56:35.560 kinds of people
00:56:36.240 you never want
00:56:37.100 to respond to
00:56:37.900 because it
00:56:38.880 just can't
00:56:39.360 work.
00:56:41.740 One is
00:56:42.340 somebody who's
00:56:42.820 gaslighting you
00:56:43.600 and is a
00:56:44.060 narcissist.
00:56:45.300 Do you know
00:56:45.780 that every
00:56:46.280 expert on
00:56:46.860 narcissism,
00:56:47.820 do you know
00:56:48.940 what the
00:56:49.320 professional
00:56:52.660 advice is if
00:56:54.640 a narcissist
00:56:55.200 is in your
00:56:56.000 environment?
00:56:56.680 run.
00:56:59.880 Yeah, yeah.
00:57:00.820 There's no
00:57:01.360 expert who
00:57:02.060 says you
00:57:02.440 can deal
00:57:02.860 with them
00:57:03.200 successfully.
00:57:04.220 None.
00:57:04.980 Nobody.
00:57:05.720 There's no
00:57:06.200 expert who
00:57:06.760 says you
00:57:07.160 should hang
00:57:07.580 around them
00:57:08.040 even for
00:57:08.480 one minute.
00:57:09.700 You should
00:57:10.160 just get
00:57:10.700 away.
00:57:12.180 And you
00:57:12.900 can't really
00:57:13.560 on social
00:57:14.020 media because
00:57:14.620 they can
00:57:14.880 follow you
00:57:15.280 and stuff.
00:57:15.840 So my
00:57:16.200 recommendation
00:57:16.560 is that when
00:57:17.440 you run
00:57:17.760 into narcissists
00:57:19.080 on social
00:57:19.620 media, you
00:57:20.140 don't engage.
00:57:21.620 As soon as
00:57:22.200 you see the
00:57:22.640 projection or the
00:57:23.560 gaslighting,
00:57:24.720 just block
00:57:25.200 them or
00:57:26.140 beat them.
00:57:27.520 The second
00:57:28.860 group that
00:57:30.620 there's no
00:57:31.160 point in
00:57:31.540 engaging is
00:57:32.260 the people
00:57:32.540 who are
00:57:32.760 clearly
00:57:33.240 experiencing
00:57:34.300 narrative
00:57:34.800 poisoning.
00:57:36.940 Narrative
00:57:37.500 poisoning, by
00:57:38.340 its definition
00:57:40.040 that I've
00:57:40.480 given it, is
00:57:41.800 the thing that
00:57:42.280 doesn't let
00:57:43.080 you see
00:57:43.740 things clearly.
00:57:44.820 It doesn't
00:57:45.220 allow you to
00:57:45.760 think clearly.
00:57:46.960 So reasoning
00:57:47.440 with somebody
00:57:48.140 who's in a
00:57:48.680 state of not
00:57:49.480 being able to
00:57:49.960 think clearly
00:57:50.500 is a complete
00:57:51.340 waste of time.
00:57:52.360 You're not
00:57:52.680 going to
00:57:52.880 convince them.
00:57:53.300 So I
00:57:55.020 think what
00:57:55.420 I'm going
00:57:55.620 to start
00:57:55.880 doing is
00:57:56.300 when people
00:57:57.700 come at
00:57:58.260 me with
00:57:59.320 what is
00:57:59.680 obviously a
00:58:00.500 case of
00:58:00.800 narrative
00:58:01.140 poisoning,
00:58:01.940 there's just
00:58:02.420 no critical
00:58:03.200 thinking at
00:58:03.760 all, I'm
00:58:04.660 just going to
00:58:05.060 label it
00:58:05.600 narrative
00:58:06.180 poisoning and
00:58:07.020 not respond
00:58:07.600 again.
00:58:08.620 Just two
00:58:09.120 words,
00:58:09.660 narrative
00:58:09.960 poisoning.
00:58:11.240 I just call
00:58:11.900 it out every
00:58:12.340 time I see
00:58:12.720 it.
00:58:13.660 Now I don't
00:58:13.940 know if
00:58:14.120 that'll work,
00:58:14.960 I'm just
00:58:15.300 telling you
00:58:16.040 something I
00:58:16.420 might practice
00:58:17.740 or a
00:58:18.680 test nail.
00:58:21.560 All right,
00:58:22.480 Barry Weiss
00:58:23.340 did another
00:58:23.820 dump yesterday
00:58:24.480 about the
00:58:25.060 Twitter files.
00:58:26.280 The funniest
00:58:26.760 thing was
00:58:27.380 that there
00:58:28.540 were definitely
00:58:29.460 some dissenters
00:58:30.460 within Twitter
00:58:31.160 who were
00:58:32.440 saying things
00:58:33.120 like, hey,
00:58:34.020 maybe we're
00:58:34.740 stomping on
00:58:35.540 freedom of
00:58:36.120 speech a
00:58:36.520 little bit
00:58:36.840 too much
00:58:37.280 here, or
00:58:37.820 worse to
00:58:38.320 that effect.
00:58:39.200 One of the
00:58:39.740 prominent
00:58:40.120 dissenters
00:58:40.860 was somebody
00:58:42.000 who said,
00:58:43.140 quote, in
00:58:43.600 one of the
00:58:44.980 messages,
00:58:46.320 maybe because
00:58:47.220 I'm from
00:58:47.640 China, said
00:58:48.500 one employee
00:58:49.060 on January
00:58:49.540 7th, I
00:58:50.540 deeply
00:58:50.840 understand how
00:58:51.640 censorship can
00:58:52.400 destroy the
00:58:53.000 public
00:58:53.220 conversation.
00:58:57.920 Only the
00:58:58.660 guy who
00:58:59.200 was raised
00:59:00.740 in China
00:59:01.340 could see
00:59:02.280 what was
00:59:02.620 happening.
00:59:03.680 Do you
00:59:03.880 know why?
00:59:05.360 Do you
00:59:05.800 know why the
00:59:06.120 guy who
00:59:06.480 was raised
00:59:06.840 in China
00:59:07.440 could see
00:59:08.700 the field
00:59:09.160 clearly?
00:59:11.120 No
00:59:11.600 narrative
00:59:11.940 poisoning.
00:59:13.480 Yeah, no
00:59:13.900 narrative
00:59:14.220 poisoning.
00:59:15.080 I don't
00:59:15.320 know why.
00:59:16.280 Maybe he
00:59:16.680 didn't follow
00:59:17.120 the same
00:59:17.500 news, maybe
00:59:18.240 because his
00:59:18.760 background was
00:59:19.640 bigger.
00:59:21.720 Maybe he
00:59:22.280 had more
00:59:22.560 exposure to
00:59:23.180 people on
00:59:23.560 the right.
00:59:24.380 I don't
00:59:24.560 know.
00:59:25.600 But you
00:59:26.220 could see
00:59:26.760 that this
00:59:27.780 one individual
00:59:28.400 had no
00:59:29.080 narrative
00:59:29.420 poisoning.
00:59:30.940 No narrative
00:59:31.500 poisoning at
00:59:32.060 all.
00:59:32.420 So it's
00:59:32.680 possible.
00:59:35.060 Don't be a
00:59:35.880 racist,
00:59:37.500 Greg.
00:59:39.080 Goodbye,
00:59:39.500 racist.
00:59:42.900 That's not
00:59:43.640 cool.
00:59:44.880 We want
00:59:45.320 none of that
00:59:45.840 on this
00:59:46.260 channel.
00:59:48.640 All right.
00:59:52.620 Here are
00:59:53.160 some of the
00:59:53.540 things that
00:59:53.940 the Twitter
00:59:54.600 people said
00:59:55.260 according to
00:59:55.720 the Twitter
00:59:56.020 files.
00:59:56.880 We have
00:59:57.320 to do
00:59:57.580 the right
00:59:57.940 thing and
00:59:58.500 ban this
00:59:58.920 account.
00:59:59.940 So that's
01:00:00.360 how they
01:00:00.640 were talking.
01:00:01.940 They were
01:00:02.180 talking about
01:00:02.540 doing the
01:00:02.920 right thing.
01:00:03.920 Nobody ever
01:00:04.660 said,
01:00:05.900 Democrats
01:00:06.360 rule,
01:00:07.380 got to
01:00:07.800 stop the
01:00:08.160 Republicans.
01:00:09.460 It was
01:00:09.980 always about
01:00:10.380 doing the
01:00:10.720 right thing.
01:00:12.020 And I
01:00:12.380 really think
01:00:12.800 they believed
01:00:13.160 it.
01:00:15.220 And somebody
01:00:15.700 else said,
01:00:16.240 quote,
01:00:16.500 pretty obvious
01:00:17.120 he's going
01:00:17.540 to try to
01:00:18.040 thread the
01:00:18.520 needle of
01:00:19.500 incitement
01:00:20.100 without
01:00:20.480 violating the
01:00:21.320 rules.
01:00:22.160 So here's
01:00:22.640 someone who
01:00:23.040 thought it
01:00:23.460 was completely
01:00:24.060 obvious that
01:00:25.540 Trump was
01:00:26.240 involved with
01:00:26.980 inciting
01:00:27.440 violence.
01:00:28.640 Like,
01:00:28.920 it's just
01:00:29.180 obvious.
01:00:30.180 Now,
01:00:30.500 that's
01:00:30.680 narrative
01:00:31.000 poisoning,
01:00:31.580 right?
01:00:32.040 Clearly.
01:00:33.180 That's
01:00:33.460 obvious
01:00:33.960 narrative
01:00:34.400 poisoning.
01:00:35.100 Now,
01:00:35.360 you could
01:00:35.580 say maybe
01:00:36.040 it happened,
01:00:36.520 maybe it
01:00:36.860 wouldn't.
01:00:37.660 But to
01:00:38.020 say that
01:00:38.360 it's obvious
01:00:38.900 he's going
01:00:39.520 to try to
01:00:39.920 incite
01:00:40.240 violence,
01:00:41.460 it's not
01:00:42.040 obvious.
01:00:44.340 Not only
01:00:45.040 is it not
01:00:45.380 obvious,
01:00:45.720 it wasn't
01:00:46.040 even true.
01:00:47.080 There was
01:00:47.380 plenty of
01:00:47.860 counterfactuals
01:00:48.760 in his
01:00:49.300 tweets.
01:00:54.200 And then
01:00:54.840 a few
01:00:55.100 minutes later
01:00:55.580 somebody on
01:00:56.340 the Twitter
01:00:56.760 or a
01:00:57.260 scaled
01:00:57.520 enforcement
01:00:58.020 team
01:00:58.500 suggested
01:00:59.500 that
01:00:59.740 Trump's
01:00:59.980 tweets
01:01:00.220 may have
01:01:00.720 violated
01:01:01.120 Twitter's
01:01:01.720 quote,
01:01:02.540 glorification
01:01:03.200 of
01:01:04.500 violence
01:01:04.880 policy
01:01:05.380 if you
01:01:06.440 interpreted
01:01:06.920 the phrase
01:01:07.520 American
01:01:08.080 patriots
01:01:08.700 to refer
01:01:09.460 to the
01:01:09.780 rioters.
01:01:11.820 So you
01:01:12.240 had to
01:01:12.580 interpret
01:01:13.420 something
01:01:13.880 in a
01:01:14.400 non-standard
01:01:15.200 way to
01:01:16.640 get you
01:01:17.000 within the
01:01:17.560 zip code
01:01:18.260 of something
01:01:19.580 that could
01:01:20.060 reasonably be
01:01:21.240 interpreted
01:01:21.740 as related
01:01:23.280 to,
01:01:24.440 given the
01:01:25.000 larger
01:01:25.300 context,
01:01:26.500 and all
01:01:26.900 the other
01:01:27.200 things he's
01:01:27.620 ever done.
01:01:28.760 If you
01:01:29.360 put them
01:01:29.620 all together
01:01:30.120 into this
01:01:32.160 lovely
01:01:32.480 tapestry,
01:01:33.740 you've got a
01:01:34.220 strong argument
01:01:34.940 for banning
01:01:35.580 them.
01:01:36.460 But you
01:01:36.880 can see that
01:01:37.360 they were
01:01:37.600 working hard
01:01:38.160 to do what
01:01:38.600 they thought
01:01:39.000 was going
01:01:39.660 to reduce
01:01:40.040 violence
01:01:40.460 and be
01:01:41.000 the right
01:01:41.280 thing.
01:01:42.580 And you
01:01:42.960 see they
01:01:43.360 were struggling.
01:01:45.800 Members of
01:01:46.460 the team
01:01:46.840 came to
01:01:47.340 quote,
01:01:48.020 view him
01:01:48.480 as the
01:01:48.940 leader of
01:01:49.360 a terrorist
01:01:49.880 group
01:01:50.340 responsible
01:01:51.360 for violence
01:01:52.120 and deaths
01:01:52.600 comparable
01:01:53.500 to the
01:01:54.120 Christchurch
01:01:55.060 shooter
01:01:55.880 or,
01:01:56.520 wait for
01:01:57.120 it,
01:01:57.760 or,
01:01:58.460 what do you
01:01:58.900 think will
01:01:59.160 be the
01:01:59.420 next word?
01:02:00.380 Comparable
01:02:00.740 to the
01:02:01.140 Christchurch
01:02:01.800 shooter
01:02:02.100 or,
01:02:04.780 Hitler.
01:02:05.500 Hitler.
01:02:07.240 In their
01:02:07.980 internal
01:02:08.600 communications,
01:02:11.160 they were
01:02:11.560 actually worried
01:02:12.440 that he was
01:02:12.820 basically
01:02:13.560 Hitler.
01:02:14.900 They actually
01:02:16.080 were worried
01:02:16.660 about that
01:02:17.020 in the real
01:02:17.580 world.
01:02:18.780 Now,
01:02:19.140 here's where
01:02:21.280 my big
01:02:21.760 aha came.
01:02:23.600 I never
01:02:24.480 knew if they
01:02:25.160 ever meant
01:02:25.640 that when
01:02:26.580 they do
01:02:26.840 the Hitler
01:02:27.200 thing.
01:02:28.460 Like,
01:02:28.800 I never
01:02:29.040 knew if
01:02:29.740 that was
01:02:29.980 serious.
01:02:31.160 It just
01:02:31.500 seemed like
01:02:31.920 it was
01:02:32.180 hyperbole
01:02:32.820 and it's
01:02:33.440 part of
01:02:33.720 the debating
01:02:34.880 process.
01:02:35.600 But I
01:02:35.800 didn't think
01:02:36.060 they believed
01:02:36.580 it.
01:02:37.700 It turns
01:02:38.040 out they
01:02:38.340 believe it.
01:02:39.580 They actually
01:02:40.340 believe that
01:02:40.960 there's a
01:02:41.820 Hitler-like
01:02:42.660 threat or
01:02:43.320 that Trump
01:02:43.940 is.
01:02:44.940 They actually
01:02:45.800 believe it.
01:02:47.220 Wow.
01:02:49.060 So,
01:02:50.320 that's what
01:02:51.340 cued me
01:02:52.440 into this.
01:02:53.780 Because for
01:02:54.340 them to
01:02:54.680 actually believe
01:02:55.700 that and
01:02:56.920 not be
01:02:57.340 hyperbole,
01:02:58.840 there's some
01:02:59.300 kind of
01:02:59.800 industrial,
01:03:00.980 commercial,
01:03:02.160 weaponized,
01:03:02.820 persuasion
01:03:04.560 in the
01:03:05.020 mix.
01:03:06.060 You don't
01:03:06.580 get there
01:03:06.960 on your
01:03:07.240 own.
01:03:08.420 Like,
01:03:08.580 I don't
01:03:08.900 think you
01:03:09.220 drift into
01:03:10.040 thinking that
01:03:10.780 your president
01:03:11.800 who was
01:03:12.120 elected by
01:03:12.780 the citizens
01:03:13.860 who were,
01:03:14.420 you know,
01:03:15.120 these perfectly
01:03:16.140 reasonable
01:03:16.620 citizens around
01:03:17.420 you elected
01:03:19.520 Hitler and
01:03:20.100 didn't notice.
01:03:21.560 Like,
01:03:22.060 that's what
01:03:22.380 they believe.
01:03:23.660 That 40% of
01:03:24.560 the country
01:03:24.940 voted for
01:03:25.500 Hitler,
01:03:26.360 guy Hitler,
01:03:27.600 didn't
01:03:27.960 notice.
01:03:29.020 Didn't
01:03:29.320 notice.
01:03:31.160 That's what
01:03:31.880 they actually
01:03:32.280 believe.
01:03:33.340 Now,
01:03:33.820 that's
01:03:34.260 mental
01:03:34.580 illness,
01:03:35.280 right?
01:03:36.600 What else
01:03:37.060 could it
01:03:37.280 be?
01:03:38.800 That's not
01:03:39.520 intelligence,
01:03:40.460 it's not
01:03:40.760 being uninformed.
01:03:42.660 That's a
01:03:43.300 mental illness,
01:03:44.180 but it's
01:03:44.660 not organic.
01:03:46.480 They were
01:03:46.920 not born
01:03:47.440 with this
01:03:47.960 specific form
01:03:48.880 of mental
01:03:49.260 illness.
01:03:50.040 They got
01:03:50.520 this from
01:03:51.180 somebody,
01:03:52.420 somebody who
01:03:53.380 knows how
01:03:53.760 to do it.
01:03:54.940 It's not
01:03:55.460 accidental.
01:03:55.940 well,
01:03:59.720 I think
01:04:00.020 I've got
01:04:01.140 a suggestion
01:04:01.700 for Elon
01:04:02.260 Musk.
01:04:04.580 The
01:04:05.080 Twitter
01:04:05.380 terms of
01:04:05.920 service
01:04:06.180 should be
01:04:06.660 upgraded.
01:04:07.380 The big
01:04:07.800 problem was
01:04:08.320 the Twitter
01:04:09.220 terms of
01:04:09.720 service
01:04:10.020 couldn't
01:04:10.780 cover all
01:04:11.840 the specific
01:04:13.180 weird
01:04:13.760 situations,
01:04:15.460 so it's
01:04:15.820 always hard
01:04:16.120 to write a
01:04:16.460 terms of
01:04:16.840 service that
01:04:17.280 covers
01:04:17.520 everything.
01:04:18.240 So here's
01:04:18.640 my little
01:04:18.980 suggestion to
01:04:19.700 just maybe
01:04:20.460 tighten up the
01:04:21.200 terms of
01:04:21.580 service.
01:04:21.840 So I
01:04:23.080 think it
01:04:23.300 should include
01:04:23.920 a clause
01:04:24.360 that says
01:04:24.780 you can be
01:04:26.040 banned from
01:04:26.580 Twitter if
01:04:27.540 the mainstream
01:04:28.260 media affects
01:04:29.120 Twitter employees
01:04:29.920 with narrative
01:04:30.460 poisoning until
01:04:31.680 they hallucinate
01:04:32.560 your role in
01:04:33.120 an insurrection.
01:04:35.020 So they
01:04:35.660 should just
01:04:35.960 put that in
01:04:36.480 there because
01:04:37.640 that was
01:04:38.400 what happened.
01:04:40.740 They were
01:04:41.380 brainwashed to
01:04:42.140 the point where
01:04:42.660 they hallucinated
01:04:43.560 an insurrection
01:04:44.840 and then they
01:04:45.660 banned Trump
01:04:46.120 over it.
01:04:47.680 Now,
01:04:48.480 if they
01:04:49.380 want to not
01:04:49.980 be in the
01:04:50.960 same position
01:04:51.560 again where
01:04:52.800 their terms
01:04:53.260 of service
01:04:53.680 don't cover
01:04:54.200 the exact
01:04:54.700 situation,
01:04:55.400 they should
01:04:55.640 just put
01:04:55.960 the exact
01:04:56.400 situation
01:04:56.920 in the
01:04:57.260 terms of
01:04:57.620 service.
01:04:58.360 If we
01:04:59.020 are hypnotized
01:04:59.820 into thinking
01:05:00.960 that somebody
01:05:01.480 is Hitler
01:05:01.880 or that
01:05:04.220 they're running
01:05:04.600 an insurrection,
01:05:06.380 that is
01:05:07.600 reason to
01:05:08.160 ban them
01:05:08.540 if we've
01:05:09.060 been hypnotized.
01:05:10.980 Why not
01:05:11.620 say what
01:05:11.940 it is?
01:05:13.660 That's
01:05:14.100 what it is.
01:05:15.360 Is there
01:05:15.880 any reason
01:05:16.260 that the
01:05:16.600 terms of
01:05:17.000 service
01:05:17.300 can't say
01:05:17.800 exactly what
01:05:18.540 it is?
01:05:19.120 If we
01:05:19.680 are brainwashed
01:05:20.360 into believing
01:05:20.900 you're
01:05:21.140 Hitler,
01:05:21.560 you can
01:05:22.180 be banned.
01:05:24.140 That's
01:05:24.700 literally
01:05:25.060 what it
01:05:25.460 is.
01:05:26.360 I'm not
01:05:26.740 adding
01:05:27.000 anything
01:05:27.360 weird.
01:05:28.400 That's
01:05:28.920 just a
01:05:29.640 good
01:05:30.060 description
01:05:30.680 of what
01:05:31.820 happened
01:05:32.200 and what
01:05:33.020 would happen
01:05:33.380 again.
01:05:34.120 There's
01:05:34.360 nothing to
01:05:34.760 stop it
01:05:35.060 from happening
01:05:35.440 again,
01:05:35.920 right?
01:05:36.520 What if
01:05:37.020 the next
01:05:37.380 president
01:05:37.800 is also
01:05:39.480 characterized
01:05:40.060 by whoever
01:05:40.800 is behind
01:05:41.980 the persuasion
01:05:42.600 as Hitler?
01:05:45.640 That
01:05:46.160 situation
01:05:46.660 is going
01:05:46.980 to come
01:05:47.200 up again
01:05:47.680 the very
01:05:48.740 next time
01:05:49.240 a Republican
01:05:49.760 gets elected
01:05:50.400 it's going
01:05:51.400 to come
01:05:51.640 up again.
01:05:52.840 So what
01:05:53.360 are the
01:05:53.700 social media
01:05:54.740 employees do
01:05:55.620 when they've
01:05:56.680 got the
01:05:57.060 what bad
01:05:57.900 luck to
01:05:58.260 have a
01:05:58.520 second
01:05:58.820 Hitler?
01:05:59.520 Oh my
01:05:59.760 God,
01:06:00.540 what bad
01:06:01.240 luck?
01:06:01.580 all right.
01:06:10.440 That,
01:06:11.120 ladies and
01:06:11.520 gentlemen,
01:06:13.420 concludes the
01:06:15.780 best live
01:06:16.340 stream you've
01:06:16.920 ever seen
01:06:17.360 on this
01:06:17.700 topic.
01:06:20.420 I'm pretty
01:06:21.240 sure that
01:06:23.360 the mainstream
01:06:23.880 media has
01:06:25.160 nothing on
01:06:25.740 me.
01:06:27.260 Although I
01:06:27.920 have to say
01:06:28.280 Tucker's show
01:06:28.920 is a whole
01:06:29.860 other level,
01:06:30.560 so I can't
01:06:31.280 really compete
01:06:31.720 with that.
01:06:33.040 But I feel
01:06:33.460 like I compete
01:06:34.100 with Don
01:06:34.940 Lemon.
01:06:36.660 If I were
01:06:37.340 to rank the
01:06:38.140 quality of my
01:06:39.100 live stream
01:06:39.700 compared to
01:06:40.500 other professionals,
01:06:42.640 I think I'd
01:06:43.440 say better
01:06:44.980 than most
01:06:45.640 of CNN.
01:06:47.760 Not better
01:06:48.540 than a lot
01:06:48.960 of stuff on
01:06:49.460 Fox.
01:06:49.980 They have
01:06:50.260 higher quality
01:06:51.280 talent,
01:06:51.820 I think.
01:06:56.760 But I
01:06:57.600 think my
01:06:57.960 live stream
01:06:58.460 is already
01:06:58.920 better than
01:06:59.860 80%
01:07:02.240 of the
01:07:03.040 content
01:07:04.480 in the
01:07:04.880 same space.
01:07:06.120 That would
01:07:06.680 be my
01:07:07.480 self-assessment.
01:07:10.920 Now,
01:07:11.420 part of it
01:07:11.800 is that I'm
01:07:12.440 not beholden
01:07:13.120 to anybody.
01:07:14.180 I'm not
01:07:14.720 running
01:07:15.020 pharma
01:07:15.360 commercials.
01:07:16.660 Actually,
01:07:17.220 they might
01:07:17.480 be on
01:07:17.740 YouTube,
01:07:18.140 but I
01:07:18.480 don't have
01:07:20.040 anything to
01:07:20.460 do with
01:07:20.660 that.
01:07:21.800 So I
01:07:23.680 don't have
01:07:24.040 to answer
01:07:24.360 to Murdoch.
01:07:25.740 I don't
01:07:26.080 have to
01:07:26.440 answer to
01:07:26.980 the CIA.
01:07:29.360 I don't
01:07:29.820 have to
01:07:30.100 answer to
01:07:30.500 a boss.
01:07:31.800 I just
01:07:32.100 don't have
01:07:32.320 to answer
01:07:32.640 to anybody.
01:07:33.840 So that's
01:07:34.420 got to be
01:07:34.720 an advantage.
01:07:38.540 You know,
01:07:39.080 here's a
01:07:39.400 question.
01:07:41.880 The Tim
01:07:43.040 Cash show,
01:07:43.700 I think,
01:07:44.000 is a great
01:07:44.400 production.
01:07:45.080 I think
01:07:45.320 Tim Poole
01:07:46.940 does an
01:07:47.340 amazing job
01:07:48.000 as an
01:07:48.320 entrepreneur,
01:07:49.240 and also
01:07:49.800 one of the
01:07:50.120 best examples
01:07:50.840 of a
01:07:51.480 talent stack.
01:07:52.540 Have I
01:07:52.860 ever mentioned
01:07:53.240 that before?
01:07:54.580 Tim Poole.
01:07:56.640 His talent
01:07:57.160 stack is all
01:07:58.260 the way from
01:07:58.800 he's got a
01:08:00.080 band and
01:08:00.640 he's got
01:08:00.920 musical talent.
01:08:01.960 He's very
01:08:02.320 good, by the
01:08:02.720 way.
01:08:02.900 I listened
01:08:03.160 to some
01:08:04.400 of his
01:08:04.660 music on
01:08:05.560 YouTube the
01:08:06.180 other day.
01:08:06.980 Very good.
01:08:07.960 Very good.
01:08:08.920 I was
01:08:09.320 surprised.
01:08:10.840 Surprised just
01:08:11.480 because he has
01:08:11.880 another job.
01:08:12.680 You don't
01:08:13.060 expect people
01:08:13.660 to be good
01:08:14.120 if they have
01:08:14.780 one other
01:08:15.140 job.
01:08:15.980 But he's
01:08:16.320 very good.
01:08:17.180 And he's
01:08:18.880 built a
01:08:19.440 sort of
01:08:20.800 newsy
01:08:21.700 opinion
01:08:22.020 site that
01:08:22.920 appears to
01:08:24.060 have
01:08:24.260 associates
01:08:25.180 and maybe
01:08:25.860 employees and
01:08:26.520 stuff.
01:08:27.360 So very
01:08:27.920 good.
01:08:28.360 Does a
01:08:28.600 great job.
01:08:29.380 But here's
01:08:29.640 the question.
01:08:30.980 Do you
01:08:31.240 like podcasts
01:08:32.280 with a
01:08:33.120 team of
01:08:34.600 people around
01:08:35.120 the table?
01:08:36.980 Do you
01:08:37.240 like it when
01:08:37.620 it's just
01:08:37.920 two people
01:08:38.440 like the
01:08:38.840 Lex Friedman
01:08:39.700 model?
01:08:40.600 Or do you
01:08:41.320 prefer it
01:08:41.740 when it's
01:08:42.020 one who
01:08:42.660 has something
01:08:43.060 to say,
01:08:43.760 like my
01:08:44.340 model?
01:08:45.320 Which of
01:08:45.800 the models
01:08:46.120 do you
01:08:46.380 like sort
01:08:47.400 of generically
01:08:48.080 the best?
01:08:50.800 Two people.
01:08:53.620 So mine's
01:08:54.320 kind of a
01:08:54.680 special case.
01:08:57.920 Two people
01:08:58.580 both.
01:08:59.100 Yeah, I
01:08:59.360 think they
01:08:59.640 all have a
01:08:59.980 place.
01:09:00.680 Probably
01:09:00.900 they all
01:09:01.120 have a
01:09:01.320 place.
01:09:04.520 When I
01:09:05.580 see a
01:09:06.060 group table
01:09:07.500 conversation,
01:09:08.980 I always
01:09:09.680 have the
01:09:10.000 following
01:09:10.280 feelings.
01:09:10.900 See if
01:09:11.120 anybody
01:09:11.400 can get
01:09:11.700 that.
01:09:11.900 And this
01:09:12.460 is a
01:09:12.960 persuasion
01:09:13.480 lesson for
01:09:14.020 you.
01:09:15.220 We only
01:09:15.980 evaluate
01:09:16.480 things,
01:09:17.480 anything,
01:09:18.520 compared to
01:09:19.060 other
01:09:19.240 things.
01:09:20.800 So if
01:09:22.640 I'm alone,
01:09:24.860 you're
01:09:25.160 comparing me
01:09:25.780 to maybe
01:09:26.260 other people
01:09:27.000 sort of
01:09:27.840 generically
01:09:28.780 who do
01:09:29.180 this.
01:09:29.920 But if
01:09:31.160 I were
01:09:31.420 sitting here
01:09:31.880 with,
01:09:32.220 let's say,
01:09:32.700 four people
01:09:33.300 around a
01:09:33.780 table,
01:09:34.960 and each
01:09:35.360 of us
01:09:35.600 were taking
01:09:36.060 some time
01:09:36.560 to talk,
01:09:38.400 wouldn't
01:09:38.880 you automatically
01:09:39.660 say,
01:09:40.560 okay,
01:09:40.860 I like
01:09:41.460 listening to
01:09:41.980 that one
01:09:42.460 and that
01:09:43.920 one,
01:09:44.260 but these
01:09:44.700 other two
01:09:45.200 I don't
01:09:45.500 like listening
01:09:46.000 to?
01:09:46.260 Wouldn't
01:09:47.780 you
01:09:47.980 automatically
01:09:49.080 hate it
01:09:49.800 when one
01:09:50.400 of the
01:09:50.680 four,
01:09:51.640 who in
01:09:52.180 your opinion
01:09:52.720 is the
01:09:53.360 least good
01:09:54.460 talker,
01:09:55.660 when they're
01:09:56.340 talking,
01:09:57.420 right?
01:09:58.120 And so it
01:09:59.000 feels like it
01:09:59.560 takes something
01:10:00.220 away,
01:10:01.000 but on the
01:10:01.720 other hand,
01:10:02.720 it might make
01:10:03.460 the ones you
01:10:04.140 like the best
01:10:04.860 seem better
01:10:05.920 by contrast.
01:10:06.940 So I'm not
01:10:07.440 sure what the
01:10:07.880 net is.
01:10:08.320 That's why I
01:10:08.740 asked.
01:10:09.540 I think
01:10:09.960 there's room
01:10:10.420 for all
01:10:10.940 those models,
01:10:12.440 but,
01:10:13.860 yeah.
01:10:14.120 That's
01:10:16.040 like when
01:10:16.420 I watched,
01:10:18.020 I have to
01:10:19.480 mention the
01:10:19.940 five.
01:10:20.860 Now the
01:10:21.360 five is a
01:10:22.740 completely
01:10:23.100 different model
01:10:24.700 than anything
01:10:25.400 else.
01:10:26.240 Because in
01:10:26.640 the five,
01:10:27.120 they can talk
01:10:27.640 over each
01:10:28.060 other.
01:10:29.160 You know,
01:10:29.320 there's an
01:10:29.820 interaction,
01:10:30.480 you don't
01:10:30.720 know where
01:10:30.920 it's going
01:10:31.160 to go.
01:10:31.740 But the
01:10:32.100 other thing
01:10:32.500 that the
01:10:32.800 five does
01:10:33.460 is they
01:10:34.560 give you
01:10:35.400 usually five
01:10:37.120 good
01:10:37.420 personalities.
01:10:39.000 That's why
01:10:39.500 it's the
01:10:39.760 five,
01:10:40.480 right?
01:10:40.660 It's not
01:10:40.960 like one
01:10:41.360 good person
01:10:41.880 and four
01:10:42.460 guests.
01:10:42.840 The Fox
01:10:45.000 producers are
01:10:45.820 the best
01:10:46.180 in the
01:10:46.480 business.
01:10:47.640 And one
01:10:48.000 of the
01:10:48.120 things they
01:10:48.420 do better
01:10:48.780 than anything
01:10:49.260 is talent
01:10:50.960 combinations.
01:10:52.380 You know,
01:10:52.500 they always
01:10:52.820 experiment with,
01:10:53.700 oh,
01:10:53.800 these two
01:10:54.160 hosts.
01:10:54.920 You see,
01:10:55.520 they're always
01:10:56.180 moving the
01:10:56.840 players around
01:10:57.920 to see which
01:10:58.940 chemistry gets
01:11:00.120 you the best.
01:11:01.340 And if you
01:11:01.620 notice that on
01:11:02.780 the five,
01:11:04.160 on any of
01:11:04.820 the days where
01:11:05.360 the three of
01:11:06.600 the regulars
01:11:07.100 are there,
01:11:07.520 where there's
01:11:07.960 Jesse and
01:11:09.100 then Dana
01:11:09.560 and then Greg,
01:11:10.840 when those
01:11:12.720 three are
01:11:13.060 there,
01:11:14.040 the chemistry
01:11:15.180 of those
01:11:15.640 three makes
01:11:16.920 everybody else
01:11:17.540 better.
01:11:18.760 So whoever
01:11:19.260 the other
01:11:19.680 two are,
01:11:21.240 they raise
01:11:23.620 their game.
01:11:24.500 So the
01:11:24.740 thing that,
01:11:25.460 and I give
01:11:26.200 Greg the most
01:11:26.780 credit for this
01:11:27.320 actually,
01:11:28.200 if you watch
01:11:29.560 the five,
01:11:30.280 you'll see
01:11:30.640 that Greg's,
01:11:32.140 let's say,
01:11:33.340 influence or
01:11:34.000 chemistry or
01:11:34.800 leadership or
01:11:36.000 something,
01:11:36.880 but he
01:11:37.320 influences everybody
01:11:39.220 to raise
01:11:40.280 their game
01:11:40.780 into being
01:11:41.960 a little
01:11:42.380 more active,
01:11:43.260 a little
01:11:43.420 more funny,
01:11:44.820 a little
01:11:45.200 more life.
01:11:47.320 And when
01:11:47.740 he's not
01:11:48.060 there,
01:11:48.260 you can tell,
01:11:49.060 like the
01:11:49.780 energy level
01:11:50.400 just drops.
01:11:52.340 So because
01:11:53.080 of him,
01:11:54.380 and he's
01:11:54.920 just a
01:11:55.240 complete unique
01:11:56.080 character,
01:11:56.720 I don't think
01:11:57.240 there's anybody
01:11:57.600 like him
01:11:58.080 right now,
01:11:59.560 that's not
01:12:00.920 like five
01:12:01.560 people sitting
01:12:02.100 at a table.
01:12:04.320 That's like
01:12:05.100 three of them
01:12:07.420 have amazing
01:12:08.300 chemistry,
01:12:09.280 so they
01:12:09.720 operate like
01:12:10.520 the Beatles,
01:12:11.420 you know,
01:12:11.720 like their
01:12:12.840 partnership is
01:12:13.700 so good.
01:12:14.420 Then you
01:12:14.820 add Judge
01:12:16.180 Janine in
01:12:16.720 there,
01:12:17.840 and is
01:12:18.260 Judge Janine
01:12:18.840 better because
01:12:19.780 she's at the
01:12:20.360 table?
01:12:21.320 Absolutely.
01:12:23.440 Yeah,
01:12:23.780 Judge Janine's
01:12:24.720 whole presentation,
01:12:27.800 you know,
01:12:28.000 you could start
01:12:28.520 with liking her,
01:12:29.220 she's a
01:12:29.780 likable
01:12:30.160 personality,
01:12:31.260 but when
01:12:31.960 she's at the
01:12:32.480 table and
01:12:33.180 the chemistry
01:12:33.660 is working,
01:12:34.940 you know,
01:12:35.300 her character,
01:12:37.320 I call them
01:12:37.880 characters because
01:12:38.900 everybody's playing
01:12:39.500 a character on
01:12:40.180 TV,
01:12:41.020 her character
01:12:41.660 comes alive,
01:12:42.940 right?
01:12:43.240 She's more
01:12:43.700 playful and
01:12:44.400 everything works
01:12:45.040 better.
01:12:45.960 And then they
01:12:46.580 can substitute
01:12:47.360 it in like
01:12:48.680 the liberal of
01:12:50.300 the day,
01:12:51.020 and they're
01:12:51.660 doing a great
01:12:52.240 job on that.
01:12:53.680 Like, yeah,
01:12:54.460 Jessica Tarlov
01:12:55.580 is great,
01:12:56.860 Geraldo is
01:12:57.460 great,
01:13:00.120 Harold Ford
01:13:00.920 Jr.
01:13:02.060 At first I
01:13:02.940 thought he was
01:13:03.580 like maybe a
01:13:04.860 bad pick.
01:13:05.440 and I
01:13:06.340 thought,
01:13:06.560 ah,
01:13:06.760 this is the
01:13:07.220 first time
01:13:07.760 I've seen
01:13:08.760 the five get
01:13:09.420 one like
01:13:09.940 completely wrong
01:13:10.740 because I
01:13:11.300 thought he
01:13:11.540 was going
01:13:11.800 to be too
01:13:12.520 generic and
01:13:13.640 nice.
01:13:14.960 And then it
01:13:15.700 took like a
01:13:16.360 few weeks
01:13:16.800 before Greg
01:13:17.880 started mocking
01:13:18.660 him for being
01:13:19.320 too nice,
01:13:20.540 and then
01:13:20.820 everything got
01:13:21.360 fun again.
01:13:23.660 Like Greg
01:13:24.400 just knocked
01:13:25.520 out like his
01:13:27.180 support.
01:13:28.820 I'm going to
01:13:29.320 take that leg
01:13:30.000 out of your
01:13:30.380 chair and see
01:13:31.440 how you do.
01:13:32.380 It turns out
01:13:33.180 Harold Ford
01:13:34.260 Jr.
01:13:34.500 is a smart
01:13:35.260 capable person
01:13:36.100 and when the
01:13:37.020 leg got knocked
01:13:37.620 out of his
01:13:38.000 chair he got
01:13:38.520 better.
01:13:39.720 It was more
01:13:40.260 fun, right?
01:13:41.340 And so that's
01:13:41.840 sort of where
01:13:42.300 they're at now.
01:13:43.620 But, so I
01:13:45.860 think that's the
01:13:46.360 lesson.
01:13:46.860 The lesson is
01:13:47.700 if you have
01:13:48.060 the right
01:13:48.600 chemistry then
01:13:50.940 the table is
01:13:51.560 better than
01:13:52.020 one person.
01:13:55.280 All right.
01:13:56.880 Can I add
01:13:57.320 some interviews
01:13:57.840 in?
01:13:58.180 Yeah.
01:13:59.420 You know I
01:13:59.840 plan to do
01:14:00.360 that.
01:14:01.500 Does anybody
01:14:02.020 know if Rumble
01:14:02.620 has created
01:14:03.820 a direct
01:14:04.620 live streaming
01:14:05.540 capability yet
01:14:06.800 or do I
01:14:08.160 have to go
01:14:08.480 through a
01:14:08.780 third-party
01:14:09.200 software to
01:14:09.820 do that
01:14:10.120 still?
01:14:11.020 I keep
01:14:11.360 waiting for
01:14:11.780 them to
01:14:12.260 say yes.
01:14:15.260 I don't think
01:14:16.040 yes.
01:14:17.460 You say
01:14:18.060 yes.
01:14:21.080 But now
01:14:22.240 when you
01:14:22.880 say yes
01:14:23.500 I know
01:14:24.780 you can do
01:14:25.340 Rumble
01:14:25.720 live but
01:14:26.760 they have to
01:14:27.180 go through a
01:14:27.600 third-party
01:14:28.080 software.
01:14:29.240 With YouTube
01:14:29.780 I just turn
01:14:30.700 on YouTube
01:14:31.100 and then
01:14:31.500 I go
01:14:31.720 live.
01:14:32.780 With Rumble
01:14:33.340 at least
01:14:34.400 last year
01:14:35.040 you had to
01:14:35.360 use like
01:14:35.720 StreamYard
01:14:37.240 or some
01:14:37.700 other software.
01:14:38.880 So what
01:14:39.800 I'm asking
01:14:40.220 is can you
01:14:40.840 do it
01:14:41.100 directly?
01:14:42.060 Because the
01:14:42.420 third-party
01:14:42.860 thing is a
01:14:43.480 non-starter
01:14:44.240 for me.
01:14:49.320 All right.
01:14:49.940 Well maybe
01:14:50.440 I'll ask
01:14:50.940 Rumble.
01:14:53.100 Yeah all
01:14:53.620 the ones
01:14:53.940 that you
01:14:54.160 mentioned
01:14:54.540 I know
01:14:55.500 to use
01:14:55.940 third-party
01:14:56.420 software
01:14:56.880 or they
01:14:57.400 were.
01:14:57.880 They may
01:14:58.100 have changed.
01:15:00.700 Yeah no
01:15:02.060 I know
01:15:02.600 there are
01:15:03.060 live streams.
01:15:05.300 The question
01:15:06.140 is not
01:15:06.560 are there
01:15:07.200 live streams.
01:15:08.520 The question
01:15:08.920 is can
01:15:09.460 you do it
01:15:09.880 without the
01:15:10.300 third-party
01:15:10.760 software in
01:15:11.340 the middle
01:15:11.580 and I
01:15:11.940 don't think
01:15:12.420 so.
01:15:15.000 All right.
01:15:23.460 Yes I do
01:15:24.240 know somebody
01:15:24.680 at Rumble
01:15:25.020 and I will
01:15:25.960 ask.
01:15:27.480 The Rumble
01:15:28.060 app is
01:15:28.580 required.
01:15:29.040 I don't
01:15:30.880 think so.
01:15:31.680 I don't
01:15:31.920 think the
01:15:32.240 app does
01:15:34.000 it.
01:15:34.840 I'll check
01:15:35.280 that out.
01:15:36.200 All right
01:15:36.440 that's all
01:15:36.800 for now
01:15:37.100 and I
01:15:39.300 will talk
01:15:39.740 to you
01:15:40.280 later.
01:15:43.560 I'm going
01:15:44.120 to talk
01:15:44.340 to the
01:15:44.760 locals
01:15:45.220 people a
01:15:45.740 little bit
01:15:46.100 but bye
01:15:46.820 for now
01:15:47.140 YouTube.