The Megyn Kelly Show - April 27, 2022


Elon Musk's Twitter Takeover, From All Angles, with Seth Dillon, Vivek Ramaswamy, and Peter Schiff | Ep. 309


Episode Stats

Length

1 hour and 36 minutes

Words per Minute

200.1944

Word Count

19,224

Sentence Count

1,093

Misogynist Sentences

33

Hate Speech Sentences

25


Summary

Elon Musk is the new CEO of Tesla and the richest man in the world, and now he s the new owner of one of the world s most popular social media platforms, Twitter. But what does this mean for the future of the site? And what will happen to the people who have been banned?


Transcript

00:00:00.500 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.620 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:14.900 Today we're taking a deep dive into Elon Musk's eventual takeover of Twitter.
00:00:19.840 Everything you need to know about this.
00:00:21.600 What's happening inside the social media platform?
00:00:24.220 Who are the big winners and losers?
00:00:26.520 The big winners and the big whiners?
00:00:28.440 What's left or what's behind, I should say, the left's total meltdown over this.
00:00:34.100 My God, get yourselves together, people.
00:00:36.400 And what does it mean for banned conservatives, including former President Trump?
00:00:41.460 God, my team gave me the list of people who've been banned.
00:00:43.760 I forgot how long it is.
00:00:45.820 Do you know, we're going to talk about James O'Keefe today and Project Veritas
00:00:49.120 and this tape that they've gotten their hands on.
00:00:51.100 He's banned. Project Veritas is banned.
00:00:54.020 So many people are banned.
00:00:55.180 It's like they get banned and then you kind of forget about Marjorie Taylor Greene.
00:00:58.060 She's banned.
00:00:58.980 Trump's banned.
00:00:59.660 Anyway, we'll go.
00:01:00.420 We'll get into it.
00:01:01.120 What happens to them?
00:01:02.160 Do they come back to life?
00:01:03.040 Is there a resurrection?
00:01:04.920 And speaking of people who have been banned on Twitter,
00:01:07.300 we're very excited to have one of the key players in this story.
00:01:10.260 Just over a month ago, the right-leaning satirical news site,
00:01:13.700 the Babylon Bee, was suspended by Twitter after it named a transgender member of the Biden administration
00:01:20.560 as its man of the year.
00:01:22.520 It was a joke, you see, because this transgender woman was named one of the women of the year
00:01:29.560 recognized by USA Today.
00:01:31.060 And Rachel Levine spent almost her entire life as a man.
00:01:36.020 She didn't even turn trans until within the past 10 to 15 years.
00:01:39.520 And they literally want us to celebrate her as a feminine icon.
00:01:43.700 Meanwhile, all of her accomplishments came when she was a man.
00:01:46.380 And only in the last decade and a half has she switched over to being a woman.
00:01:50.080 So the bee was having some fun with it.
00:01:53.080 And they've been shut down and they haven't been resurrected.
00:01:56.500 And they won't be under the old Twitter management,
00:01:59.240 or they were told they wouldn't until they took down that tweet.
00:02:02.320 They refused.
00:02:03.780 And they've been sitting there in Twitter jail ever since.
00:02:07.400 Well, we should all be very glad this happened in a way,
00:02:10.040 because that story caught the attention of one Elon Musk,
00:02:13.400 who happens to be the richest man in the world,
00:02:15.260 who then happened to reach out to the company to confirm
00:02:18.520 that the Babylon Bee had, in fact, been suspended
00:02:21.620 prior to making his bid for Twitter, which was successful.
00:02:32.060 So my first guest today is the CEO of the Babylon Bee, Seth Dillon.
00:02:37.120 Welcome back to the show.
00:02:38.020 Great to have you.
00:02:39.020 Great to be back on the show.
00:02:40.060 Thanks, Megan.
00:02:41.160 It's such an awesome story.
00:02:42.640 You've got to be feeling pretty good today.
00:02:46.020 Well, I mean, the story's not over.
00:02:48.800 We are still locked out of Twitter.
00:02:50.760 The deal right now, you know, Twitter has accepted Musk's bid,
00:02:54.180 but the deal hasn't gone through yet.
00:02:55.900 So it's not a done deal.
00:02:58.360 Well, we'll see when all the ink is dry and the funds have been transferred
00:03:01.760 and ownership has exchanged hands and all of that,
00:03:04.880 what actually ends up happening.
00:03:07.160 As of this moment, we're still locked out of Twitter.
00:03:09.740 Yeah, right.
00:03:10.400 Because you won't take down your tweet.
00:03:12.080 Although what's happening?
00:03:13.500 Because I've seen a lot of people in the past couple of days since Elon,
00:03:17.480 since his offer was accepted, tweeting out,
00:03:21.600 Rachel Levine is a man and things like that.
00:03:24.660 And they're not being banned.
00:03:26.420 So what's the deal?
00:03:28.340 Like, have you noticed that?
00:03:29.460 And why the difference in treatment?
00:03:32.080 I have noticed that.
00:03:33.820 It's risky behavior.
00:03:35.700 Hats off to them for being willing to do that.
00:03:38.720 It could be that, you know,
00:03:39.900 they've put a freeze on some of their enforcement policies
00:03:42.840 while the deal is going through,
00:03:45.220 maybe to minimize the likelihood of some kind of controversy
00:03:48.640 coming up as a result of it
00:03:50.040 and the distraction that might result from that.
00:03:52.660 I'm not sure what's going on.
00:03:54.120 You also have this weird thing where
00:03:55.560 people's followings are jumping dramatically.
00:03:58.440 I don't know what yours looks like over there,
00:04:00.200 but mine has been climbing record numbers.
00:04:03.660 I think I gained like 15,000 followers
00:04:05.700 in the last 24 hours or something like that.
00:04:08.100 So there's been a lot of people gaining followers
00:04:09.920 and they're speculating about what the reason for that is.
00:04:13.560 I mean, obviously Twitter's in the news right now
00:04:15.400 and a lot of people who left Twitter before
00:04:17.240 for other platforms are possibly coming back to it
00:04:19.880 as a result of the change in ownership that's forthcoming.
00:04:24.460 But it could be more than that.
00:04:25.860 Some people are speculating that there's been a lift
00:04:29.060 on some of this shadow banning that's been happening
00:04:31.160 and that it's opening the floodgates to new followers.
00:04:33.920 I don't know.
00:04:34.360 I think that a lot of that is speculation,
00:04:35.940 but there is some crazy stuff happening on there right now.
00:04:38.560 All I know is I have 2.5 million followers.
00:04:40.940 I never click on what says 2.5 to see,
00:04:44.900 you know, how many, how close am I to 2.6?
00:04:48.880 Yeah, so I have no idea.
00:04:50.340 I definitely did not go from 2.5 to 2.6.
00:04:52.740 So we know I didn't gain 100,000 overnight.
00:04:55.940 Although Ben Shapiro was just saying in his show,
00:04:57.780 he gained something, he said, quote,
00:04:59.440 something like 200,000 overnight.
00:05:01.520 So I do believe there's some messing with the algorithm
00:05:04.380 and sort of a, you know, it's a cleanup.
00:05:06.640 It's a cleanup in aisle seven before the new boss comes in.
00:05:09.400 And then he can see all that they've been doing
00:05:11.700 and it's not going to work because he's going to see it.
00:05:14.300 That could be the case.
00:05:15.260 I don't know.
00:05:15.840 I don't want to, I don't want to speculate
00:05:17.400 because I don't know what's going on behind closed doors,
00:05:19.180 but a lot of people are speculating
00:05:20.200 that there's, you know,
00:05:20.800 a lot of documents being shredded right now
00:05:22.900 and a lot of coverup that's happening
00:05:25.600 and things that they don't want to be exposed.
00:05:29.060 Ultimately, you know,
00:05:29.660 they're not going to be able to hide all of that.
00:05:31.180 And it does sound like Musk wants
00:05:32.880 to make the algorithm open source,
00:05:34.480 which is really interesting.
00:05:35.980 But yeah, I mean, back to what you're saying.
00:05:37.560 I mean, it was a good day for free speech.
00:05:40.020 I think that a lot of people,
00:05:41.360 people who love free speech,
00:05:42.640 people who love this idea
00:05:43.920 that we should be able to express ourselves openly
00:05:47.540 and freely and not have restrictions.
00:05:50.140 You know, people who agree
00:05:51.520 with Twitter's terms of service, by the way,
00:05:53.880 in their intermission statement,
00:05:55.400 which says that they don't want
00:05:56.580 any barriers to free expression.
00:05:58.840 If you don't want barriers to free expression,
00:06:00.300 then it was a good day
00:06:01.200 because it sounds like Musk
00:06:02.300 is actually going to make sure
00:06:03.200 they live up to that statement.
00:06:04.520 And I don't think they have been up to this point.
00:06:06.820 So can we jump back?
00:06:07.780 So you got banned on Twitter
00:06:10.040 and not banned,
00:06:11.520 but I guess your account's been suspended.
00:06:13.480 So you're not able to tweet out
00:06:15.800 the funny things that we love the Babylon B for.
00:06:19.560 And it's conditional, right?
00:06:21.660 They've got you like a hostage situation
00:06:24.020 saying you will delete the tweet
00:06:26.100 or your account will remain our hostage.
00:06:29.800 And you refuse to delete the tweet.
00:06:32.000 So good for you, by the way.
00:06:33.220 Love that.
00:06:33.660 It happened to Charlie Kirk, too,
00:06:39.560 who said he was making a similar point
00:06:41.620 about Rachel Levine saying
00:06:42.800 she's actually a biological man
00:06:44.920 and he got locked out as well.
00:06:47.020 So like they're big on like, I don't know.
00:06:50.160 I like I don't know if it's because
00:06:51.780 you named her specifically.
00:06:53.220 Like they're not actually that clear
00:06:54.920 about how they apply their policies.
00:06:59.460 But you get suspended.
00:07:01.060 So what was the next iteration of that
00:07:03.640 as far as Elon Musk goes?
00:07:06.160 Well, the next iteration of the suspension,
00:07:08.880 you know, where is it going from there?
00:07:10.720 No, like what?
00:07:11.860 So then you find out that Elon Musk
00:07:13.800 has taken an interest in your suspension.
00:07:17.300 He only follows like 100 accounts
00:07:19.120 and you guys are one of them.
00:07:20.460 So you must have known that.
00:07:21.680 But like, it's quite different
00:07:22.920 when you find out, wait a minute,
00:07:24.100 he's actually personally taking
00:07:25.580 an interest in this somehow.
00:07:27.100 Yeah, well, you know,
00:07:28.220 I think that he I think
00:07:29.620 that he likes our content.
00:07:30.680 You know, he's interacted
00:07:31.480 with our content quite a bit.
00:07:32.660 And and and he noticed
00:07:34.480 that we'd stop posting it.
00:07:35.980 And this is the crazy thing,
00:07:37.400 by the way, this this idea
00:07:38.640 that we need to delete this tweet
00:07:40.120 to get back on.
00:07:40.960 Why do we have to delete the tweet?
00:07:42.720 Why?
00:07:43.420 I mean, if Twitter doesn't like the tweet
00:07:45.360 and they want to take it down,
00:07:47.260 then they can do that.
00:07:48.180 They have the power
00:07:48.760 to take the tweet down themselves.
00:07:50.060 It's it seems to me
00:07:51.320 that it's the submission thing.
00:07:52.700 They just want you to like bend the knee
00:07:54.300 and do what they want you to do
00:07:55.920 and acknowledge that you engaged
00:07:57.380 in hateful conduct.
00:07:58.440 And it is unclear.
00:08:00.580 I thought of that.
00:08:01.440 You're exactly right.
00:08:02.360 God, that's annoying.
00:08:03.740 It is a very it takes it a step further
00:08:05.760 than just simply having,
00:08:07.100 you know, if they were just
00:08:07.900 to take our delete
00:08:08.640 or tweet down
00:08:09.580 and delete it themselves,
00:08:10.500 then it would be annoying
00:08:12.240 that they did that,
00:08:13.320 that they that they censored us.
00:08:14.960 But at least they wouldn't be going
00:08:16.220 that extra step
00:08:17.020 of making us do it ourselves.
00:08:18.440 I think that that's taking
00:08:19.640 a little bit a little bit
00:08:20.740 further than it needs to go.
00:08:21.960 And it is unclear,
00:08:23.620 you know, what it is
00:08:24.580 that prompts these things.
00:08:25.680 I think right now
00:08:26.620 you're safe calling
00:08:27.680 a trans person
00:08:28.960 a biological male
00:08:30.740 if their sex is male.
00:08:33.100 But you're, you know,
00:08:34.100 you're not supposed to refer
00:08:34.900 to them as a man.
00:08:36.340 And so it really comes down
00:08:37.720 to the gender sex distinction
00:08:39.020 that they're trying to make.
00:08:40.080 And and our in our joke
00:08:41.560 referred to an individual
00:08:42.860 who is a transgender
00:08:44.860 as a man
00:08:45.880 rather than as a biological male.
00:08:47.840 And so that runs afoul
00:08:48.700 of the hateful conduct policy.
00:08:49.860 I think that's the that's
00:08:50.940 the sticking point there.
00:08:52.880 But yeah, I mean, Musk,
00:08:54.100 Musk is a big fan of the bee.
00:08:55.520 He's been following us
00:08:56.240 for some time.
00:08:56.900 And so he noticed
00:08:57.520 when we were missing.
00:08:58.760 And I think it's just,
00:08:59.580 you know, a lot of people
00:09:01.560 are giving the bee credit
00:09:02.660 as if we are the reason
00:09:03.860 why Musk decided to buy Twitter.
00:09:05.400 I think he had it in his head
00:09:06.380 for quite some time
00:09:07.560 leading up to this.
00:09:08.620 Maybe it was just
00:09:09.100 a contributing factor.
00:09:09.980 Maybe it was the straw
00:09:10.680 that broke the camel's back.
00:09:11.720 I'm not sure.
00:09:12.360 But but clearly he saw
00:09:13.920 that there was an issue
00:09:14.640 with heavy handed censorship
00:09:15.700 and wanted to do something about it.
00:09:17.460 And I think it's really
00:09:17.980 interesting, too,
00:09:18.700 because you have
00:09:20.220 so many people
00:09:20.780 who call themselves progressives.
00:09:23.320 And I have a different idea
00:09:24.300 of what progress means.
00:09:25.940 I think there's like
00:09:26.480 a really great quote
00:09:27.480 that C.S. Lewis had said
00:09:28.520 at one point about how
00:09:29.460 if you're on the wrong road,
00:09:31.340 then progress means
00:09:32.840 doing an about turn
00:09:33.860 and going back
00:09:34.520 to the right road.
00:09:35.940 And the man who turns back soonest
00:09:37.920 in that context
00:09:38.760 is the most progressive.
00:09:40.340 Musk is just trying to take us
00:09:41.480 back to the right road.
00:09:42.640 You know, these things started out.
00:09:43.860 These platforms were pitched
00:09:44.960 as free speech platforms
00:09:46.240 when they were first created.
00:09:47.740 There was not supposed
00:09:48.620 to be any barriers
00:09:49.520 to free expression.
00:09:50.660 He's trying to take us
00:09:51.520 back to that road
00:09:52.560 to get us on the right road.
00:09:53.700 And so in that sense,
00:09:54.400 I think this is a progressive move.
00:09:56.540 I agree with that.
00:09:58.340 So without asking you
00:09:59.760 about your private conversations
00:10:00.880 with Elon Musk,
00:10:01.880 because I know that's
00:10:02.540 that's between the two of you.
00:10:03.860 Did he call you personally?
00:10:05.480 Like, were you,
00:10:06.580 you know,
00:10:07.200 was there a moment
00:10:07.960 where you were like,
00:10:08.640 oh, my God,
00:10:09.300 Elon Musk is on the phone for me?
00:10:10.740 We had a couple
00:10:13.200 we had a couple of moments
00:10:14.160 like that initially
00:10:14.960 when we were
00:10:15.660 we're trying to book him
00:10:16.760 for an interview
00:10:17.440 for our podcast.
00:10:19.300 You know,
00:10:19.840 there was no expectation
00:10:20.860 that he would respond
00:10:21.760 to a DM on Twitter.
00:10:23.220 And then all of a sudden he did.
00:10:24.740 And he was talking to us
00:10:25.600 about why in the world
00:10:26.440 do you have an office
00:10:27.160 in California?
00:10:28.520 You know, it's it's nice weather,
00:10:29.940 but it's the most expensive
00:10:30.800 weather in the world.
00:10:32.240 You know,
00:10:32.680 there was this like banter
00:10:33.680 that was going back and forth
00:10:34.780 all of a sudden.
00:10:35.300 It's like, OK,
00:10:35.860 we've got Elon Musk on the line.
00:10:37.400 What do we do?
00:10:37.940 How do we get him on our on our show?
00:10:39.420 So, yeah,
00:10:41.500 I mean, he he reached out.
00:10:43.000 He just wanted to confirm
00:10:43.920 that we had been suspended.
00:10:45.800 And so, you know,
00:10:46.860 he wanted to have
00:10:47.620 an actual conversation
00:10:49.280 to confirm that
00:10:50.200 instead of just, you know,
00:10:51.240 reading about it online
00:10:52.320 and and trying to figure out
00:10:53.840 if it was actually true.
00:10:55.300 OK, that I mean,
00:10:56.500 it's it's spiraled
00:10:58.100 to such a crazy place
00:11:00.020 since then.
00:11:01.380 When you saw that
00:11:02.420 he had acquired
00:11:03.000 the nine percent
00:11:03.760 of Twitter stock,
00:11:04.780 did you think,
00:11:05.600 oh, my God,
00:11:06.560 he's going to buy Twitter?
00:11:09.420 That is what we thought.
00:11:11.020 I mean, it was
00:11:11.920 he was giving
00:11:12.540 all of these public statements
00:11:13.840 that were kind of
00:11:14.560 just leading people
00:11:15.360 to believe that
00:11:15.980 he was really going
00:11:16.660 to take action.
00:11:17.280 That poll that he initially ran
00:11:18.740 where he said,
00:11:19.460 do you believe
00:11:19.900 that free speech
00:11:20.740 is a principle
00:11:21.760 that Twitter really adheres to?
00:11:23.460 And he's and he warned people.
00:11:25.080 He's like,
00:11:25.560 there will be important
00:11:26.340 consequences for this poll.
00:11:27.820 So vote accordingly.
00:11:30.180 You know,
00:11:30.780 I think he was making it clear.
00:11:32.080 He told us
00:11:32.640 when we interviewed him
00:11:33.600 a few months ago,
00:11:34.440 he told us,
00:11:35.300 you don't have to read
00:11:36.420 between the lines with me.
00:11:37.440 Just read the lines.
00:11:38.640 I'll tell you
00:11:39.500 what I'm thinking
00:11:40.140 and and I'm not going
00:11:41.380 to try to mislead you
00:11:42.400 whether or not
00:11:44.140 you could say
00:11:44.600 that that's been
00:11:45.140 consistently true
00:11:45.960 for him throughout
00:11:46.800 throughout the past.
00:11:47.740 You know,
00:11:47.860 if you apply it
00:11:48.420 in this in this case,
00:11:49.440 you just take him
00:11:50.020 at face value.
00:11:51.140 It was very clear
00:11:51.820 that he wanted
00:11:52.260 to get involved
00:11:52.880 in a big way
00:11:53.540 in Twitter
00:11:53.940 very early on
00:11:55.040 with his public statements.
00:11:56.780 Wow.
00:11:57.360 And and have you
00:11:58.640 spoken with him
00:11:59.600 at all this week?
00:12:01.520 No, we're not
00:12:02.140 talking to him
00:12:02.920 on a day to day basis.
00:12:04.820 We certainly
00:12:05.520 don't want to bother him
00:12:06.380 while he's really busy.
00:12:08.040 We've got to do
00:12:08.780 this other thing.
00:12:09.460 It has to complete.
00:12:10.840 Yeah, we'd love
00:12:11.540 to ask him
00:12:12.080 to bust us
00:12:12.660 out of Twitter jail.
00:12:14.000 But, you know,
00:12:14.780 we'll get there
00:12:15.500 when the time comes.
00:12:16.560 I don't think
00:12:16.900 he's in a position
00:12:17.460 to do that
00:12:17.880 at this moment.
00:12:18.740 Wait, that reminds me
00:12:19.640 of the Kyle Mann skit
00:12:20.640 that you guys did.
00:12:21.640 So Kyle is the one
00:12:23.120 who does a lot
00:12:23.640 of your writing
00:12:24.100 as I understand it.
00:12:25.460 And yeah,
00:12:26.200 and and you guys
00:12:27.060 did this little skit
00:12:27.840 sort of well,
00:12:29.000 I'll just let it play out
00:12:30.300 and let the audience
00:12:30.840 listen to it
00:12:31.480 and watch it.
00:12:32.320 It's soundbite one.
00:12:33.840 We got thrown in here
00:12:34.500 for a pretty terrible crime.
00:12:35.800 We gave Rachel Levine
00:12:37.440 our Man of the Year award.
00:12:39.220 We were just trying
00:12:39.900 to honor such a great man.
00:12:41.900 Pretty heinous, right?
00:12:43.260 But we're making do.
00:12:44.900 You know,
00:12:45.240 we've even been able
00:12:45.900 to continue publishing articles
00:12:47.320 with this laptop
00:12:48.520 that someone left here
00:12:49.840 in Twitter jail.
00:12:50.440 Oh, this is Hunter Biden.
00:12:51.440 The showers, of course,
00:12:52.080 they're a little scary.
00:12:53.420 I've been really careful
00:12:54.820 not to drop the cell.
00:12:55.840 Hey!
00:12:56.520 No homophobic comments!
00:12:58.320 Oh, yeah.
00:12:59.360 I can't say stuff
00:13:00.180 like that in here.
00:13:01.740 But don't worry.
00:13:02.780 I've got a plan to get out.
00:13:04.300 I'm digging a tunnel
00:13:05.140 behind this poster.
00:13:06.480 As soon as I finish it,
00:13:07.740 I'll be as free
00:13:08.440 as the Taliban,
00:13:09.600 Vladimir Putin,
00:13:10.380 Kathy Griffin,
00:13:11.340 the Chinese government,
00:13:12.040 and all the other
00:13:12.680 wonderful people
00:13:13.320 who are still out on Twitter.
00:13:17.780 What's that?
00:13:18.940 I'm Elon Musk.
00:13:19.640 I'm here to rescue you.
00:13:22.380 It's great.
00:13:23.320 Trump, for the listening audience,
00:13:24.820 Trump is in the background
00:13:25.860 just, like, filing his cards
00:13:27.900 because he's also in Twitter jail.
00:13:29.980 So I think that's actually
00:13:31.360 going to happen.
00:13:32.180 There's zero doubt
00:13:33.060 that the Babylon Bee
00:13:34.240 will come back
00:13:35.000 if this sale concludes
00:13:37.480 and probably even short of that
00:13:39.240 because I think Twitter
00:13:40.020 is being just publicly dragged
00:13:41.460 right now
00:13:42.040 in such a way
00:13:43.780 that it's going to have to,
00:13:44.980 even if this falls through,
00:13:46.360 they're going to have to reexamine
00:13:47.420 the way they've been
00:13:48.260 approaching their business model
00:13:50.180 because there's just,
00:13:51.220 there's too much anger at them
00:13:53.240 over the way it's been.
00:13:55.040 Don't you think?
00:13:55.520 I mean, this whole thing
00:13:56.620 has really served to expose
00:13:58.360 just how biased
00:13:59.540 against conservatives they are.
00:14:02.080 I don't think that would result
00:14:03.560 in them backing down on it, though.
00:14:05.380 I mean, look how they're reacting to Musk.
00:14:06.980 It's the most interesting thing
00:14:08.220 that they're all up in arms,
00:14:10.820 that they're not going to be able
00:14:11.720 to censor everyone
00:14:12.820 as much as they have been.
00:14:13.960 And it's a funny thing, too,
00:14:14.960 coming from people
00:14:15.540 who have been assuring us
00:14:16.400 that they haven't been censoring us
00:14:17.840 up to this point.
00:14:18.900 So on the one hand,
00:14:19.540 they're like,
00:14:19.840 no, we've never censored anyone.
00:14:21.240 This is a fair platform.
00:14:22.440 You know, we enforce the rules
00:14:23.400 really fairly.
00:14:24.480 And then all of a sudden,
00:14:25.080 Musk says,
00:14:25.560 I'm going to open this up
00:14:26.380 and make it a free speech platform.
00:14:27.640 And they freak out
00:14:28.580 like that's a big deal.
00:14:29.780 Well, why is that a big deal
00:14:30.760 if that's what you've been doing
00:14:31.620 the whole time
00:14:32.140 if you haven't been censoring?
00:14:33.320 So it's really interesting
00:14:34.540 to see the reaction.
00:14:35.340 But I don't think that,
00:14:37.040 I don't think that
00:14:38.400 if the deal, for example,
00:14:39.760 were to fall through
00:14:40.480 or something,
00:14:41.000 then Twitter would have their own accord
00:14:42.480 try to fix these problems.
00:14:43.740 I don't really think that they care.
00:14:45.880 They're really forceful
00:14:47.560 in trying to get you
00:14:48.460 to adopt their ideology,
00:14:49.960 their way of thinking.
00:14:51.300 They unapologetically
00:14:52.500 will try to say
00:14:53.500 they know what's real,
00:14:55.200 what's fake information.
00:14:56.760 They know what's good
00:14:57.680 and an acceptable speech
00:15:00.660 and what's hate speech
00:15:01.860 and what's unacceptable.
00:15:03.260 And they're happy
00:15:03.960 to shove that down our throats.
00:15:05.940 All right.
00:15:06.080 So this may sound
00:15:06.720 like a weird question,
00:15:07.620 but explain why
00:15:08.600 you think it's important
00:15:09.760 for the Babylon Bee
00:15:10.980 to be able to say,
00:15:12.100 Rachel Levine
00:15:13.200 is our man of the year.
00:15:15.920 Where to begin with that?
00:15:17.180 I mean, I think it's
00:15:17.720 a basic fundamental truth.
00:15:19.420 It's like,
00:15:20.220 it's like,
00:15:20.620 why is it important
00:15:21.380 to be able to say
00:15:22.060 that two plus two equals four
00:15:23.600 when people are trying
00:15:25.040 to tell you
00:15:25.460 that two plus two equals five
00:15:27.040 and that you have to say
00:15:28.300 that two plus two equals five?
00:15:30.080 I mean, I think that's just
00:15:31.080 what free speech is.
00:15:32.980 You know, it's like
00:15:33.920 that Orwell quote.
00:15:34.800 If that, if that,
00:15:35.580 if two plus,
00:15:36.200 if you're allowed to say
00:15:36.840 that two plus two equals four,
00:15:38.240 then everything else
00:15:39.600 follows from there.
00:15:40.400 Um, that's freedom.
00:15:42.340 I think that's just
00:15:42.960 at bedrock,
00:15:43.860 that's what freedom is.
00:15:44.760 And I, and I don't think,
00:15:46.060 um, that, that speaking
00:15:47.900 the truth or,
00:15:48.660 or stating simple
00:15:49.620 biological facts
00:15:50.740 is, can possibly
00:15:52.520 be hate speech.
00:15:53.500 Um, you know,
00:15:54.700 a fact is just a fact.
00:15:55.980 It's just a brute fact
00:15:57.060 and you can't try
00:15:57.700 to get at the motives
00:15:58.520 of someone who's
00:15:59.260 willing to say a fact.
00:16:00.680 So I don't know this,
00:16:02.140 this whole,
00:16:02.740 this whole idea
00:16:03.340 that we have to,
00:16:04.060 uh, to buy into this stuff,
00:16:05.960 you know, and this is
00:16:06.520 one of the issues
00:16:07.120 that I have with Twitter
00:16:07.860 right now is that they
00:16:08.620 have their own ideology
00:16:10.560 is baked into their
00:16:12.060 terms of service.
00:16:13.120 They have this ringing
00:16:14.100 tribute to free speech
00:16:15.420 on the front end
00:16:16.100 of their, of their
00:16:16.840 mission statement.
00:16:17.760 And then you dig down
00:16:18.880 into the terms of service
00:16:19.960 and they have a
00:16:20.520 hateful conduct policy
00:16:21.680 that includes all kinds
00:16:22.720 of things like dead
00:16:23.740 naming and misgendering
00:16:24.900 as hateful conduct.
00:16:27.100 And if you disagree,
00:16:28.620 you know, there's,
00:16:29.120 there's a lot of people
00:16:29.840 who think it's misgendering
00:16:30.900 to call someone
00:16:31.640 who's biologically male,
00:16:33.860 a woman.
00:16:34.580 They consider that
00:16:35.600 misgendering,
00:16:36.360 but we can't even have
00:16:37.240 a debate or discussion
00:16:38.200 about that.
00:16:39.060 And I think that's
00:16:39.560 ultimately what it comes
00:16:40.340 to.
00:16:40.520 If you're going to say
00:16:41.100 that you're a free speech
00:16:41.860 platform with no barriers
00:16:43.180 to free expression,
00:16:44.020 then you have to let
00:16:45.000 people debate those
00:16:45.920 things.
00:16:46.220 And what's the harm
00:16:47.240 in letting them talk,
00:16:48.180 talk it out and try
00:16:49.240 to convince each other.
00:16:50.580 Why do you have to tell
00:16:51.460 them that they must only
00:16:53.100 say or think one thing?
00:16:55.620 Vivek, uh, Ramaswamy
00:16:57.040 is coming out a little
00:16:57.680 later, tech executive
00:16:59.120 and, um, philanthropist
00:17:00.860 and, you know, knows a lot
00:17:02.560 about this world.
00:17:03.140 And one of the things
00:17:04.620 he says, and I think
00:17:05.760 this is exactly right,
00:17:06.680 is they smuggle in
00:17:09.620 their viewpoint
00:17:11.320 discrimination through
00:17:13.400 that, that part of the
00:17:15.900 policy you mentioned at
00:17:16.860 the bottom, speaking to
00:17:17.820 hate speech, right?
00:17:19.200 So who decides what is
00:17:21.120 hate speech?
00:17:22.140 How is it hateful for
00:17:23.740 you to push back on the
00:17:24.820 narrative that Rachel
00:17:26.240 Levine, who lived the
00:17:28.000 vast majority of her life
00:17:29.840 as a man, uh, is in
00:17:32.180 fact a man.
00:17:33.100 And you're basically
00:17:33.880 saying shouldn't be
00:17:34.740 honored as one of USA
00:17:36.820 today's women of the
00:17:37.920 year.
00:17:38.940 That's a view.
00:17:39.840 That's, that's a
00:17:40.620 legitimate point of view
00:17:41.920 that a lot of people
00:17:42.920 share.
00:17:43.800 And so the way they
00:17:45.400 smuggle in to steal
00:17:46.860 Vivek's, uh, turn of
00:17:49.200 phrase, their, their
00:17:50.760 viewpoint discrimination
00:17:52.040 against you is by
00:17:54.120 calling it hate speech,
00:17:55.580 by labeling it.
00:17:56.580 Now, I mean, like it's
00:17:57.360 not like somebody's on
00:17:58.220 there dropping the N
00:17:59.200 word.
00:17:59.560 There are certain
00:18:00.020 things we can all
00:18:00.820 agree depending on, I
00:18:02.860 mean, like I would
00:18:03.440 agree, but if I listen
00:18:04.600 to Spotify, I hear that
00:18:05.500 word every other two
00:18:06.300 seconds.
00:18:07.320 Um, but there are
00:18:08.020 certain words that we
00:18:08.960 can, most of us agree
00:18:10.200 should not be bandied
00:18:11.500 about with impunity on,
00:18:12.900 on a forum like Twitter,
00:18:14.140 which isn't run by the
00:18:15.200 government, but a
00:18:16.080 private company, but they
00:18:18.580 draw the line on
00:18:20.120 anything that challenges
00:18:21.880 liberal orthodoxies.
00:18:23.860 Yeah.
00:18:24.160 And not, you know, not
00:18:25.380 just that, but it also
00:18:26.500 comes down to the COVID
00:18:27.440 misinformation stuff, you
00:18:29.160 know, and, and these,
00:18:29.960 and these things are
00:18:30.520 moving targets.
00:18:31.400 They're constantly
00:18:32.100 changing.
00:18:32.700 It's not like they have
00:18:33.480 like some fixed, uh,
00:18:35.360 rigid policy that's just,
00:18:36.840 you know, that's rooted
00:18:37.620 in reality and you, and
00:18:38.780 you have to, and you
00:18:39.520 have to deal with that.
00:18:40.380 They have this, these
00:18:41.280 moving targets everywhere
00:18:42.300 that you have to keep up
00:18:43.180 with.
00:18:43.400 And what's, what's
00:18:44.140 misinformation one day
00:18:45.380 is a fact the next day.
00:18:47.100 And then, you know,
00:18:47.860 what's, what's hate
00:18:48.560 speech, you know, today
00:18:50.280 wasn't hate speech
00:18:51.200 yesterday.
00:18:51.920 These things are changing
00:18:52.860 all the time.
00:18:53.460 It's hard to keep up
00:18:54.080 with it, but it is, it
00:18:54.980 is smuggled in.
00:18:55.780 It is, it is the
00:18:56.480 problem that, that the,
00:18:57.680 uh, that the terms of
00:18:58.820 service themselves have
00:19:00.520 ideology, progressive
00:19:01.680 ideology baked into them
00:19:03.160 so that you have to
00:19:04.040 affirm that ideology or
00:19:05.440 at least, or at the
00:19:06.860 very least, like just
00:19:07.840 not even speak about
00:19:08.780 these issues and
00:19:09.440 refrain from, uh, opining
00:19:10.980 on them at all, uh, in
00:19:12.820 order to stay on the
00:19:13.520 platform.
00:19:14.120 And I don't think
00:19:14.820 that's healthy for
00:19:15.420 democracy, especially if
00:19:16.620 the platform is so large
00:19:18.080 that you have tens of
00:19:19.120 millions of active users.
00:19:20.460 It's where most of the,
00:19:21.600 of the public discourse
00:19:22.680 takes place, not just
00:19:24.000 between private
00:19:24.600 citizens, but between
00:19:25.520 government officials and
00:19:26.560 private citizens, when you
00:19:27.940 have a platform at that
00:19:29.040 level and it's, and it's
00:19:30.720 that influential in terms
00:19:32.020 of shaping public
00:19:32.860 thought, uh, and, and in
00:19:34.420 terms of disseminating
00:19:35.480 information out to the
00:19:36.580 masses, um, you have to be
00:19:38.380 willing to have debate on
00:19:39.960 that platform, good faith
00:19:41.160 debate.
00:19:41.500 And so much of this is done
00:19:43.560 in good faith.
00:19:44.080 And a lot of people would
00:19:44.860 argue, you know, from the,
00:19:46.120 from the other side, the
00:19:46.880 conservative side of this
00:19:47.860 is that when you, when you,
00:19:49.580 when you confront people with
00:19:51.260 the facts on these issues,
00:19:52.620 that's not hate, you know,
00:19:54.340 telling the truth is a
00:19:55.520 loving thing to do.
00:19:56.520 It's a compassionate thing
00:19:57.620 to do.
00:19:58.040 You know, affirming a lie is,
00:19:59.980 is not a compassionate or
00:20:01.200 loving thing to do.
00:20:02.140 And I think there's a
00:20:02.980 discussion to be had about
00:20:03.940 that.
00:20:04.300 What constitutes actual
00:20:05.400 compassion, love, um, you
00:20:07.420 know, it's defined as hate
00:20:08.820 speech to tell the truth.
00:20:10.040 Uh, I think that's a
00:20:11.220 problematic position to
00:20:12.220 take.
00:20:12.500 And I think it's, it's
00:20:13.340 certainly, it's certainly
00:20:14.420 something that we should be
00:20:15.160 able to debate.
00:20:16.840 You know, I think about
00:20:18.780 this sometimes remember
00:20:20.580 Dr. Keith Abloh, he's got
00:20:22.960 in all sorts of trouble
00:20:23.800 since, since back when we
00:20:25.280 used to have him on a Fox
00:20:26.480 during the day, I haven't
00:20:28.000 followed it that closely,
00:20:28.840 but he did make a point
00:20:30.120 that there is some, there's
00:20:32.480 a mental disorder in the
00:20:34.520 psychiatric field where some
00:20:36.120 people feel like they have
00:20:37.600 to cut off their arm and
00:20:39.920 they feel really strongly
00:20:41.460 that their arm must come
00:20:43.120 off or they're going to
00:20:44.340 die or they can't live
00:20:45.300 and something.
00:20:46.520 And he said, if somebody
00:20:47.800 like that comes into my
00:20:48.720 office, I say, no, you're
00:20:51.420 not right.
00:20:52.420 We're not going to cut off
00:20:53.540 your arm that you're not
00:20:55.060 going to die if, if you
00:20:56.100 don't cut that arm off.
00:20:57.180 And he was likening that to
00:20:59.180 the way he would treat
00:21:00.600 somebody who thinks they're
00:21:01.600 trans, um, coming into his
00:21:03.840 office.
00:21:04.040 Now I realize today people
00:21:05.440 freak out.
00:21:06.260 They freaked out back then
00:21:07.160 for him, from him saying
00:21:08.200 that, but certainly by
00:21:10.120 today's standards, he'd
00:21:11.100 get in trouble by saying
00:21:12.320 something like that.
00:21:13.500 But, you know, there is a
00:21:15.180 real question about whether
00:21:16.520 we should be able to push
00:21:17.780 back on people who say, I
00:21:21.060 have this intense need to
00:21:24.000 chop off my genitals, right?
00:21:26.960 Like, why can't there be
00:21:28.680 more of a community saying,
00:21:30.000 no, you are wrestling with
00:21:32.320 something inside of you that
00:21:34.100 may not be real, that may be
00:21:36.200 curable.
00:21:37.140 It's not gay conversion
00:21:38.500 therapy.
00:21:38.900 It's making sure this person
00:21:40.940 really is suffering from the
00:21:42.620 thing he thinks he's
00:21:43.440 suffering from and sparing
00:21:45.120 them, sparing them a lifetime
00:21:46.800 of agony if they make a
00:21:48.740 terrible decision the wrong
00:21:50.800 way.
00:21:52.380 Right.
00:21:53.120 Yeah.
00:21:53.680 It's not like these are all
00:21:54.900 just, uh, happy success
00:21:56.720 stories where after these
00:21:58.000 transitions, you know, there's
00:21:59.220 no issues after that.
00:22:00.280 Everyone just feels, uh, uh,
00:22:02.200 perfectly, uh, fine and
00:22:04.220 content and happy in their
00:22:05.280 new identity.
00:22:06.640 Um, but, but even if you
00:22:08.160 just bring it down to the
00:22:09.840 fact that these are adults
00:22:11.180 making their own decisions
00:22:12.320 and you say, okay, well,
00:22:13.380 you know, someone can do
00:22:14.180 what they want to do and
00:22:14.920 identify how they want to
00:22:16.020 identify.
00:22:16.700 You still have the issue of
00:22:18.060 compelled speech.
00:22:19.000 You still have the issue of,
00:22:20.700 well, you're forcing other
00:22:21.500 people to acknowledge that
00:22:22.920 and, and, uh, and agree with
00:22:24.440 that and affirm that or not
00:22:26.220 speak at all.
00:22:27.040 And I think that that's a
00:22:28.160 problem, even if you're fine
00:22:29.720 with, um, the fact that
00:22:31.220 people can identify with
00:22:32.280 however they want or have
00:22:33.220 whatever surgeries they want
00:22:34.420 to have, um, you know, the
00:22:36.380 issue of compelled speech I
00:22:37.640 think is one that's, that,
00:22:38.740 that's a serious issue right
00:22:40.060 now.
00:22:40.520 In the Ablo, uh, discussion,
00:22:41.940 it would be the equivalent
00:22:42.940 of, you know, the person
00:22:44.420 says, I know I don't have a
00:22:45.640 left arm.
00:22:46.160 I want it off.
00:22:46.900 Why is it still there?
00:22:47.660 Sometimes I know I'm missing
00:22:48.700 my left arm and the, and
00:22:50.020 the left arm needs to come
00:22:50.840 off.
00:22:51.140 It'd be the same as Twitter
00:22:52.080 saying, you must refer to
00:22:53.560 that person as a one-armed
00:22:54.620 man.
00:22:55.080 You must.
00:22:55.900 And you're like, I can see
00:22:57.140 two arms.
00:22:57.840 And they're like, it's
00:22:58.980 disrespectful and it's
00:23:00.320 hateful for you to call him
00:23:01.780 anything other than a
00:23:02.540 one-armed man.
00:23:03.260 I mean, that's effectively
00:23:03.960 where we are right now.
00:23:05.160 And the rest of us are
00:23:06.040 standing there saying, I
00:23:07.020 know what my eyes can show
00:23:08.720 me.
00:23:08.940 I see two arms.
00:23:10.900 Well, and it also comes
00:23:11.840 down to, you know, the
00:23:13.300 person who is offended by
00:23:15.400 what you have to say, let's
00:23:16.780 say, you know, somebody
00:23:17.800 gets really upset that
00:23:18.780 they've been misgendered
00:23:19.840 because it, it hurts their
00:23:20.940 feelings or it upsets them.
00:23:22.260 It makes them angry.
00:23:23.080 It doesn't, you know, it's
00:23:24.120 not what, how they want
00:23:25.000 you to identify them.
00:23:26.260 Well, I mean, there's plenty
00:23:28.720 of cases where, where, where
00:23:31.060 people on the left say
00:23:32.320 hurtful things about people
00:23:33.880 on the right, call them
00:23:34.720 Nazis, call them racists,
00:23:36.320 call them horrible names.
00:23:37.380 I've been called horrible
00:23:38.360 names.
00:23:38.720 I'm sure you've been called
00:23:39.660 terrible names.
00:23:41.120 And that, and that hurts our
00:23:42.420 feelings, right?
00:23:43.080 It's not, it's not nice to
00:23:44.220 hear when people, when people
00:23:45.460 say mean things about you.
00:23:47.820 They have a right to say
00:23:49.020 them.
00:23:49.260 Sometimes they do say them
00:23:50.340 from a place of hate as
00:23:51.640 well.
00:23:51.960 And sometimes it's because
00:23:53.360 they just don't like us as a
00:23:54.640 person.
00:23:54.940 And it's not just that
00:23:55.920 they're trying to speak
00:23:56.660 truth to us because they
00:23:57.820 love us.
00:23:59.200 But, you know, there's,
00:24:00.540 there's plenty of examples
00:24:01.800 of speech like that, that
00:24:03.180 isn't deemed problematic on
00:24:04.700 these platforms.
00:24:05.220 So they're very specific
00:24:06.460 about what they hone in on
00:24:07.680 and consider it an offense
00:24:09.720 that, you know, you can't
00:24:10.620 come back from that results
00:24:11.700 in a permanent suspension.
00:24:13.540 Yeah.
00:24:14.000 The trans issue in
00:24:14.900 particular is very dicey for
00:24:16.320 them.
00:24:16.660 They, they are very
00:24:17.480 protective on this issue.
00:24:18.680 And that's because trans
00:24:19.920 lobbyists are so damn loud.
00:24:21.520 And again, I've said this
00:24:22.800 many times, do not represent
00:24:24.000 the majority of trans
00:24:25.140 people who are not these
00:24:27.560 crazy, like far left
00:24:29.740 lunatics that their
00:24:30.780 advocates are.
00:24:31.740 I mean, it's just crazy.
00:24:32.600 Like Katie Herzog's been
00:24:33.840 saying, get better people,
00:24:34.920 get better representatives.
00:24:36.380 But here is some of it was
00:24:37.880 encapsulated in the
00:24:38.760 discussion that they had on
00:24:39.860 Joe Rogan.
00:24:40.660 This is 2019.
00:24:42.280 And you remember this, we
00:24:43.480 had Tim Poole on the show
00:24:44.420 and we talked about this
00:24:45.220 moment, this incredible
00:24:46.000 moment where Jack Dorsey
00:24:47.880 that, you know, the former,
00:24:49.380 the founder and head of
00:24:50.320 then CEO of Twitter and his
00:24:52.520 chief legal officer.
00:24:53.500 I think she's general
00:24:54.160 counsel, went on to the
00:24:56.000 Joe Rogan show and Joe had
00:24:58.120 Tim Poole come on, who was
00:25:00.100 really well versed in all of
00:25:01.520 this and kind of cross
00:25:03.340 examine these guys on their
00:25:04.800 censorship policy.
00:25:06.040 We have a little bit of
00:25:06.760 that queued up with
00:25:07.860 soundbite for.
00:25:09.320 So your platform
00:25:10.780 restricts speech.
00:25:13.020 Our platform promotes
00:25:14.040 speech unless people
00:25:15.280 violate our rules.
00:25:16.880 And in a specific
00:25:17.500 direction.
00:25:18.620 In any direction.
00:25:19.700 But uncle, I don't want to
00:25:20.860 say his name.
00:25:21.380 The guy who calls for
00:25:22.160 death gets a suspension.
00:25:23.040 The guy who insinuates
00:25:24.320 death gets a permanent
00:25:25.000 ban.
00:25:25.460 But Tim, you're you're
00:25:26.540 misinterpreting what I'm
00:25:27.660 saying.
00:25:28.060 And I feel like you're
00:25:28.760 doing it deliberately.
00:25:29.700 It's not about one
00:25:30.580 particular thing.
00:25:31.300 It's about a pattern and
00:25:32.280 practice of violating.
00:25:33.060 And you have a pattern and
00:25:34.080 practice of banning only one
00:25:35.420 faction of people recently
00:25:36.980 published an article where
00:25:37.980 they looked at 22 high
00:25:39.260 profile bannings from 2015
00:25:40.380 and found 21 of them were
00:25:42.240 only on one side of the
00:25:43.640 cultural debate.
00:25:44.200 But I don't look at the
00:25:45.360 political spectrum of people
00:25:46.480 when I'm looking at their
00:25:47.380 tweets.
00:25:47.600 Right.
00:25:47.860 You have a bias.
00:25:48.900 I don't know who they
00:25:49.120 are.
00:25:49.480 You're biased and you're
00:25:50.700 you're targeting specific
00:25:51.720 individuals because your
00:25:52.620 rules support this
00:25:53.700 perspective.
00:25:54.460 No, I don't agree with
00:25:55.960 that.
00:25:56.100 Well, so can you be clear
00:25:57.180 though?
00:25:57.480 And like what rules
00:25:58.480 support that perspective?
00:25:59.220 Specifically, the easiest
00:26:00.580 one is misgendering, right?
00:26:01.740 Because that's so clearly
00:26:02.720 ideological.
00:26:03.400 If you ask a conservative
00:26:04.660 what is misgendering, they'll
00:26:06.700 say if someone is
00:26:07.480 biologically male and you
00:26:09.240 call them, you know, she
00:26:10.800 a biologically male, you
00:26:12.380 call them a she.
00:26:13.060 That's misgendering.
00:26:13.800 That's a conservative view.
00:26:14.880 The progressive view is
00:26:16.720 inverted.
00:26:17.400 So now you actually have in
00:26:19.100 your policies a rule
00:26:21.220 against the conservative
00:26:22.660 perspective.
00:26:23.680 I have a rule against the
00:26:25.020 abuse and harassment of
00:26:26.260 trans people on our
00:26:27.240 platform.
00:26:27.700 Mm hmm.
00:26:29.800 Right.
00:26:30.200 Again, the smuggling
00:26:31.920 smuggling.
00:26:33.420 Right.
00:26:33.960 That's really what she's
00:26:34.900 saying.
00:26:35.220 And there's a report out
00:26:36.400 today from Politico saying
00:26:37.780 that woman, the general
00:26:39.060 counsel, cried on Monday
00:26:40.900 during a virtual meeting
00:26:42.360 with her team as she
00:26:43.340 expressed concerns about
00:26:44.960 how the company might
00:26:45.960 change under Elon Musk.
00:26:48.480 By the way, she said to be
00:26:49.740 the person behind the
00:26:50.640 suspension of the New York
00:26:51.580 Post account and was
00:26:53.000 pivotal in Trump's ban
00:26:54.040 from Twitter.
00:26:55.000 I mean, this is why you
00:26:55.840 get people like Ben
00:26:56.620 Shapiro saying Elon Musk
00:26:58.080 going to have to go in
00:26:58.740 there and clean house of
00:26:59.640 people who are not pro
00:27:01.160 free speech.
00:27:01.960 Doesn't matter.
00:27:02.640 Conservative liberal.
00:27:03.660 Are you pro free speech
00:27:05.040 or aren't you?
00:27:05.720 And not the fake kind,
00:27:07.100 the real kind.
00:27:09.600 Well, yeah, they'll say
00:27:10.760 they're pro free speech,
00:27:12.480 but they're just also in
00:27:14.240 favor of of reasonable
00:27:15.600 content moderation.
00:27:16.540 This is just reasonable
00:27:17.760 content moderation, Megan.
00:27:19.320 You know, they're just
00:27:19.800 trying to apply good
00:27:21.760 standards that are that
00:27:22.760 are based in not being
00:27:23.840 abusive and hateful.
00:27:25.440 But yeah, the ideology
00:27:26.240 is smuggled into that.
00:27:27.780 And it's it's not this
00:27:29.120 is not really an issue of
00:27:30.460 of them not having a
00:27:31.840 problem with hate speech.
00:27:32.900 It's that they actually
00:27:33.880 hate speech.
00:27:35.520 They hate the freedom to
00:27:36.500 disagree with them.
00:27:37.280 They want to enforce
00:27:38.100 their ideology.
00:27:39.160 And I think that she's
00:27:40.280 being disingenuous.
00:27:41.180 I think she knows exactly
00:27:42.160 what she's doing.
00:27:43.260 But the crying in the
00:27:44.700 meeting thing, I mean, it
00:27:45.560 just goes back to show
00:27:46.420 you how emotional this is.
00:27:47.660 This is feeling based.
00:27:49.280 It's all based on feelings
00:27:50.640 and we need to have some
00:27:52.360 facts enter into the
00:27:53.380 discussion.
00:27:53.860 We need to let reality
00:27:55.020 drive some of the
00:27:56.260 conversation instead of
00:27:57.300 just feelings.
00:27:59.120 Is it so hard?
00:28:00.340 Is it so heartbreaking for
00:28:02.140 you that a man has taken
00:28:04.040 over the platform?
00:28:05.000 He says he's going to
00:28:05.680 improve it.
00:28:06.180 He's going to make it more
00:28:06.900 profitable for shareholders
00:28:08.420 and its employees who will
00:28:10.780 benefit from that as well.
00:28:12.120 But by making it a truly
00:28:14.040 open forum, by allowing
00:28:15.540 different viewpoints to air.
00:28:17.320 I mean, if that if you
00:28:19.040 have such a deep problem
00:28:20.020 with that, that it brings
00:28:20.600 you to tears, get out,
00:28:22.100 get out, right?
00:28:22.800 You're in the wrong job.
00:28:23.780 Quit.
00:28:24.100 You're a damn lawyer.
00:28:25.180 You go to law school to
00:28:26.340 learn about opposing
00:28:27.600 viewpoints and how to make
00:28:29.000 the better argument.
00:28:29.880 If you can't do that, leave
00:28:32.020 your shitty at your job.
00:28:34.460 True.
00:28:35.000 And I would also say this,
00:28:36.420 you know, most of this
00:28:37.240 stuff is geared towards
00:28:38.540 safety, right?
00:28:39.500 She mentions abuse and
00:28:41.060 and and threats and
00:28:43.460 things like that.
00:28:44.520 So we have these we're
00:28:46.100 we're supposed to put these
00:28:47.060 policies in place for the
00:28:48.360 safety of the people on
00:28:49.540 the platform.
00:28:50.520 And that's another thing
00:28:51.860 that I would disagree with.
00:28:53.040 You know, I think Tim made
00:28:53.760 some great points there.
00:28:54.680 I mean, I make similar
00:28:55.460 points when I'm when I'm
00:28:56.880 responding on these things.
00:28:57.860 But I think this issue
00:28:59.700 of safety is an important
00:29:00.740 one.
00:29:01.020 You know, you can try to
00:29:02.200 create a safe environment
00:29:03.580 and in doing so
00:29:05.420 actually harm people.
00:29:07.360 And there's great examples
00:29:08.280 of this.
00:29:08.700 I think a really great
00:29:09.420 illustration of it is
00:29:10.720 this study that was done
00:29:12.560 on public playgrounds in
00:29:14.060 New York City that found
00:29:15.120 that the playgrounds had
00:29:15.980 been made too safe for
00:29:17.480 children and it was
00:29:18.540 actually harming the
00:29:19.340 children because it
00:29:19.980 wasn't teaching them
00:29:20.700 about risk.
00:29:21.920 And this is the thing
00:29:22.560 like when you go to
00:29:23.180 college, for example,
00:29:24.160 and they try to tell you
00:29:25.320 that you need to have a
00:29:26.260 safe space when you're
00:29:28.120 in when you're in an
00:29:29.160 environment where you're
00:29:29.840 supposed to be learning,
00:29:30.760 you know, encountering
00:29:31.840 other ideas,
00:29:33.500 encountering debates
00:29:35.120 and and being on the
00:29:36.920 receiving end of
00:29:37.860 criticism, even harsh
00:29:39.000 criticism.
00:29:39.660 Sometimes that's how you
00:29:40.820 build resilience.
00:29:41.540 It's how you build
00:29:42.300 character.
00:29:42.800 You don't want to be
00:29:44.360 in a safe space where
00:29:45.460 you're just insulated
00:29:46.780 from all of that or
00:29:47.560 you're never going to
00:29:48.080 grow as a person.
00:29:49.300 You might never change
00:29:50.060 your views on things
00:29:50.980 where you should change
00:29:51.840 your views.
00:29:52.700 I mean, you're certainly
00:29:53.280 never going to be
00:29:53.800 challenged.
00:29:54.520 So this idea that we
00:29:55.760 need safety and
00:29:56.640 protection, you know,
00:29:57.860 we are adults and
00:29:59.340 freedom of speech is a
00:30:00.580 valuable thing.
00:30:01.840 And I think that it's
00:30:02.580 so condescending to act
00:30:03.860 like we need to protect
00:30:04.760 each other from ideas
00:30:05.860 that might harm us.
00:30:07.420 And instead, we should
00:30:09.020 be trying to protect
00:30:09.820 the freedom itself.
00:30:11.760 Absolutely.
00:30:12.820 All right.
00:30:13.220 So much more to go
00:30:13.880 over, including the
00:30:15.160 Babylon Bee now
00:30:16.380 reportedly hiring,
00:30:18.980 backing, helping the
00:30:20.720 woman behind Libs of
00:30:22.140 TikTok, who was outed
00:30:24.460 against her will by this
00:30:25.680 Taylor Lorenz, who is
00:30:27.240 again playing the victim.
00:30:29.800 I'll bring you up to
00:30:30.640 date.
00:30:30.900 We'll talk more about it
00:30:31.960 and Elon with Seth in
00:30:33.640 one second.
00:30:34.480 Seth Dillon is staying
00:30:35.180 with us.
00:30:35.500 One note of
00:30:42.320 clarification, not
00:30:43.640 from anything we've
00:30:44.360 reported, but from
00:30:45.440 something that's out
00:30:46.200 there.
00:30:46.880 There is a Twitter
00:30:48.760 executive, the CMO,
00:30:50.780 chief marketing
00:30:51.300 officer named Leslie
00:30:52.480 Burland.
00:30:53.860 And the reason we
00:30:54.920 know what happened at
00:30:55.660 this at this Twitter
00:30:57.940 internal meeting
00:30:58.960 company wide call on
00:31:00.180 Monday is Project
00:31:01.320 Veritas obtained a
00:31:02.440 45 minute recording of
00:31:03.520 it.
00:31:03.660 So people have been
00:31:05.380 reporting on it and
00:31:06.000 that's fine.
00:31:07.180 God bless James O'Keefe
00:31:08.460 because he gets his
00:31:09.060 hands on great stuff.
00:31:10.840 But this woman is
00:31:13.100 being, I think,
00:31:14.800 unfairly lumped in
00:31:15.980 with the chief legal
00:31:17.420 officer, the general
00:31:18.220 counsel, this Leslie
00:31:19.400 Burland, so far as I
00:31:20.660 can see, didn't do the
00:31:22.480 thing she's being
00:31:23.200 accused of and a lot
00:31:24.240 of websites and so on
00:31:25.560 today.
00:31:26.600 She well, let me put
00:31:28.400 it to this way.
00:31:29.080 This is the soundbite of
00:31:30.360 her that's circulating.
00:31:31.860 OK, listen.
00:31:32.440 Elon made it clear in
00:31:34.320 public that a large
00:31:35.380 part of the reason he
00:31:36.340 bought the platform was
00:31:37.560 because of our
00:31:38.080 moderation policies and
00:31:39.400 disagreements in how
00:31:40.780 we deal with health.
00:31:41.820 This puts Twitter
00:31:42.600 service and trust and
00:31:43.720 safety as well as
00:31:44.800 anybody who cares about
00:31:45.980 health on the platform
00:31:47.040 in a very difficult
00:31:48.260 position.
00:31:50.160 OK, so she's being
00:31:51.280 slammed because she's
00:31:52.740 saying Elon made it
00:31:53.940 clear the reason he
00:31:55.260 bought it was our he
00:31:56.600 wanted to change our
00:31:57.380 moderation policies.
00:31:58.320 And this puts us in a
00:32:00.240 very difficult position
00:32:01.120 if you if we care about
00:32:02.060 trust and safety.
00:32:03.180 The truth is she didn't
00:32:03.880 say that.
00:32:04.440 The truth is she was
00:32:05.500 repeating the questions
00:32:07.420 that had been asked by
00:32:08.520 the employees.
00:32:09.380 And here is the full
00:32:10.400 soundbite.
00:32:11.540 And back to you,
00:32:12.660 Parag.
00:32:13.580 Elon made it clear in
00:32:14.920 public that a large part
00:32:16.220 of the reason he bought
00:32:17.160 the platform was because
00:32:18.360 of our moderation
00:32:19.100 policies and
00:32:19.980 disagreements in how
00:32:21.380 we deal with health,
00:32:22.880 something that we place
00:32:23.780 in value very highly
00:32:25.020 within the company.
00:32:25.820 This puts Twitter service
00:32:27.740 and trust and safety
00:32:28.720 as well as anybody who
00:32:30.040 cares about health on
00:32:31.080 the platform in a very
00:32:32.620 difficult position.
00:32:34.740 Can you speak your
00:32:35.700 thoughts on this and how
00:32:37.140 those teams will be
00:32:38.520 supported?
00:32:39.840 All right.
00:32:40.420 So I just want to make
00:32:41.300 that clear.
00:32:41.800 She starts it with
00:32:42.600 question back to you,
00:32:43.840 Parag.
00:32:44.120 She's the moderator.
00:32:45.200 She's bringing the
00:32:45.920 Twitter employees
00:32:46.900 questions to the CEO.
00:32:49.960 And she's basically the
00:32:51.140 one who has to read
00:32:51.900 them.
00:32:52.140 And she went she did
00:32:52.980 clarify that in response
00:32:54.080 to this.
00:32:54.780 But just, you know,
00:32:55.700 we have to make sure
00:32:56.580 forgive me, but like
00:32:58.340 this this general
00:32:59.080 counsel is a problem
00:33:00.740 and should go.
00:33:01.680 But I don't I haven't
00:33:02.760 seen the evidence that
00:33:03.500 the CMO is in the same
00:33:05.280 boat as her.
00:33:06.420 So I will defend her
00:33:07.420 at least based on what
00:33:09.060 I've heard so far.
00:33:10.560 OK, so let's switch
00:33:13.220 gears because it's
00:33:14.660 sort of related, but
00:33:16.080 it's not totally related
00:33:17.240 to to what we've been
00:33:19.720 discussing.
00:33:20.260 Libs of TikTok got
00:33:21.740 suspended on Twitter
00:33:23.160 as well last week.
00:33:24.560 And it was a very big
00:33:25.220 story.
00:33:26.400 And they they
00:33:29.080 they were the subject
00:33:30.120 of a long piece by
00:33:31.420 Taylor Lorenz, who
00:33:32.780 works for The
00:33:33.260 Washington Post, who
00:33:34.600 decided that they need
00:33:35.880 to be outed, that they
00:33:37.280 are also anti trans and
00:33:39.440 contributing to, quote,
00:33:40.540 hate.
00:33:42.080 And so The Washington
00:33:42.820 Post decides to unleash
00:33:44.100 the Jeff Bezos
00:33:45.020 resources on Libs of
00:33:47.240 TikTok, which is run by
00:33:48.640 somebody who wishes to
00:33:49.440 remain anonymous,
00:33:50.520 this, though she did
00:33:51.380 give an interview
00:33:51.920 while masking her
00:33:54.580 identity to Tucker and
00:33:56.320 at least one other
00:33:56.920 source.
00:33:57.580 Was I don't know.
00:33:58.600 Was it you guys?
00:33:59.140 I can't remember, but
00:33:59.780 at least one other.
00:34:01.820 And any post, I think
00:34:02.980 New York Post, New York
00:34:03.700 Post.
00:34:03.940 There we go.
00:34:05.180 So.
00:34:06.680 The the right wing
00:34:08.100 got very upset with the
00:34:09.300 Taylor Lorenz reporting
00:34:10.640 because it was
00:34:12.200 unnecessary.
00:34:12.720 We didn't really need
00:34:14.500 to know who was behind
00:34:15.460 Libs of TikTok.
00:34:16.740 They don't do this kind
00:34:17.680 of reporting behind for
00:34:18.840 anonymous accounts
00:34:19.940 pushing BLM
00:34:20.900 narratives or trans
00:34:23.080 narratives.
00:34:23.840 It's just Libs of
00:34:25.640 TikTok, which outs
00:34:26.700 what Libs on TikTok
00:34:28.060 are saying about
00:34:29.040 usually LGBTQ
00:34:30.860 issues.
00:34:32.980 You have stepped in
00:34:34.520 in some way to back
00:34:36.320 the woman who's been
00:34:37.340 outed as the purveyor
00:34:38.520 of Libs of TikTok.
00:34:39.220 What have you done?
00:34:41.940 So, yeah, I mean,
00:34:43.380 this was this is a
00:34:44.420 crazy story because
00:34:45.540 there's all kinds of
00:34:46.920 stuff happening here.
00:34:47.680 You recapped a little
00:34:49.040 bit of it, but Libs of
00:34:50.240 TikTok had been suspended
00:34:51.380 twice on Twitter leading
00:34:52.760 up to this.
00:34:53.340 There were all these
00:34:53.820 mass reportings where
00:34:55.560 basically the account
00:34:56.820 was getting mobbed and
00:34:58.000 they were trying to get
00:34:58.720 it shut down.
00:34:59.520 So you had some
00:35:01.440 Antifa groups that
00:35:03.240 were leading this
00:35:03.900 charge and and
00:35:05.720 they were just piling
00:35:06.460 on and thousands of
00:35:08.060 reports were streaming
00:35:08.840 into Twitter.
00:35:09.940 The safety team that
00:35:11.500 she was just talking
00:35:12.280 about, you know, for
00:35:13.220 for reports of
00:35:14.660 harassment and abuse
00:35:16.140 and the account
00:35:17.320 was kept getting
00:35:17.920 locked.
00:35:18.380 So it got locked a
00:35:19.080 couple of times in a
00:35:19.760 row.
00:35:21.520 They required that a
00:35:22.860 couple of tweets be
00:35:23.580 deleted in order for the
00:35:24.520 account to be unlocked.
00:35:26.020 So that was kind of
00:35:27.140 leading up to this whole
00:35:28.100 thing where there was
00:35:28.800 this doxing story that
00:35:30.200 was put out by Taylor
00:35:31.020 Lorenz, which, you
00:35:33.100 know, was the goal of
00:35:34.420 that was to expose who
00:35:35.680 this person behind Libs of
00:35:36.900 TikTok was.
00:35:38.280 And it's very funny when
00:35:39.060 you hear the interviews
00:35:39.840 that Taylor gives about
00:35:41.360 this.
00:35:41.640 You know, on the one
00:35:42.360 hand, she tries to say,
00:35:43.920 oh, we didn't expose any
00:35:45.020 information.
00:35:45.520 We didn't we didn't
00:35:46.560 reveal this person's
00:35:47.660 identity.
00:35:48.300 But then on the other
00:35:49.520 hand, a moment later,
00:35:51.720 she talks about how it
00:35:53.100 was important for us to
00:35:53.980 determine whether or not
00:35:54.700 this was a foreign actor
00:35:55.700 and reveal who this
00:35:56.600 person was.
00:35:57.280 So which is it?
00:35:58.000 Was she revealing the
00:35:58.780 person or not?
00:35:59.920 Let me play that.
00:36:00.680 OK, just pause your story
00:36:02.080 right there because we
00:36:03.000 have that on tape.
00:36:03.820 I mean, it's it's crazy
00:36:04.820 how often the left now
00:36:05.740 goes to foreign actor.
00:36:07.020 It was the Russians.
00:36:07.740 It was somebody foreign.
00:36:08.740 That's why I had to do
00:36:09.380 the reporting.
00:36:10.160 We had to know if it was
00:36:10.860 Vladimir Putin.
00:36:11.780 Here's soundbite five.
00:36:14.180 I think it's rare to see
00:36:15.940 an account gain so much
00:36:17.840 prominence so quickly and
00:36:19.560 be shaping these
00:36:20.480 narratives in such an
00:36:21.760 effective way, especially
00:36:22.600 against trans people.
00:36:24.320 So I was I mean, my story
00:36:25.800 was kind of long, but I
00:36:26.700 really wanted to make the
00:36:27.660 case like why this
00:36:28.680 account mattered.
00:36:29.740 And I think it's
00:36:30.180 incredibly important, you
00:36:31.500 know, as someone that
00:36:32.200 covers the influence or
00:36:33.260 industry to know who is
00:36:35.140 exerting influence in
00:36:36.740 this way.
00:36:37.360 I mean, for all we knew,
00:36:38.120 this could have been a
00:36:38.740 foreign actor, right, or
00:36:39.940 someone we just didn't
00:36:41.440 know.
00:36:42.400 Oh, my God.
00:36:43.760 Keep going.
00:36:44.320 That's not.
00:36:44.660 No, we need to know.
00:36:47.400 So, you know, she felt
00:36:48.900 she's got moral reasons for
00:36:51.180 why she needs to expose
00:36:52.120 this person.
00:36:52.720 And it comes on the heels
00:36:54.360 of her literally just
00:36:55.540 weeks ago, weeping in
00:36:57.820 an interview and saying
00:36:59.260 that she had personal
00:37:00.580 information exposed and
00:37:01.720 how horrifying this was
00:37:02.760 and how traumatizing it was
00:37:03.940 for her personally to have
00:37:05.380 somebody, you know,
00:37:06.380 putting information out
00:37:07.480 there that she didn't want
00:37:08.500 out there.
00:37:09.240 But that's literally what
00:37:10.480 she does professionally.
00:37:11.660 And so the hypocrisy there
00:37:13.380 is just absolutely stunning.
00:37:14.960 It's breathtaking.
00:37:15.960 But but, you know, I saw
00:37:17.760 this as an opportunity.
00:37:19.260 You know, lives of tick
00:37:20.000 talk is a very impactful
00:37:21.140 account.
00:37:21.860 And I think the work that
00:37:23.200 they're doing is actually
00:37:24.040 very important because what
00:37:25.360 they're doing is they're
00:37:26.140 bringing to light a lot of
00:37:27.520 these things.
00:37:28.640 A lot of these videos are
00:37:30.100 videos of teachers
00:37:31.220 admitting what they're
00:37:32.840 talking to their students
00:37:33.940 about.
00:37:34.420 And sometimes they're
00:37:35.200 kindergarten teachers, you
00:37:36.480 know, and they're talking
00:37:37.100 to their kids about issues
00:37:38.380 related to sex and gender
00:37:39.820 and things like this.
00:37:40.720 So the kind of stuff that
00:37:42.260 DeSantis is dealing with
00:37:43.280 here in Florida, we're
00:37:44.260 based out of Florida here.
00:37:45.460 You know, these are the
00:37:47.020 types of things that that
00:37:48.160 lives of tick tock is
00:37:49.000 bringing to the surface.
00:37:49.920 And and by the way, this
00:37:51.440 isn't like these aren't
00:37:52.440 private videos that are
00:37:53.580 being leaked.
00:37:54.320 These are publicly posted
00:37:55.680 videos on social media
00:37:57.200 that are just having a
00:37:58.400 light shown on them and
00:37:59.580 being amplified.
00:38:00.820 And I think that's an
00:38:01.760 important point to make
00:38:02.600 too.
00:38:03.400 But but I think it's
00:38:04.620 important that we see
00:38:05.460 what's going on.
00:38:06.260 And I think lives of
00:38:07.440 tick tock is doing
00:38:08.160 something really important
00:38:09.300 and significant.
00:38:11.740 And so, you know, I
00:38:12.960 worked out a deal with
00:38:13.820 her where she can
00:38:14.680 continue to do this, even
00:38:15.780 if they try to cancel
00:38:16.580 her.
00:38:16.900 The whole point is they
00:38:18.300 can try to cancel her,
00:38:19.420 but I'm not going to
00:38:20.080 fire her for doing this
00:38:21.100 job.
00:38:21.800 So she works for me now.
00:38:23.220 We have we've worked
00:38:23.900 out an arrangement.
00:38:24.580 It's not a deal between
00:38:25.460 the Babylon Bee and her,
00:38:27.240 but we've worked out a
00:38:28.740 deal where she can
00:38:29.460 actually run this
00:38:30.160 account, get paid a
00:38:32.180 salary to do it, and
00:38:33.500 we're going to turn it
00:38:34.060 into a media company.
00:38:35.480 That's amazing.
00:38:36.620 I love that.
00:38:37.880 Good for you.
00:38:38.560 You you were her Elon
00:38:40.000 Musk.
00:38:41.840 He was on a much
00:38:43.160 smaller scale.
00:38:43.920 You were hers with
00:38:45.720 there's a there's an
00:38:46.800 update to the story, which
00:38:47.860 I kind of love.
00:38:49.460 I have to say, speaking
00:38:50.380 of Tim Poole, he
00:38:52.220 decided to take out a
00:38:54.200 billboard in Times
00:38:55.580 Square, calling her out
00:38:57.300 on some of her nonsense.
00:38:58.980 Our pal Jeremy Boring of
00:39:00.000 The Daily Wire helped
00:39:00.900 and it's in Times
00:39:03.640 Square right now, I
00:39:04.660 think.
00:39:05.300 Right.
00:39:05.480 It's in Times Square
00:39:05.980 right now.
00:39:07.000 And it basically says
00:39:08.200 Taylor.
00:39:08.800 I guess we'll show
00:39:11.680 and say it'll it'll come
00:39:12.540 up, but it said that she
00:39:13.800 doxed libs of
00:39:15.620 Taylor Lorenz doxed
00:39:17.140 the libs of
00:39:18.080 of TikTok at
00:39:19.820 Tim Cast.
00:39:20.500 Hey, Wapo.
00:39:21.080 There it is.
00:39:21.520 OK.
00:39:21.680 Hey, Wapo.
00:39:22.440 Democracy dies in
00:39:23.280 darkness.
00:39:24.040 Taylor Lorenz doxed
00:39:25.680 libs of TikTok, which
00:39:27.600 they totally did.
00:39:28.640 And then she comes back
00:39:30.020 and once again, oh,
00:39:31.900 my God, she this woman
00:39:33.100 never misses a chance
00:39:35.380 to play the victim.
00:39:37.120 She she comes out
00:39:39.700 and says, oh, Tim
00:39:40.620 Poole and the CEO of
00:39:41.800 The Daily Wire did
00:39:42.520 this in an attempt to
00:39:43.820 discredit my reporting.
00:39:45.820 OK, so no.
00:39:47.340 Right.
00:39:47.680 No, they were really
00:39:48.620 just doing what you
00:39:49.860 just did, pointing her
00:39:50.760 out as a hypocrite.
00:39:52.000 She's out there crying
00:39:52.780 about getting doxed
00:39:53.560 and she's a doxer.
00:39:55.040 She says this billboard.
00:39:56.260 So it's undeniably
00:39:57.540 so idiotic.
00:39:58.420 It's hilarious.
00:39:58.940 But don't forget that
00:39:59.900 these campaigns have a
00:40:00.880 much darker and more
00:40:01.880 violent side.
00:40:03.300 OK, I await your
00:40:05.020 update on how your
00:40:06.560 life became violent
00:40:07.720 as a result of this.
00:40:09.000 She never provides
00:40:09.660 an example, blah, blah,
00:40:11.580 but I'm grateful to be
00:40:12.180 in a newsroom, recognizes
00:40:13.300 these bad faith,
00:40:14.200 politically motivated
00:40:14.960 attacks and has a
00:40:15.720 strong security team.
00:40:16.880 I'll bet you she's got
00:40:17.540 twenty five security
00:40:18.400 officers around her
00:40:19.340 right now with her
00:40:19.940 fake claims of the
00:40:21.520 violence.
00:40:22.540 So Tim Poole responds
00:40:23.380 by saying, I'm not
00:40:24.740 just I'm not
00:40:25.540 discrediting your
00:40:26.160 reporting.
00:40:26.600 I've repeatedly said
00:40:27.280 it was justified and
00:40:28.220 that publishing a name
00:40:29.440 is something we can
00:40:30.520 argue on the merits.
00:40:31.860 I'm calling you out
00:40:32.500 for lying when you
00:40:34.020 and The Washington
00:40:34.580 Post denied linking
00:40:35.860 to private details.
00:40:37.360 You published
00:40:38.160 lives of TikTok's
00:40:39.520 private address.
00:40:41.420 Just own it.
00:40:42.720 He goes on to say,
00:40:43.280 I'm glad to hear you're
00:40:44.120 you are not upset and
00:40:45.780 your friends and fans
00:40:46.540 are also happy.
00:40:47.260 It's the way it should
00:40:47.980 be.
00:40:48.520 You make a statement.
00:40:49.320 I contest it.
00:40:50.120 All of our points are
00:40:50.800 heard.
00:40:51.700 You on CNN, me and
00:40:53.040 Times Square.
00:40:53.780 Now move on.
00:40:55.260 Here she is.
00:40:56.740 My family and friends
00:40:58.060 are not happy.
00:40:59.860 They have been subject
00:41:00.780 to a nonstop stream
00:41:02.840 of hateful attacks,
00:41:04.620 doxing.
00:41:05.360 There she goes again
00:41:06.040 and ready.
00:41:08.220 Wait for it.
00:41:09.680 Violent attacks.
00:41:11.520 Name one challenge.
00:41:14.160 I call bullshit that
00:41:15.420 what violent attacks
00:41:16.560 you.
00:41:17.340 You don't think Taylor
00:41:18.100 Lorenz would have run
00:41:19.100 to the airways to
00:41:20.180 inform us all.
00:41:21.740 if a violent attack
00:41:23.240 had happened against
00:41:23.960 her friends or family
00:41:24.960 as a result of
00:41:25.960 something she wrote.
00:41:27.000 Bull.
00:41:27.880 Happy to hear that
00:41:28.840 you're moving on and
00:41:29.740 on it goes.
00:41:30.360 I just and with no
00:41:31.920 compassion or even
00:41:34.040 acknowledgement of the
00:41:36.080 fact that she just did
00:41:37.440 all of that same stuff
00:41:38.880 minus her fake violent
00:41:40.840 attacks claim to lives
00:41:42.400 of TikTok.
00:41:44.700 Yeah, it's like I said,
00:41:46.640 it's breathtaking.
00:41:47.700 I love how you read that,
00:41:48.800 by the way.
00:41:49.160 The dramatic reading is
00:41:50.060 great.
00:41:50.660 Thank you.
00:41:51.440 I mean, I wish you had
00:41:53.260 the clip of her sobbing
00:41:54.360 on TV.
00:41:55.200 I do.
00:41:55.900 Standby.
00:41:56.460 We never come to air
00:41:57.220 without it.
00:41:57.740 It's soundbite six.
00:41:59.040 I've had to remove
00:41:59.900 every single social tie.
00:42:02.340 I had severe PTSD
00:42:03.860 from this.
00:42:04.580 I contemplated suicide.
00:42:06.080 It got really bad.
00:42:07.780 You feel like any
00:42:08.640 little piece of
00:42:09.560 information that gets
00:42:11.760 out on you will be used
00:42:12.960 by the worst people on
00:42:14.140 the Internet to destroy
00:42:15.180 your life.
00:42:16.100 And it's so isolating
00:42:18.480 and terrifying.
00:42:20.900 It's horrifying.
00:42:24.140 I'm so sorry.
00:42:26.060 It's overwhelming.
00:42:27.500 It's really hard.
00:42:29.980 I'm sorry.
00:42:31.000 No, no, I know.
00:42:33.780 It's one thing, you know,
00:42:35.360 if she wasn't actually
00:42:36.360 doing this to other
00:42:37.140 people and she felt that
00:42:38.160 way, you could really
00:42:38.860 feel sorry for her.
00:42:39.880 But but the fact that the
00:42:41.220 reason that people come
00:42:42.480 after her is because she
00:42:43.540 comes after them.
00:42:44.540 You know, it's just it's
00:42:45.840 just insane that she's
00:42:47.420 that she tries to play
00:42:48.540 the victim like this.
00:42:49.420 But, you know, this is a
00:42:50.600 this is the point that I
00:42:51.600 made when I publicly
00:42:52.660 stated that I had worked
00:42:54.140 out an arrangement with
00:42:55.280 with the creator of
00:42:56.720 Libs of TikTok.
00:42:58.300 The effort to dox her
00:43:00.360 was not about journalism.
00:43:01.980 This wasn't about telling
00:43:03.300 a story that needed to be
00:43:04.540 told that was of great
00:43:05.480 public interest.
00:43:06.580 This was about trying to
00:43:07.700 intimidate and harass
00:43:08.960 this girl so that she
00:43:10.360 would be so afraid it
00:43:11.860 would raise the cost of
00:43:13.180 her doing her job to
00:43:14.240 the point where it
00:43:14.940 would become too
00:43:16.800 terrifying and risky to
00:43:18.060 do it.
00:43:19.040 And what I wanted to do
00:43:20.180 was step in and give her
00:43:21.240 the assurance that she
00:43:22.040 could continue to do
00:43:22.880 this.
00:43:23.120 We'll provide her
00:43:23.700 security if we have to
00:43:24.820 so that she can continue
00:43:26.020 to do what she's doing
00:43:27.000 and not have to have
00:43:28.080 those worries.
00:43:29.080 But that was the point
00:43:29.900 of this piece.
00:43:30.560 It was to make her feel
00:43:31.620 the way Taylor was just
00:43:32.660 saying she feels it was
00:43:34.000 to make her feel
00:43:34.760 terrified, horrified
00:43:36.540 and crawl back into a
00:43:38.180 hole and hide and stop
00:43:39.360 doing what she's doing
00:43:40.300 because it's so
00:43:41.280 effective.
00:43:41.700 I don't need
00:43:44.620 identifying information
00:43:46.040 on Libs of TikTok's
00:43:47.360 creator.
00:43:48.420 I know she's a woman.
00:43:49.340 I mean, I read her name.
00:43:50.280 Of course, we haven't
00:43:50.860 repeated it on this show.
00:43:51.760 But how old is she?
00:43:52.980 Like about what age range
00:43:54.140 is she?
00:43:55.140 She's in her 20s.
00:43:56.540 Oh, so she's young.
00:43:57.620 Right.
00:43:57.920 Of course, she's got
00:43:58.820 reason to be afraid
00:44:00.860 that there will be
00:44:01.980 penalties.
00:44:02.480 And she was, as I
00:44:03.800 understand it, from
00:44:04.860 Brooklyn.
00:44:05.400 It's like, OK, so nobody
00:44:06.980 in Brooklyn agrees with
00:44:07.860 her.
00:44:08.160 So there's a reason why
00:44:09.560 she's afraid to share
00:44:10.840 her identity because
00:44:12.080 the Taylor Lorenz's of
00:44:13.540 the world have made
00:44:14.580 her opinions not
00:44:16.820 acceptable.
00:44:17.380 Right.
00:44:17.820 Like her pushback on
00:44:18.960 some of these narratives
00:44:19.600 and some of them are
00:44:20.380 insane that she posts
00:44:22.620 on her Twitter.
00:44:24.640 Any pushback?
00:44:25.760 Unacceptable.
00:44:26.500 You're in the same boat,
00:44:28.100 except thankfully you have
00:44:29.360 some job security and
00:44:30.380 some, you know, now
00:44:31.460 powerful friends.
00:44:32.880 And she didn't when
00:44:34.960 when Taylor took a shot
00:44:36.480 at her, when Jeff Bezos
00:44:37.380 basically took a shot
00:44:38.360 at her.
00:44:39.820 Yeah.
00:44:40.460 And I they knew that.
00:44:41.880 And so I think that
00:44:42.640 that was the goal of the
00:44:43.560 piece originally was just
00:44:44.740 to get her to basically
00:44:46.160 feel so afraid that she
00:44:47.420 would stop doing what
00:44:48.200 she's doing.
00:44:49.180 And she's she's very
00:44:50.680 bold.
00:44:51.040 I mean, was she afraid?
00:44:53.000 Absolutely.
00:44:54.000 At the same time, you
00:44:55.260 know, she hasn't really
00:44:56.140 been deterred.
00:44:57.540 You know, she's if
00:44:59.020 anything, she does
00:44:59.840 realize that this means
00:45:01.000 that the work that she's
00:45:01.860 doing is effective.
00:45:02.620 And so her her
00:45:05.020 strong desire to keep
00:45:07.020 going, I think,
00:45:08.300 represents a boldness
00:45:10.100 and courage there.
00:45:11.180 I think she's she's
00:45:11.980 actually very courageous,
00:45:12.960 even though she is
00:45:13.940 obviously reasonably
00:45:16.020 scared of the threats
00:45:17.920 that are posed to her
00:45:18.720 now at this point.
00:45:19.360 I mean, she gets death
00:45:20.080 threats every single
00:45:20.820 day in her inbox.
00:45:21.900 It's really nasty
00:45:22.640 what's coming at her.
00:45:24.260 But, you know, this
00:45:25.680 is it's a part of it's
00:45:26.820 innate.
00:45:27.160 It's part of what you
00:45:28.080 do here.
00:45:28.540 And Taylor, if she's
00:45:29.680 going to be out there
00:45:30.380 doing this kind of work,
00:45:31.420 doxing people like her,
00:45:32.620 then she should expect
00:45:33.460 this kind of a response.
00:45:35.000 She shouldn't be crying
00:45:35.820 about it on TV and
00:45:37.060 acting like, you know,
00:45:37.980 she's she's on the
00:45:39.240 receiving end of some
00:45:40.080 great injustice.
00:45:41.300 Yeah.
00:45:41.540 Like out of nowhere.
00:45:42.800 It's so hard to determine
00:45:43.980 why this happened to me.
00:45:45.500 As I understand it,
00:45:46.380 lives of tick tock went
00:45:47.220 from like 700000 followers
00:45:49.100 on Twitter to over a
00:45:50.160 million now.
00:45:51.120 So Taylor's little
00:45:52.180 campaign didn't work.
00:45:54.160 And of course, her her
00:45:55.480 lie about it could be a
00:45:57.040 foreign actor.
00:45:57.560 And that's why I needed
00:45:58.280 to unearth.
00:45:58.860 This is exposed
00:46:00.060 because she clearly
00:46:01.060 found out who it was
00:46:02.080 and knew it wasn't a
00:46:03.720 Russian or a foreign
00:46:04.680 actor and went ahead with
00:46:06.160 her little doxing campaign
00:46:07.220 anyway.
00:46:08.180 And Tim Pool is right.
00:46:09.660 Then they lied about it.
00:46:10.500 The Washington Post itself
00:46:11.980 in writing lied
00:46:14.000 and said that we didn't
00:46:15.400 reveal any personal
00:46:16.240 information.
00:46:16.720 And they very much did.
00:46:18.020 They linked to her home
00:46:19.080 address and didn't have
00:46:20.920 the balls to just own it.
00:46:22.540 They lied on paper.
00:46:23.820 So this is not the paper
00:46:25.060 of record.
00:46:25.500 This is not democracy
00:46:26.760 dies in darkness.
00:46:27.840 This has become a partisan
00:46:28.960 hack rag run by a guy
00:46:30.860 with a political agenda.
00:46:32.320 And we should be pushing
00:46:33.520 back publicly as much as
00:46:34.540 possible.
00:46:34.700 And we can be mad about it
00:46:35.820 and we should we should be
00:46:36.760 upset about the way that
00:46:37.680 they handled it.
00:46:38.300 But it is true that they
00:46:40.320 amplified the voice that
00:46:41.580 they were trying to silence.
00:46:42.780 And her platform is way
00:46:44.460 bigger now than it was
00:46:45.600 before they started all of
00:46:46.820 this.
00:46:47.040 And we have Taylor Lorenz
00:46:48.040 to thank for that.
00:46:48.720 So thank you, Taylor Lorenz,
00:46:50.040 if you're watching and
00:46:50.740 listening.
00:46:51.680 You're violent.
00:46:53.600 There you go with more of
00:46:54.820 your violence.
00:46:55.900 Seth, thanks for fighting
00:46:57.620 the good fight.
00:46:59.300 Thank you.
00:47:00.240 It's a pleasure.
00:47:01.440 Coming up, a different
00:47:02.340 take on the Elon Musk
00:47:03.340 Twitter news from Peter
00:47:05.020 Schiff, who predicted this
00:47:06.980 was all bluster and that it
00:47:09.360 would not happen back on our
00:47:11.300 show two weeks ago.
00:47:12.320 Well, we said, why don't you
00:47:13.480 come back on so we can ask
00:47:14.780 you all the tough questions?
00:47:15.920 And to his credit, he said,
00:47:17.160 I'll do it.
00:47:18.380 What does he say now?
00:47:19.220 That's next.
00:47:24.360 Peter Schiff, chief economist
00:47:26.420 and global strategist at
00:47:27.700 Euro Pacific Capital, is back
00:47:29.000 with us today to talk about
00:47:30.880 Elon Musk, the predictions
00:47:32.980 that Peter got right and
00:47:34.920 those he got wrong.
00:47:36.640 Two weeks ago, Peter was on
00:47:38.660 this show and said the
00:47:40.420 following.
00:47:41.300 I don't think Elon Musk has
00:47:42.780 any intention of buying
00:47:43.980 Twitter.
00:47:44.660 I don't think he has the
00:47:46.120 ability, the liquidity to
00:47:47.880 finance such a large
00:47:48.940 purchase.
00:47:49.360 So I think he's just having
00:47:51.240 fun.
00:47:52.820 Aha.
00:47:53.180 Well, Peter Schiff is back
00:47:55.360 with us today to talk about
00:47:56.540 that prediction and also
00:47:57.980 whether Musk buying Twitter
00:47:59.220 is going to be a net benefit
00:48:01.900 for all of the investors in
00:48:04.360 his companies like those who
00:48:06.040 have bought Tesla.
00:48:07.200 Welcome back, Peter.
00:48:07.940 Good to have you.
00:48:08.960 Oh, thanks for having me back,
00:48:10.120 Megan.
00:48:10.820 OK, so that was not correct.
00:48:12.960 How'd you get it wrong?
00:48:14.100 Well, I guess I overestimated
00:48:15.620 Musk's intelligence.
00:48:17.840 I mean, he's obviously a smart
00:48:18.820 guy, but he's not that smart.
00:48:21.140 And sometimes smart people
00:48:22.400 do foolish things.
00:48:23.440 And I think buying Twitter
00:48:25.840 is going to end up being
00:48:27.060 probably the most foolish
00:48:28.160 thing Musk did if it in
00:48:30.060 fact happens.
00:48:30.720 I mean, if you look at where
00:48:31.620 the price of Twitter is
00:48:32.740 trading, the market is
00:48:34.340 obviously assigning a very
00:48:36.100 large probability to the
00:48:37.780 deal not going through.
00:48:39.280 So it's still not a
00:48:40.960 certainty that it's going to
00:48:42.240 be done.
00:48:43.580 And I think, you know, for
00:48:45.280 Musk and probably for Tesla
00:48:47.000 shareholders, I would be
00:48:48.760 hoping that it doesn't get
00:48:49.760 done.
00:48:50.100 I think if you own Twitter,
00:48:51.300 it's a great deal because
00:48:52.660 Elon Musk is getting you out
00:48:54.460 of jail.
00:48:55.060 He's overpaying for Twitter
00:48:56.920 and Twitter is going to
00:48:58.440 probably lose a lot of money
00:48:59.760 while Musk owns it.
00:49:01.560 But the bigger problem and
00:49:03.080 why I thought that Musk
00:49:04.440 wouldn't do it is I didn't
00:49:06.400 think, A, he would want to
00:49:07.660 sell his Tesla shares in
00:49:10.240 order to buy Twitter.
00:49:12.760 But I think more importantly,
00:49:13.800 I didn't think he would want
00:49:14.960 to pledge his Tesla shares
00:49:17.040 as collateral for a loan to
00:49:19.800 buy Twitter, which is what
00:49:20.840 he's doing, because, you
00:49:23.180 know, who's to say that
00:49:24.520 Tesla can't do what
00:49:26.000 Netflix did?
00:49:26.840 I mean, Netflix is now down
00:49:28.040 70 percent from where it was
00:49:29.680 a few months ago, and there's
00:49:31.240 no bottom in sight for that
00:49:32.700 stock.
00:49:33.500 The same thing could happen
00:49:34.780 to Tesla.
00:49:36.280 Tesla's stock prices is way in
00:49:39.240 the stratosphere compared to
00:49:40.540 its actual earnings, its
00:49:42.920 valuation.
00:49:43.480 And there's nothing that
00:49:44.880 would prevent the stock from
00:49:46.100 dropping 70 percent or more.
00:49:48.240 But if that were to happen,
00:49:49.560 he would be forced to sell
00:49:51.100 those shares into a very
00:49:52.360 weak market in order to
00:49:53.940 repay the lender.
00:49:56.080 And who knows where that
00:49:57.060 share where that transaction
00:49:58.300 would be priced.
00:49:59.380 It would probably have to
00:50:00.260 happen at a significant
00:50:01.340 discount to the market.
00:50:03.020 And I think that's why you
00:50:04.340 saw Twitter stock down
00:50:06.040 11, 12 percent yesterday.
00:50:07.700 And it was down 2 percent
00:50:09.180 the day of the announcement.
00:50:10.360 But people are starting to
00:50:12.260 think about this
00:50:13.260 possibility.
00:50:13.820 And I think it's like
00:50:15.440 waving a flag at a bull or
00:50:17.620 in this case, a bear when it
00:50:19.040 comes to the shorts for
00:50:20.260 Twitter.
00:50:20.760 I think this is a gift.
00:50:22.180 They smell blood.
00:50:23.000 And I would expect the
00:50:24.640 shorts to be piling into
00:50:25.740 that stock because they know
00:50:26.880 if they can push the price
00:50:28.000 down low enough, they're
00:50:29.380 going to flush out Musk and
00:50:30.840 they'll be able to cover their
00:50:31.880 shorts at a much lower
00:50:33.280 price.
00:50:34.580 I think I understood what
00:50:35.440 you're saying.
00:50:36.640 I was hanging on there by a
00:50:38.600 thread, but you're
00:50:39.160 basically saying he's
00:50:39.940 overpaying and that if he
00:50:42.420 has to actually dip into
00:50:43.820 like sell some of his
00:50:44.660 Tesla shares, that's a very
00:50:46.700 bad thing for Tesla for him
00:50:49.420 and and that since the value
00:50:51.440 isn't in Twitter, it's smart
00:50:53.100 to short this purchase.
00:50:54.100 It's to bet that it's not
00:50:55.200 going to work out, but it's
00:50:56.140 not worth what it's trading
00:50:57.080 for right now.
00:50:57.640 I don't know if if the
00:50:58.900 purchase is going to work
00:50:59.800 out.
00:51:00.060 If you think it is going to
00:51:00.880 work out, I think Twitter is
00:51:02.060 trading at about 48 and the
00:51:03.820 deal price is 54 and change.
00:51:06.100 So you make some good money
00:51:07.800 if you buy it now and it
00:51:09.960 goes through.
00:51:10.640 But if it doesn't go
00:51:11.400 through, then the question
00:51:12.520 is, what happens to the
00:51:13.580 price of Twitter stock?
00:51:14.780 It's probably going to end
00:51:15.620 up being a lot lower.
00:51:16.860 But my point is that the
00:51:18.660 real risk is for Tesla.
00:51:20.900 But, you know, I think if
00:51:22.680 if Musk were to sell some
00:51:24.280 Tesla stock now to buy
00:51:26.120 Twitter, that's not that bad
00:51:27.760 because Tesla is overvalued.
00:51:29.360 And so is Twitter.
00:51:30.580 The risk is that he doesn't
00:51:32.080 sell his Tesla.
00:51:33.540 He borrows money and pledges
00:51:35.600 the shares because then he
00:51:37.400 could end up selling the
00:51:38.460 shares at a much lower
00:51:39.660 price than the price he
00:51:41.180 would get if he sold it
00:51:42.100 right now, because he would
00:51:43.140 be selling under a
00:51:44.480 distressed circumstance
00:51:45.460 where the price was
00:51:46.920 significantly lower than the
00:51:48.360 price he could get if he
00:51:49.220 sold stock now.
00:51:50.860 All right.
00:51:51.480 The word that's coming to
00:51:52.700 mind is buzzkill.
00:51:54.620 You're a buzzkill.
00:51:56.200 Come on.
00:51:56.700 We're excited about this
00:51:57.700 because we want him to
00:51:59.320 take over Twitter to
00:52:00.160 restore some semblance of
00:52:01.720 balance in the big tech
00:52:03.120 information wars.
00:52:04.040 Yeah, Megan, I'm not
00:52:05.600 against him owning Twitter.
00:52:07.140 I mean, I use Twitter
00:52:08.120 myself.
00:52:08.740 I've got almost 700,000
00:52:11.060 followers.
00:52:11.640 And, you know, I agree with
00:52:13.460 a lot of what Musk is
00:52:14.960 saying.
00:52:15.440 I mean, I would like the
00:52:16.400 forum to be more open.
00:52:17.800 I don't like it being
00:52:18.720 policed by, you know,
00:52:20.960 the socially collect,
00:52:22.960 you know, correct
00:52:23.780 employees, the woke people
00:52:26.560 who are working at
00:52:27.920 Twitter.
00:52:28.160 I would much rather have a
00:52:29.420 guy like Musk own it
00:52:31.080 than current management.
00:52:32.880 So I'm not objecting to
00:52:34.320 this as somebody who
00:52:35.820 uses Twitter.
00:52:36.880 I think it's good for
00:52:38.120 the user experience.
00:52:39.940 I'm just talking about it
00:52:41.180 as a financial analyst.
00:52:43.080 Is it a wise transaction
00:52:44.820 for Musk?
00:52:45.960 And what are the
00:52:46.840 implications for Tesla
00:52:48.940 shareholders?
00:52:49.620 I mean, I think it's great
00:52:50.460 for Twitter shareholders.
00:52:51.580 They get a bunch of cash
00:52:52.740 and they can invest it
00:52:53.580 someplace else.
00:52:54.520 I think it's a risk for
00:52:56.120 Tesla shareholders.
00:52:57.020 And I think it's a risk
00:52:58.280 for Elon Musk himself
00:52:59.600 as the largest Tesla
00:53:01.460 shareholder.
00:53:01.980 I think he's putting
00:53:02.960 a good portion of his
00:53:04.540 fortune in jeopardy.
00:53:05.680 Now, I don't think,
00:53:06.780 you know, he's not going
00:53:07.420 to go bankrupt.
00:53:08.220 He's not going to be,
00:53:09.040 you know, poor.
00:53:10.100 I mean, he's, you know,
00:53:10.840 he could lose 100 billion
00:53:12.920 dollars or more
00:53:14.220 and still be very rich.
00:53:15.460 So I'm not worried
00:53:16.860 about Elon Musk.
00:53:17.980 He has more money
00:53:18.760 than he could ever spend.
00:53:20.140 But a lot of the people
00:53:21.280 who own his stock,
00:53:22.920 they may have very large
00:53:25.080 positions relative
00:53:26.460 to their net worths.
00:53:28.140 Those people should be selling.
00:53:29.500 I mean, they they are exposed
00:53:31.060 to significant risk.
00:53:32.280 I mean, Tesla was overpriced
00:53:33.540 to begin with.
00:53:34.480 But now this is another factor
00:53:35.960 that could significantly
00:53:37.320 weigh on the share price.
00:53:39.220 So we'll get we'll get
00:53:40.160 into the connection
00:53:40.800 between what happens
00:53:41.560 with the Tesla stock
00:53:43.860 and the Twitter stock
00:53:45.820 in one second.
00:53:47.040 But don't you think
00:53:48.600 that Elon Musk
00:53:49.840 actually can find efficiencies
00:53:51.700 that were not being pursued
00:53:53.340 by the current management
00:53:54.500 and that he will
00:53:55.980 make Twitter better?
00:53:56.800 I mean, he has a reputation
00:53:57.560 of doing that at companies.
00:53:58.940 It's it's I realize now
00:54:00.560 he's got three massive
00:54:01.460 companies to run,
00:54:02.420 so he's not going to be
00:54:03.640 all that available.
00:54:04.700 But he's got a pretty
00:54:05.800 good track record of choosing
00:54:08.100 the right companies
00:54:08.780 and making them work.
00:54:10.240 Oh, you know, I'm not saying
00:54:11.620 he can't make Twitter better.
00:54:14.140 I mean, maybe he could make Twitter
00:54:15.620 a more profitable business
00:54:18.160 than it is now.
00:54:19.100 Now, will he make it
00:54:20.120 profitable enough
00:54:21.100 so it'll be worth
00:54:22.180 the forty four billion
00:54:23.260 that he's paying for it?
00:54:24.880 I have no idea.
00:54:25.640 But my only point
00:54:27.640 is that that's irrelevant
00:54:29.460 to Tesla.
00:54:31.700 Tesla stock.
00:54:32.500 So you're saying, OK,
00:54:33.200 wait, before we get to that,
00:54:34.040 because we're going to do
00:54:34.560 we're going to dumb it down.
00:54:35.760 So you're not saying
00:54:36.880 that he's not going to improve
00:54:38.040 it and find efficiencies
00:54:38.960 and make the company
00:54:39.720 more valuable.
00:54:40.420 You're not saying
00:54:41.220 he can't do that.
00:54:42.020 You're saying
00:54:42.480 your concerns are about
00:54:44.700 the financing of this deal
00:54:46.020 and I mean, I think
00:54:47.780 how it could how it could
00:54:48.600 drive down the Twitter value
00:54:49.940 right right from the start.
00:54:51.280 OK, is that correct?
00:54:52.400 Well, the Twitter value.
00:54:53.220 No, the Twitter value
00:54:54.000 is irrelevant to anybody
00:54:55.100 but Elon Musk
00:54:55.940 and his partners
00:54:56.540 because the current shareholders
00:54:57.780 are going to be gone, right?
00:54:59.120 They're going to get
00:54:59.720 cashed out of the deal
00:55:01.040 and they and they walk away.
00:55:03.520 Now, can Elon Musk
00:55:04.900 make Twitter
00:55:05.840 a more valuable company
00:55:07.320 than it is today?
00:55:08.580 Probably.
00:55:09.540 But can he make it
00:55:10.340 valuable enough
00:55:11.140 to be worth
00:55:11.680 the forty four billion
00:55:12.560 he's paying?
00:55:13.360 That's a much
00:55:14.220 more difficult task
00:55:15.960 because I think
00:55:16.420 he's overpaying
00:55:17.500 for the company anyway
00:55:18.820 based on the assumption
00:55:20.240 that he could make
00:55:21.060 it far more valuable.
00:55:21.980 So there's a lot of,
00:55:23.120 you know, ifs there.
00:55:24.240 But again, that's
00:55:25.060 it's Elon Musk.
00:55:25.860 He can take a chance.
00:55:26.880 That's his money
00:55:27.480 and his partners.
00:55:28.720 What I'm talking about,
00:55:29.960 again, is the vulnerable position
00:55:31.700 that this puts Tesla
00:55:33.700 in as a company
00:55:35.140 when you've got somebody
00:55:36.900 with a twelve
00:55:37.480 and a half billion dollar
00:55:38.440 margin loan effectively
00:55:39.680 against a huge position
00:55:41.780 in Tesla.
00:55:43.160 And Tesla is a very
00:55:44.660 expensive stock.
00:55:46.180 The momentum could break.
00:55:47.480 The price could collapse.
00:55:48.920 Look what's happening
00:55:49.720 to so many of these
00:55:50.520 other momentum stocks.
00:55:51.940 You know, I think Tesla
00:55:53.080 was the original meme stock
00:55:54.500 before GameStop
00:55:55.940 and AMC.
00:55:57.260 Tesla really was
00:55:58.240 a meme stock.
00:55:59.180 It was all about Elon Musk
00:56:00.460 and his personality
00:56:01.720 more than the actual
00:56:02.940 profitability of the company.
00:56:04.360 But anything can happen
00:56:05.760 to change that momentum.
00:56:07.640 Look what just happened
00:56:08.360 to Robinhood.
00:56:09.080 You know, Robinhood
00:56:09.640 is down 90 percent.
00:56:11.460 You know, that's the app
00:56:12.400 was really popular.
00:56:13.520 Social media
00:56:14.160 during the COVID lockdowns.
00:56:16.120 This thing has imploded.
00:56:17.540 Now they're laying off
00:56:18.420 a good percentage
00:56:19.300 of their workforce.
00:56:20.780 But one stumble,
00:56:22.500 one missed earnings.
00:56:23.800 You know, Tesla's
00:56:24.680 most recent earnings
00:56:25.580 were better
00:56:26.160 and the stock went up.
00:56:27.200 But they're one
00:56:28.400 earnings announcement
00:56:29.520 away from a potential crash.
00:56:31.980 And having a guy
00:56:33.380 short
00:56:34.180 or basically leveraged
00:56:35.680 long
00:56:36.220 in this big position
00:56:37.860 is very vulnerable
00:56:39.040 because if the price
00:56:40.560 falls enough,
00:56:41.900 the covenants
00:56:42.600 in that loan
00:56:43.440 are going to require
00:56:44.600 the stock to be sold
00:56:45.980 regardless of the market price.
00:56:47.840 he's going to be forced
00:56:49.560 to sell that stock.
00:56:51.180 And since the price
00:56:52.260 has gone down,
00:56:53.260 he's going to have to sell
00:56:54.080 a lot more shares
00:56:55.060 to cover what he owes.
00:56:57.380 And why would that be bad
00:56:58.500 for Tesla?
00:56:59.820 Well, it's just bad
00:57:00.760 for the stock price,
00:57:01.620 you know,
00:57:01.780 whether it's bad
00:57:02.380 for the underlying company.
00:57:03.660 I mean,
00:57:04.120 you know,
00:57:04.960 that is a different story.
00:57:06.780 I'm talking about
00:57:07.280 the stock price
00:57:08.240 of Tesla
00:57:09.220 and people
00:57:10.260 who may own Tesla
00:57:11.420 right now
00:57:12.120 who think the price
00:57:13.320 is going to go up.
00:57:14.180 there is a tremendous risk
00:57:15.840 that it goes way down
00:57:17.180 and the fact
00:57:18.000 that Elon Musk
00:57:18.920 has just borrowed
00:57:19.880 so much money
00:57:20.640 or is thinking about it.
00:57:22.180 I mean,
00:57:22.400 it may not happen
00:57:23.360 because the deal
00:57:24.180 may not happen.
00:57:25.460 And if the price
00:57:26.680 of Tesla stock
00:57:27.920 drops enough
00:57:28.780 between now
00:57:29.940 and the deal,
00:57:30.720 maybe that alone
00:57:31.660 will call off the deal
00:57:32.600 because,
00:57:33.320 you know,
00:57:33.960 Musk might be concerned
00:57:35.440 about the loss
00:57:37.180 in value of Tesla.
00:57:38.660 So we'll see.
00:57:39.600 Again,
00:57:39.900 you know,
00:57:40.100 the market is pricing
00:57:41.340 in a significant risk
00:57:42.560 that this deal
00:57:43.260 does not happen.
00:57:45.060 You had predicted
00:57:46.620 that there would be
00:57:47.820 a drop
00:57:48.220 in the Tesla stock
00:57:49.760 and indeed
00:57:50.460 we have seen that
00:57:51.360 this week
00:57:52.220 someplace between
00:57:53.260 7% and 10%
00:57:54.480 it fell
00:57:55.100 after the announcement
00:57:56.560 of this deal
00:57:57.560 but how confident
00:57:59.660 are we
00:58:00.320 that that is
00:58:01.260 attributable
00:58:02.060 to his announcement
00:58:02.900 about Twitter
00:58:03.580 is what the papers
00:58:05.240 tell me.
00:58:05.680 Bloomberg says,
00:58:06.560 you know,
00:58:07.640 it happened amid
00:58:08.480 a broader sell-off
00:58:09.520 in equity markets
00:58:10.380 around the world
00:58:11.140 due to slower
00:58:11.800 economic expansion,
00:58:13.200 persistent inflation,
00:58:14.160 and then
00:58:16.380 Barron says
00:58:17.460 the broader stock
00:58:18.200 markets plunge
00:58:19.040 can be blamed
00:58:19.760 for at least part
00:58:20.680 of the decline
00:58:22.100 in the Tesla stock.
00:58:23.680 So how much
00:58:25.740 of a factor
00:58:26.340 was his announcement
00:58:27.460 on the Twitter deal
00:58:28.680 in what we've seen
00:58:29.780 happen to Tesla
00:58:30.440 this week?
00:58:31.600 The stock.
00:58:31.880 Well,
00:58:32.100 there's certainly
00:58:32.600 something to be said
00:58:33.540 for the fact
00:58:34.040 that it might have
00:58:34.620 dropped anyway
00:58:35.300 but it was down
00:58:36.780 11-12% yesterday
00:58:38.780 the day following
00:58:39.640 the announcement.
00:58:40.760 That was one of
00:58:41.200 the biggest declines
00:58:42.240 in any one stock.
00:58:44.120 I think you had
00:58:44.580 a 10% drop
00:58:45.600 in GE
00:58:46.180 but GE actually
00:58:47.800 had earnings
00:58:48.420 that they missed on.
00:58:49.900 I mean,
00:58:50.020 there was no news,
00:58:50.980 no bad news
00:58:51.900 was released on Tesla.
00:58:53.520 So the only news
00:58:54.620 was this deal
00:58:55.520 and yes,
00:58:56.500 the overall market
00:58:57.320 was weak
00:58:57.960 but not nearly
00:58:58.700 as weak as Tesla.
00:59:00.160 So I don't think
00:59:00.980 you can just dismiss
00:59:01.940 this transaction
00:59:02.940 and say,
00:59:03.440 well,
00:59:03.800 you know,
00:59:03.960 it's got this huge drop
00:59:05.320 has nothing to do
00:59:06.360 with this Twitter deal.
00:59:07.860 I think it has
00:59:08.500 a lot to do
00:59:09.700 with this Twitter deal
00:59:10.480 and I think the stock
00:59:11.720 is going to continue
00:59:12.420 to be under pressure
00:59:13.380 as long as this deal
00:59:14.740 is out there.
00:59:15.340 Now,
00:59:15.440 if the deal falls through
00:59:16.480 and it doesn't happen
00:59:17.600 then you could see
00:59:18.740 a big rise
00:59:19.520 in Tesla
00:59:20.700 because I think
00:59:21.220 that would be good news
00:59:22.260 that it's not going
00:59:22.880 to happen
00:59:23.260 but,
00:59:23.920 you know,
00:59:24.140 if it's going to happen
00:59:25.840 and it does happen
00:59:27.520 it is going to be
00:59:28.620 a significant negative
00:59:29.700 that is going to weigh
00:59:30.500 on that share price
00:59:31.440 and is reason enough
00:59:32.500 not to buy the stock.
00:59:33.980 I mean,
00:59:34.080 I don't own it anyway.
00:59:35.240 I mean,
00:59:35.400 so even if there was
00:59:36.320 no deal,
00:59:37.300 you know,
00:59:37.560 this is much too expensive
00:59:38.760 a stock for me to buy.
00:59:40.480 I don't buy stocks that way.
00:59:42.580 I'm looking for value
00:59:43.780 and,
00:59:44.500 you know,
00:59:44.960 I don't see it in Tesla.
00:59:46.660 I see a lot of competition coming.
00:59:48.800 In fact,
00:59:49.040 I mentioned Netflix.
00:59:50.560 I mean,
00:59:50.720 Netflix and Tesla
00:59:51.780 are very similar.
00:59:53.140 If you look at
00:59:54.260 why Netflix
00:59:55.360 became so expensive,
00:59:56.940 it innovated,
00:59:57.840 it had a new idea,
00:59:58.940 the streaming service
00:59:59.800 and it grew very quickly.
01:00:01.680 The share price
01:00:02.500 went up very quickly
01:00:03.420 but all of a sudden
01:00:04.560 a lot of people
01:00:05.480 emulated it.
01:00:06.400 Now it's got
01:00:06.880 all sorts of competition
01:00:07.940 from major studios
01:00:09.600 and television networks
01:00:11.100 and internet companies
01:00:13.180 that it didn't have before.
01:00:14.340 So it's a crowded field.
01:00:15.820 Meanwhile,
01:00:16.200 their customer base
01:00:17.120 is,
01:00:17.720 you know,
01:00:17.980 is bombarded
01:00:18.880 with competition
01:00:20.500 but they also have
01:00:22.060 higher food prices,
01:00:23.520 higher energy prices,
01:00:24.700 higher rent.
01:00:25.560 How much money do they have
01:00:26.800 for all these streaming services?
01:00:28.420 They don't
01:00:28.940 and so they're having to cut back
01:00:30.260 and I think the same thing
01:00:31.700 could happen with Tesla.
01:00:33.020 Tesla,
01:00:33.480 you know,
01:00:33.700 has everything,
01:00:34.400 the market to itself.
01:00:35.780 It shows that there's
01:00:36.560 a lot of demand
01:00:37.240 for electric cars.
01:00:38.780 Well,
01:00:39.040 what do you know?
01:00:39.760 A lot of the major
01:00:40.700 automobile companies
01:00:41.720 all around the world
01:00:42.460 are now coming up
01:00:43.380 with electric cars
01:00:44.140 of their own
01:00:44.720 to compete with Tesla.
01:00:46.620 Meanwhile,
01:00:46.980 the cost of building
01:00:47.960 these cars
01:00:48.580 is skyrocketing
01:00:49.560 because of the raw material costs.
01:00:51.220 So there's more competition.
01:00:53.500 It's more expensive
01:00:54.160 to make them.
01:00:55.120 I mean,
01:00:55.400 their market share
01:00:56.120 is going to be
01:00:56.940 competed away
01:00:58.200 and,
01:00:59.120 you know,
01:00:59.720 the valuation
01:01:00.520 could suffer.
01:01:01.540 Investors
01:01:01.940 are going to start
01:01:02.980 to look at this
01:01:03.700 from a more realistic
01:01:05.080 perspective
01:01:05.660 and the valuation
01:01:07.340 is going to come down.
01:01:08.420 The problem is now
01:01:09.240 when the valuation
01:01:09.860 comes down,
01:01:10.900 now Musk could be forced
01:01:11.960 to sell his stock.
01:01:13.380 Who's going to buy that stock
01:01:14.600 and at what price
01:01:15.620 under those circumstances?
01:01:16.940 That is the risk.
01:01:18.020 I mean,
01:01:18.180 both for Elon Musk
01:01:19.360 and for any shareholders
01:01:21.000 in Tesla.
01:01:23.100 All right.
01:01:23.780 So to dumb it way down,
01:01:26.100 if we want to know
01:01:27.280 whether the Twitter deal
01:01:28.300 is actually going to go through,
01:01:29.900 a smart thing to watch
01:01:31.460 might be
01:01:32.180 the price of Tesla shares.
01:01:35.140 Or the price of Twitter.
01:01:36.680 I mean,
01:01:36.960 the deal price for Twitter,
01:01:38.440 I think,
01:01:38.760 is $54.20.
01:01:40.500 And Twitter is trading at $48.
01:01:42.800 So obviously,
01:01:43.780 if the deal goes through,
01:01:44.980 you have to buy,
01:01:45.560 just buy,
01:01:46.020 you can buy it now at $48
01:01:47.220 and you make $7.
01:01:48.420 I mean,
01:01:49.020 it's a good return
01:01:50.120 in potentially
01:01:51.100 a short period of time.
01:01:52.640 But the question is,
01:01:53.760 why is the stock trading
01:01:55.100 at $48
01:01:55.740 when Musk is going to buy it
01:01:58.380 at $54?
01:01:59.860 It's because
01:02:00.560 it's not a certainty.
01:02:01.860 Because the risk is,
01:02:03.020 you buy it at $48
01:02:04.080 hoping to make $7
01:02:05.320 and the deal
01:02:06.540 is called off
01:02:07.800 and it goes from $48
01:02:08.960 to $28
01:02:09.500 and you lose $20.
01:02:10.600 Right?
01:02:10.780 So there's risk there.
01:02:12.460 So if you look at that price,
01:02:14.460 if the price of Twitter
01:02:15.760 in the market
01:02:16.540 starts to trade
01:02:17.720 a lot closer
01:02:18.480 to that deal price,
01:02:19.760 then that's an indication
01:02:20.940 that at least the market
01:02:21.980 has more confidence
01:02:23.360 that the deal
01:02:23.980 is going to go through.
01:02:25.160 That is negative for Tesla.
01:02:27.040 And I would imagine
01:02:27.980 that the closer
01:02:28.880 the price of Twitter
01:02:30.580 gets to the deal price,
01:02:32.000 the lower the price
01:02:33.240 of Tesla is going to be.
01:02:34.720 Okay.
01:02:35.400 All right.
01:02:35.840 Now,
01:02:36.060 one more subject
01:02:36.580 before I let you go.
01:02:37.940 Joe Biden
01:02:38.440 is getting ready.
01:02:39.420 It looks like,
01:02:40.840 I mean,
01:02:41.060 it's stunning.
01:02:41.860 I can hardly believe
01:02:43.680 that it's actually
01:02:44.700 he's going to do it.
01:02:46.020 But to forgive,
01:02:48.360 quote,
01:02:48.720 forgive,
01:02:49.960 I don't know how much
01:02:51.680 in student loans.
01:02:53.540 And it's incredibly immoral.
01:02:56.660 I mean,
01:02:56.900 it's incredibly immoral
01:02:57.720 to say those people
01:02:59.000 who paid off their loans,
01:03:00.640 they're suckers.
01:03:01.800 They're a bunch of losers
01:03:02.700 because the federal government
01:03:04.120 is going to swoop in now
01:03:05.140 and pay the loans
01:03:05.800 of those people
01:03:06.260 who refuse to pay,
01:03:07.500 who didn't pay their debt.
01:03:08.900 Not to mention,
01:03:09.860 it's a huge wealth transfer,
01:03:11.140 right?
01:03:11.360 And it's a wealth transfer
01:03:12.200 from the working class
01:03:13.200 and the middle class
01:03:13.920 to the upper class,
01:03:15.320 the people who went to college,
01:03:16.540 people who,
01:03:17.120 all the other people
01:03:17.740 who said,
01:03:18.100 you know what,
01:03:18.320 I can't afford college.
01:03:19.420 I can't afford it
01:03:20.160 and I don't want to put myself
01:03:20.880 in debt,
01:03:21.200 so I'm not doing it.
01:03:21.740 Those people,
01:03:22.420 guess what?
01:03:22.900 You didn't get to go to college
01:03:23.840 and now you're going to have
01:03:24.640 to pay for the college
01:03:25.340 of the people who chose to go.
01:03:27.100 Yeah.
01:03:27.500 Look,
01:03:27.660 this is another example
01:03:28.920 of two wrongs
01:03:30.340 not making a right.
01:03:32.020 The government
01:03:32.560 never should have got
01:03:33.700 into the student loan business
01:03:35.420 in the first place.
01:03:36.380 The only reason
01:03:37.320 that college is so expensive
01:03:39.280 and so many people
01:03:40.360 now have so much debt
01:03:41.440 is because of government.
01:03:43.040 Before government
01:03:43.660 got involved
01:03:44.380 in guaranteed student loans
01:03:46.100 and then direct student loans,
01:03:47.620 college was not expensive.
01:03:48.960 Yeah.
01:03:49.920 You know,
01:03:50.620 if your parents were poor
01:03:52.120 or middle class
01:03:53.580 and they couldn't afford
01:03:54.660 to send you to college,
01:03:55.620 you worked your way
01:03:56.640 through college.
01:03:57.180 You got a summer job,
01:03:58.160 you waited tables,
01:03:58.960 no problem.
01:03:59.960 You graduated debt-free
01:04:01.600 and if your parents
01:04:02.760 were even upper middle class
01:04:04.240 or upper class,
01:04:05.440 no problem.
01:04:06.100 They covered your cost of college.
01:04:07.960 College was inexpensive.
01:04:09.320 But once the government
01:04:10.260 started providing students
01:04:11.780 with all this money,
01:04:13.000 that's when the universities
01:04:14.060 really took advantage
01:04:15.040 of the students
01:04:15.680 and the government
01:04:16.240 to jack up prices
01:04:17.540 because you no longer
01:04:18.360 had any competition.
01:04:20.360 Students no longer cared
01:04:21.540 how much college cost
01:04:22.820 because they were getting
01:04:23.820 the money from the government.
01:04:24.880 They were getting these loans
01:04:25.800 and without the government guarantees,
01:04:28.100 the banks wouldn't
01:04:28.660 have loaned the money.
01:04:29.360 And so if the students
01:04:31.360 didn't have the money,
01:04:32.240 then the colleges
01:04:33.000 couldn't have charged so much.
01:04:34.400 So now you have a situation
01:04:35.820 where because of government,
01:04:37.160 everybody has all this debt
01:04:38.320 and their solution is,
01:04:40.260 well, let's just forgive the debt.
01:04:41.760 That is an even bigger moral hazard
01:04:44.040 because what you're now telling people
01:04:45.660 is nobody should pay for college.
01:04:47.920 You're an idiot now.
01:04:49.260 Even if you can afford it,
01:04:50.560 just borrow the money
01:04:51.620 because it's going to get forgiven.
01:04:53.540 I mean, you'd have to be
01:04:54.140 a complete idiot
01:04:54.860 to pay for college.
01:04:56.260 And now if you thought
01:04:57.800 college was expensive before,
01:04:59.360 wait till you see
01:05:00.360 how much more expensive
01:05:01.320 it's going to get
01:05:02.080 once they start forgiving the loans
01:05:03.920 because now colleges can say,
01:05:06.140 look, we're going to charge you
01:05:07.640 $100,000 for this tuition.
01:05:10.020 But who cares?
01:05:11.400 Borrow the money.
01:05:12.080 You're never going to have to pay it back.
01:05:13.420 It's all going to get forgiven anyway.
01:05:15.120 In fact, if you come to our university,
01:05:17.900 we'll just throw in a free car.
01:05:19.800 You know, we'll just wrap that up
01:05:21.100 and throw it in with your student loans
01:05:22.440 because, you know,
01:05:23.080 you're not going to have to pay for it.
01:05:24.620 You know, I mean,
01:05:25.360 this is all a grab bag.
01:05:27.580 Government pays for it.
01:05:28.740 It's going to be a disaster
01:05:30.340 if they actually forgive it.
01:05:32.300 I mean, what they should do,
01:05:33.280 I feel badly for a lot of these students
01:05:35.660 that have all this debt
01:05:36.740 because of the government.
01:05:37.900 What I would do personally,
01:05:40.220 I would like to forgive the debt,
01:05:42.240 but only if you get the government
01:05:44.080 out of the student loan business forever.
01:05:46.720 No more government student loans,
01:05:48.460 no more guaranteed student loans.
01:05:51.000 That way you'd avoid the moral hazard.
01:05:53.200 And you force these universities
01:05:55.680 to cut costs to lower their tuition
01:05:58.440 because otherwise they'll have no customers
01:06:00.300 because the only reason
01:06:01.320 they can charge so much now
01:06:02.680 is because of these government loans.
01:06:04.340 So the government gets out of the way.
01:06:05.960 The free market is going to lower tuition.
01:06:08.760 It's going to cost a lot less.
01:06:10.040 Now, fewer people will go to college
01:06:11.680 because right now,
01:06:12.460 because the government makes it so easy,
01:06:14.280 a lot of people
01:06:15.080 who barely graduated high school
01:06:16.760 go to college
01:06:18.020 and they take remedial math,
01:06:19.580 remedial English.
01:06:20.260 They're wasting their time.
01:06:21.800 The degrees are worthless.
01:06:23.180 They study, you know,
01:06:24.000 underwater basket weaving.
01:06:25.700 You know, we have a lot of kids
01:06:27.040 that don't belong in college.
01:06:28.480 They're only there
01:06:29.040 because the government
01:06:29.740 makes it available and pays for it.
01:06:32.640 They need to be learning trades.
01:06:34.180 They need to be learning skills.
01:06:35.580 The last thing they need to do
01:06:36.720 is waste five or six years,
01:06:38.680 you know, in college.
01:06:41.400 But that's not, unfortunately,
01:06:43.780 where we're headed.
01:06:45.220 You know, they're likely
01:06:46.680 to forgive these loans.
01:06:48.680 And, you know,
01:06:50.040 they're just going to do
01:06:50.660 even more damage
01:06:51.740 than what they've already done.
01:06:54.920 And the other factor
01:06:56.040 that even if you forgive these loans,
01:06:58.740 you have to recognize
01:06:59.820 that there's a cost to it.
01:07:01.800 How are you going to pay for it?
01:07:03.360 Because forgiving these loans,
01:07:05.080 if you tell all these college students,
01:07:07.060 you don't have to pay your loans back.
01:07:09.400 I mean, what that means
01:07:11.240 is like the government
01:07:12.080 is giving up all this income
01:07:13.800 that they were getting.
01:07:15.520 Up to 1.7 trillion.
01:07:16.200 How are they going to make up the difference?
01:07:17.500 Are they going to raise taxes
01:07:18.880 on the middle class
01:07:20.000 to cover the cost?
01:07:21.620 Right?
01:07:21.960 Of course.
01:07:22.560 How else would they?
01:07:23.380 That's their go-to.
01:07:24.720 No.
01:07:25.380 No.
01:07:25.760 They're just going to print more money.
01:07:27.400 Because all this,
01:07:28.000 because can you imagine
01:07:28.920 if you tell these students
01:07:30.500 that have student debt right now,
01:07:32.380 you don't have to pay the debt back.
01:07:35.140 Now the students have more money
01:07:36.520 to go out and buy more stuff.
01:07:38.340 They can buy more cars.
01:07:39.800 They can travel more.
01:07:41.240 They have more spending money.
01:07:42.560 Well, that's more inflation.
01:07:43.860 They're going to push up prices
01:07:44.960 because the government
01:07:45.980 is not going to reduce its spending
01:07:49.080 even though it's reduced its revenue.
01:07:51.640 So it's going to require,
01:07:52.740 it's going to mean
01:07:53.240 bigger budget deficits,
01:07:54.640 more money printing,
01:07:55.880 more inflation.
01:07:57.320 That's how the public
01:07:58.360 is going to pay
01:07:59.160 for student loan forgiveness.
01:08:00.840 They're not going to see
01:08:01.700 their taxes go up.
01:08:03.000 They're just going to see
01:08:03.840 prices go up more.
01:08:05.200 And now, of course,
01:08:05.740 the government's going to blame it
01:08:06.660 on Putin
01:08:07.180 or they're going to blame it
01:08:08.120 on the pandemic
01:08:09.240 or everybody,
01:08:09.960 anything else.
01:08:10.480 But it's government.
01:08:12.100 It's this deficit spending.
01:08:13.440 It's money printing.
01:08:14.760 And if the government says,
01:08:16.680 we're not going to take that revenue,
01:08:18.240 we're going to forgive
01:08:18.980 the student loans,
01:08:19.860 then they're going to have
01:08:21.260 even bigger deficits.
01:08:22.520 And again, as I said,
01:08:24.020 more people are going to borrow
01:08:25.660 more money to go to college.
01:08:27.220 The colleges are going to raise
01:08:28.500 tuitions even higher.
01:08:30.020 So nobody's going to pay.
01:08:31.940 Everybody's going to take on debt
01:08:33.260 knowing that the government
01:08:34.300 is going to wipe out the debt.
01:08:35.700 And that's, you know,
01:08:36.460 even bigger budget deficits,
01:08:38.560 even more inflation.
01:08:39.440 You know, this is insanity.
01:08:41.340 It's insanity.
01:08:42.160 It's, of course,
01:08:43.280 politically driven.
01:08:44.300 His numbers are in the tank
01:08:45.640 with young people right now.
01:08:47.260 He's lost only 20 points
01:08:48.580 off of his approval rating.
01:08:49.720 And so even while
01:08:50.880 President Biden himself
01:08:52.540 has expressed skepticism
01:08:54.280 about doing this
01:08:55.660 for some of the reasons
01:08:56.360 you outlined,
01:08:57.240 he's now reportedly
01:08:58.580 getting ready to do it
01:08:59.920 because of politics.
01:09:01.820 It's disgusting.
01:09:03.060 It's irresponsible.
01:09:04.420 Yeah.
01:09:04.520 Tax and print is no way
01:09:06.400 to run a federal
01:09:07.360 monetary policy.
01:09:08.320 Megan, this is the problem
01:09:10.000 with democracy.
01:09:11.040 This is why we have student loans
01:09:12.520 in the first place
01:09:13.240 because the government,
01:09:15.000 see, when they lowered
01:09:15.900 the voting age down to 18,
01:09:18.240 the cost of college
01:09:19.440 became a big issue
01:09:20.380 because now you have
01:09:21.060 all these 18-year-olds
01:09:22.040 that are voting
01:09:22.580 and they're going to college.
01:09:23.840 And so how did
01:09:24.840 the Democratic Party
01:09:25.940 want to lock up their votes?
01:09:27.740 Hey, we'll promise
01:09:28.580 to pay for your college.
01:09:29.760 So, hey, you don't have
01:09:30.540 to have a summer job.
01:09:31.600 You don't have to wake tables
01:09:32.520 over the summer.
01:09:33.420 Go to the beach.
01:09:34.180 Have fun.
01:09:35.000 We'll make it easy
01:09:35.760 for you to borrow money.
01:09:36.980 And so that's what they did.
01:09:38.540 And, you know,
01:09:39.140 you get into bed
01:09:39.820 with the government.
01:09:40.580 Eventually, you know,
01:09:41.460 you know what's going
01:09:41.920 to happen to you.
01:09:42.540 And that's exactly
01:09:43.240 what happened.
01:09:44.180 And that's what
01:09:44.760 they're doing again.
01:09:45.640 Obviously, if Biden is saying,
01:09:48.180 hey, vote for me
01:09:49.000 and I'm going to forgive
01:09:50.320 your $50,000,
01:09:51.400 $100,000 of student loans,
01:09:53.300 yeah, that guy
01:09:54.120 is going to vote for you.
01:09:54.880 That gal is going
01:09:55.380 to vote for you
01:09:55.840 because you're giving them
01:09:57.160 this huge prize.
01:09:58.600 It's like a bribe.
01:09:59.860 It's like you're bribing
01:10:00.700 people for their votes.
01:10:02.280 But the people
01:10:03.220 who are paying for it
01:10:04.480 don't understand
01:10:05.280 that they're paying for it.
01:10:06.240 That's the problem
01:10:07.040 because the beneficiaries
01:10:08.500 know what they're going to get.
01:10:10.320 That's the scene, right?
01:10:11.940 The scene benefit of this
01:10:13.320 is, oh, I'm a student.
01:10:14.840 I don't have to repay the money.
01:10:16.320 All the unseen consequences
01:10:17.920 are diffused.
01:10:19.400 All the voters
01:10:19.940 who are going
01:10:20.400 to end up paying for it
01:10:21.520 don't realize
01:10:22.360 how much it's going to cost.
01:10:23.560 And so Biden
01:10:25.000 doesn't lose their votes.
01:10:26.640 He gains the votes
01:10:27.580 of the people who benefit,
01:10:29.040 but he doesn't necessarily
01:10:30.080 lose the votes of the people
01:10:31.500 who have to suffer
01:10:32.600 and bear the cost
01:10:33.620 because they don't even
01:10:34.480 understand that they're
01:10:35.400 paying for it.
01:10:37.180 As you say,
01:10:37.520 you get in bed
01:10:38.100 with the government.
01:10:38.760 This is the scene.
01:10:39.560 This is the scene
01:10:40.180 in that movie
01:10:40.680 where you wake up
01:10:41.560 with a horse head
01:10:42.240 in the bottom of your bed.
01:10:44.020 Don't do it.
01:10:45.320 Peter Schiff,
01:10:45.880 always interesting.
01:10:46.640 Thank you so much.
01:10:47.700 Up next,
01:10:48.460 tech entrepreneur
01:10:49.180 and author of Woke Inc.
01:10:51.300 Vivek Ramaswamy is here.
01:10:52.860 He's the one who coined
01:10:54.200 that they're smuggling
01:10:55.680 in their content
01:10:57.060 or their viewpoint discrimination
01:10:58.380 through their hate speech policies.
01:11:01.940 Very interesting theory
01:11:02.800 and he's got a lot of thoughts
01:11:03.980 on what this means for America.
01:11:06.340 That's next.
01:11:10.560 Tech entrepreneur
01:11:11.520 and author of the book
01:11:12.760 Woke Inc.,
01:11:14.020 Vivek Ramaswamy
01:11:14.820 joins me now
01:11:15.460 to explain
01:11:16.080 how Elon Musk
01:11:18.120 can liberate Twitter successfully.
01:11:21.600 Welcome back to the show, Vivek.
01:11:22.780 Great to see you.
01:11:23.600 Good to see you, Megan.
01:11:24.140 How are you?
01:11:24.920 Good.
01:11:25.680 So all show long,
01:11:27.320 we've been talking about
01:11:28.020 something you wrote
01:11:28.900 in a piece for
01:11:29.560 The Wall Street Journal
01:11:30.240 on April 26th
01:11:31.660 and it was a piece called
01:11:33.120 How Elon Musk
01:11:34.060 Can Liberate Twitter,
01:11:35.560 you and our mutual friend
01:11:36.960 Jed Rubenfeld
01:11:37.580 of Yale Law School.
01:11:39.300 He's married to Tiger Mom,
01:11:40.360 Amy Chua,
01:11:40.860 who's amazing too.
01:11:41.740 Long story.
01:11:42.360 Okay, anyway,
01:11:43.700 I love the way you put it
01:11:45.780 so I stole it
01:11:46.680 for purposes of this show
01:11:47.820 with attribution,
01:11:48.820 which means it's not stealing.
01:11:50.960 You write Twitter.
01:11:51.820 Well, thank you.
01:11:52.420 You don't have to do
01:11:53.140 the attribution even next time.
01:11:54.320 Just spread the ideas
01:11:55.420 and I'm happy.
01:11:56.240 I'll take it.
01:11:57.140 You write Twitter
01:11:58.140 and others smuggle
01:12:00.220 viewpoint discrimination
01:12:02.000 into supposedly neutral
01:12:04.460 content moderation categories,
01:12:06.740 primarily misinformation,
01:12:07.980 incitement,
01:12:08.380 and hate speech.
01:12:09.420 Right.
01:12:10.180 So your point is that
01:12:11.380 all they do
01:12:12.300 is viewpoint discrimination
01:12:13.320 and there's a reason
01:12:14.520 that it always goes
01:12:15.740 against one side
01:12:16.680 and they don't want
01:12:18.420 to own up to it.
01:12:19.480 So they just label it
01:12:21.340 as, oh,
01:12:22.000 that's pursuant
01:12:22.660 to our policy
01:12:23.400 against hate speech,
01:12:24.840 against misinformation,
01:12:26.420 against incitement
01:12:27.760 so they can still
01:12:28.680 look like the good guys.
01:12:30.220 Yeah, I think
01:12:30.560 you've got that right, Megan.
01:12:31.920 It runs just one layer
01:12:33.280 a little bit deeper
01:12:34.160 that allows them
01:12:35.600 to get away with it,
01:12:36.480 which is the classic
01:12:37.700 modern problem
01:12:38.560 of taking a political decision
01:12:40.260 but wrapping it
01:12:41.480 in the veneer
01:12:42.260 of technocracy.
01:12:43.500 Okay, so here's the thing.
01:12:45.220 Much of what they do
01:12:46.200 is viewpoint-based discrimination,
01:12:47.680 but some of it
01:12:49.180 is actually
01:12:49.800 the kind of thing
01:12:50.920 that you have to do
01:12:51.980 in order to just operate
01:12:53.220 a functional social media site.
01:12:54.740 So what I wanted to do
01:12:55.420 in the piece,
01:12:56.100 along with Jed,
01:12:56.660 was to clarify
01:12:57.820 what these different categories are
01:12:59.300 so we could smoke
01:13:00.140 the difference out.
01:13:01.380 What do I mean?
01:13:01.920 Even Elon Musk
01:13:02.760 has said that
01:13:03.400 he wants to remove
01:13:04.380 things like spam
01:13:05.960 that litters his feed.
01:13:07.180 I think that's a good thing.
01:13:08.300 It'll improve the user experience
01:13:09.720 for Twitter users.
01:13:10.840 If you didn't have
01:13:11.720 some level of content moderation,
01:13:13.120 all of the feeds
01:13:13.940 would be filled by porn,
01:13:15.440 by just unpleasant
01:13:17.520 content,
01:13:18.040 commercial spam,
01:13:18.780 the kinds of bot-driven content.
01:13:20.960 So you need to be able
01:13:21.920 to clean that up
01:13:22.520 even though that's
01:13:23.200 constitutionally protected
01:13:24.460 while still operating
01:13:26.400 as a free speech platform.
01:13:27.460 So that's the conundrum
01:13:28.180 that Elon Musk faces.
01:13:29.540 So what Twitter's been doing
01:13:30.380 for a long time
01:13:31.140 is that they've been
01:13:32.300 wrapping in
01:13:33.440 viewpoint-based discrimination
01:13:35.720 by smuggling it
01:13:36.860 into the categories
01:13:37.680 of the supposedly
01:13:38.820 neutral content moderation.
01:13:40.520 And so my first suggestion,
01:13:41.580 Elon Musk,
01:13:42.180 is to make sure
01:13:43.040 that he smokes out
01:13:43.900 the different categories
01:13:44.860 and becomes really clear
01:13:46.400 about what's what
01:13:47.240 because there's a very clear answer
01:13:48.380 about what to do
01:13:48.980 with each of them.
01:13:49.760 Okay, first is genuinely
01:13:51.500 false speech
01:13:52.820 like commercially
01:13:53.500 fraudulent advertising.
01:13:55.260 Well, there's an answer
01:13:56.500 for false speech.
01:13:57.600 You have to prove
01:13:58.260 that it's false,
01:13:58.940 but if you do,
01:13:59.440 you can take it down,
01:14:00.360 but you shouldn't be able
01:14:00.960 to take it down
01:14:01.560 without actually proof of falsity.
01:14:03.420 With respect to incitement,
01:14:05.080 it's a private company,
01:14:06.280 but actually the legal precedents
01:14:07.320 give us a good standard
01:14:08.160 that Elon Musk can use.
01:14:09.440 If there's really going
01:14:10.460 to be an imminent risk
01:14:11.780 of, say, bodily harm
01:14:13.300 or unlawful activity
01:14:14.340 and it meets a Brandenburg
01:14:15.620 versus Ohio test,
01:14:16.900 this is time-tested stuff.
01:14:17.980 It's been around
01:14:18.340 for a long time.
01:14:19.020 We don't have to recreate the wheel.
01:14:20.300 If it meets that test
01:14:21.400 of imminent harm,
01:14:22.700 take it down,
01:14:23.240 but it can't just meet
01:14:24.380 the test of what
01:14:25.040 a given Twitter employee
01:14:26.160 thinks on a given day
01:14:27.220 could theoretically lead
01:14:28.520 to violence in the future
01:14:29.320 because that includes everything.
01:14:30.920 And then there's
01:14:31.260 that third category
01:14:32.040 of hate speech.
01:14:33.320 And I think this is the category
01:14:34.400 that Elon Musk has
01:14:35.220 to get rid of altogether,
01:14:36.240 where any society
01:14:37.880 that permits free speech
01:14:38.920 has to acknowledge
01:14:39.760 that there's no such thing
01:14:41.460 as a false opinion
01:14:42.600 and there's no such thing
01:14:43.840 as an opinion
01:14:44.600 that isn't allowed
01:14:45.500 in the marketplace of ideas,
01:14:46.920 even if that's
01:14:47.640 an odious opinion,
01:14:49.020 even if that's an opinion
01:14:49.820 that hurts some people,
01:14:51.220 that's part of what it means
01:14:52.200 to live in a society
01:14:53.240 that protects free speech.
01:14:54.760 That's the principle
01:14:55.380 that Elon Musk
01:14:56.000 should bring to running Twitter
01:14:56.980 in recognizing that hate speech
01:14:58.600 is in a different category.
01:15:00.320 And then Megan, like,
01:15:01.020 look, I mean,
01:15:01.480 do a lot of people
01:15:02.100 want to see racial epithets online?
01:15:03.800 No.
01:15:04.600 Are racial epithets
01:15:05.340 a form of hate speech
01:15:06.440 that express an opinion?
01:15:07.900 Absolutely.
01:15:08.480 So how do you resolve
01:15:09.100 that conundrum?
01:15:10.020 Are you just going to drive
01:15:10.820 all of your users away?
01:15:12.340 And this is where I come up
01:15:13.080 with a really simple solution
01:15:14.300 along with Jed
01:15:14.880 that we printed
01:15:15.680 in the pages of the journal today,
01:15:17.020 which was that actually
01:15:18.500 the thing that you got
01:15:19.420 to be able to do
01:15:20.160 is give users the choice.
01:15:22.780 Twitter has this censorship regime.
01:15:24.540 If you really like it that much,
01:15:25.960 let the user opt into it
01:15:27.620 or let the user opt into an algorithm
01:15:30.200 that can show them content
01:15:31.580 that they do
01:15:32.140 and don't want to see,
01:15:33.440 but don't make those decisions centrally
01:15:35.460 because that's what turns this
01:15:36.540 from a free speech,
01:15:37.720 what could have been
01:15:38.180 a free speech platform
01:15:39.080 into a platform
01:15:40.280 for really a centralized form
01:15:41.800 of political censorship
01:15:43.060 and related indoctrination.
01:15:44.820 So that's the view at a high level.
01:15:47.220 I like the plan.
01:15:48.100 However, I see some potential pitfalls
01:15:50.120 with the hate speech policy, right?
01:15:51.600 Because it's,
01:15:52.660 if you were to allow,
01:15:54.180 like what could be said?
01:15:55.540 So if pursuant to those guidelines,
01:15:57.180 you could say,
01:15:58.620 let's make it somebody,
01:16:00.180 a fictional character
01:16:00.960 because I don't,
01:16:02.000 you know, whatever,
01:16:02.880 you know,
01:16:03.340 Joe, Jane Schmo,
01:16:04.980 Jane Doe.
01:16:06.060 Jane Doe is an effing C word
01:16:09.480 whose face should be bashed in
01:16:11.600 and relatives should be harassed.
01:16:14.940 And then they retweet it
01:16:16.340 and they retweet it
01:16:17.080 and they like it,
01:16:17.560 they like it,
01:16:17.920 they like it,
01:16:18.340 and they retweet it.
01:16:19.260 And now it's trending.
01:16:21.240 Jane Doe is an effing C word
01:16:23.420 and all that stuff.
01:16:24.740 It's not incitement.
01:16:26.040 It doesn't call for immediate violence.
01:16:27.360 It's likely to happen.
01:16:28.540 It's definitely hateful.
01:16:30.580 It's got what anybody
01:16:32.260 would consider slurs,
01:16:33.720 but if we're not going to ban them,
01:16:35.060 then it should be okay.
01:16:36.940 But it's going to create a shitstorm
01:16:38.560 in Jane Doe's life,
01:16:39.760 probably, right?
01:16:40.560 Because it's going to circulate.
01:16:41.640 It's going to circulate.
01:16:42.640 So what about something like that?
01:16:44.940 Yeah.
01:16:45.100 So just for the purpose of the example,
01:16:46.660 Megan,
01:16:46.900 at least for first discussing,
01:16:47.980 are you okay with dropping
01:16:49.280 the should have her face punched in part,
01:16:51.340 but all of the other odious stuff
01:16:52.600 still included effing C word,
01:16:53.980 all that.
01:16:54.760 You okay to start with that one first?
01:16:56.600 Like, I don't know.
01:16:57.400 I'm still working it out.
01:16:58.520 You want to leave that in there.
01:16:59.440 You want to leave that in there.
01:17:00.200 Okay.
01:17:01.040 Because you and I agree
01:17:02.200 that the government
01:17:03.220 could not ban that.
01:17:04.160 That would be a constitutionally
01:17:05.260 protected statement.
01:17:05.400 The government could not ban that.
01:17:06.260 Yeah.
01:17:06.460 The government could,
01:17:07.860 well, let's agree with that.
01:17:09.040 So for the sake of the example,
01:17:10.560 let's just agree
01:17:11.420 that that does not count
01:17:12.500 as an incitement
01:17:14.160 to imminent bodily harm.
01:17:15.560 I agree.
01:17:15.840 Okay.
01:17:16.000 Let's just put that in that category.
01:17:17.260 Then let's talk about it.
01:17:17.940 Fine.
01:17:18.700 Because when that goes viral,
01:17:19.720 some people may be
01:17:20.320 on the other side of that.
01:17:21.040 And I don't want to distract
01:17:21.760 from that from the main issue,
01:17:22.720 which is just,
01:17:23.140 there's a really odious thing
01:17:24.060 you're saying about Jane
01:17:24.640 that's going to make her life worse.
01:17:26.020 That's not going to result
01:17:27.080 in her suffering
01:17:28.280 from physical violence,
01:17:29.160 but is going to make
01:17:29.820 her life worse off.
01:17:31.280 And that's going to trend virally.
01:17:32.560 What do we say about that?
01:17:33.840 Yeah.
01:17:33.940 So here's what I say about it.
01:17:35.860 Most users don't want to see
01:17:37.580 that kind of content on Twitter.
01:17:39.220 They will have the opportunity
01:17:40.340 to opt out of it.
01:17:41.680 Many of them will be able
01:17:42.640 to opt out of it
01:17:43.400 by saying that they want to opt out
01:17:44.600 by opting back in
01:17:46.020 to Twitter's pre-existing regime,
01:17:47.620 which would have that kind
01:17:48.320 of content taken down.
01:17:49.840 And others,
01:17:50.620 through their own user behavior,
01:17:52.240 will actually be able
01:17:52.940 to train simple
01:17:53.820 artificial intelligence AI algorithms
01:17:56.220 that Twitter uses today
01:17:57.700 to effectively show people
01:17:59.380 the kind of content
01:17:59.960 they want to see.
01:18:01.120 That's the other proposal
01:18:02.060 that we make.
01:18:02.920 But at the end of the day,
01:18:03.820 if there are people
01:18:04.580 who really want to opt in
01:18:06.300 to see the kind of content
01:18:07.720 that you just described,
01:18:09.260 odious content,
01:18:10.380 hateful content,
01:18:11.440 content that doesn't make me
01:18:12.880 feel better in my course
01:18:14.280 of living my day
01:18:15.000 when I see that.
01:18:16.060 So I'm not going to go
01:18:16.680 search it out.
01:18:17.220 But let's suppose
01:18:17.960 there are the kinds of people
01:18:19.180 out there who want
01:18:19.860 to search that out
01:18:20.500 and say, you know what?
01:18:21.500 That's what people say
01:18:22.200 in the real world
01:18:22.720 and I want to see it unfiltered.
01:18:24.380 Then I say that is part
01:18:25.600 of what living
01:18:26.240 in a free speech culture
01:18:27.480 actually means.
01:18:28.420 We support free speech
01:18:29.720 in this country
01:18:30.340 not because it is not harmful.
01:18:32.660 Free speech
01:18:33.220 has harm.
01:18:34.400 This was the same debate
01:18:35.440 in 1787
01:18:36.200 as it is today.
01:18:37.900 But we accept
01:18:38.780 free speech
01:18:39.640 and we embrace free speech
01:18:40.840 not because we believe
01:18:41.760 it does not cause harm
01:18:42.680 but because we believe
01:18:44.080 the right way
01:18:44.900 for working out
01:18:45.580 our disagreements
01:18:46.320 is through more speech
01:18:47.840 rather than less speech
01:18:49.080 and through ultimately
01:18:50.360 a free exchange of ideas
01:18:51.680 that democracy
01:18:52.380 and the advancement
01:18:53.040 of truth
01:18:53.640 ultimately depends on.
01:18:55.500 So push comes to shove.
01:18:57.860 There's a lot of reasons
01:18:59.120 why I think the problem
01:18:59.780 isn't as big as
01:19:00.480 you might be concerned
01:19:01.440 about there
01:19:01.860 but when the rubber
01:19:02.940 hits the road
01:19:03.440 yes, I come down
01:19:04.360 on the side
01:19:04.820 that somebody
01:19:05.300 should be allowed
01:19:05.880 to say that
01:19:06.460 and that users
01:19:07.940 shouldn't be forced
01:19:08.980 to listen to it.
01:19:10.180 They should be able
01:19:10.780 to turn that off
01:19:11.740 should be able
01:19:12.480 to turn their settings
01:19:13.280 such that they don't see that
01:19:14.300 should be able
01:19:14.900 to opt into Twitter's
01:19:16.020 existing paradigm
01:19:17.000 which wouldn't allow
01:19:17.660 them to see that
01:19:18.300 but for those
01:19:19.140 who do want to see it
01:19:19.980 they should be free
01:19:20.480 to see it
01:19:20.900 and for those
01:19:21.460 who do want to say it
01:19:22.300 they should be free
01:19:22.880 to say it.
01:19:23.300 I feel like
01:19:24.700 it's just
01:19:25.580 what bothers me
01:19:26.940 about the way
01:19:27.460 Twitter operates
01:19:28.260 right now
01:19:28.740 is such a statement
01:19:30.160 like that
01:19:30.540 would probably be censored
01:19:31.620 under the present day rules
01:19:33.320 though 100%
01:19:35.380 I've been subjected
01:19:36.180 to that on Twitter
01:19:36.800 for years
01:19:37.920 and all you need to do
01:19:39.060 is go back
01:19:39.480 and look at the tweets
01:19:40.480 at me after the Trump debate
01:19:41.540 to see that.
01:19:42.180 I mean it was
01:19:42.700 absolutely vicious
01:19:43.640 and one of the top
01:19:46.960 executives of Twitter
01:19:47.880 actually told me
01:19:49.140 personally
01:19:49.540 that what happened
01:19:50.460 to me
01:19:50.760 was one of the reasons
01:19:51.560 they started to think about
01:19:52.660 should we be moderating
01:19:54.060 some of this content more
01:19:55.240 however
01:19:57.740 however
01:19:58.680 it doesn't make me
01:20:01.260 in favor of these
01:20:02.020 moderation rules
01:20:03.060 it doesn't make me
01:20:04.040 say we should
01:20:04.960 we should crack down
01:20:05.920 on speech more on Twitter
01:20:06.900 because what's happened
01:20:08.500 is they've just chosen
01:20:10.200 their favorite
01:20:10.800 left-wing causes
01:20:11.640 and they crack down on
01:20:13.300 you know
01:20:13.500 back to the viewpoint thing
01:20:14.380 they crack down on the people
01:20:15.440 who touch those
01:20:17.120 sacred cows
01:20:18.140 but
01:20:19.160 so Twitter didn't like
01:20:20.760 what the Trump supporters
01:20:21.700 were saying about me
01:20:22.580 but they were 100%
01:20:24.160 with all of the liberals
01:20:25.840 on TikTok
01:20:26.680 calling me a racist
01:20:27.880 after I left NBC
01:20:29.540 for trying to get
01:20:30.740 into a discussion
01:20:31.340 about how
01:20:32.000 society's acceptance
01:20:33.220 of blackface
01:20:34.020 has changed
01:20:34.620 over the past 40 years
01:20:35.620 they loved it
01:20:36.800 Twitter amplified it
01:20:37.940 they let everybody
01:20:38.660 go for it
01:20:39.660 right
01:20:39.860 no problem
01:20:40.520 they had no problem
01:20:41.420 with that
01:20:41.600 so I don't
01:20:43.040 I have very mixed feelings
01:20:44.020 about this content
01:20:44.620 moderation thing
01:20:45.500 because I think
01:20:45.960 it's just going to be
01:20:46.620 employed against one side
01:20:48.240 if we try to say
01:20:49.040 oh the balance of decency
01:20:50.480 would prevent this
01:20:51.980 well
01:20:52.820 yeah
01:20:53.240 it hasn't worked out
01:20:54.860 so well
01:20:55.200 let's go to the harder question
01:20:57.420 the easier question is
01:20:58.320 is Twitter hypocritical
01:20:59.460 in the way they apply
01:21:00.300 those standards
01:21:00.820 in a partisan way
01:21:01.860 absolutely they are
01:21:03.200 the examples are countless
01:21:04.960 Jed and I pointed out
01:21:06.200 several in our article
01:21:07.280 of the way that
01:21:08.020 certain of President Trump
01:21:09.020 Trump's tweets
01:21:09.800 were taken down
01:21:10.540 when certain other
01:21:11.500 more clear
01:21:12.120 can I read that
01:21:12.360 that's a good paragraph
01:21:13.280 let me just read that
01:21:14.100 Vivek
01:21:14.400 and I'll let you finish
01:21:15.800 your point
01:21:16.020 because that's
01:21:16.420 I wanted the audience
01:21:17.320 to hear this
01:21:17.760 you write
01:21:18.500 conservative opinions
01:21:19.380 about transgenderism
01:21:20.380 are censored as attacks
01:21:21.560 on a protected group
01:21:23.040 conservative views on COVID
01:21:24.460 are flagged as misinformation
01:21:25.420 in May 2020
01:21:26.840 Twitter censored
01:21:27.980 as a quote
01:21:28.580 glorification of violence
01:21:29.860 President Trump's
01:21:31.100 quote
01:21:31.600 when the looting starts
01:21:32.560 the shooting starts
01:21:33.480 tweet
01:21:33.860 while leaving untouched
01:21:35.560 Ayatollah Ali Khamenei's
01:21:37.360 tweets
01:21:37.700 calling for the destruction
01:21:39.060 of Israel
01:21:39.580 and Colin Kaepernick's
01:21:41.080 tweets supporting
01:21:41.800 the burning
01:21:42.380 of police precinct houses
01:21:43.880 claims
01:21:45.080 that the Democrats
01:21:46.240 stole the presidency
01:21:47.300 in 2020
01:21:48.180 are censored
01:21:48.940 while claims
01:21:49.900 that Russia
01:21:50.700 did the same
01:21:51.500 in 2016
01:21:52.320 go untouched
01:21:54.100 and of course
01:21:55.000 the truthful
01:21:55.440 Hunter Biden
01:21:56.040 laptop story
01:21:56.760 was suppressed
01:21:57.520 as misinformation
01:21:58.180 so such a good point
01:21:59.640 about the Russians
01:22:00.340 versus
01:22:01.140 the Democrats
01:22:02.740 right
01:22:03.120 such a good point
01:22:03.760 and this is an op-ed
01:22:05.500 so I can't
01:22:06.180 we could just go
01:22:06.700 we could write
01:22:07.200 an entire essay
01:22:08.420 on just the countless
01:22:09.600 list of hypocritical
01:22:10.720 examples that reveal
01:22:11.640 what's actually
01:22:12.300 happening here
01:22:12.960 is the enforcement
01:22:13.960 of a partisan agenda
01:22:15.060 rather than
01:22:16.280 a complicated
01:22:17.340 adjudication
01:22:18.240 of what content
01:22:19.200 moderation means
01:22:20.120 but that's the easy part
01:22:21.440 I still think
01:22:22.280 the question
01:22:22.680 you were getting at
01:22:23.240 before is
01:22:23.820 the harder question
01:22:24.780 which is interesting
01:22:25.440 to adjudicate
01:22:26.340 and talk about
01:22:26.940 let's just set
01:22:28.280 the rules of the road
01:22:28.980 and assume that
01:22:29.580 right wing people
01:22:30.240 run Twitter in the future
01:22:31.200 and then left wing people
01:22:32.020 run it in the future
01:22:32.580 kind of like our
01:22:33.120 federal government works
01:22:33.880 I would like to see
01:22:35.220 a platform in which
01:22:36.420 the rules of the road
01:22:37.700 and the kind of content
01:22:38.920 that makes it through
01:22:39.880 to everyday users
01:22:40.860 shouldn't vary
01:22:41.960 as a function
01:22:42.940 of who's actually
01:22:43.840 in charge
01:22:44.300 or the politics
01:22:44.880 of the person
01:22:45.540 who's in charge
01:22:46.480 and I guess
01:22:48.080 to pick up
01:22:48.980 the strand
01:22:49.360 of our last conversation
01:22:50.020 I'd love to hear
01:22:50.660 your thoughts
01:22:51.160 on this making
01:22:51.500 is what objection
01:22:53.160 would you have
01:22:54.200 you know
01:22:54.920 in having the position
01:22:55.700 that at least
01:22:56.040 you were taking
01:22:56.500 what objection
01:22:57.040 would you have
01:22:57.320 to the idea
01:22:57.680 that the user
01:22:58.380 can decide
01:22:59.080 for himself
01:22:59.500 or herself
01:23:00.060 whether or not
01:23:01.040 they want to see it
01:23:01.860 right
01:23:02.400 in a way
01:23:04.220 they already can
01:23:04.760 because you get
01:23:05.240 to choose
01:23:05.560 who you follow
01:23:06.140 and unfollow
01:23:07.040 so if you
01:23:08.200 you know
01:23:08.600 I actually have
01:23:09.500 a very funny story
01:23:10.200 about this
01:23:10.600 Vivek
01:23:10.960 you have a very funny
01:23:11.840 story about this
01:23:12.260 so when I was at Fox News
01:23:13.600 there was some account
01:23:14.780 and it tweeted out
01:23:16.120 some like thought of the day
01:23:17.260 it was like some
01:23:17.900 introspective
01:23:18.720 self-help type thing
01:23:19.760 I'm like oh
01:23:20.140 this sounds like
01:23:20.660 a good account
01:23:21.120 I'll follow them
01:23:21.820 and then
01:23:23.840 I was getting
01:23:25.560 into the elevator
01:23:26.360 at Fox News
01:23:27.060 and they tweeted
01:23:29.400 the same site
01:23:30.240 with which I was
01:23:30.960 becoming familiar
01:23:31.800 tweeted
01:23:32.640 and I quote
01:23:33.680 forgive me
01:23:34.200 for going there
01:23:34.840 the shocking truth
01:23:36.580 about anal sex
01:23:38.060 I was like
01:23:39.720 okay I need to
01:23:40.980 unfollow
01:23:41.440 I'm gonna need
01:23:43.300 to unfollow
01:23:43.820 and I literally
01:23:44.620 went to hit
01:23:45.600 unfollow
01:23:46.400 but instead
01:23:47.720 I accidentally
01:23:48.880 hit like
01:23:50.100 and I was getting
01:23:52.920 into the elevator
01:23:53.780 and so I couldn't
01:23:54.700 like unlike
01:23:55.340 it was too late
01:23:56.160 I was in the elevator
01:23:56.840 I was like oh my god
01:23:57.780 you know I was like
01:23:58.380 I just liked this tweet
01:24:01.020 and I didn't mean
01:24:01.520 to like this tweet
01:24:02.120 and thank god
01:24:03.880 by the grace of god
01:24:04.740 the you know
01:24:05.420 the black box
01:24:06.160 that is the elevator
01:24:06.880 didn't allow
01:24:08.120 the like
01:24:08.660 the mistake
01:24:09.660 oh is that right
01:24:10.380 okay that's good
01:24:10.880 to go through
01:24:11.300 so I dodged
01:24:12.040 my point is
01:24:13.700 you might have
01:24:14.020 had some unfollows too
01:24:14.980 and people would be
01:24:15.900 a little confused
01:24:16.440 about your new tastes
01:24:17.760 in content
01:24:18.640 could have been a story
01:24:19.780 about that
01:24:20.220 and my point is
01:24:21.240 we do have some control
01:24:22.080 we have control now
01:24:23.180 yeah we have some control now
01:24:24.680 I mean I think
01:24:25.320 that in a certain sense
01:24:26.400 you didn't have the control
01:24:27.240 to say that
01:24:28.120 there's a person
01:24:29.040 who you decided to follow
01:24:29.900 but within the people
01:24:30.700 you follow
01:24:31.100 are people who follow
01:24:31.800 President Trump
01:24:32.460 or people who follow
01:24:33.160 Ayanna Pressley
01:24:33.820 who says a lot of things
01:24:35.060 that may be closer
01:24:35.980 to incitement to violence
01:24:36.780 than what President Trump
01:24:37.520 has said in the past
01:24:37.940 it doesn't matter
01:24:38.700 you can even have it
01:24:39.460 on a content specific basis
01:24:41.440 you don't control
01:24:42.380 what someone else retweets
01:24:43.520 people don't necessarily
01:24:44.720 agree with everything
01:24:46.000 they retweet
01:24:46.540 but might put it out there
01:24:47.400 for people to see
01:24:48.060 if you don't want that
01:24:49.280 to disrupt
01:24:49.880 the flow of your day
01:24:51.200 or your experience
01:24:52.520 or your user experience
01:24:53.900 of the platform
01:24:54.540 you can
01:24:55.380 A
01:24:55.840 set your current settings
01:24:57.280 to be exactly
01:24:57.960 what it is with Twitter today
01:24:59.060 this would be
01:24:59.380 on my proposal to Elon
01:25:00.420 at least
01:25:00.720 leave that intact
01:25:01.500 don't tear that down
01:25:02.740 just create
01:25:03.820 a parallel experience
01:25:04.780 that allows people
01:25:05.420 to see what's
01:25:05.840 on the other side
01:25:06.340 of the veil
01:25:06.800 and even when you
01:25:08.000 create that parallel experience
01:25:09.120 you can use
01:25:09.960 your own behavior
01:25:10.900 to further train
01:25:11.980 what's exposed to you
01:25:13.360 and not
01:25:13.860 but that is different
01:25:15.140 from saying
01:25:15.720 that certain
01:25:16.340 a certain person
01:25:17.060 who's expressing
01:25:17.700 a viewpoint
01:25:18.280 however odious
01:25:19.520 that viewpoint was
01:25:20.780 about Jane Doe
01:25:22.220 that is still a viewpoint
01:25:23.680 that in our society
01:25:25.380 there's no such thing
01:25:26.660 as a false opinion
01:25:28.620 there are false facts
01:25:29.400 there are no false opinions
01:25:30.340 but when there are no false opinions
01:25:32.020 even if it's an odious opinion
01:25:33.360 it's one that a society
01:25:34.820 that protects free speech
01:25:35.860 has to be committed
01:25:36.740 to protecting
01:25:37.760 and I thought Elon
01:25:38.240 did a good job of this
01:25:39.140 you know
01:25:39.600 it's easy for him to say
01:25:40.620 but I thought he did
01:25:41.520 a good job of it nonetheless
01:25:42.240 to say that he believes
01:25:43.600 that the people
01:25:44.020 who are every bit as critical
01:25:45.220 harshly critical of him
01:25:46.240 should have a platform
01:25:47.200 on Twitter
01:25:47.620 just as he does
01:25:48.920 that's the culture
01:25:50.260 of free speech
01:25:50.960 on which this country
01:25:51.700 was built
01:25:52.120 it is a culture
01:25:52.920 that involves trade-offs
01:25:54.320 people have to remember
01:25:55.960 the free speech advocates
01:25:56.780 included
01:25:57.200 the world of free speech
01:25:59.420 is not a rose-colored
01:26:00.780 hunky-dory world
01:26:02.420 it is a challenging world
01:26:03.960 that exposes us
01:26:04.920 to many trade-offs
01:26:06.380 of the experiences
01:26:07.420 we face
01:26:08.100 the risks we take
01:26:09.040 by allowing people
01:26:09.780 to say what they want
01:26:10.800 and express the opinions
01:26:12.160 that they want to express
01:26:13.200 that has costs
01:26:15.060 but starting in 1787
01:26:16.880 and by my book
01:26:18.020 Every Bit
01:26:18.420 Holding True in 2022
01:26:19.680 those are the trade-offs
01:26:21.260 we accept
01:26:22.040 in return
01:26:22.980 for having an open
01:26:24.400 marketplace of ideas
01:26:25.360 that allows the best ideas
01:26:26.680 to win
01:26:27.140 that allows a democracy
01:26:28.360 to thrive
01:26:28.960 that allows the pursuit
01:26:30.120 of truth
01:26:30.720 to actually achieve
01:26:32.000 its goal
01:26:32.540 of adjudicating truth
01:26:33.720 because the thing is
01:26:34.480 if we have a policy
01:26:35.580 that says
01:26:35.960 you can't say the thing
01:26:36.820 about Jane Doe
01:26:37.500 like and you know
01:26:38.520 we should be bashing
01:26:39.820 in the faces of her
01:26:40.680 friends and relatives
01:26:41.780 or of her
01:26:42.380 if we ban that
01:26:44.360 then you're going
01:26:45.860 to open the door
01:26:46.780 to you know
01:26:47.720 the trans people saying
01:26:49.040 I you put me at risk
01:26:51.160 and my family at risk
01:26:52.320 when you say
01:26:52.960 I'm actually not a woman
01:26:54.340 right like that's
01:26:55.420 what they claim now
01:26:56.180 this is harm
01:26:56.920 you do violence to me
01:26:57.960 you threaten me
01:26:59.340 you you know
01:27:00.400 you're you're challenging
01:27:01.380 my very right to exist
01:27:02.720 and that's how they get
01:27:04.200 these sweeping bands
01:27:04.920 they take something
01:27:05.560 that seems objectively
01:27:06.360 reasonable
01:27:06.900 like you shouldn't be
01:27:07.980 calling for the bashing
01:27:08.860 in of somebody's face
01:27:09.780 and say
01:27:10.900 that's exactly the same
01:27:12.260 as you saying
01:27:14.020 I don't accept
01:27:15.160 your change
01:27:16.140 your claim
01:27:16.680 that you can change
01:27:17.500 biological sexes
01:27:18.420 or you can change genders
01:27:19.520 so it does become
01:27:21.100 a slippery slope
01:27:22.040 you almost want to
01:27:22.800 kind of just go with
01:27:23.640 what's
01:27:24.240 what it wouldn't be okay
01:27:25.620 for the federal government
01:27:26.420 to ban
01:27:26.920 Twitter can't ban
01:27:28.240 you know
01:27:28.600 I liked your original piece
01:27:29.940 with Jed
01:27:30.360 you know
01:27:30.920 a year plus ago
01:27:31.780 that we talked about
01:27:32.340 on our show
01:27:32.740 which is just treat them
01:27:33.500 like they're
01:27:34.240 like they're
01:27:34.680 essentially government
01:27:35.820 agencies
01:27:36.620 and they have to
01:27:37.540 follow the same rules
01:27:38.280 and then we have
01:27:38.860 jurisprudence
01:27:39.420 that we can just follow
01:27:40.100 to figure out
01:27:40.600 what's okay
01:27:40.980 and what's not
01:27:41.620 yeah
01:27:42.300 and that was an assumption
01:27:43.100 baked into this piece
01:27:44.020 that Elon would not be
01:27:45.320 a steward for the government
01:27:46.780 because historically
01:27:47.740 what's been happening
01:27:48.500 with these platforms
01:27:49.240 is that they're
01:27:49.900 absolutely deputized
01:27:51.120 by the party in power
01:27:52.240 in the United States
01:27:53.100 now the Democratic Party
01:27:53.900 to do through the back door
01:27:55.180 what government could not
01:27:56.120 do through the front door
01:27:57.220 and since nowadays
01:27:58.420 we have to describe
01:27:59.020 everything through a rhyme
01:27:59.980 the way I describe
01:28:00.780 this one is
01:28:01.340 you know
01:28:01.960 if the
01:28:03.100 what do you say
01:28:04.540 if it's state action
01:28:05.440 in disguise
01:28:06.100 the constitution
01:28:07.080 still applies
01:28:07.960 that's the principle there
01:28:09.180 actually
01:28:09.480 if you're really acting
01:28:10.520 on behalf of the state
01:28:11.280 then you're bound
01:28:12.020 by the constitution
01:28:12.660 that was what we published
01:28:13.580 in January 2021
01:28:14.320 now
01:28:15.320 this piece
01:28:16.620 was based on the assumption
01:28:17.440 that Elon Musk
01:28:18.260 is not going to be acting
01:28:19.320 as a pawn
01:28:20.060 for the ruling Democratic Party
01:28:21.600 in the United States
01:28:22.200 but he still has
01:28:23.120 some irreducible challenges
01:28:24.720 for how you make it
01:28:25.700 a user-friendly platform
01:28:26.900 that people want to use
01:28:27.700 while still protecting
01:28:28.660 free speech
01:28:29.180 this was sort of
01:28:29.860 a how-to guide
01:28:30.620 to accomplish that goal
01:28:32.420 but exactly
01:28:33.500 the more fundamental
01:28:34.540 first principle
01:28:35.320 it should go without saying
01:28:36.780 but maybe it's worth saying
01:28:37.760 nonetheless
01:28:38.040 is that the first principle
01:28:39.320 is that Elon Musk
01:28:40.360 should be crystal clear
01:28:41.680 that he will not be
01:28:42.840 censoring content
01:28:43.620 on behalf of the state
01:28:44.860 and Megan
01:28:45.640 it's worth remembering
01:28:46.400 I mean just think about
01:28:46.960 the context of what's going on
01:28:47.820 in the world right now
01:28:48.520 look at what's happening
01:28:49.240 in Russia
01:28:49.560 okay
01:28:49.980 this is in the news
01:28:51.180 there's a major war
01:28:51.780 playing out in Europe
01:28:53.040 a whole separate topic
01:28:54.120 we could talk about
01:28:55.100 another day
01:28:55.680 but let's think about
01:28:57.360 what Vladimir Putin
01:28:58.040 is doing in Russia
01:28:58.700 he has banned the BBC
01:29:00.120 he has banned
01:29:01.340 all kinds of misinformation
01:29:02.880 and as the state
01:29:04.600 the party in power
01:29:05.540 transparently saying
01:29:06.600 that there's misinformation
01:29:07.780 about the war in Russia
01:29:09.480 coming from the
01:29:10.420 toxic western media
01:29:11.900 his people aren't allowed
01:29:13.240 to see it
01:29:13.640 that is what an autocrat does
01:29:14.980 that is what gives us
01:29:16.240 the moral standing
01:29:17.080 to be on the other side
01:29:18.580 of that geopolitical struggle
01:29:20.340 but I worry about
01:29:21.880 living in a society
01:29:22.780 where actually
01:29:23.440 what we're doing here
01:29:24.700 in the name of
01:29:25.480 taking down hate speech
01:29:26.300 and misinformation
01:29:26.860 as decided
01:29:28.140 or at least influenced
01:29:29.260 by the party in power
01:29:30.200 isn't that much better
01:29:31.740 in some ways
01:29:33.060 it might even be worse
01:29:34.480 because at least
01:29:34.960 Vladimir Putin
01:29:35.640 bans it directly
01:29:36.700 whereas what we have here
01:29:38.140 in the United States
01:29:39.100 is a government
01:29:40.040 that pretends not to do that
01:29:41.260 but actually influences
01:29:42.620 private parties
01:29:43.340 through a combination
01:29:43.880 of threats
01:29:44.580 and inducements
01:29:45.720 and joint coordination
01:29:46.960 behind closed doors
01:29:47.920 to effectuate
01:29:48.920 a really similar outcome
01:29:50.420 and I hope
01:29:51.040 if there's one outcome
01:29:51.920 of this war in Ukraine
01:29:53.560 on the other side
01:29:54.560 of the world
01:29:55.860 it is that we still
01:29:57.320 look ourselves
01:29:58.040 in the mirror
01:29:58.540 to ask ourselves
01:29:59.380 what it is
01:29:59.920 that gives us
01:30:00.680 the moral authority
01:30:01.860 to stand up
01:30:02.760 to autocratic regimes
01:30:03.680 like Russia and China
01:30:04.940 and if we go through
01:30:05.920 this war ending
01:30:07.160 without us having gone
01:30:08.400 through that cycle
01:30:09.080 of introspection
01:30:10.140 I think it will be
01:30:11.140 a lost opportunity
01:30:11.980 for the rediscovery
01:30:12.980 of the few strands
01:30:14.520 of national identity
01:30:15.380 that still define
01:30:16.400 who it is that we are
01:30:17.700 as a people
01:30:18.300 it's not the color
01:30:19.460 of our skin
01:30:20.060 or even our shared heritage
01:30:21.380 because guess what
01:30:21.920 in this country
01:30:22.380 we don't have one
01:30:23.300 what we do have
01:30:24.260 is a set of shared values
01:30:25.460 and one of those values
01:30:26.760 was a commitment
01:30:27.880 to one side of the trade-off
01:30:29.360 that we set in motion
01:30:30.420 in that constitutional convention
01:30:31.520 of 1787
01:30:32.400 when we said
01:30:33.460 that we protect free speech
01:30:35.220 not because it is easy
01:30:36.580 but because it is worth preserving
01:30:38.840 that reminds me
01:30:41.020 of course we've had
01:30:41.700 the Biden administration
01:30:42.580 already say
01:30:43.020 they're concerned
01:30:43.960 they're concerned
01:30:44.800 about Elon's purchase
01:30:45.980 of Twitter
01:30:46.400 and this is
01:30:48.200 we have both Barack Obama
01:30:49.320 and Hillary Clinton
01:30:50.420 pushing for the online crackdown
01:30:52.520 on disinformation
01:30:53.580 Vivek
01:30:54.400 Hillary Clinton
01:30:56.180 she's
01:30:57.580 she's the
01:30:58.720 the biggest
01:30:59.480 and best purveyor
01:31:00.360 of disinformation
01:31:00.900 she's the one
01:31:01.560 who came up
01:31:01.940 with a whole
01:31:02.300 Russiagate lie
01:31:03.360 we know that now
01:31:04.940 it was repeated
01:31:06.500 by John Brennan
01:31:07.360 then head of the CIA
01:31:08.240 to Barack Obama
01:31:09.800 while he was
01:31:10.300 the sitting president
01:31:10.900 hey Hillary's
01:31:11.840 come up with this plan
01:31:13.140 it's a distraction
01:31:14.160 from her email scandal
01:31:15.380 she's going to say
01:31:16.900 that he has some ties
01:31:17.840 to Russia
01:31:18.380 to try to
01:31:19.180 undermine
01:31:19.980 the guy's candidacy
01:31:21.000 she
01:31:22.440 and Barack Obama
01:31:23.400 separately
01:31:23.800 now speaking out
01:31:24.480 about disinformation
01:31:25.340 online
01:31:25.780 pushing for the
01:31:26.440 for the Europeans
01:31:27.400 to crack down
01:31:28.680 on it
01:31:28.900 with their big tech
01:31:29.580 which they are
01:31:30.260 and Barack Obama
01:31:31.680 now making this
01:31:32.380 like the biggest platform
01:31:33.460 of his post-presidency push
01:31:34.860 that we need to crack down
01:31:35.840 on disinformation
01:31:36.460 what do you make of it
01:31:37.280 look this is a surprising
01:31:39.340 reversal of what used
01:31:40.560 to be a liberal concern
01:31:42.120 the idea that you might
01:31:43.520 have an Orwellian state
01:31:45.000 that suppressed
01:31:45.980 the free flow
01:31:46.700 of information
01:31:47.340 the idea of fighting
01:31:48.940 against that
01:31:49.480 was a liberal idea
01:31:50.880 whether this is a liberal
01:31:52.180 or idea
01:31:52.660 or conservative idea
01:31:53.820 today I don't really
01:31:54.520 care about the label
01:31:55.300 but it is interesting
01:31:56.600 just to see how far
01:31:57.960 these so-called
01:31:58.680 progressive movements
01:31:59.680 in this country
01:32:00.580 have come
01:32:01.280 and I also love
01:32:02.400 the shift
01:32:03.280 and the easy
01:32:04.280 the easy sort of
01:32:05.500 let's just say
01:32:06.300 conflation
01:32:06.920 of disinformation
01:32:08.100 and misinformation
01:32:08.920 which is also something
01:32:10.400 that you say the words
01:32:11.200 enough times
01:32:11.780 people start to view
01:32:12.660 them the exact same way
01:32:13.440 kind of like equality
01:32:14.260 and equity
01:32:14.720 but that's a
01:32:15.680 that's a game of language
01:32:16.700 jujitsu
01:32:17.260 but you know
01:32:18.120 what I would say here
01:32:18.760 is just imagine
01:32:19.560 the narrative
01:32:20.040 without saying
01:32:20.640 that this is
01:32:21.020 in the United States
01:32:21.800 that you have
01:32:22.520 a party in power
01:32:23.860 in the United States
01:32:25.020 that adopts
01:32:26.600 a false report
01:32:28.340 from one political
01:32:29.160 candidate
01:32:29.600 to attempt
01:32:30.820 to go so far
01:32:32.000 as to plant
01:32:32.900 a mole
01:32:33.420 in the opposition party
01:32:35.260 and then when
01:32:36.100 that opposition party
01:32:37.020 takes over
01:32:37.680 but then when you
01:32:38.580 regain power
01:32:39.360 four years later
01:32:40.240 you're also working
01:32:41.420 behind the scenes
01:32:42.160 to threaten
01:32:42.660 private companies
01:32:43.500 to take down
01:32:44.720 information
01:32:45.340 that otherwise
01:32:46.380 could prevent
01:32:47.000 the opposition party
01:32:47.800 from getting elected
01:32:48.460 and to take down
01:32:49.500 misinformation
01:32:49.920 that's defined
01:32:51.320 by the party in power
01:32:52.260 and to take down
01:32:52.980 hate speech
01:32:53.500 that the party in power
01:32:54.340 doesn't like
01:32:55.080 this would be something
01:32:56.200 out of a George Orwell novel
01:32:57.420 maybe it will be something
01:32:58.500 out of modern day Russia
01:32:59.360 or China
01:32:59.900 and yet reveal the curtain
01:33:02.120 and ask which country
01:33:02.840 is actually doing
01:33:03.400 that's a description
01:33:04.480 of modern day reality
01:33:05.620 here in the United States
01:33:07.060 of America
01:33:07.820 Megan
01:33:08.240 and I think that
01:33:08.880 that is chilling
01:33:09.840 I think it is chilling
01:33:10.940 and it is
01:33:11.700 am I disappointed
01:33:13.100 that it has to come
01:33:13.760 to an actor
01:33:14.260 in the private sector
01:33:15.320 or a set of actors
01:33:15.980 in the private sector
01:33:16.700 to be able to stand up
01:33:17.840 to this
01:33:18.080 which is fundamentally
01:33:18.660 a problem
01:33:19.120 with our civic culture
01:33:20.100 and even the state itself
01:33:21.300 yeah it's disappointing
01:33:22.400 that that's where we are
01:33:23.240 but at the end of the day
01:33:24.080 I think we're getting
01:33:24.760 pretty close
01:33:25.200 to the last line
01:33:26.020 of defense
01:33:26.500 on basic principles
01:33:28.300 like free speech
01:33:29.460 and open debate
01:33:30.040 is something that we preserve
01:33:31.040 and protect
01:33:31.860 from state intervention
01:33:33.020 or from state
01:33:34.040 corporate intervention
01:33:34.920 and there's by the way
01:33:35.620 a word for the merger
01:33:37.220 of state power
01:33:37.860 and corporate power
01:33:38.920 that is the definition
01:33:40.140 textbook definition
01:33:40.880 of fascism
01:33:41.480 I think that that's something
01:33:42.640 that we have to worry about
01:33:43.680 here on American soil
01:33:44.820 and as I hear these words
01:33:46.100 come out of my mouth
01:33:46.700 you know five years ago
01:33:48.020 I would have listened
01:33:49.400 to myself
01:33:49.780 and thought this sounds
01:33:50.360 like some kind
01:33:50.820 of conspiracy theorist
01:33:51.880 I wish this sort of conspiracy
01:33:53.320 if it weren't a reality
01:33:54.380 that I'm just describing
01:33:55.320 in plain language
01:33:56.080 it's not even hiding
01:33:57.100 in plain sight
01:33:57.660 it's lying
01:33:58.140 in plain sight
01:33:58.980 and I think that
01:33:59.680 that's part of what
01:34:00.740 makes a moment
01:34:01.380 like the takeover
01:34:02.580 of Twitter
01:34:02.980 and otherwise
01:34:03.660 you know
01:34:04.320 let's just say
01:34:04.900 mundane topic
01:34:06.000 for M&A takeover news
01:34:07.800 to have such
01:34:08.760 political importance
01:34:09.800 in where we are
01:34:10.600 as a culture today
01:34:11.420 and I do think
01:34:12.360 that's real
01:34:13.020 I think it's real
01:34:13.800 I need one word answer
01:34:15.080 because we're out of time
01:34:15.760 do you think the sale
01:34:16.680 will be completed
01:34:17.220 and Elon will buy it
01:34:18.260 yes
01:34:19.220 okay
01:34:20.380 all right now listen
01:34:21.420 Vivek has another book
01:34:22.500 coming out
01:34:22.840 it's called
01:34:23.120 Nation of Victims
01:34:23.920 it comes out later this year
01:34:24.960 you can pre-order it
01:34:25.920 right now
01:34:26.300 and you should
01:34:27.020 the subtitle is
01:34:27.680 Identity Politics
01:34:28.480 the Death of Merit
01:34:29.860 and the Path
01:34:31.000 Back to Excellence
01:34:32.380 hope you enjoyed
01:34:33.380 our in-depth discussion
01:34:34.720 of the Elon Musk
01:34:35.620 purchase of Twitter
01:34:36.640 which I agree
01:34:37.420 will happen
01:34:38.560 I want to tell you
01:34:39.700 that tomorrow
01:34:40.600 we've got independent
01:34:41.420 journalist Matt Taibbi
01:34:42.620 friend of the show
01:34:43.360 back with us
01:34:43.940 I'm really looking forward
01:34:44.960 to that
01:34:45.300 he speaks sense
01:34:46.900 I mean pretty much
01:34:48.540 reliably right
01:34:49.540 it's like you can't say
01:34:50.340 that about everything
01:34:50.960 in media
01:34:51.500 but Matt you can
01:34:52.880 and also tell you
01:34:54.280 that now is a great time
01:34:55.080 to download the show
01:34:55.800 if you haven't done so already
01:34:56.780 because tomorrow
01:34:57.860 we're going to drop
01:34:59.360 a special episode
01:35:01.320 just on the podcast feed
01:35:03.700 you know we do the show
01:35:04.380 live on Sirius
01:35:05.220 and we post on YouTube
01:35:06.400 and so on
01:35:06.880 we're doing something
01:35:07.700 special tomorrow
01:35:08.760 just on the podcast feed
01:35:10.900 you'll find out
01:35:11.840 what it is
01:35:12.380 if you subscribe
01:35:13.560 and then download
01:35:15.120 the Megan Kelly show
01:35:15.960 on Apple
01:35:16.500 Pandora
01:35:16.960 Spotify
01:35:17.400 Stitcher
01:35:18.180 wherever you get
01:35:18.740 your podcasts
01:35:19.320 for free
01:35:20.380 and we are looking
01:35:21.980 forward to bringing you
01:35:22.740 that content
01:35:23.520 you'll find out tomorrow
01:35:24.320 what it is
01:35:25.000 thanks for listening
01:35:27.740 to the Megan Kelly show
01:35:28.860 no BS
01:35:29.740 no agenda
01:35:30.500 and no fear
01:35:31.660 good...
01:35:33.840 good...
01:35:34.460 good...
01:35:34.860 go...
01:35:36.160 good...
01:35:38.300 good...
01:35:40.200 good...
01:35:42.560 good...
01:35:42.980 good...
01:35:43.820 good...
01:35:44.300 good...
01:35:45.300 good...
01:35:46.320 good...
01:35:47.580 good...
01:35:48.740 good...
01:35:49.620 good
01:35:50.160 good...
01:35:50.280 good...
01:35:50.340 good...
01:35:51.860 good...
01:35:53.820 good
01:35:54.700 good...
01:35:55.300 good...
01:35:55.900 good...
01:35:56.420 good...
01:35:56.660 good...
01:35:57.920 good...
01:35:58.820 good...
01:35:58.980 good Just...
01:35:59.840 good...
01:35:59.900 good...
01:36:00.600 good...