Verdict with Ted Cruz - May 29, 2020


Social Media Censorship


Episode Stats

Length

31 minutes

Words per Minute

172.9145

Word Count

5,446

Sentence Count

365

Misogynist Sentences

5

Hate Speech Sentences

12


Summary

Ted Cruz has been fighting censorship of conservative voices on the internet for a long time. Now, he s joined by his good friend, Michael Knowles, to argue that tech companies should be held accountable for their censorship. Ted Cruz is a conservative firebrand who served as a Supreme Court clerk for Justice William Rehnquist.


Transcript

00:00:00.000 This is an iHeart Podcast.
00:00:02.500 Guaranteed human.
00:00:04.480 President Trump signs an executive order
00:00:06.680 to stop big tech's censorship of conservatives.
00:00:11.240 Luckily, we can sit down with a guy
00:00:12.880 who has been fighting that very censorship for many years.
00:00:16.640 This is Verdict with Ted Cruz.
00:00:24.240 Welcome back to Verdict with Ted Cruz.
00:00:26.480 I'm Michael Knowles.
00:00:27.580 And the guy who's been fighting this for many years,
00:00:29.620 coincidentally, is Ted Cruz himself.
00:00:32.140 Senator, nice to see you,
00:00:33.680 even though we still have to do this virtually,
00:00:36.480 digitally, and hopefully without any censorship.
00:00:39.040 Well, let's hope so,
00:00:40.360 but we'll see if big tech pulls this podcast down.
00:00:42.840 That's right.
00:00:43.620 Senator, I think everybody knows
00:00:45.760 that big tech has been going after conservatives.
00:00:49.040 And probably for a lot of people listening to this right now,
00:00:51.740 they themselves have felt this kind of thing.
00:00:54.120 If their account has been suspended
00:00:55.480 for sharing a conservative view.
00:00:56.980 I think, frankly, it's probably happened
00:00:58.780 to all of us at this point.
00:00:59.960 So we know that there's a problem,
00:01:01.700 but there's been some debate
00:01:02.760 over how you solve the problem.
00:01:05.160 Do you pass some new regulation?
00:01:06.960 Do you break them up like they're a monopoly
00:01:09.140 with antitrust laws?
00:01:10.280 Or do you enforce laws that are already on the books?
00:01:13.340 You focused in on something called Section 230
00:01:16.400 of the Communications Decency Act of 1996.
00:01:19.880 The president focused in on that as well
00:01:22.040 in his executive order.
00:01:23.300 Can you tell us what that is
00:01:25.540 and how the argument applies to big tech censorship?
00:01:28.620 Well, sure.
00:01:29.320 And in answer to your question about what you do,
00:01:31.300 you listed several options.
00:01:32.580 My answers are yes, yes, and yes.
00:01:36.180 Let's start with 230.
00:01:37.400 What is 230?
00:01:38.380 So 1996, Congress passes a law
00:01:41.220 called the Communications Decency Act.
00:01:42.840 It was actually mainly focused
00:01:44.360 at trying to regulate internet porn,
00:01:47.560 but it included a portion in it, Section 230,
00:01:51.460 that gave a special immunity from liability
00:01:56.020 for big tech companies.
00:01:58.700 And here was the reasoning at the time.
00:02:00.640 Remember, this is early.
00:02:01.560 This was right when the internet is starting.
00:02:04.940 In fact, one thing, Michael,
00:02:06.420 you and I have talked about,
00:02:07.560 I was clerking at the U.S. Supreme Court
00:02:09.920 in 1996, 1997,
00:02:12.340 when one of the first challenges
00:02:15.380 to congressional regulation came up.
00:02:18.360 And I told a story in the book
00:02:20.600 I wrote a few years ago
00:02:21.840 about sitting with a Supreme Court librarian,
00:02:24.740 with my boss, Chief Justice William Rehnquist,
00:02:27.700 and with Sandra Day O'Connor.
00:02:29.340 And the librarian pulled up hardcore porn
00:02:32.940 to show the justices.
00:02:35.680 And all the law clerks are there.
00:02:37.480 And look, Justice O'Connor was in her 70s.
00:02:40.100 And I still remember when they pulled up porn
00:02:42.200 on the screen, Justice O'Connor just went,
00:02:44.560 oh my.
00:02:46.340 And we're all sitting there
00:02:47.840 like the law clerks are in our 20s.
00:02:49.480 And we're looking at our shoes
00:02:50.600 feeling really, really awkward.
00:02:52.820 And even more awkward is the fact
00:02:55.120 that when they were law students at Stanford,
00:02:57.120 Rehnquist and O'Connor were same class.
00:02:59.160 They actually dated for a while.
00:03:02.000 He was number one in the class.
00:03:03.300 She was number three.
00:03:04.660 So picture this, Michael,
00:03:07.260 40 years from now,
00:03:08.940 you're 70, 80 years old
00:03:11.040 and you're standing in a room
00:03:13.100 with an ex-girlfriend
00:03:14.340 from 40, 50 years ago
00:03:17.080 watching porn.
00:03:18.800 Imagine how awkward it was
00:03:20.400 for the two of them.
00:03:21.420 Senator, I know you-
00:03:21.920 And the librarians were-
00:03:23.160 I know you've had a lot of strange experiences
00:03:24.860 in politics.
00:03:25.540 I just think I have to stop here
00:03:26.820 for a moment to say
00:03:28.060 that the strangest one
00:03:29.240 I've ever heard from you
00:03:30.260 is watching hardcore porn
00:03:32.480 with Justice Sandra Day O'Connor
00:03:34.520 and her ex-boyfriend,
00:03:36.440 William Rehnquist.
00:03:37.180 I think that one is absolutely
00:03:39.060 the top of the list.
00:03:41.100 It was surreal
00:03:42.580 and I can still hear the oh my
00:03:45.420 like echoing 20 plus years later.
00:03:49.100 But this was right at the beginning
00:03:51.280 of the internet.
00:03:51.920 So you gotta understand,
00:03:52.580 justices didn't know
00:03:53.280 what the internet was.
00:03:54.300 This was librarians saying,
00:03:55.620 okay, this is this internet thing.
00:03:57.560 You type in things.
00:03:58.640 I still remember, in fact,
00:03:59.980 they turned off the filters
00:04:01.360 and to get to porn,
00:04:03.520 they typed in the word cantaloupe,
00:04:05.940 misspelled.
00:04:07.140 And cantaloupe,
00:04:08.140 I use your imagination,
00:04:10.600 but I guess it pulled up porn.
00:04:13.060 And the librarians were trying to show them,
00:04:15.600 okay, look,
00:04:16.180 people can stumble into this
00:04:17.820 accidentally all the time.
00:04:20.860 Part of that bill,
00:04:22.040 it was section 230,
00:04:24.780 was designed,
00:04:25.620 you had these little internet startups
00:04:26.780 and the idea was,
00:04:28.740 listen,
00:04:29.740 it's not fair to sue a tech company
00:04:32.200 for something somebody posts on there
00:04:34.860 because this is a forum,
00:04:35.880 this is a public square.
00:04:36.940 And so if somebody posts something
00:04:39.020 and you sue the internet company,
00:04:41.320 you could drive them out of business
00:04:42.640 and it's not the tech company
00:04:44.500 that's speaking,
00:04:45.300 it's whoever the users are.
00:04:46.460 So if you post something,
00:04:48.580 the users should be liable,
00:04:50.320 but not the forums.
00:04:51.640 And the predicate for this,
00:04:53.740 this is the policy predicate.
00:04:55.240 But everyone understood
00:04:57.360 big tech were going to be
00:05:00.120 what are called neutral public fora.
00:05:02.280 In other words,
00:05:03.380 they were going to allow everyone to speak.
00:05:05.720 It was going to be
00:05:06.140 the new marketplace of ideas.
00:05:07.800 And so section 230 passed into law.
00:05:10.680 But what it means,
00:05:12.420 you know,
00:05:12.740 Google and Facebook and Twitter,
00:05:14.220 they've got an immunity from liability.
00:05:16.300 Nobody else does.
00:05:17.880 You, Michael Knowles,
00:05:18.940 if you go on the radio
00:05:20.700 and you defame someone,
00:05:22.100 you can be sued.
00:05:23.220 The New York Times,
00:05:24.380 do you know that Google and Facebook
00:05:26.180 have an immunity from liability
00:05:27.340 the New York Times doesn't have?
00:05:28.660 Fox News doesn't have?
00:05:30.340 Everyone else,
00:05:31.420 every American citizen,
00:05:32.720 every American company,
00:05:33.980 it is only big tech
00:05:35.600 that gets this special immunity
00:05:38.120 from liability.
00:05:39.400 So Senator,
00:05:40.040 this is the distinction
00:05:40.880 we've heard about a little bit,
00:05:42.220 which is the platform
00:05:43.680 versus the publisher, right?
00:05:45.160 If you're a neutral platform,
00:05:46.480 you get these protections.
00:05:47.660 But if you're a publisher
00:05:48.520 like the New York Times
00:05:49.620 or the Daily Wire
00:05:50.560 or Fox News or whatever,
00:05:52.260 then you don't get those protections.
00:05:54.280 And the reason for it,
00:05:55.180 of course,
00:05:55.440 is if you could sue Twitter
00:05:57.700 for every defamatory tweet
00:06:00.000 that has ever been tweeted,
00:06:01.420 Twitter would go out of business
00:06:02.420 in approximately five nanoseconds.
00:06:05.660 That could well be true.
00:06:07.360 But, you know,
00:06:07.700 the interesting thing,
00:06:08.640 let's suppose I wrote an op-ed
00:06:10.480 that said with apologies
00:06:12.440 that Michael Knowles
00:06:14.680 has carnal relations
00:06:15.960 with barnyard animals.
00:06:18.020 That's a very graphic show today.
00:06:19.180 And the New York Times published it.
00:06:26.220 And let's assume,
00:06:27.580 and I'm willing to assume
00:06:28.540 for the sake of the podcast,
00:06:29.860 that that is a totally false statement.
00:06:31.960 Thank you.
00:06:33.000 And that I have no basis,
00:06:34.400 that I am willfully being reckless
00:06:36.540 making it up.
00:06:38.580 You as a citizen could sue,
00:06:40.780 you could sue me
00:06:41.500 for defamation,
00:06:43.220 but you could sue the New York Times
00:06:44.680 because by their choice
00:06:45.920 to publish that,
00:06:47.260 if they publish something
00:06:48.460 that's defamatory about you,
00:06:49.740 you can sue the hell out of them.
00:06:51.700 I do the same thing
00:06:52.900 on social media.
00:06:53.920 You can't sue the hell out of them.
00:06:55.200 And the reason is
00:06:56.420 that Congress made a determination
00:06:58.520 20 years ago,
00:07:00.440 these are special public forum.
00:07:02.320 Now, here's what's changed.
00:07:04.220 Big tech has decided,
00:07:05.680 and they've decided it only
00:07:06.580 in the last couple of years,
00:07:08.300 they don't want to be neutral anymore.
00:07:09.940 Yeah.
00:07:10.520 They want to be political players.
00:07:12.600 They want to editorialize.
00:07:14.220 They want to silence voices
00:07:16.360 they don't like.
00:07:17.240 And so they're deliberately
00:07:18.380 amplifying lefty voices
00:07:20.200 and silencing conservative voices.
00:07:23.160 And it's, okay, fine.
00:07:25.040 If they want to do that, fine.
00:07:26.500 But one of the obvious steps is
00:07:28.820 you don't get a special immunity
00:07:31.120 from liability
00:07:31.880 that nobody else gets.
00:07:33.200 We're not going to treat you differently
00:07:34.780 because if you're going to behave
00:07:36.180 like the New York Times,
00:07:37.300 you should face the same legal risks
00:07:41.120 the New York Times faces.
00:07:42.660 That makes sense.
00:07:43.460 We've heard from some legal scholars
00:07:47.360 on the left or jurists on the left
00:07:50.000 that really Section 230
00:07:51.820 shouldn't be applied this way.
00:07:53.680 It's not about political stances
00:07:55.500 and conservatives are abusing this.
00:07:57.880 But the argument you've just made
00:07:59.160 and that you've actually been making
00:08:00.240 for quite some time now
00:08:01.400 is the argument that was made today
00:08:03.280 by Attorney General William Barr.
00:08:05.140 It's the argument that was made
00:08:06.220 by President Trump.
00:08:07.100 So it seems that Section 230
00:08:09.500 is the key here
00:08:11.120 to the conservatives' argument
00:08:12.780 to stopping this big tech censorship.
00:08:15.760 Can you just go into a little bit
00:08:17.200 of the political debate
00:08:18.420 or maybe how the Trump administration
00:08:20.200 came to adopt this idea?
00:08:21.960 Look, the simplest thing I have to say
00:08:24.580 about this executive order
00:08:25.700 is it's about damn time.
00:08:27.100 I have literally been urging
00:08:29.600 this administration to do this
00:08:31.060 for three and a half years.
00:08:32.860 In the last three and a half years,
00:08:34.700 you know, look, I'm in the Senate.
00:08:36.520 I'm a legislator.
00:08:37.220 I can share hearings.
00:08:38.140 I've shared hearings highlighting
00:08:39.840 the rampant censorship
00:08:41.440 and political bias.
00:08:42.740 I can introduce legislation.
00:08:44.320 I've advocated for legislation.
00:08:46.320 But I'm not the executive.
00:08:47.860 It's the executive
00:08:48.900 that actually has enforcement ability,
00:08:51.060 that actually has prosecutors
00:08:53.020 and grand juries
00:08:53.880 and subpoenas
00:08:54.900 and can enforce the law.
00:08:57.280 So I have in the last three years,
00:08:59.280 I have spent hours
00:09:01.760 meeting with Bill Barr,
00:09:03.120 the attorney general,
00:09:03.920 on this topic.
00:09:04.840 I have spent hours
00:09:05.900 with Jeff Rosen,
00:09:07.060 the deputy attorney general,
00:09:08.200 on this topic.
00:09:09.000 I've spent hours
00:09:10.000 with Macon Delrahim,
00:09:11.440 the head of the antitrust division
00:09:12.680 at the Department of Justice,
00:09:13.900 on this topic.
00:09:14.740 I've spent hours
00:09:15.480 with Joe Simons,
00:09:16.520 the head of the Federal Trade Commission,
00:09:18.140 who I used to work with.
00:09:19.040 I know Joe very well,
00:09:20.240 on this topic.
00:09:21.180 I've spent hours
00:09:21.940 with the president,
00:09:22.860 with the vice president,
00:09:24.160 with the White House chief of staff,
00:09:25.660 with the White House counsel
00:09:27.340 urging them.
00:09:28.960 And here's been the problem.
00:09:31.340 Everybody,
00:09:32.220 it's not quite
00:09:33.000 in their jurisdiction.
00:09:34.940 It doesn't quite fit
00:09:36.420 in the neat.
00:09:36.960 So everyone says,
00:09:37.780 yeah, yeah,
00:09:38.080 it's a problem.
00:09:38.580 They've agreed it's a problem.
00:09:39.640 And by the way,
00:09:40.220 Barr and Rosen
00:09:40.720 especially agreed
00:09:41.500 it's a real problem.
00:09:43.020 But it doesn't fit neatly
00:09:45.120 in anyone's
00:09:46.200 sort of traditional
00:09:47.160 job description.
00:09:48.980 And so the president,
00:09:50.060 listen,
00:09:50.260 the president's been frustrated
00:09:51.220 and pissed off
00:09:51.840 about this
00:09:52.320 for a long time.
00:09:53.660 But nobody on his team
00:09:54.900 has been willing
00:09:55.320 to do anything about it.
00:09:56.840 And so I do find it
00:09:58.360 kind of ironic
00:09:59.080 that Twitter
00:10:01.100 decided to be
00:10:03.100 such jackasses.
00:10:05.940 You know,
00:10:06.560 look,
00:10:06.800 my view has always been
00:10:08.340 in the kind of hierarchy
00:10:09.360 of big tech.
00:10:10.960 The worst is Google
00:10:12.280 and YouTube,
00:10:13.200 which they own.
00:10:14.000 You know,
00:10:14.160 Google,
00:10:14.620 their motto used to be
00:10:15.520 don't be evil.
00:10:16.240 Now their motto is just evil.
00:10:17.780 Yeah,
00:10:18.060 be evil.
00:10:18.720 Under Google
00:10:21.360 has been Twitter
00:10:22.460 and then under that
00:10:23.960 is Facebook.
00:10:24.640 Look,
00:10:24.800 Facebook's pretty bad too,
00:10:25.980 but there's tiny moments
00:10:27.180 where they're trying.
00:10:28.440 I think Jack Dorsey
00:10:29.620 at Twitter
00:10:30.120 decided,
00:10:31.220 you know what,
00:10:32.000 I'm tired of Google
00:10:33.040 being the worst player
00:10:34.000 on earth.
00:10:34.840 And so this whole thing
00:10:35.740 was prompted
00:10:36.420 because the idiots
00:10:37.720 decided to fact check
00:10:39.900 the president's tweet
00:10:41.580 on voter fraud.
00:10:43.140 And by the way,
00:10:44.440 number one,
00:10:45.060 they linked to CNN,
00:10:46.120 which is so profoundly wrong
00:10:47.320 on so many issues,
00:10:48.040 it's ridiculous.
00:10:49.680 But Twitter,
00:10:50.960 I think,
00:10:51.260 just pissed off
00:10:52.140 the president
00:10:52.740 and thank God
00:10:54.240 they did
00:10:54.980 because what I assume
00:10:56.240 happened,
00:10:56.780 and I don't know this,
00:10:58.200 but I assume
00:10:59.140 he blew his top
00:11:00.260 and told everyone
00:11:01.200 somebody get off your ass
00:11:02.880 and do something about it
00:11:04.160 and it motivated them
00:11:06.340 to finally do this.
00:11:07.200 I'm glad they did.
00:11:08.340 I think this is
00:11:09.240 an incredible threat.
00:11:11.440 Big tech censorship
00:11:12.400 is the biggest threat
00:11:13.820 to free speech
00:11:14.800 and to fair elections
00:11:17.580 and democracy.
00:11:18.260 we got in the whole country.
00:11:19.040 And to Facebook's credit,
00:11:20.500 actually,
00:11:20.740 Mark Zuckerberg came out today
00:11:22.160 and he said,
00:11:23.200 I don't think
00:11:24.060 that big tech
00:11:24.700 should be the arbiters
00:11:25.780 of truth,
00:11:26.620 basically directly
00:11:27.500 contradicting
00:11:28.240 Jack Dorsey at Twitter.
00:11:29.520 So, I mean,
00:11:29.860 that's a good move.
00:11:31.160 You know,
00:11:31.640 just what do we think
00:11:33.200 the conclusion
00:11:33.800 of this will be?
00:11:34.960 It seems that the threat
00:11:36.180 would be
00:11:36.600 to get Twitter
00:11:37.440 to back off,
00:11:38.100 to not interfere
00:11:38.720 in the presidential election,
00:11:39.800 to not put their thumb
00:11:40.760 on the scales
00:11:41.420 of how information
00:11:42.420 moves around the internet.
00:11:43.400 If they don't back off,
00:11:45.560 are we looking
00:11:46.160 at a world
00:11:46.620 in which Twitter
00:11:47.460 really does lose
00:11:48.220 its protections
00:11:48.800 and Twitter goes down?
00:11:50.440 Look,
00:11:50.940 I hope so.
00:11:52.640 Although,
00:11:53.020 listen,
00:11:53.340 I say I hope so,
00:11:55.120 but I love Twitter.
00:11:57.060 Twitter is a vehicle
00:11:58.160 to engage
00:11:59.040 in public debates,
00:12:00.180 to go back and forth.
00:12:01.740 Twitter,
00:12:02.260 as a neutral public forum,
00:12:03.500 actually works quite well.
00:12:04.740 It's only recently
00:12:05.720 that they've decided
00:12:06.980 to let their crazy lefty go.
00:12:09.980 And actually,
00:12:10.420 Michael,
00:12:10.700 I'll tell you
00:12:11.120 a Zuckerberg story.
00:12:12.360 Yeah.
00:12:13.220 So,
00:12:13.980 as you know,
00:12:14.720 Zuckerberg testified
00:12:15.620 in front of the Senate.
00:12:16.840 A lot of us pounded it.
00:12:17.960 He and I went back and forth
00:12:19.280 in a very public exchange
00:12:20.700 when he was testifying.
00:12:22.360 Well,
00:12:23.000 last year,
00:12:24.560 Zuckerberg came to D.C.
00:12:26.720 and reached out
00:12:28.400 to my office
00:12:29.060 and asked if I'd be willing
00:12:30.260 to sit down
00:12:30.760 and get together.
00:12:32.540 He and I had dinner together.
00:12:34.360 And it was,
00:12:35.580 you know,
00:12:36.980 it was kind of interesting.
00:12:37.760 We thought about
00:12:38.380 actually going to a restaurant,
00:12:40.060 but to be honest,
00:12:41.060 if Mark Zuckerberg and I
00:12:42.160 sat down at a restaurant in D.C.,
00:12:44.520 people would lose their minds.
00:12:46.040 I mean,
00:12:46.160 I think they'd start
00:12:47.360 running around screaming
00:12:48.360 and lighting their hair on fire.
00:12:50.120 So,
00:12:50.680 so we didn't do it
00:12:51.520 in a restaurant.
00:12:52.360 We did it at somebody's house
00:12:54.240 and it was just,
00:12:56.020 it was Zuckerberg and me
00:12:57.300 and it was a couple of people
00:12:58.820 on my staff,
00:12:59.440 a couple of people
00:12:59.840 on his staff.
00:13:00.380 So it was a very small dinner
00:13:01.380 and it was like three hours long.
00:13:03.800 And it was actually,
00:13:04.380 look,
00:13:04.480 it was a lot of fun.
00:13:05.680 Listen,
00:13:05.980 Zuckerberg comes across
00:13:08.000 as he's a smart,
00:13:10.400 geeky techie.
00:13:12.960 And I'll give him some credit.
00:13:14.640 Look,
00:13:14.780 he's actually trying to wrestle
00:13:16.380 through these issues.
00:13:17.460 Yeah.
00:13:17.760 But look,
00:13:19.640 he,
00:13:20.660 and to be honest,
00:13:21.480 we went round and round and round
00:13:23.880 on just,
00:13:26.020 what I was advocating was,
00:13:27.400 look,
00:13:27.640 how about some basic free speech?
00:13:30.200 How about just
00:13:31.100 let the marketplace of ideas,
00:13:33.360 if you disagree with someone,
00:13:34.940 let something like,
00:13:35.620 let people argue.
00:13:37.900 Trump tweets all sorts of things
00:13:39.340 I disagree with.
00:13:40.220 I don't think the answer
00:13:41.160 is to let some Silicon Valley
00:13:42.700 billionaires silence him
00:13:44.000 if you disagree with him.
00:13:46.060 Say you disagree with him,
00:13:48.180 but Zuckerberg
00:13:49.080 is trying.
00:13:52.600 He actually,
00:13:53.540 shortly after the dinner
00:13:54.460 I had with him,
00:13:55.220 he gave a speech,
00:13:56.960 I think it was at Georgetown,
00:13:58.060 maybe somewhere in DC,
00:14:00.080 advocating principles
00:14:01.460 of free speech.
00:14:02.220 Now look,
00:14:02.680 Facebook has been bumpy on this.
00:14:04.780 Yeah.
00:14:05.380 But compared to Twitter and Google,
00:14:07.300 they've been much,
00:14:08.540 much better.
00:14:09.100 And you can see them,
00:14:12.540 Twitter doesn't care anymore.
00:14:14.160 And,
00:14:14.260 and,
00:14:14.360 and by the way,
00:14:17.600 YouTube?
00:14:18.620 Yeah.
00:14:20.160 CEO of YouTube
00:14:21.000 came by my office
00:14:21.920 to talk about this.
00:14:23.620 And her attitude
00:14:24.900 was essentially,
00:14:26.620 well,
00:14:26.780 I can't even say it,
00:14:27.820 but it was,
00:14:28.500 it was screw it,
00:14:30.000 although,
00:14:30.480 although said more graphically
00:14:31.780 than that.
00:14:32.660 Yikes.
00:14:33.240 Um,
00:14:34.740 it was simply,
00:14:35.940 we have power
00:14:36.740 and we'll use power.
00:14:37.920 And you know what?
00:14:38.760 She actually wanted credit
00:14:39.920 because she said,
00:14:41.960 well,
00:14:42.140 you know,
00:14:42.540 people on the left
00:14:43.540 want us to completely
00:14:45.180 silence people.
00:14:46.040 So we're talking about
00:14:46.660 Steven Crowder,
00:14:47.400 a friend of ours.
00:14:48.480 Yep.
00:14:48.940 Um,
00:14:49.300 comedian and,
00:14:50.220 and,
00:14:50.420 and,
00:14:50.620 and YouTube
00:14:51.380 demonetized him.
00:14:53.800 Boy,
00:14:53.960 talk about an Orwellian word.
00:14:55.340 We will take away
00:14:56.420 all your money.
00:14:57.680 We're not censoring you.
00:14:58.820 We're just taking away
00:14:59.340 your CEO wanted,
00:15:01.940 she wanted props
00:15:03.620 because she said,
00:15:04.400 well,
00:15:04.800 we still allow him to post.
00:15:06.420 We didn't,
00:15:06.900 we didn't silence him
00:15:08.040 altogether.
00:15:09.220 I'm like,
00:15:09.460 what the hell are you talking?
00:15:10.520 Okay.
00:15:11.060 And she's like,
00:15:11.600 well,
00:15:11.700 that's what the people
00:15:12.520 of the left
00:15:12.940 wanted us to do.
00:15:13.620 I said,
00:15:13.820 listen,
00:15:14.120 the calls for censorship
00:15:15.160 are only coming
00:15:16.700 from the crazy leftists.
00:15:17.940 I'm not asking you
00:15:19.160 to silence Bernie Sanders.
00:15:21.060 Right.
00:15:21.820 Or AOC.
00:15:22.780 God knows they can
00:15:23.500 prattle on forever,
00:15:24.420 but let them talk.
00:15:25.600 Their ideas are so bad.
00:15:27.000 We'll engage with them
00:15:27.820 on substance.
00:15:28.840 Yep.
00:15:29.340 But I am glad
00:15:32.580 the administration
00:15:33.460 has jumped in.
00:15:34.160 By the way,
00:15:34.540 there are other things
00:15:35.180 they can do.
00:15:36.200 Antitrust agencies,
00:15:37.200 both DOJ and FTC
00:15:38.320 have been engaged
00:15:38.940 in investigations.
00:15:40.280 These are monopolies
00:15:41.220 and they're abusing
00:15:41.900 monopoly power.
00:15:43.120 Something else they can do
00:15:44.000 that's in the executive order
00:15:45.440 that's important.
00:15:45.980 It's not just Section 230,
00:15:47.440 but it's deceptive trade practices.
00:15:50.560 I'm really glad
00:15:51.540 the order tells
00:15:52.660 the attorney general
00:15:53.480 to work with
00:15:54.160 the state attorney's general.
00:15:55.580 I've also talked at length
00:15:56.760 with the Texas attorney general
00:15:58.200 who's leading
00:15:59.280 a state lawsuits
00:16:00.420 about these deceptive practices.
00:16:04.080 And Bill Barr today
00:16:05.660 in the Oval Office,
00:16:06.520 he said it well
00:16:07.280 and actually he reflected
00:16:08.440 a lot of what he and I
00:16:09.500 talked about over breakfast.
00:16:11.360 He said,
00:16:12.120 listen,
00:16:12.460 these tech companies
00:16:13.420 have built their platforms
00:16:15.520 on a lie.
00:16:17.020 They tell people
00:16:18.140 if you come to our platform,
00:16:19.400 you can speak
00:16:20.440 and if you sign up
00:16:22.020 to follow someone,
00:16:23.580 you can watch them.
00:16:24.800 You can see what they post
00:16:26.120 and if they sign up
00:16:26.960 to follow you,
00:16:27.960 they can see what you post.
00:16:29.160 That's the fundamental promise
00:16:30.660 and that is a lie.
00:16:32.580 We know Twitter shadow bans.
00:16:34.040 If they don't like you,
00:16:35.360 people who say,
00:16:36.940 I want to follow Michael Knowles,
00:16:38.460 I care what Michael Knowles
00:16:39.580 has to say,
00:16:40.680 Twitter says,
00:16:41.200 no, no, no, no, no, no, no.
00:16:42.040 We're just going to silently
00:16:43.040 make it go away.
00:16:44.240 That is a lie.
00:16:45.400 That is fraud.
00:16:46.920 And I'm glad
00:16:47.860 this executive order
00:16:48.780 takes a step
00:16:49.640 towards real legal liability
00:16:51.680 for defrauding consumers,
00:16:53.740 which is what big tech is doing.
00:16:54.880 That's a big key
00:16:55.820 to this whole issue
00:16:56.700 is the dishonesty,
00:16:57.860 is the fraud,
00:16:58.820 is the very, very possible
00:17:00.460 and it seems likely
00:17:02.220 violation of these laws.
00:17:03.980 So it's very good.
00:17:05.160 We'll see what happens from it.
00:17:06.580 I mean,
00:17:06.820 sort of the ball now
00:17:07.900 is in the court of big tech.
00:17:09.640 We'll see how they react to it.
00:17:10.900 I want to get your opinion
00:17:11.960 on something,
00:17:12.720 by the way,
00:17:13.880 directly related
00:17:14.940 to this social media marketplace,
00:17:17.420 which is a few incidents
00:17:19.260 that have popped up
00:17:20.020 over the last week or so.
00:17:21.520 And they've really
00:17:22.360 gained national attention
00:17:23.300 because of social media.
00:17:24.800 The most prominent one
00:17:25.820 would be the killing
00:17:27.080 of George Floyd
00:17:28.540 in Minneapolis.
00:17:29.820 There's a police officer
00:17:30.600 who arrested this man.
00:17:32.580 You've got a white officer,
00:17:34.720 black perpetrator,
00:17:36.840 I suppose,
00:17:37.340 or alleged perpetrator.
00:17:38.800 And the suspect ends up dead.
00:17:42.160 The guy's got his knee
00:17:43.280 on his neck.
00:17:44.040 I mean,
00:17:44.160 it looks really bad.
00:17:44.940 And then this spreads
00:17:45.820 on social media.
00:17:47.060 Now there are riots erupting
00:17:49.080 around the country,
00:17:49.960 not even just in Minneapolis,
00:17:51.360 also in Los Angeles.
00:17:52.920 There is looting going on.
00:17:55.080 I'd like to get your perspective
00:17:56.280 on that video
00:17:58.460 from both a social media perspective,
00:18:01.360 a social perspective,
00:18:02.220 and also from the perspective
00:18:03.820 of policing.
00:18:05.220 Well,
00:18:05.460 what happened in Minneapolis
00:18:06.580 was horrific.
00:18:08.120 It was wrong.
00:18:09.140 I've watched that video.
00:18:10.780 And listen,
00:18:11.260 anytime you have an incident
00:18:12.460 with police,
00:18:13.600 sometimes the social media mob
00:18:18.360 is quick to demonize
00:18:19.820 the police officer.
00:18:20.880 And I've long advocated
00:18:22.300 we should wait for the facts
00:18:23.940 to play out.
00:18:24.920 That being said,
00:18:26.000 I watched that video
00:18:27.200 and you had a man
00:18:30.700 in handcuffs
00:18:31.520 on his face
00:18:32.360 on the pavement
00:18:33.000 with an officer's knee
00:18:35.380 in the back of his neck
00:18:36.800 pushing it
00:18:37.540 into the pavement.
00:18:38.540 He's gasping for breath.
00:18:40.140 He's pleading
00:18:40.700 that he can't breathe.
00:18:41.940 And the officer continues
00:18:43.060 for eight minutes.
00:18:44.880 That is,
00:18:46.080 on the face of it,
00:18:47.460 police brutality.
00:18:50.160 And anyone
00:18:51.080 who believes in liberty
00:18:52.200 should not want
00:18:53.620 to see authoritarianism
00:18:55.600 and authority abused.
00:18:57.840 The police officer
00:18:58.880 has been fired
00:18:59.600 and the Department of Justice
00:19:00.600 has opened an investigation,
00:19:02.060 a civil rights investigation.
00:19:03.200 I'm glad they have.
00:19:05.040 Watching that
00:19:06.120 pisses me off.
00:19:08.060 And by the way,
00:19:09.600 you know,
00:19:10.140 the social media mob
00:19:11.620 is quick to paint this.
00:19:14.040 As far as they're concerned,
00:19:15.660 that was Donald Trump
00:19:16.480 with his knee
00:19:16.960 on the back of the neck.
00:19:19.560 Let me be clear.
00:19:20.680 This is Minneapolis, Minnesota.
00:19:22.880 You've got a Democratic mayor.
00:19:24.900 You've got a Democratic governor.
00:19:26.180 You've got Democratic senators.
00:19:28.160 This is bright blue.
00:19:29.640 And we keep seeing
00:19:30.820 these things happen.
00:19:32.000 This abuse of power,
00:19:33.860 often in Democratic strongholds,
00:19:37.380 where people that claim
00:19:39.780 to be interested
00:19:41.940 in defending people's rights,
00:19:43.540 they're not doing
00:19:44.560 a good job of it.
00:19:45.400 Right.
00:19:45.800 I mean,
00:19:46.100 I think a little bit
00:19:47.080 of perspective here is key.
00:19:48.620 And it's why it's so good
00:19:50.620 to get your opinion on this
00:19:51.500 is you're not only somebody
00:19:53.040 with a Twitter account,
00:19:54.000 but you also happen
00:19:55.380 to know quite a bit
00:19:56.380 about the way
00:19:56.980 the criminal justice system works,
00:19:58.800 having worked in it
00:19:59.480 for a long time.
00:20:00.120 I think that perspective
00:20:00.960 really helps.
00:20:02.180 We saw a less tragic,
00:20:04.480 much more frivolous example
00:20:05.920 of this also
00:20:06.660 just a few days earlier,
00:20:08.040 which was this altercation
00:20:09.360 that happened
00:20:09.780 in Central Park.
00:20:10.940 There was a man
00:20:12.160 talking to this woman
00:20:13.620 and basically said,
00:20:15.140 put your dog on a leash.
00:20:16.460 She said no.
00:20:17.620 The man,
00:20:18.180 for some reason,
00:20:18.760 had dog treats
00:20:19.580 in his bag
00:20:20.680 just for this sort of occasion.
00:20:23.260 When people don't have
00:20:24.140 their dogs on leashes,
00:20:24.800 he lures them away
00:20:25.600 with the treats.
00:20:26.600 Then this woman
00:20:27.500 sort of lost her temper
00:20:29.040 and had a little bit
00:20:29.880 of an emotional meltdown.
00:20:31.460 And then he started filming it.
00:20:32.960 And it, you know,
00:20:33.880 what was the end result of this?
00:20:35.620 Nothing happened.
00:20:36.320 I mean,
00:20:36.540 after the videos went public,
00:20:38.200 then this woman
00:20:38.940 lost her job
00:20:39.540 and lost her dog.
00:20:40.360 But in the moment itself,
00:20:42.100 it seemed a little bit
00:20:43.280 like much,
00:20:43.860 much ado about nothing.
00:20:45.700 The social media mob
00:20:47.440 is what made it
00:20:48.360 so much worse,
00:20:49.920 so much more sensational.
00:20:51.620 Is there a world
00:20:53.480 in which we should perhaps
00:20:54.420 hope that maybe social media
00:20:56.440 takes it down a few notches
00:20:57.960 because of this sort of emotion
00:21:00.380 that it can gin up?
00:21:01.500 Yeah, look,
00:21:02.160 I got to say,
00:21:02.700 I have a little bit
00:21:03.460 different take on it.
00:21:04.420 There's no doubt
00:21:05.520 that that incident
00:21:06.220 showed the power
00:21:06.980 of social media
00:21:07.660 and that an iPhone video
00:21:10.060 can suddenly get millions
00:21:11.160 of views all across the world.
00:21:13.100 I have to admit,
00:21:14.740 the woman who was involved,
00:21:17.300 her behavior was atrocious.
00:21:18.600 Of course, yeah.
00:21:19.180 This was an individual,
00:21:23.020 an African-American individual
00:21:24.160 who was there birdwatching,
00:21:25.760 who at least on the video we saw
00:21:27.540 didn't do anything
00:21:28.220 remotely threatening to her.
00:21:29.840 And to watch her
00:21:30.820 be willing to,
00:21:33.900 listen,
00:21:34.280 making a false accusation
00:21:36.240 deliberately
00:21:37.940 is an act of violence.
00:21:39.700 And when this woman,
00:21:42.900 Amy Cooper,
00:21:43.480 picks up her phone
00:21:44.300 and calls 911
00:21:45.280 and she says,
00:21:47.060 an African-American man
00:21:48.280 is physically threatening
00:21:49.420 me and my dog
00:21:50.240 and she calls in
00:21:51.620 what by all appearances
00:21:53.260 is a totally false crime report
00:21:55.280 and it's a dangerous crime report.
00:21:57.840 Listen,
00:21:58.220 if you call the police
00:21:59.420 and say someone
00:22:00.220 is physically threatening me,
00:22:02.380 you are asking
00:22:03.000 for law enforcement
00:22:03.880 to show up with guns
00:22:05.200 and the very real consequence
00:22:07.880 of that could be
00:22:09.540 that the person
00:22:10.600 you're wrongly accusing
00:22:11.800 gets shot and killed.
00:22:14.240 Listen,
00:22:14.520 when officers arrive
00:22:15.760 to an assault in progress,
00:22:19.320 there is a risk
00:22:22.320 of something going really wrong
00:22:23.740 and she was in a very real sense
00:22:25.280 endangering his life.
00:22:26.840 Now,
00:22:26.980 I got to say,
00:22:27.480 the fact that he had dog treats
00:22:28.700 and was trying to feed her dog,
00:22:29.800 if I'm walking my dog,
00:22:30.860 stay the hell away from my dog
00:22:31.920 and don't give him dog treats,
00:22:32.980 that's a little out there.
00:22:35.920 But the attitude she expressed
00:22:39.140 were clearly racist,
00:22:40.440 they were clearly wrong.
00:22:41.540 And by the way,
00:22:43.600 the same point I made
00:22:44.480 about Minneapolis holds here.
00:22:46.460 You saw the Twitter mob
00:22:49.060 saying,
00:22:49.860 oh,
00:22:50.160 this is the age of Donald Trump.
00:22:52.820 They blame Donald Trump for it.
00:22:54.880 Well,
00:22:55.020 then the story comes out
00:22:56.780 that to the shock of nobody,
00:23:00.200 she's apparently
00:23:01.260 a liberal Democratic donor
00:23:02.900 who's donated to Barack Obama,
00:23:04.720 to Pete Buttigieg
00:23:05.620 and to John Kerry.
00:23:07.680 Right.
00:23:08.400 And of course she is.
00:23:09.920 You know,
00:23:10.900 to be honest,
00:23:12.500 well,
00:23:12.840 look,
00:23:13.140 actually,
00:23:14.040 there is racism
00:23:14.860 on both sides of the aisle,
00:23:17.260 but the left,
00:23:20.880 many on the left
00:23:22.080 love to jump in a soapbox
00:23:23.580 and moralize.
00:23:25.280 That's right.
00:23:25.740 And you bring up
00:23:26.460 such a great point
00:23:27.280 because we shouldn't downplay,
00:23:29.340 and in the instance
00:23:29.880 in Central Park,
00:23:30.780 we shouldn't downplay
00:23:31.520 the woman's atrocious behavior
00:23:33.220 just because you point out
00:23:34.660 that there's more
00:23:35.320 to the story,
00:23:36.080 for instance,
00:23:36.920 than,
00:23:37.360 you know,
00:23:37.780 she's totally wrong
00:23:39.280 and he's totally right.
00:23:41.260 I mean,
00:23:41.460 maybe some of his behavior
00:23:42.380 was a little odd too.
00:23:43.660 Maybe it doesn't justify
00:23:44.400 her behavior.
00:23:45.420 The same thing,
00:23:46.080 obviously,
00:23:46.260 you can look at
00:23:46.900 what is pretty clearly
00:23:48.420 police brutality
00:23:49.360 in Minneapolis
00:23:50.500 and condemn that
00:23:52.900 as being terrible
00:23:53.620 and not,
00:23:55.080 therefore,
00:23:55.480 start defending looting
00:23:56.980 and burning down businesses.
00:23:58.380 You know,
00:23:58.500 it seems that there's
00:23:59.380 this knee-jerk reaction
00:24:00.540 on social media
00:24:01.340 where we immediately
00:24:02.320 have to take a side
00:24:03.320 and say,
00:24:03.800 one person was totally wrong,
00:24:05.260 one person was totally right,
00:24:06.620 when really,
00:24:07.160 situations are much
00:24:08.600 more complex than that
00:24:09.620 and maybe a little bit
00:24:11.220 of perspective
00:24:11.820 on the legal side,
00:24:13.420 on the policing side,
00:24:14.180 on the social side
00:24:14.880 can help us
00:24:15.680 to understand those things.
00:24:17.280 Well,
00:24:17.520 and listen,
00:24:18.120 police brutality,
00:24:19.500 it undermines
00:24:21.820 not just the community,
00:24:23.280 but it undermines
00:24:24.460 law enforcement as well.
00:24:25.800 Right.
00:24:27.100 I'm blessed to know
00:24:28.260 a lot of men and women
00:24:29.540 who are law enforcement officers
00:24:30.880 and they feel
00:24:32.060 that whenever something happens,
00:24:34.820 the mob immediately
00:24:37.580 assumes they're at fault
00:24:38.740 and there are instances,
00:24:41.180 a lot of instances,
00:24:42.140 where an officer
00:24:42.940 is scared for his life
00:24:44.220 and is protecting himself
00:24:45.900 and it's one thing,
00:24:47.560 so I watch the video
00:24:48.720 and sometimes these videos
00:24:50.340 don't capture
00:24:51.040 everything that happened,
00:24:52.180 there may have been
00:24:52.680 something that happened
00:24:53.460 before that's not on it,
00:24:54.860 so you have to
00:24:55.620 view all of this
00:24:57.080 with skepticism
00:24:58.000 and so there are times
00:24:59.740 when an officer
00:25:00.320 has to use physical force
00:25:01.880 and serious force
00:25:02.960 to subdue a dangerous individual.
00:25:06.900 What made this video
00:25:08.200 so damning
00:25:09.000 is that it lasts eight minutes
00:25:10.280 and the guy is in handcuffs
00:25:11.860 and his face is there
00:25:12.840 and he's gasping for breath
00:25:14.080 and he's pleading
00:25:14.820 and this officer
00:25:17.000 doesn't do anything
00:25:18.100 other than keep the knee
00:25:19.300 pressing into his neck.
00:25:20.860 That was grotesque and wrong,
00:25:24.940 particularly there are
00:25:25.600 multiple other officers around.
00:25:27.240 It's very hard
00:25:28.060 to look at that video
00:25:28.980 and suggest
00:25:29.700 there was any reason
00:25:31.120 that officer believed,
00:25:32.300 was afraid for his safety
00:25:33.780 as compared to just being
00:25:35.520 brutalizing someone
00:25:37.580 who was already immobilized
00:25:39.160 and that is not
00:25:42.380 how law enforcement
00:25:43.360 should operate
00:25:44.080 and that's not
00:25:44.960 how they usually do
00:25:46.220 and it's why it's good
00:25:48.280 that the Department of Justice
00:25:49.720 is looking into this
00:25:52.900 and I very much hope
00:25:53.760 justice is served here.
00:25:56.140 That's right
00:25:56.640 because the effect
00:25:58.560 of this ultimately
00:25:59.260 is going to be
00:25:59.800 to undermine our faith
00:26:01.040 in these very institutions
00:26:02.880 that would be
00:26:03.760 maintaining law and order,
00:26:05.160 maintaining civil society.
00:26:06.420 It has far wider
00:26:07.980 reaching consequences
00:26:08.920 than many would admit.
00:26:10.540 Before we go,
00:26:11.120 we've got to get to
00:26:11.520 a little bit of mailbag.
00:26:13.520 First question,
00:26:14.720 you know,
00:26:14.980 a really easy one
00:26:16.480 to,
00:26:17.100 I'm sure you can answer
00:26:18.020 this in 10 or 20 seconds.
00:26:19.860 From Norman,
00:26:20.780 how should pro-America
00:26:21.960 nationalists
00:26:22.600 think about Hong Kong?
00:26:23.900 You know,
00:26:24.120 a really easy topic
00:26:25.140 like China and Hong Kong.
00:26:26.300 So what China is doing
00:26:28.600 is trying to take over
00:26:30.140 Hong Kong completely,
00:26:31.720 trying to subject it
00:26:33.900 to the communist government's
00:26:35.620 authoritarianism,
00:26:36.720 trying to strip their rights
00:26:37.900 of democracy,
00:26:38.700 trying to strip their rights
00:26:39.720 of free speech.
00:26:41.120 It is a power grab
00:26:42.580 and it is wrong.
00:26:43.480 And by the way,
00:26:44.740 when Hong Kong,
00:26:45.800 Hong Kong used to be part
00:26:47.060 of the British Empire
00:26:48.540 and then it,
00:26:49.140 when it rejoined China,
00:26:51.060 China agreed to have
00:26:52.380 two separate systems
00:26:54.480 and to protect freedom
00:26:56.340 and democracy in China.
00:26:58.020 China's now changed its mind
00:26:59.300 and it's crushing freedom.
00:27:01.360 There are a lot of consequences.
00:27:03.100 We just saw this week
00:27:04.240 the State Department
00:27:05.420 issue a report
00:27:06.380 that Hong Kong
00:27:07.760 is no longer autonomous
00:27:08.880 from China.
00:27:10.040 They did so
00:27:10.980 because of legislation
00:27:11.880 that I wrote.
00:27:12.860 I authored legislation
00:27:14.160 that was included
00:27:14.800 in a bigger bill
00:27:15.680 that directed
00:27:16.600 the State Department
00:27:17.440 to assess whether Hong Kong
00:27:18.860 had real autonomy.
00:27:20.200 We just got the report
00:27:21.160 this week.
00:27:21.540 I talked with the State Department
00:27:22.540 this week
00:27:23.000 right after they issued it.
00:27:24.880 The consequences
00:27:26.160 of that is significant.
00:27:27.220 There are a lot of things
00:27:28.040 that flow from that
00:27:29.000 in terms of,
00:27:30.700 number one,
00:27:31.320 Treasury Department
00:27:32.040 and sanctions
00:27:32.620 that could easily flow.
00:27:34.320 You know,
00:27:34.780 China wants to use Hong Kong
00:27:36.320 as this sort of
00:27:37.040 free market bastion
00:27:38.160 to get around
00:27:38.920 the restrictions on China,
00:27:40.800 but at the same time
00:27:42.200 they want to trample
00:27:42.920 freedom there.
00:27:44.200 Tariffs,
00:27:44.700 the U.S. trade representative,
00:27:46.520 the tariffs we have
00:27:47.560 against China,
00:27:48.340 Hong Kong is exempt from that.
00:27:49.880 I think given
00:27:50.740 this determination,
00:27:52.040 we will see,
00:27:53.380 I believe,
00:27:54.000 a determination
00:27:54.500 from the President
00:27:55.600 and the White House
00:27:56.360 that will result
00:27:57.200 in some very significant
00:27:58.880 legal consequences
00:28:00.840 basically ending
00:28:02.900 Hong Kong's special status
00:28:04.680 because China
00:28:05.920 is no longer honoring
00:28:06.980 the agreement they have.
00:28:08.300 That's a very good point
00:28:10.380 because I guess
00:28:11.500 this gets back
00:28:12.140 to what we were talking about
00:28:13.100 with big tech.
00:28:14.840 Some players in the world
00:28:16.200 are trying to have it both ways.
00:28:17.600 They're trying to get
00:28:18.060 certain special protections
00:28:19.600 when actually they're violating
00:28:21.300 the very basis
00:28:22.220 of those protections.
00:28:23.360 Something to look at in China.
00:28:24.440 Then finally,
00:28:25.640 maybe the most important question
00:28:27.380 that we keep getting
00:28:29.400 recurring here
00:28:30.240 when it comes
00:28:30.860 to criminal justice.
00:28:32.600 What is Ted Cruz's stance
00:28:34.200 on the legalization
00:28:35.520 of marijuana?
00:28:37.820 Leave it to the states.
00:28:39.620 Listen,
00:28:40.160 I have
00:28:40.880 very libertarian instincts
00:28:43.920 and I have to admit
00:28:45.560 on pot legalization
00:28:47.360 over the course
00:28:48.040 of my life,
00:28:48.680 I've had different views
00:28:49.700 at different times.
00:28:51.140 There are times
00:28:51.920 when I was for legalization.
00:28:53.920 Personally,
00:28:54.460 I'm not for legalization now
00:28:56.100 so if there were a referendum
00:28:57.120 in Texas on it,
00:28:58.340 I'd vote against it.
00:28:59.260 I think there are some
00:29:00.020 significant negative consequences
00:29:01.700 that come from it
00:29:02.580 but I believe in federalism.
00:29:05.180 I believe we got 50 states
00:29:06.900 and reasonable people
00:29:08.740 can differ on this
00:29:09.740 and I think it's perfectly fine
00:29:11.880 to let the states operate
00:29:13.820 as laboratories of democracy
00:29:15.380 to see what happens
00:29:16.500 and so,
00:29:20.640 I mean,
00:29:20.860 listen,
00:29:21.220 when I was a teenager,
00:29:22.820 I smoked pot.
00:29:24.200 I wrote about that
00:29:25.160 in my book also.
00:29:25.940 You know,
00:29:26.080 when I was in high school,
00:29:26.860 early on in college,
00:29:28.000 I smoked pot a number of times
00:29:29.360 and it's not something
00:29:31.380 I hope my kids don't.
00:29:33.220 Thankfully,
00:29:33.720 they're 9 and 12.
00:29:34.520 I'm pretty sure
00:29:35.020 they haven't gone there yet
00:29:36.040 but,
00:29:36.940 you know,
00:29:37.780 it was,
00:29:38.620 I wasn't much older
00:29:39.480 than they were
00:29:40.060 when I first tried it
00:29:41.220 and it's not something
00:29:42.700 I don't think
00:29:43.120 it's good for kids to do
00:29:44.100 but I think we can leave it
00:29:46.360 to the states
00:29:46.940 and let the states sort out
00:29:48.300 if and when it should be allowed
00:29:50.380 or if not.
00:29:51.240 So what I'm hearing,
00:29:52.140 Senator,
00:29:52.440 is we're not going to get
00:29:53.560 one of these Elon Musk,
00:29:55.320 Joe Rogan moments
00:29:56.260 where you pull a joint
00:29:58.220 from off camera
00:29:59.120 and start puffing on screen.
00:30:00.700 We're not going to get that.
00:30:01.840 Well,
00:30:02.140 I will say this.
00:30:03.820 If you remember
00:30:04.880 when Bill Clinton
00:30:06.040 was running for office
00:30:07.100 and he said he smoked pot
00:30:08.520 but he didn't inhale.
00:30:09.740 Yeah.
00:30:10.420 I have to admit
00:30:11.760 I was,
00:30:13.600 even then I was cracking up
00:30:15.140 laughing,
00:30:16.020 thinking,
00:30:16.900 so you're saying
00:30:17.480 you didn't do it right?
00:30:18.540 Like,
00:30:18.720 if you're going to do it,
00:30:19.480 actually,
00:30:20.120 like,
00:30:20.820 I mean,
00:30:21.460 don't screw it up.
00:30:22.580 Like,
00:30:22.800 listen,
00:30:23.100 I still smoke cigars.
00:30:24.560 Now,
00:30:24.680 you don't inhale cigars.
00:30:26.200 That's actually not how you,
00:30:27.100 and you and I
00:30:27.640 have smoked cigars together.
00:30:28.740 That's right.
00:30:29.520 But,
00:30:29.900 and there one does not inhale.
00:30:32.160 But,
00:30:32.340 but no,
00:30:33.440 I will be not,
00:30:35.100 not be lighting up a spliff
00:30:36.580 in this particular podcast.
00:30:38.480 Fair enough.
00:30:39.240 You know,
00:30:39.640 you're,
00:30:40.140 you've reminded me
00:30:40.840 when you mentioned Bill Clinton,
00:30:42.440 those Democrats
00:30:43.160 wasting things
00:30:44.320 so fiscally irresponsible,
00:30:45.960 even when it comes down
00:30:47.600 to something such as that.
00:30:49.320 Much more to get to,
00:30:50.360 but alas,
00:30:50.880 we're out of time.
00:30:51.760 Senator,
00:30:52.160 we will have to pick it up again
00:30:53.580 next time.
00:30:54.560 I'm Michael Knowles.
00:30:55.440 This is Verdict with Ted Cruz.
00:31:04.040 This episode of Verdict
00:31:05.680 with Ted Cruz
00:31:06.520 is being brought to you
00:31:07.620 by Jobs Freedom
00:31:08.620 and Security Pack,
00:31:09.980 a political action committee
00:31:11.360 dedicated to supporting
00:31:12.540 conservative causes,
00:31:13.960 organizations,
00:31:14.820 and candidates
00:31:15.500 across the country.
00:31:16.840 In 2022,
00:31:18.080 Jobs Freedom
00:31:18.700 and Security Pack
00:31:19.720 plans to donate
00:31:20.580 to conservative candidates
00:31:21.900 running for Congress
00:31:22.940 and help the Republican Party
00:31:24.580 across the nation.
00:31:25.760 This is an iHeart Podcast.
00:31:28.600 Guaranteed Human.