Louder with Crowder - March 18, 2024


LIVE FROM SUPREME COURT: Can Biden Crush Free Speech?! + Jim Jordan Joins!


Episode Stats

Length

1 hour and 11 minutes

Words per Minute

194.47743

Word Count

13,863

Sentence Count

1,098

Misogynist Sentences

20

Hate Speech Sentences

16


Summary

Today, the Supreme Court will hear the case of "Biden v. Missouri," otherwise known as "Missouri v. Biden." The case centers around whether or not a shadow ban should be implemented in order to combat online censorship.


Transcript

00:00:00.000 you.
00:00:06.000 Two.
00:00:07.000 Three.
00:00:08.000 Jump!
00:00:09.000 With me!
00:00:09.000 Sit!
00:00:10.000 Ass!
00:00:10.000 With me!
00:00:12.000 Oh!
00:00:12.000 With me!
00:00:13.000 Such fun!
00:00:13.000 King!
00:00:14.000 Jump!
00:00:15.000 Ass!
00:00:15.000 Sit!
00:00:15.000 With me!
00:00:16.000 With me!
00:00:17.000 Jump!
00:00:18.000 With me!
00:00:18.000 Oh!
00:00:19.000 Jump with me, flip us with me, fall with me, keep such fun.
00:00:28.000 Jump with me, flip us with me, jump, fall with me.
00:00:35.000 Great big hit, play what you're on.
00:00:41.000 Great big hit.
00:00:42.000 I'm playin' music.
00:00:43.000 And I'm good at it.
00:00:45.000 Great big hit.
00:00:46.000 I'm playin' like you're on.
00:00:48.000 Mm-hmm-hmm-hmm.
00:00:49.000 Great big hit.
00:00:51.000 I'm playin' music each day.
00:00:53.000 Each day.
00:00:56.000 Jump with me.
00:00:58.000 Flip us with me.
00:01:00.000 Fall with me.
00:01:02.000 Keepin' such fun in here with me.
00:01:06.000 One flip us with me.
00:01:08.000 Jump off with me.
00:01:10.000 It is the very early morning hours of March 18th, 2024.
00:01:33.000 And later on this morning, in this building behind me, the Supreme Court of the United States is going to hear oral arguments on the case of Bernie v. Missouri, otherwise known as Missouri v. Biden.
00:01:43.000 It's time to change 230, get rid of 230.
00:01:45.000 We're for getting this in front of the Supreme Court.
00:01:50.000 The idea of a shadow ban is that you ban someone but they don't know they've been banned.
00:01:54.000 I am anti-Trump overall.
00:01:57.000 I ban him so how?
00:02:00.000 Our government, they can advise big tech right now as it relates to online censorship.
00:02:03.000 All major tech platforms ensure the American people have access to accurate information.
00:02:07.000 YouTube dragged its feet before taking any action against Steven Crowder.
00:02:11.000 You have Susan Wojcicki talking with senators to demonetize this very channel.
00:02:14.000 We did announce the monetization change that Steven Crowder...
00:02:19.000 The DHS flagged 4,800 pieces of misinformation.
00:02:23.000 The strike was you quoting the CDC.
00:02:25.000 We want every platform to continue doing more to call out misinformation and mis- and disinformation.
00:02:30.000 Feel that tension?
00:02:31.000 Feel that f***ing tension that shouldn't exist?
00:02:34.000 When we quote the CDC, we survived the Vox Apocalypse.
00:02:37.000 YouTube, Google said, hey, you're not going to make another dime off ads.
00:02:40.000 We broadcast the single largest individual election live stream in the history, doing it through Rumble.
00:02:47.000 Current members of the government and entities who are backed by the government deciding to put you in stocks in the town square.
00:02:53.000 The message is really loud.
00:02:56.000 What Google does when they're supporting a candidate or a cause, they suppress the negative search suggestion.
00:03:04.000 Just by manipulating those search suggestions, they're flashing at you.
00:03:07.000 We can turn a 50-50 split among undecided voters into a 90-10 split with no one having the slightest idea.
00:03:15.000 We're also training our algorithms like if 2016 happens again, would we, uh, would there?
00:03:21.000 I, I, it couldn't be different.
00:03:23.000 Just the Hunter Biden story is enough.
00:03:25.000 According to the polls, 17% of Biden voters would have changed their vote if they'd known about the Hunter Biden laptop story.
00:03:34.000 You would have had the biggest political landslide victory for Donald Trump in all of modern presidential history according to the people who voted for Biden.
00:03:45.000 Everyone just might want to hit a clean slate policy, but do it so that everyone can start using social media to serve us and not serving this algorithm non-human brain.
00:03:56.000 There are forces out there that they don't want that information out there, so they suppress it.
00:04:02.000 If you say something that they in power deem to be unacceptable, you can still have your life ruined.
00:04:10.000 Thanks to Mug Club viewers out there, we've been able to file an amicus brief on behalf of Louder with Crowder.
00:04:15.000 And with any luck, I'll be in that courtroom later today, sitting and witnessing these oral arguments and coming back out to give you my analysis on what I found in the courtroom.
00:04:28.000 LouderwithCrowder.com slash Mug Club.
00:04:29.000 You can use the promo code SCOTUS for $10 off.
00:04:31.000 You can also try $9 Mugless.
00:04:33.000 We cannot do this without you.
00:04:35.000 From the bottom I think of everyone's heart here as a thank you.
00:04:48.000 The firearm?
00:04:49.000 Yes, would you have any?
00:04:51.000 Would I or do I?
00:04:53.000 Both, actually.
00:04:55.000 What?
00:04:56.000 What what?
00:04:56.000 I don't know, I'm asking you.
00:04:58.000 But I was asking you.
00:04:59.000 Is this about the firearms again?
00:05:01.000 Yes, would you have some?
00:05:03.000 Well, since you're offering.
00:05:05.000 Yes, actually I would.
00:05:07.000 Thanks for watching!
00:05:10.000 Thanks for watching!
00:05:12.000 Walther, one of life's finest firearms.
00:05:16.000 Try it, you'll buy it.
00:05:19.000 You're a stranger in love, that's what I know.
00:05:37.000 You're a stranger in love, I got to follow.
00:05:46.000 I'm in the sweetest love.
00:05:49.000 I'm in the sweetest love.
00:05:53.000 Okay, quick sep.
00:05:56.000 We're going to introduce everybody really quickly.
00:05:57.000 Number two, CEO Gerald Morgan.
00:05:59.000 Thank you for being here.
00:06:00.000 Third chair, Josh Feierstein.
00:06:02.000 Excellent.
00:06:04.000 The reason why is because today is a very special stream.
00:06:08.000 We have our own George the Greek down live at the Supreme Court.
00:06:12.000 We are going to have Representative Jim Jordan on in just a few minutes.
00:06:15.000 So if at some point today, at any point today, you see this if you're watching on YouTube.
00:06:22.000 Head on over to Rumble, live show, weekdays, 10 a.m.
00:06:24.000 Eastern.
00:06:24.000 Of course, today it's at 11.
00:06:26.000 So, we actually have filed with the Supreme Court here today our own amicus brief to the Murphy v. Missouri, or Missouri v. Biden, as it's known, case as it relates to big tech, as it relates to government censorship.
00:06:41.000 And there's a lot happening today, which could be, frankly, depending on what takes place, it could be the most Impact, people use the term turning point, could be the most impactful event in relation to not only this election, but future elections in the changing landscape of social media.
00:06:56.000 You know that we've been up against this really going back at least since 2015, dealing with official bans on YouTube, Facebook, Twitter.
00:07:04.000 But this is happening, this has affected the outcome of elections, period.
00:07:07.000 You know that, we've covered that on this show, and today we're going to get into the specific arguments that are being made before the Supreme Court And depending how this goes, depending on how the Supreme Court rules, that may change everything that you see as it relates to social media.
00:07:20.000 So question of the day, and please do, this is going to be a topic of today, the algorithm.
00:07:24.000 So to help the algorithm, if you comment, it helps.
00:07:27.000 Maybe not if you're on YouTube because they get to determine it anyway.
00:07:30.000 We'll get to that.
00:07:31.000 Do you think the tides are turning when it comes to Well, legacy media propaganda, and are you concerned about those tides not turning yet as it relates to big tech, social media propaganda?
00:07:43.000 That's kind of the problem that is taking place, right?
00:07:45.000 We thought, hey, no more gatekeepers.
00:07:46.000 It's not ABC, NBC, CNN.
00:07:48.000 Well, we've really gone to just a few key platforms, and they control over 90% of the information that you see.
00:07:54.000 Gerald, how are you feeling?
00:07:55.000 I'm good.
00:07:56.000 We're good.
00:07:57.000 I'm completely out of— I know.
00:07:58.000 You gotta breathe.
00:08:00.000 I'm not just breathing.
00:08:01.000 I'm completely—because I was going to you for an intro, but I already did it.
00:08:05.000 Josh, you're good?
00:08:06.000 Yeah, I'm doing great, man.
00:08:07.000 I'm just ready for this court case.
00:08:08.000 It's gonna be... Supreme Court case.
00:08:10.000 It's going to be intense.
00:08:11.000 And by the way, as you know, this show is made possible entirely by you.
00:08:14.000 Yes.
00:08:15.000 Mug Club.
00:08:16.000 We are not giving a dime on YouTube, which is a big part of it, right?
00:08:20.000 Just demonetized.
00:08:21.000 Imagine if you walked into your office and said, well, you still have to work here, but you're not gonna be paid anymore.
00:08:24.000 So, we made sure to uncouple from Big Tech because you are the ones who support this show, the investigative journalism, the ability for us to send someone down and file an amicus brief like today.
00:08:33.000 Of course, you also get Nick DiPaolo, Brian Callan, Hodge Twins, Mr. Guns and Gear, the Mug Club Undercover.
00:08:37.000 All of that, it's only supported by people like you.
00:08:40.000 Yeah, and by the way, just so you know, not just anybody can print one of these things up and file it with the court.
00:08:45.000 There are a lot of rules and regulations that go into it.
00:08:47.000 It costs many, many, many thousands of dollars and a lot of man hours just to be able to get to the point we are now, much less sending people on the ground and doing everything there.
00:08:56.000 So, thank you again, Mug Club.
00:08:57.000 It does not happen without your support.
00:08:59.000 Doing something like this is a big, big, big lift.
00:09:01.000 It looks like a St.
00:09:02.000 Patrick's Day hymnal book.
00:09:04.000 It does.
00:09:05.000 Well, I don't want to get pinched.
00:09:07.000 I think it is.
00:09:07.000 I was wondering about the typos.
00:09:09.000 You're drunk.
00:09:10.000 Did you just have a filler light?
00:09:10.000 Makes sense.
00:09:12.000 You just got really bright.
00:09:12.000 You know, I think it's just the glow of heaven.
00:09:15.000 What is it actually?
00:09:17.000 I have no idea.
00:09:17.000 I think the light just turned on right up there.
00:09:19.000 I turned his light up a little bit.
00:09:22.000 It was a little dark.
00:09:22.000 Well, turn him back down.
00:09:23.000 I'm seeing him too clearly.
00:09:24.000 Okay.
00:09:26.000 So, February 7th, this show company filed an amicus brief.
00:09:31.000 In this case, Missouri vs. Biden.
00:09:32.000 Let me set this all up for you.
00:09:33.000 It's also known as Murphy vs. Missouri, so if you hear me use those interchangeably.
00:09:36.000 All of these references today are available.
00:09:39.000 Link in the description.
00:09:39.000 Livewithcreditor.com today.
00:09:40.000 That's particularly important.
00:09:42.000 For those who don't know, an amicus brief is basically a legal argument filed by a person who isn't the direct party in that case, but has a vested interest in the outcome and has relevant either evidence, documents, or an argument.
00:09:54.000 Let me explain to you what this case is, okay?
00:09:58.000 It's a case, not the Samickers brief, but the case itself is about the Biden administration communicating with big tech, colluding, let's use that term, colluding with big tech to remove content that they didn't like, which means that the Biden administration's actions, it is a direct violation of the First Amendment.
00:10:14.000 I know libertarians will say, oh, private platforms, they can do whatever they want.
00:10:16.000 We are way past that, especially after COVID.
00:10:20.000 We were dealing with this for a very long time.
00:10:22.000 COVID accelerated it.
00:10:23.000 There is no doubt now.
00:10:25.000 That places like YouTube, Facebook, at one point Twitter, now X, I still have to get that right, that they were doing the bidding of the government.
00:10:32.000 Because the government said so.
00:10:32.000 Why?
00:10:33.000 Also, these platforms said so.
00:10:36.000 They were doing it out in the open.
00:10:38.000 So think about that for a second.
00:10:39.000 They tried to claim, though, that the Biden administration, their counter-argument is that the government, I guess, was just exercising First Amendment rights in trying to persuade platforms to allow content and ban other content.
00:10:54.000 Think about that for a second.
00:10:55.000 That is their argument.
00:10:56.000 I don't know what the hell they're talking about.
00:10:57.000 I don't know what they're talking about.
00:11:19.000 Someone should censor that.
00:11:20.000 Yes. Not us.
00:11:22.000 No, not us. We're about truth.
00:11:24.000 So let me give you a few examples here as to what has been targeted by the government.
00:11:31.000 Just before you say, oh it's just about conservatives and people just whining and claiming to be a victim.
00:11:36.000 Okay.
00:11:37.000 No.
00:11:38.000 The COVID lab leak theory, it's now not referred to as a theory, but at one point in time, right, the Biden administration, this government said, we want you to specifically either throttle it, outright label it as misinformation.
00:11:48.000 There were different steps that they had taken.
00:11:50.000 If you criticize the efficacy of lockdowns, Or of the vaccines, which we did in this program, we were suspended for that.
00:11:57.000 If you at any point questioned the legitimacy, not only the legitimacy, I want to be clear, of the 2020 election, but the changing of laws in states which affected the outcome of the election.
00:12:08.000 If you criticized the unconstitutional changing of the laws, for example, in Pennsylvania, you could be removed.
00:12:15.000 And also, keep in mind, one year, this was the biggest election stream ever in the history of, well, big tech on social media platforms.
00:12:24.000 They removed us.
00:12:25.000 They removed us and that's why we were able to stream to Rumble.
00:12:28.000 That's not a small detail and the big reason for that, it was taking place in real time.
00:12:33.000 Remember when the red wagons came in with ballots in Michigan and they were plastering up the voting precincts?
00:12:42.000 We covered that in real time.
00:12:44.000 It was impossible for them to say it was a conspiracy theory when everyone was watching and saying, what is happening?
00:12:49.000 Now they had made up their excuses the next day or, for example, the pipe bursting in Georgia.
00:12:52.000 That never actually happened?
00:12:53.000 That never actually happened.
00:12:55.000 There's a real value in seeing things in real time, and that's been a problem for this administration.
00:13:00.000 So I want to be clear, it's not about conspiracy theories.
00:13:03.000 It's about proven, at least facts, or at least, let's just even give them debatable information.
00:13:09.000 information, like the COVID lab leak theory.
00:13:12.000 Now it seems far more likely than unlikely.
00:13:14.000 At the very least, it's a 50-50 wash.
00:13:17.000 Anything that you wanted to...
00:13:18.000 No, no, no, we're good.
00:13:20.000 I know we've got Jim Jordan in about eight minutes here, but we'll detail some of the arguments in this case in just a minute to give you kind of a brief, and then we've got a lot more information after we do the interview with Jim Jordan to go through with everybody.
00:13:30.000 And we have George the Greek down there.
00:13:31.000 We also have George the Greek!
00:13:33.000 Did noodles just have to go to the bathroom?
00:13:34.000 He almost died.
00:13:35.000 I was thrown off for a second.
00:13:36.000 He was choking?
00:13:37.000 He choked on a fly.
00:13:38.000 He did it so quietly.
00:13:39.000 That was the quietest choke.
00:13:41.000 He was eating sunflower seeds.
00:13:42.000 I like a guy who politely chokes.
00:13:44.000 Yeah, you don't do that.
00:13:45.000 Sunflower seeds?
00:13:46.000 Yeah, I don't trust him.
00:13:48.000 I don't trust him.
00:13:50.000 Even for adults.
00:13:51.000 He was choking?
00:13:52.000 He choked so quietly you could hear a rat piss on cotton.
00:13:56.000 That's usually how it happens.
00:13:58.000 Do they typically do that?
00:13:59.000 I've heard they do.
00:14:01.000 I don't know their preference.
00:14:02.000 It's like a urinal cake.
00:14:03.000 Before we get into the arguments, we actually are going to check in again.
00:14:05.000 We have George the Greek down there, but we also have our on-the-ground correspondent outside of the Supreme Court, as opposed to in, like George the Greek.
00:14:13.000 Thomas Finnegan, let's go to him.
00:14:15.000 Finnegan, what are you hearing about down...
00:14:25.000 Why aren't you at the Supreme Court?
00:14:29.000 Hi, Steven.
00:14:30.000 Fine.
00:14:30.000 Hi.
00:14:31.000 You're supposed to be outside of the Supreme Court to cover the free speech case.
00:14:35.000 Where are you?
00:14:36.000 Well, I slept through my alarm, and I missed my flight, and I remember that you want me to go to a court, and the closest I could get to is the nearby pickleball court.
00:14:46.000 And I made good time.
00:14:46.000 Pickleball?
00:14:47.000 Pickleball?
00:14:49.000 The court?
00:14:50.000 I see what you're...
00:14:52.000 Is that, is that actually... Pickleball is growing in popularity, combining elements from badminton, tennis, and ping-pong.
00:14:59.000 Finnegan, you are a waste of space.
00:15:03.000 I don't even, I don't, I don't care anymore.
00:15:05.000 It's a fan-friendly sport.
00:15:06.000 Anyone can play.
00:15:08.000 What does it have to do with big tech or free speech, Finnegan?
00:15:11.000 That's the point that we're getting to.
00:15:12.000 Pickleball is free.
00:15:14.000 You just have to sign up at the Rec Center.
00:15:15.000 Using your services should be free.
00:15:18.000 Tim Cutt, I don't want to...
00:15:19.000 I promise you, George the Greek is in the Supreme Court right now.
00:15:30.000 Do we have the audio feed, actually, from George?
00:15:33.000 Yeah, we do.
00:15:33.000 Okay, so you can hear, he is in there right now, just so you can hear what it is that he is hearing.
00:15:37.000 Let's bring that up just really quickly.
00:15:39.000 Why are you showing the street fighter?
00:15:45.000 Is that a fun thing to show?
00:15:50.000 So get their reactions and also hear maybe novel arguments that are being made or a poor defense by the Biden administration.
00:16:05.000 So he'll be able to give us kind of an insight into what's actually going on and maybe which way they'll lean.
00:16:10.000 They're arguing today, they'll judge the kind of rule on this in June, I believe, and announce the decision.
00:16:15.000 We'll be covering that as well, but you can kind of understand with a 6-3 court where people are probably going to head.
00:16:19.000 We're looking at those votes that might sway like Kavanaugh, right, or Roberts.
00:16:23.000 Maybe what their reactions, what their questions are to kind of make some indication of what they think.
00:16:27.000 Yes.
00:16:28.000 So let's go through these arguments really quickly after Representative Jim Jordan.
00:16:31.000 We'll go through these in detail.
00:16:33.000 Do you have these numbered there, Toolman?
00:16:35.000 Okay.
00:16:35.000 Yes.
00:16:36.000 So argument number one that's being made.
00:16:39.000 Um, that Big Tech is, is, uh, not only is it large enough, but the way they operate, they need to be considered a public square, a digital town square.
00:16:48.000 Meaning they are not just a publisher, right, they are not just the New York Times, for example, or NBC News, because they don't want to be.
00:16:54.000 They benefit from Section 230, and they are so ubiquitous, they are required for people to express their opinions right now, and they benefit from the protections of legal liability, that they need to be subject to the constitutional constraints on limiting speech just as a real Town Square would be.
00:17:10.000 And by the way, this is being made right here today, but we have been making this here at this show for years.
00:17:17.000 Ladder with Crowder has been censored by Facebook, or been targeted for selective oppression by Facebook.
00:17:22.000 Ladder with Crowder was fingered by Facebook as a page to throttle.
00:17:27.000 But this week, for the first time, we ran into this problem with Google here on YouTube.
00:17:33.000 And there has been just definitely an uptick in election time. And the idea too is this kind of goes to
00:17:39.000 the what we've talked about with big tech is that it's the digital town square, right? So it kind of
00:17:43.000 it's predicated on the same idea that you can't ban someone from a town hall just because they're
00:17:47.000 critical of you. All right. So and Joe, you can jump, but I know we have to get to Tim Jordan here
00:17:52.000 pretty quickly. Here's another argument that you'll see that's being made today. I want to be clear.
00:17:57.000 Are these our arguments or these are the arguments period? These are the
00:18:00.000 These are arguments kind of period, but they're based on arguments that are being made as well, right?
00:18:05.000 Some of them are actually ours.
00:18:06.000 And go to argument number two right now.
00:18:08.000 Argument number?
00:18:09.000 We scrolled down just a little bit to four.
00:18:11.000 Go back up to number two.
00:18:12.000 Oh, I apologize.
00:18:12.000 Oh, what happened?
00:18:14.000 Your finger moved the thing.
00:18:15.000 Well, it's because I'm looking at the amicus brief and it says don't write on this because it's a crime or something like that.
00:18:20.000 It's not like a mattress tag.
00:18:23.000 So, by the way, I've broken the law with taking off the tags.
00:18:23.000 Okay.
00:18:26.000 Oh yeah, all the time.
00:18:27.000 I particularly take the tags off of the face gloves that I use on my little ones because, you know, that can scratch their eye.
00:18:33.000 I took it off my mattress.
00:18:34.000 I buck the tag sign.
00:18:35.000 I don't care.
00:18:36.000 I'm a tag ripper, dude.
00:18:37.000 I don't care.
00:18:38.000 I'll tell anybody.
00:18:39.000 Josh the Ripper.
00:18:39.000 I don't pay my taxes either.
00:18:41.000 Careful.
00:18:42.000 So.
00:18:44.000 Argument, I guess, number two here that you're going to see is that the Biden administration, it violated Section 230 by specifically coercing platforms to censor.
00:18:52.000 Right, which they said they're persuading.
00:18:54.000 Their word is persuading, we say coercing.
00:18:56.000 Because it's the federal government.
00:18:56.000 Why?
00:18:58.000 You know what happens if you don't obey, right?
00:18:59.000 It's like the mob.
00:19:00.000 There's consequences.
00:19:01.000 Exactly.
00:19:02.000 Another argument that is being made here today, that the Biden administration, number three I guess, that they turned social media, effectively these platforms, into an arm of the government.
00:19:12.000 Yes.
00:19:13.000 When they actually say you have to, at this point, permit this viewpoint and not this viewpoint.
00:19:13.000 Right?
00:19:21.000 Or if they say, you have to go by these CDC guidelines, you have to go by this establishment's guidelines.
00:19:26.000 Institutional, right?
00:19:26.000 That's why he's very big.
00:19:27.000 Trust your institutions.
00:19:29.000 Well, if you don't trust your institutions, you're not going to be able to operate.
00:19:32.000 So effectively, this administration outsourced constitutional violations to private companies, which is really kind of an odd thing to outsource.
00:19:39.000 Yeah, you figure you're just going to do the job yourself.
00:19:41.000 But it's a commodity.
00:19:43.000 Argument number four that you'll see, and we'll get to all these again after Representative Jordan, that algorithms, and this is something you will probably see Supreme Court justices discuss after this, I will tell you this, a lot of conservatives, a lot of people, they're not as involved as you are, so they don't fully understand what it means.
00:19:57.000 The algorithms have enabled censorship to take place without the public knowing at all.
00:20:02.000 This is something that we specifically brought to the attention of the Supreme Court here today.
00:20:07.000 Non-public algorithms, they allow for shadow banning to take place.
00:20:10.000 So you may know about, for example, demonetization.
00:20:13.000 Sure.
00:20:14.000 Our problem personally has not been with demonetization, it's been with not even being able to reach the people who are subscribing, people who hit the notification bell.
00:20:20.000 And let's define what algorithms are.
00:20:23.000 This is artificial intelligence coded by humans with a mission in mind.
00:20:28.000 In other words, someone codes an algorithm, like you see with Google, to affect the election.
00:20:32.000 Someone codes an algorithm to flag the Hunter Biden story as misinformation.
00:20:35.000 Someone codes an algorithm to show you penis or dildo, which is on YouTube and non-age-restricted.
00:20:42.000 But questioning, for example, lockdowns or even pointing to Sweden and them not having lockdowns is not something that you will see, even if you're looking for it.
00:20:50.000 That's an algorithm and it's not public.
00:20:53.000 You see your screen.
00:20:55.000 You see your feed because of something that is happening beneath the service that is being obfuscated by design.
00:21:00.000 And think about this for a second.
00:21:02.000 Algorithms determining the content.
00:21:04.000 That's a human being then putting it in the hands of an artificial machine.
00:21:07.000 But it starts with a human.
00:21:08.000 What do you think that does to investigative journalism?
00:21:12.000 So imagine they're being coerced by the government, if you will.
00:21:14.000 The government says you're not allowed to actually Make sure that people see this.
00:21:19.000 Let's go back to Watergate, which, keep in mind, it wasn't that Nixon knew about Watergate.
00:21:24.000 It wasn't that Nixon cleared it.
00:21:26.000 The scandal was him saying, well, how do we contain this so the public doesn't know and they don't assume that I was the one involved with this?
00:21:31.000 Now, picture if Nixon was in charge of Facebook, YouTube, Google, Twitter, Spotify, as Jen Psaki refers to it.
00:21:40.000 It's a whole new company.
00:21:41.000 You would have never known about Watergate, or at the very least, people would say, ah, it's a nothing burger, to use a term that I hate and the people who use it.
00:21:49.000 Really quickly, we have an update.
00:21:50.000 The Supreme Court arguments are going a little long so Jordan's going to be probably quite a bit late, maybe 20 minutes.
00:21:56.000 So let's, we can keep going in this and then we can actually get into the details of each one of these points because we actually have some very good detail to every one of these arguments and why they're important to you guys and how we play a part in this as well as some of the other people that have filed here.
00:22:09.000 So we can get into that stuff and then when Jim Jordan is able to come on.
00:22:12.000 I can do that.
00:22:13.000 We can also, when we want, we can have George come on and give us a little bit of an update from inside as well if we want to do that.
00:22:18.000 Well, you guys let me know when the best time is to do that.
00:22:20.000 And by the way, if you're watching right now, the best thing you can do to cut through the algorithms, if you are not a Mugg Club member, download the Rumble app.
00:22:26.000 Okay, you can watch on YouTube, that's the worst scenario.
00:22:29.000 You can hit the subscription, it doesn't really mean anything.
00:22:31.000 You can hit the notification bell, you can go over and watch on Rumble, download the app so that you know, you choose the notifications that you get.
00:22:38.000 Here's an argument, I guess number five, that the The algorithmic censorship taking place at the hands of the Biden administration constitutes a prior restraint on free speech.
00:22:47.000 What do I mean?
00:22:48.000 Big tech, meaning a few companies, they censor themselves, they self-censor in advance because they know that the Biden administration will come down on them.
00:22:57.000 I'm sorry, persuade them.
00:23:01.000 Well, so when they did it in the first place, right, and so this is that censoring speech in advance, they basically did it once.
00:23:07.000 And made an example of people.
00:23:08.000 Right.
00:23:09.000 So they're like, ah, we had to come down on you guys here.
00:23:09.000 Right?
00:23:12.000 Oh, a little bit of extra scrutiny for your company here.
00:23:14.000 It'd really be a shame if we had to come and do the same thing to you.
00:23:17.000 So once you've done it once, these other companies then act because they know how the government is going to then act if they don't do what they want them to do.
00:23:25.000 Right.
00:23:25.000 And so that's a really big deal.
00:23:27.000 It's like the mafia, right?
00:23:28.000 To go back to that example, the mafia just has to go and burn one guy's business to the ground for everybody else on the block to fall in line and go, you know what?
00:23:35.000 They don't even have to threaten me.
00:23:36.000 I'm going to do what I know they want me to do, which is pay them weekly to stay in business.
00:23:41.000 That's exactly what the government is doing here, and that's the argument that's being made.
00:23:44.000 Oh, by the way, I forgot to tell you guys.
00:23:46.000 There is the promo code SCOTUS, right?
00:23:48.000 $10 off if you join Mug Club, because we would not have been able to do this without you.
00:23:51.000 So we'll have Jim join us.
00:23:52.000 And so there's a link in the description.
00:23:53.000 Make sure you go and click that.
00:23:54.000 It automatically puts the promo code in for you.
00:23:56.000 We make it as easy as possible.
00:23:58.000 And look, you don't want to be gay, so subscribe to your channel.
00:24:00.000 I mean, some people want to be.
00:24:00.000 Hey, whoa, whoa, whoa, whoa.
00:24:01.000 I don't think.
00:24:02.000 No, actually no, I don't think anyone wants.
00:24:02.000 Yeah.
00:24:04.000 1920's gay.
00:24:05.000 Like happy.
00:24:06.000 You don't want to be happy.
00:24:07.000 Yeah, like the Flintstones?
00:24:08.000 Go subscribe.
00:24:09.000 It was a time.
00:24:10.000 You unhappy?
00:24:11.000 It wasn't gay, the time was gay.
00:24:13.000 It's $10 off, come on.
00:24:14.000 The English language, it's not like the romance language, it's not as descriptive.
00:24:14.000 I'm gay.
00:24:17.000 It isn't?
00:24:18.000 Let's go through the specifics on this, since Representative Jim Jordan is going to be later.
00:24:22.000 And guys, just let me know when.
00:24:24.000 We will.
00:24:25.000 He's in there right now, so that's why he's going to be there.
00:24:27.000 I notice they're not covering it on the mainstream.
00:24:29.000 Of course not!
00:24:29.000 Shocker!
00:24:30.000 So the first argument that I presented to you, okay, that big tech is so large it needs to be considered the town square.
00:24:36.000 And we can just hit number one again, Tim, so they know.
00:24:38.000 Do it.
00:24:38.000 I mean, you got it there.
00:24:40.000 So let me ask you this.
00:24:44.000 Should your politics allow private companies to deny you, for example, a bank account, a cell phone, to get a loan, to buy an airplane ticket?
00:24:53.000 Think of those things.
00:24:54.000 No, of course not.
00:24:55.000 Because you vote Republican?
00:24:57.000 Even, let's say, because you believe that all vaccines cause autism, which we do not, I'm not saying that here, I'm saying let's take a radical leap.
00:25:03.000 Let's say that you believe that Ted Cruz's dad shot JFK.
00:25:07.000 Should you not be able to take a puddle jumper with spirit?
00:25:07.000 He did!
00:25:13.000 You should be subjected to hell just like everyone else.
00:25:16.000 And remember the Chinese social credit score?
00:25:17.000 It's like an episode of Black Mirror.
00:25:19.000 And more importantly than that, with Section 230, it's very specific.
00:25:24.000 This is outlined as you deal with, for example, cell phone companies.
00:25:27.000 They cannot remove you due to your politics.
00:25:29.000 Of course, they can remove you from the comments section at the newyorktimes.com, but they are not a utility.
00:25:35.000 They are a publisher.
00:25:37.000 And these platforms, these social media platforms, are treated as utilities.
00:25:42.000 So let's, let's disabuse ourselves of the notion that private companies can just, they can do whatever they want.
00:25:47.000 Well, this is not one of those cases, and even then there is precedent to say that that is not always the case.
00:25:52.000 Right, and we don't even have to go to the Chinese credit score to have examples of this.
00:25:56.000 We can use Alex Jones.
00:25:57.000 We can use Kanye West.
00:25:58.000 No matter what you think of people, they had beliefs and political opinions that they expressed that caused them to be debanked, right?
00:26:06.000 So you no longer can do business with Chase, I believe, in Kanye's case, right?
00:26:10.000 Then you have people saying, you know what, Alex Jones, we're no longer going to let you process credit cards.
00:26:16.000 I'm not talking about just like a provider.
00:26:16.000 With anyone.
00:26:19.000 I'm talking about the credit card company that the provider then is going to.
00:26:22.000 So Visa, MasterCard, American Express, Diners Club, if that's still a thing.
00:26:26.000 You basically can't do anything with a credit card.
00:26:30.000 Okay, how do people pay to see your content?
00:26:31.000 How do people support you?
00:26:33.000 It's just like Canada when they're freezing the accounts of these guys, truckers, doing protests, right?
00:26:36.000 So these things happen.
00:26:38.000 And then it's AWS, which almost everybody except for Rumble, thank God, uses as the backbone and the cloud for their services to say, you know what?
00:26:45.000 We have a Terms of Service as well.
00:26:47.000 You can't use our site.
00:26:48.000 They came after us because of the BlackRock story, right?
00:26:51.000 And said, hey, you can't use our services to put this article up, guys.
00:26:55.000 You should go after them.
00:26:55.000 You should go after them.
00:26:57.000 That's the kind of stuff that's happening today.
00:27:00.000 Just on your viewpoint, much less all the stuff that we've done journalistically that is absolutely airtight, 100%.
00:27:06.000 Still don't have a lawsuit, by the way.
00:27:07.000 Not one lawsuit.
00:27:08.000 I'll keep you updated.
00:27:09.000 I still don't have a lawsuit.
00:27:10.000 Or a mayor contacting us for more information from, I don't know, Nashville.
00:27:14.000 Right.
00:27:15.000 Who's also gay.
00:27:16.000 I don't think so.
00:27:16.000 Allegedly.
00:27:17.000 Allegedly.
00:27:18.000 He's not a subscriber, though.
00:27:19.000 That's what I heard.
00:27:20.000 That's what I heard.
00:27:21.000 Maybe he wants to be.
00:27:23.000 So, again, this brings us to the second argument that we made, a little more detail, a little more granular here, that they violated 230, again, by coercing platforms.
00:27:34.000 That's Chris Pratt's new social.
00:27:37.000 Coercing platforms to censor certain speech, right?
00:27:41.000 And this, let me give you some details here with the Biden administration.
00:27:43.000 So Section 230, if you don't know, allows platforms Um, kind of to censor a narrow type of speech without being considered the publisher.
00:27:51.000 Let me read this for you.
00:27:52.000 It says, obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.
00:27:56.000 Otherwise objectionable?
00:27:57.000 That's broad.
00:27:58.000 Yes.
00:27:59.000 Well, that's why you couldn't publish Ashley Biden's diary.
00:28:02.000 Oh.
00:28:03.000 Why?
00:28:04.000 Joe Biden showered with her.
00:28:06.000 Oh.
00:28:06.000 When she was old enough to know it was creepy.
00:28:09.000 You know that, right?
00:28:10.000 Also, Ilhan Omar married her brother.
00:28:12.000 It's true.
00:28:12.000 Bang.
00:28:16.000 This is where the administration, though, also violated this, because demands go outside of this narrow window.
00:28:21.000 For example, questioning the CDC, questioning the WHO, questioning the idea of massive lockdowns, and even bringing up, for example, sounding the alarm bells that this could have catastrophic results for children and the educational system, which we now know to be true, that's not lascivious.
00:28:38.000 That's not outrageous, that's not violent.
00:28:40.000 Obscene.
00:28:41.000 That's not obscene.
00:28:42.000 Questioning, for example, what happened in Pennsylvania, what happened in Georgia, which of course Stacey Abrams has done, everyone from Stacey Abrams to Amy Klobuchar to Hillary Clinton as far as questioning election results, that's not obscene.
00:28:53.000 And the demands from the administration are far outside of that scope.
00:28:57.000 So that's the actual Trojan horse when people refer to that.
00:29:00.000 It's, okay, well look, this administration actually has the right in these companies if something is obscene, lewd, lascivious, filthy, excessively violent, harassing, otherwise objectionable.
00:29:07.000 That's the term.
00:29:08.000 Let me ask you this.
00:29:09.000 When Gerald Morgan... Oops.
00:29:11.000 I mean, Boy Scout personified, honestly, when he said that actually more children die from the standard flu annually than all years combined of COVID, as far as children, infants and toddlers.
00:29:26.000 When he says the standard flu is more dangerous to them, according to the CDC, and quotes the CDC, For which we were suspended and the content was removed.
00:29:36.000 Does that meet the threshold for obscene, lewd, lascivious, filthy, excessively violent, harassing?
00:29:43.000 Yeah, you know what my crime was there?
00:29:45.000 I was looking at the CDC report from the state of California and referencing the last ten flu seasons and the deaths of those age groups.
00:29:51.000 That's all I did.
00:29:52.000 We pulled the overlay up and highlighted four of the last ten years had higher death rates at that point in time.
00:29:58.000 And that's even with counting every single accidental death as COVID at the time because somebody died with COVID but not because of COVID.
00:30:05.000 Right.
00:30:05.000 Right?
00:30:06.000 Does it meet that threshold?
00:30:07.000 How about Biden-Burisma?
00:30:10.000 Why was that removed, which would have affected the election?
00:30:12.000 We're not just talking about Hunter Biden.
00:30:14.000 Former Vice President Joe Biden, when you're talking about the relations, the 10% for the big guy, the type of relationships that he bragged about with China until he tried to say that he was tough on China, does that meet that threshold?
00:30:25.000 Because this administration has worked hand-in-hand with big tech to ensure that those stories don't get out.
00:30:31.000 People talk about Donald Trump not relinquishing the reins of power.
00:30:35.000 Okay, they want you to be afraid of the idea of Donald Trump saying, I'm never leaving, I'm staying forever and putting his feet... What would you consider an administration, while in power, abusing their authority in using social media to censor any negative stories to ensure that they remain in power?
00:30:53.000 What would you consider... that's refusing to relinquish the reins of power.
00:30:56.000 Like Russia.
00:30:58.000 What about going after your political opponent with over a hundred different counts, I believe, or charges?
00:31:03.000 Right.
00:31:04.000 I think that's probably something that falls into that as well.
00:31:07.000 The administration is, again, this is the Saul Alinsky tactics, they're doing exactly what they're accusing us of doing.
00:31:13.000 Exactly.
00:31:14.000 Which brings us to argument, I guess, number three.
00:31:16.000 And you guys can jump in wherever, because I know there's a lot of legalese here, but this amicus brief, and I believe all the information today from the Supreme Court, is going to be made publicly available.
00:31:25.000 Yeah, we'll tell people where they can go and get it.
00:31:26.000 And for you in Mug Club, we're actually working on, there's a way for us to give it to you in Mug Club via PDF.
00:31:32.000 So I think there's a way we're going to make it available to everybody in Mug Club as one of the benefits of joining, if you guys want to read through it.
00:31:32.000 Okay.
00:31:37.000 Otherwise, you'd have to go through the Supreme Court and kind of scroll through everything to try to find it in there.
00:31:41.000 It's so cold in here, I hurt my neck.
00:31:43.000 You hurt your neck because it's cold?
00:31:44.000 Well, there's only two ways to hurt your neck.
00:31:45.000 Man.
00:31:46.000 No.
00:31:47.000 Uh-oh.
00:31:47.000 I've got a few ways.
00:31:49.000 Number three, the Biden administration, you're totally right, they turned these social media platforms into an arm of the government.
00:31:55.000 Okay, so there were two ways that this was accomplished, or possible ways.
00:32:01.000 Coercion of platforms.
00:32:02.000 They said persuasion.
00:32:05.000 And then they also had threats of reviewing Section 230 and the protections if the platforms don't cooperate, right?
00:32:10.000 So it's, oh, you know what?
00:32:12.000 We might review this and remove the liability protections.
00:32:15.000 No, no, no, no, wait, we'll do what you say.
00:32:17.000 Yeah, yeah, exactly.
00:32:17.000 We'll do what you say.
00:32:18.000 We'll do what you say.
00:32:19.000 Even though they're already violating Section 230.
00:32:22.000 They also, by the way, directly participated with these platforms to be controlling speech.
00:32:27.000 To what point?
00:32:28.000 Okay.
00:32:29.000 The Ministry of Misinformation.
00:32:31.000 That's a joke, obviously, but it happened.
00:32:33.000 It's not a joke.
00:32:34.000 Let's back that off a little bit.
00:32:36.000 Jen Psaki saying, we really would like to see more of this from Spotify with Joe Rogan.
00:32:41.000 That's a private company.
00:32:43.000 Well, actually, sorry, it's a publicly traded company, Spotify, if I'm not mistaken.
00:32:48.000 I know many of these companies are.
00:32:49.000 What do you think happens when they go, oh wait, hold on a second, we have a duty to our shareholders and the White House press secretary just told us what we need to do?
00:32:58.000 Right.
00:32:59.000 Yeah, exactly.
00:32:59.000 Publicly.
00:33:00.000 This was a public press briefing.
00:33:03.000 This was not an email that we happened to find out about later from the Twitter files.
00:33:07.000 It was something that she said publicly and felt comfortable saying it.
00:33:11.000 Fearless.
00:33:12.000 Completely fearless.
00:33:12.000 Yes.
00:33:13.000 Because they have no fear of accountability.
00:33:15.000 So this administration has been intertwined with these platforms directly in controlling them and censoring your speech, to be clear.
00:33:22.000 Which, by the way, has already been dramatized on screen.
00:33:25.000 I want you to be nice until it's time to not be nice.
00:33:33.000 Well, uh, how are we supposed to know when that is?
00:33:37.000 You won't.
00:33:38.000 I'll let you know.
00:33:40.000 You are the bouncers, I am the cooler.
00:33:42.000 All you have to do is watch my back and each other's.
00:33:47.000 Take out the trash.
00:33:49.000 Are we the trash?
00:33:50.000 I think so.
00:33:52.000 Compared to... God has never made a better looking man.
00:33:54.000 Oh my god.
00:33:55.000 The way his hair just kind of like, just lazily just falls perfectly right there.
00:33:59.000 And then the reverse shot is to that mongoloid.
00:34:01.000 How do we do that?
00:34:04.000 How do we do that?
00:34:06.000 Look at my jawline.
00:34:06.000 Shut up.
00:34:08.000 I'll tell you what to do.
00:34:08.000 Such a weird movie.
00:34:09.000 It's a bizarre movie.
00:34:10.000 We need 25 bouncers and this bar is still in business.
00:34:13.000 One of them needs...
00:34:15.000 What the...
00:34:17.000 🎵 Why don't you be a bouncer?
00:34:36.000 I don't know, man.
00:34:37.000 Alright.
00:34:38.000 It's a career choice, man.
00:34:39.000 It's a career choice.
00:34:39.000 Once you make your decision, you gotta stay there.
00:34:42.000 You had a 401k to uphold.
00:34:43.000 It's roadhouse time.
00:34:46.000 Okay.
00:34:47.000 So arguments...
00:34:49.000 Number four.
00:34:52.000 We use this term so much.
00:34:53.000 Algorithms, algorithms, algorithms.
00:34:54.000 Think about what that is for a second and how dangerous it is.
00:34:58.000 And these algorithms enable censorship to take place without the public knowing at all, without you knowing at all.
00:35:05.000 And this is something that we brought in our amicus brief exclusively.
00:35:08.000 You know about the Clean Slate campaign.
00:35:09.000 We do not want to be slaves to the masters of algorithms, because it's an artificial brain.
00:35:15.000 It's not even a real person at that point, but it was put into that position by someone who likely hates everything you stand for.
00:35:23.000 When you see an algorithm, or you hear the term algorithm, it determines everything that you see, everything that you don't see in your social media platforms, in your YouTube feed, in your meta, sorry, Facebook feed, TikTok, whatever it is.
00:35:33.000 It was created by someone in Palo Alto, 99% of whom, we've run these numbers, have not only voted but donated directly to the Democratic Party, and they want to determine what it is not only that you see from one side as a consumer, but what you can say.
00:35:48.000 As either a content provider or simply someone using the Digital Town Square.
00:35:52.000 And then it goes to an algorithm.
00:35:55.000 So let me kind of give you some examples here.
00:35:56.000 These completely private algorithms.
00:35:59.000 What do they facilitate?
00:36:00.000 Shadow banning?
00:36:02.000 Um, there's also algorithmic demotion, meaning, uh, we're gonna make sure that you simply don't see this content.
00:36:08.000 For example, for a long period of time you could search, Lauder with Crowder changed my mind abortion, and you would find a video from PBS with 400 plays.
00:36:16.000 Exactly.
00:36:17.000 Supporting abortion.
00:36:19.000 It leaves content creators completely unclear as it relates to what content is allowed, which is by design, right?
00:36:25.000 It's confusing you.
00:36:26.000 And it completely, there's no context when these censorship decisions are being made.
00:36:30.000 So what happens is people just start self-censoring or you have other companies, for example, conservative companies saying, you know what, just play ball and make sure you don't even go close to the lines with YouTube or Meta or TikTok.
00:36:42.000 We just, we don't want to risk it.
00:36:43.000 Because we're terrified!
00:36:43.000 Why?
00:36:45.000 They'll demonetize us.
00:36:46.000 They might, they might demote us.
00:36:48.000 Because the algorithm, we don't know what it is.
00:36:48.000 Why are you?
00:36:50.000 Let's just all play ball.
00:36:51.000 We already know we can't question the efficacy of vaccines.
00:36:54.000 We already know that we can't talk about elections or interference.
00:36:56.000 We already know that we cannot talk about lockdowns.
00:36:59.000 We know all of these things.
00:37:00.000 We know that we can't talk about puberty blockers with kids.
00:37:03.000 We can't talk about sexual assault in female prisons from biological... We know that we can't... So don't talk about those things in the play It's Safe.
00:37:08.000 Don't talk about anything that may even broach those subjects.
00:37:12.000 Think about that.
00:37:13.000 Now, transparency, which is what's being pushed for here, could help with a lot of that.
00:37:17.000 In other words, it's not about taking control over these companies, it's about ensuring that they're playing by the rules that everybody else is.
00:37:23.000 If you know what determines the algorithms, that at least helps, and they'll at least have to straighten up and fly right.
00:37:29.000 These are steps to take that are easy to take, there's precedent, and the left and the right, why would anyone be against transparency?
00:37:37.000 for the algorithms created by these platforms who are treated
00:37:41.000 as utilities, public utilities, and they benefit from Section 230.
00:37:45.000 You are asking them to play by the rules that everyone else does.
00:37:49.000 Right, and look, I think we need to make sure that people know, like,
00:37:52.000 it's possible that the justices in this case would have never even heard of,
00:37:56.000 like, some of the terms that we just talked about there.
00:37:58.000 Not because they're not smart people, because that's not their world, right?
00:38:01.000 They wouldn't have heard about like algorithmic suppression or using it as really a weapon against conservative viewpoints or viewpoints you just don't want out there.
00:38:08.000 But because we filed this brief, the clerks in a lot of cases will read this and highlight sections or summarize.
00:38:15.000 Depending on the justice.
00:38:16.000 Some justices want to make sure they do their own reading.
00:38:18.000 Some justices, give me the high points, and this will actually make it into the record now.
00:38:22.000 And so they'll see, wait, they're doing what?
00:38:25.000 Right.
00:38:25.000 Right?
00:38:26.000 So some, one of those justices, their clerks, a group of them, a lot of them may even reference this in their opinions.
00:38:31.000 Can I pause you for one second?
00:38:32.000 Bring up CNN right now because we have a segment on this.
00:38:34.000 Oh, they were just talking about bloodbath from Donald Trump.
00:38:36.000 This is a perfect example happening in real time.
00:38:38.000 Donald Trump talked about the automotive industry and China and what they're doing in Mexico and how some of these auto manufacturing plants have moved over and he was talking about China taking advantage of our trade laws and that there would be a tariff in China and that if he wasn't elected we have a whole segment on this that if he was not elected that there would be a bloodbath right there would be a bloodbath for the automotive industry.
00:38:56.000 Now this was one of those quotes where when it happened I didn't even think it would go anywhere but it's a slow news day and so what happens is his original speech Is demoted.
00:39:07.000 The algorithm says you're not going to see Donald Trump's speech because it is clear as day what happened, but the algorithm tells you, CNN, CNN, well you can just bring it up right now, it doesn't say Bloodbath, but it did just 12 seconds ago, CNN, the talking heads, telling you what Donald Trump meant by Bloodbath is what you see.
00:39:26.000 Remember, January 6th, Donald Trump said, peacefully and patriotically, walk over, march over, make your voices heard.
00:39:33.000 That, you couldn't find it.
00:39:35.000 Couldn't find it anywhere.
00:39:36.000 What you could find was a talking head saying that Donald Trump called for violence.
00:39:40.000 This is an algorithm.
00:39:42.000 Think about what happened in Charlottesville.
00:39:44.000 Donald Trump said very clearly, I'm not talking about neo-Nazis or skinheads who should be
00:39:50.000 condemned totally.
00:39:53.000 Right.
00:39:54.000 But there, you had good people on both sides, meaning, keeping up a historical statue, taking it down.
00:40:01.000 Completely condemned neo-Nazi skinheads, in no uncertain terms.
00:40:05.000 Remember?
00:40:06.000 You were told, find people on both sides, including Nazis and skinheads, because you didn't see what Donald Trump said.
00:40:11.000 It's really hard to search for it, even today!
00:40:15.000 But you can have the other people telling you what Donald Trump meant.
00:40:19.000 That's the algorithm.
00:40:20.000 That's the shadow banning.
00:40:22.000 That's the demoting.
00:40:23.000 Now add that up with every single story, every single day.
00:40:28.000 You think that affects elections?
00:40:32.000 We just throw it under the umbrella.
00:40:34.000 Algorithms!
00:40:36.000 Think of how corrosive that is.
00:40:40.000 Honestly, you go back to Pravda, you go to a lot of these other countries, a handful of oligarchs determining what the media can cover isn't all that different from five to ten people in Palo Alto who have direct meetings with the government.
00:40:55.000 At this point, comment below.
00:40:56.000 What's the difference?
00:40:58.000 What's the difference?
00:40:59.000 Comment.
00:41:00.000 Because I don't see one.
00:41:02.000 This is not about private industry at this point.
00:41:04.000 And this brings us to the argument number five here.
00:41:07.000 That the algorithmic censorship constitutes a prior restraint on free speech.
00:41:12.000 What do I mean by this?
00:41:13.000 They decided, this administration, what was verboten on free speech before you even spoke it.
00:41:18.000 Okay, so let's say a COVID expert Who's skeptical about the mRNA vaccine?
00:41:25.000 Let's say someone who actually created, developed the mRNA technology himself!
00:41:30.000 Skeptical?
00:41:31.000 Censored.
00:41:32.000 Gone.
00:41:33.000 Question some facts about the 2020 election?
00:41:36.000 Gone.
00:41:36.000 Censored.
00:41:38.000 Censored.
00:41:38.000 Question lockdowns?
00:41:39.000 Gone.
00:41:41.000 Stream the Oscars and are critical of propaganda?
00:41:44.000 Censored.
00:41:45.000 Gone.
00:41:46.000 Gone!
00:41:47.000 This has already happened.
00:41:48.000 So some key points here to remember.
00:41:49.000 You, right now, all of you, watching.
00:41:52.000 If you're watching, listening, you are all publishers.
00:41:54.000 The comments you leave, the posts you make, those are considered published content.
00:41:58.000 You can be held liable for that.
00:42:02.000 You are responsible for what it is that you post.
00:42:04.000 Not with these platforms.
00:42:05.000 Here's another key point.
00:42:06.000 Section 230 only permits platforms to censor obscene content without becoming you, the publisher.
00:42:14.000 That's the important distinction.
00:42:17.000 But that is not how this is acting, and this is not... I can't explain to you enough.
00:42:21.000 These algorithms, this administration, they're not demanding that obscene speech, that snuff videos be removed.
00:42:28.000 It's just the Ashley Biden diary because of investigative journalism.
00:42:32.000 It's just live streaming an election where there's a voting precinct that says a pipe burst in Georgia and you find out in real time that it's not true and the government says hold on a second that could be misinformation because they were covering in real time before we got to the lie that we told.
00:42:48.000 This has nothing to do with the actual law.
00:42:51.000 That's the issue.
00:42:52.000 All right, so we've got Jim Jordan in about 10 to 15 minutes.
00:42:55.000 We know he's coming, so what we'd like to do now is if we can is go to George, who is
00:42:59.000 live outside of the Supreme Court for just a few minutes.
00:43:01.000 We're working on it now.
00:43:02.000 Oh yeah, we just got his image now.
00:43:04.000 Got his image?
00:43:05.000 All right, good.
00:43:06.000 Perfect.
00:43:07.000 So we have, do we have a stinger or no?
00:43:08.000 Yeah.
00:43:09.000 Yeah.
00:43:10.000 All right, it's time for George the Greek, live from the Supreme Court.
00:43:13.000 All right, George the Greek.
00:43:16.000 I appreciate you being there, sir.
00:43:18.000 You look sharp.
00:43:19.000 It looks like it warmed up because you're not wearing your overcoat.
00:43:25.000 Oh, the audio doesn't work.
00:43:27.000 We can't hear George.
00:43:29.000 Oh, he looks pissed.
00:43:34.000 Can you hear me now?
00:43:35.000 Yes, it was just having the microphone.
00:43:38.000 Yeah, small detail, but we got it now.
00:43:41.000 What was that?
00:43:42.000 Was that in your cameraman's pockets for God's sake?
00:43:46.000 We're trying to cut out wind noise in creative ways.
00:43:49.000 No, we got it.
00:43:50.000 Can you hear me alright?
00:43:50.000 I can hear you alright.
00:43:51.000 So you were just there in the Supreme Court.
00:43:53.000 Is there anything that you noticed specifically, a major update for us?
00:43:57.000 Yes.
00:43:58.000 So I was right up front.
00:44:00.000 It was a great vantage point.
00:44:01.000 And I got to see things that you really couldn't understand in audio.
00:44:03.000 So broadly speaking, you know, the conservative judges were kind of questioning the government a lot and vice versa with the liberal judges.
00:44:11.000 But, you know, I could see like Clarence Thomas visibly frustrated.
00:44:16.000 Can you pause one second?
00:44:18.000 I think we're still getting the feed from inside the Supreme Court.
00:44:21.000 It's audio from outside protests.
00:44:23.000 Oh, is that someone with a megaphone?
00:44:27.000 Yeah, a really large one.
00:44:28.000 We're trying to get away from it.
00:44:29.000 Sorry, sorry.
00:44:31.000 I thought it was like we were on the wrong radio frequency and we were getting... Oh, well, alright.
00:44:36.000 What an asshole.
00:44:37.000 Unless they're friendlies, I don't know.
00:44:39.000 Sorry, continue.
00:44:40.000 Well, it depends.
00:44:41.000 Continue.
00:44:42.000 You're saying Clarence Thomas, who I love.
00:44:44.000 Yes, well, we love him.
00:44:45.000 You know, he's kind of stoic, you know, he doesn't really say much, but in the court I could see him visibly frustrated at some of the answers the government was giving as to how they justify their actions.
00:44:55.000 You know, not really saying much, but, you know, sort of rolling his eyes, tucking himself back in the chair.
00:45:00.000 Alito, definitely the most hostile of all the judges, questioning not only the government's motives, but their methods.
00:45:09.000 Which I thought was helpful.
00:45:10.000 On the flip side, Missouri got a lot of heat from the liberal judges.
00:45:14.000 And, you know, unfortunately, you know, I think the Missouri lawyers kind of dropped the ball when the comparisons between newspapers and social media platforms came up.
00:45:24.000 I think they had a good opportunity there to delineate the difference between the two.
00:45:28.000 You know, newspapers being publishers and having the right, you know, to sort of pick and choose versus social media platforms.
00:45:33.000 I think the discussion there got a little muddled.
00:45:37.000 Overall, I feel like it's probably going to be some kind of a split decision.
00:45:41.000 And the real question is not really if Biden, you know, violated.
00:45:46.000 There is a violation there.
00:45:47.000 I think it's going to be how broad their ruling is.
00:45:50.000 Do they limit it specifically to the injunction at hand or do they take a very broad ruling here and outline more terms?
00:45:57.000 I think that's very TBD at this point.
00:45:59.000 Yeah, it seems like it could be.
00:46:01.000 Did you notice anything as far as, you know, when we were talking, for example, about the algorithmic censorship?
00:46:06.000 You know, you helped us introduce terms here that they may not even be familiar with.
00:46:11.000 Was that something that they asked about or they followed up on?
00:46:15.000 Uh, so I left a little bit early so they may still do that.
00:46:18.000 Nothing directly on that point.
00:46:20.000 They did generally speak about some of the methods used by social media platforms, but really the conversation focused on the intent, you know?
00:46:31.000 What was the subject matter at hand and did it rise to a level of violating the First Amendment and even provisions of Section 230?
00:46:40.000 Again, those social media platforms were not a party to this case.
00:46:42.000 It was really the federal government at issue.
00:46:45.000 But algorithmic censorship, not really a topic of conversation as much as I would have liked it to be.
00:46:51.000 It was more about the nature of the content and whether there was good faith involved.
00:46:58.000 Yeah, and that's important to note because a lot of people don't know.
00:47:00.000 We are already at that point.
00:47:01.000 The federal government is involved.
00:47:03.000 We're not just discussing YouTube at this point or Facebook.
00:47:06.000 This is where it is.
00:47:07.000 There's no argument as to the fact that this had taken place.
00:47:07.000 There's no doubt.
00:47:10.000 It's just okay to what degree.
00:47:12.000 It's the severity of the outcome.
00:47:14.000 I believe that Gerald Morgan had a question.
00:47:16.000 Yeah, just a quick question, George.
00:47:17.000 So on a couple of the justices that we said we were going to keep an eye on, you said Roberts a lot of times likes to go the opposite direction maybe of the majority just to make it look like more of an even split of the court.
00:47:28.000 For him and maybe Justice Kavanaugh, did you see anything that gives you any indication on what they're thinking?
00:47:33.000 Yes, actually, I would say that that's an apt characterization of Justice Roberts in this case, sort of middle of the road.
00:47:40.000 And if anything, I think a little bit deferential to the government, especially in cases of what we would call an emergency and their ability to communicate quickly with, say, social media platforms.
00:47:50.000 And this is part of the discussion that I wish the respondents brought back to, which is, yeah, you know, you have those hypotheticals.
00:47:58.000 But in this case, we were talking about memes.
00:48:00.000 You know, we were talking about You know what I mean?
00:48:06.000 It really was an opportunity there to bring those things back and I think more of that discussion should have come in because we're really not talking about a national emergency exclusively here.
00:48:15.000 We're talking about the government overreaching.
00:48:17.000 On things they have no business overreaching on, and way outside the bounds of Section 230.
00:48:22.000 Right.
00:48:23.000 Well, we have to broom you right now, George DeGree, because we have Representative Jim Jordan on.
00:48:27.000 Not that he's more important, but, you know, we see you every day, so we'll check back in with you in a little bit.
00:48:32.000 This has been George DeGree.
00:48:33.000 Good work from him!
00:48:38.000 Alright, it's going to be a stinger upon a stinger upon a stinger because now we do have, do we have him on?
00:48:42.000 We're ready?
00:48:43.000 We are good to go, gentlemen?
00:48:44.000 Give us a minute.
00:48:45.000 Is he up there?
00:48:46.000 I see him, we're just getting it all set up right now.
00:48:48.000 You guys told me you were ready.
00:48:50.000 I wish that anybody else had the passion for how much I hate you right now.
00:48:56.000 By the way, I thought George did a pretty good job there.
00:48:58.000 He gave us some indications on what these justices were thinking.
00:49:01.000 I love the fact that Clarence Thomas can't hide his disappointment.
00:49:03.000 He's like, ugh.
00:49:08.000 He seems like somebody, we've said this before, he seems like somebody that we would enjoy hanging out with.
00:49:12.000 I would love to clerk for Clarence Thomas.
00:49:14.000 I know.
00:49:14.000 He probably walks back into his scores like, can you believe this shit?
00:49:17.000 Let's have a beer and talk some shit.
00:49:20.000 And look, one of the reasons that we're bringing Jim Jordan on, he was one of the people that signed on, he had his own amicus brief as well, and he's been leading the charge, been very vocal on Section 230, and this plays right into that, even though this isn't specifically 230, it does play a role in this, so that's one of the reasons we've got him, and I think we've got him ready to go for you now.
00:49:36.000 Also, he was one hell of a wrestler.
00:49:38.000 He was!
00:49:38.000 And he did wrestle, we have actually people here who attended his wrestling camps.
00:49:41.000 Really?
00:49:42.000 Yeah, I believe it was the Jordan Brothers if I'm not mistaken, and none of the weird stuff like that DuPont bastard.
00:49:47.000 No.
00:49:48.000 Just does a good job coaching.
00:49:49.000 Let's go to Representative Jim Jordan.
00:49:52.000 Alright, Representative Jim Jordan, can you hear me, sir?
00:50:01.000 How are you?
00:50:01.000 I can.
00:50:02.000 I'm doing very well.
00:50:03.000 Thank you for, thank you for being here.
00:50:05.000 And we actually have some people out here who did attend your, I don't know if you call them like wrestling clinics?
00:50:08.000 Yeah.
00:50:09.000 It's my brother's, my brother's camp.
00:50:09.000 Yeah.
00:50:11.000 He's, he's, uh, he does a great job and a great business he has.
00:50:14.000 And so, yeah, we were, we were just talking about that.
00:50:17.000 It would be great.
00:50:17.000 I, what I wouldn't give to see you, I don't know what your go-to, if it was a single or an ankle pick, you know, uh, someone, someone there like Zuckerberg or the like, you know what I mean?
00:50:26.000 In a consensual way.
00:50:28.000 Yeah, a man who knows a little something about wrestling.
00:50:30.000 Yeah, well, that's a long time ago for me, but I wanted to play middle linebacker for the Pittsburgh Steelers, even though I'm from Ohio.
00:50:36.000 I grew up in the 70s, and I love Jack Lambert, but when you're my size, you gotta wrestle.
00:50:41.000 So it was a great sport for our family.
00:50:42.000 Like I said, my brother's done well with his business there, so all good.
00:50:46.000 Well, it never really leaves you.
00:50:47.000 I mean, you also look like you stay relatively trim.
00:50:49.000 My point is, I'm pretty sure you could beat the hell out of most of the people in our House of Representatives.
00:50:55.000 That's something, if nothing else.
00:50:57.000 Which shows restraint.
00:50:59.000 Let me ask, and you're doing this live from the Rumble Studios, which we appreciate because we know that's a safe space, to use the word, where we're not going to be censored on Rumble.
00:51:07.000 We had an amicus brief.
00:51:08.000 I know you've been leading this charge here.
00:51:10.000 What was the main argument that you laid out in your own amicus brief here today?
00:51:15.000 When the government does something that they, you know, through some private company that they can't do by themselves, when they're coercing, when they're censoring through someone else, that is still censorship.
00:51:26.000 And that's the fundamental line.
00:51:27.000 And I just came from the argument.
00:51:29.000 I will tell you that there was one line that I still can't believe Judge Jackson said this.
00:51:35.000 But she actually said to the Solicitor General from Louisiana, she said, you've got the First Amendment hamstringing the government.
00:51:44.000 What?
00:51:44.000 Now, think about that.
00:51:45.000 That's the whole thing in purpose.
00:51:47.000 Like, it's opposed to the government.
00:51:48.000 I mean, the fact that you have a person from the United States Supreme Court make that statement in the arguments on a case about censorship and the First Amendment, it's just like, I'm like, At one point, I can only know what to say.
00:52:02.000 Like, how can you say the same thing?
00:52:04.000 That's like saying, the problem with the Second Amendment is it's going to make the government afraid of coming to your house to take your guns away.
00:52:12.000 It's exactly right.
00:52:14.000 It's so scary what we heard there.
00:52:16.000 But I thought he did a good job, the Solicitor General from Louisiana, laying out this is a fundamental, you know, First Amendment case where you've got the government, I think, coercing, significantly encouraging is another one of the standards and the tests in some of the cases that have been in front of the court before.
00:52:33.000 And I think that's clearly the case.
00:52:34.000 Remember this too, Stephen.
00:52:36.000 On the third day of the Biden administration, the White House sends an email to Twitter saying this, take down this tweet, ASAP.
00:52:45.000 And the tweet was from RFK Jr.
00:52:48.000 Everything in the tweet was accurate.
00:52:51.000 But on the third day, they're starting the censorship operation.
00:52:54.000 And the irony is, they're going after the very guy who's going to run against them in the primary, for goodness sake.
00:52:59.000 If that's not the definition of why government is not supposed to do this, I don't know what is.
00:53:06.000 No, that's a very good point.
00:53:07.000 We were just talking about today, for example, the bloodbath controversy with Donald Trump, where what they do is it's impossible to find his original comments where it couldn't be more clear.
00:53:15.000 This is not even remotely controversial, the referring to a bloodbath regarding the automotive industry.
00:53:20.000 You don't find that.
00:53:21.000 You find what all the talking heads say, and you add that up day after day after day after day, let alone one story.
00:53:29.000 Hunter Biden laptop.
00:53:31.000 One story changes the outcome of the election.
00:53:32.000 I don't know if you know this, Representative Jordan.
00:53:34.000 The first time the public saw the Hunter Biden laptop, Mayor Rudy Giuliani was on this show, and it was incidental.
00:53:39.000 And I said, wait, wait, what?
00:53:40.000 He goes, yeah, that's the Hunter Biden laptop.
00:53:42.000 I go, wait, that's the actual?
00:53:44.000 He goes, yeah, he left it over here at a computer.
00:53:45.000 I have it right here in front of me.
00:53:46.000 I'm like, this is happening?
00:53:48.000 And of course that got removed.
00:53:49.000 That got suspended.
00:53:52.000 And we've been at the middle of this accident.
00:53:52.000 That week.
00:53:54.000 Let me ask you this, though.
00:53:56.000 Because I know you've subpoenaed a lot of records, and a lot of this information wouldn't have been public if not for, for example, Elon Musk taking over Twitter.
00:54:04.000 Very true.
00:54:04.000 Very good.
00:54:05.000 Which we're very grateful for.
00:54:06.000 What are the most shocking examples that you've seen through the records that you've subpoenaed between the Biden administration and big tech platforms?
00:54:15.000 Do some spring to mind?
00:54:16.000 Yeah, I'll do that question first.
00:54:18.000 So we had this one communication from Nick Clegg, like the head of global affairs or global something for META.
00:54:27.000 And Nick Clegg is talking with, this is actually an internal communication we got through our subpoenas.
00:54:32.000 But Nick Clegg, this is when the government's pressuring Facebook to take down certain things and certain posts.
00:54:38.000 And Nick Clegg says, this looks like it encroaches on free expression.
00:54:44.000 And of course, the irony is Nick Clegg, former Deputy Prime Minister in Great Britain, is lecturing and explaining to Americans how the First Amendment works.
00:54:53.000 I mean, I thought the irony there was unbelievable.
00:54:55.000 But I want to back up a second.
00:54:56.000 I was there Saturday when President Trump used the term bloodbath.
00:55:00.000 And as soon as he said it, I go, they're going to go after that term.
00:55:04.000 I know exactly what the press is going to, we've seen it, you've seen it time and time again.
00:55:08.000 And of course they did it.
00:55:10.000 And anyone who was there, the context was trade issues with China relative to the automotive industry.
00:55:17.000 This is how crazy the left is.
00:55:19.000 But yeah, that one email from, internal email from Nick Clegg.
00:55:24.000 Where he talked about, I think this approaches on free expression, I thought the irony of the former Deputy Prime Minister telling Americans how the First Amendment, how our Constitution works.
00:55:34.000 Yeah, it is, and some of these things are incredible.
00:55:37.000 Now, they're not as shocking to us because we've lived Through this.
00:55:39.000 For example, we've had, you know, content executives at YouTube ask us to send them our videos privately so they can let us know what changes to make if we don't want to run afoul of borderline guidelines before doing it publicly.
00:55:52.000 Not only being demonetized for not violating the rules, but, you know, the same thing happened with Facebook.
00:55:57.000 This was something that they admitted to had happened leading up to an election.
00:55:59.000 We had an election stream, biggest that had ever taken place, then it's removed by the next election.
00:56:04.000 And the issue is not so much that the left, we all know the left, they're crazy, right?
00:56:08.000 What happened is you had the mainstream media, legacy media, who had a stranglehold.
00:56:12.000 And everyone thought, there are no gatekeepers anymore.
00:56:13.000 This is great.
00:56:14.000 This is what social media allows.
00:56:15.000 And for a while it did.
00:56:17.000 And now, of course, they're working hand-in-hand with the government.
00:56:20.000 We're almost back to three networks.
00:56:21.000 It's basically meta, you know, Google, Alphabet, sorry, YouTube.
00:56:26.000 And thank God, Twitter was purchased.
00:56:27.000 But outside of that, for the longest time, this was a trifecta.
00:56:30.000 You can toss in TikTok.
00:56:31.000 You can toss in maybe Apple, Spotify, Microsoft.
00:56:34.000 It's five companies.
00:56:36.000 And they have all made decisions regarding content the same day.
00:56:40.000 For example, Hunter Biden.
00:56:41.000 For example, Alex Jones removing him.
00:56:43.000 That can't be a coincidence.
00:56:44.000 That's 10 people on a conference call, right?
00:56:47.000 Yeah, yeah.
00:56:48.000 And there's a... No, you're exactly right.
00:56:50.000 And there's this... I always call it the template, trying to tie in the the bloodbath comment and what the press did there with how this whole thing operates.
00:56:58.000 It's, I think, pretty basic what you see time and time again from the left, is the left will tell a lie, The big media will report the lie, big tech will amplify the lie, the laptops, Russian disinformation, Russian information operators, so big tech will amplify the lie, and then when we try to tell the truth, they call us racist, they call us names, and they say you're crazy.
00:57:21.000 And that's exactly how it plays out time and time again, and because they have this overwhelming support in big media, legacy media, and in big tech, minus Twitter, Well, I appreciate you saying that.
00:57:32.000 And by the way, it's in spite of what has happened, right?
00:57:34.000 And people like Rumble.
00:57:35.000 Why would President Trump use that term?
00:57:37.000 Completely out of... but that's the template.
00:57:39.000 Now, the good news is, I think more and more people are waking up to
00:57:43.000 the template that's used by the left and how big media and big tech weigh in with that.
00:57:47.000 Which is the good news, and it's because we got folks like you out there telling the truth
00:57:51.000 and getting... cutting through all the garbage and baloney we get from today's left.
00:57:55.000 Well, I appreciate you saying that. And by the way, it's in spite of what has happened, right?
00:57:59.000 And people like Rumble, and I will say this, you're one of the people who's been spearheading this.
00:58:02.000 It's in spite of a lot of our representatives, because I can tell you we've had calls and meetings with representatives saying we want to do something, but guess what?
00:58:08.000 They all have podcasts.
00:58:10.000 So they don't want to push that hard.
00:58:11.000 They all have Facebook pages.
00:58:12.000 So even Republicans.
00:58:13.000 Not going to name names, but they're not out there with the zeal that you have, and it consistently surprises us.
00:58:18.000 It's in spite of the fact that not a lot has been done because we've been out here taking the hits, and I know that you have actually been there taking the hits.
00:58:24.000 Not all Republicans are created equal.
00:58:26.000 Let me ask you this.
00:58:27.000 How do you expect or How would you expect the judges to decide in this case, and then how would you like them to?
00:58:34.000 For example, if they rule against Biden, do you think they'll keep it narrow or go after 230, you know, more broadly in its application?
00:58:43.000 Yeah, I don't know.
00:58:44.000 I mean, I will tell you, based on what I heard today, I think it's kind of hard to gauge where they're going to land.
00:58:49.000 You could definitely sense, at least I could, at least I felt this way, that the left judges,
00:58:54.000 the judge on the left, they're not gonna be on the side of stopping the censorship on the side
00:59:01.000 of the Solicitor General from Louisiana who was arguing the case.
00:59:04.000 But how it all shakes out, I don't know.
00:59:06.000 Someone did raise, could we keep it kind of narrow?
00:59:09.000 I think that that's a possibility, but I just, you never know, and sometimes what you think happened
00:59:14.000 in the arguments is not how the decision actually, how the decision actually plays out.
00:59:20.000 Right, yeah, it seems like it could go either way.
00:59:21.000 it could go either way. And we do have George in there, and he did say he was a little bit
00:59:23.000 We do have George in there, and he did say he was a little bit disappointed
00:59:25.000 disappointed that they didn't really push, for example, the algorithmic censorship, that that wasn't
00:59:26.000 that they didn't really push, for example, the algorithmic censorship,
00:59:30.000 really brought up, and that they kind of dropped the ball a little bit when comparing,
00:59:34.000 meaning people on our side, the solicitor from Louisiana, when it came to comparing
00:59:39.000 newspapers to these platforms. It's very important to delineate between publishers and platforms. And we've
00:59:44.000 been making that argument for a long time.
00:59:47.000 It seems like maybe it's unfamiliar to some people, and they're making the argument on our side with
00:59:50.000 good intentions. I know Gerald has a question. Yeah, Representative Jordan, just really quickly,
00:59:54.000 I know you've been very vocal about 230, and obviously the only reason that this show has
00:59:59.000 survived is because of our Mug Club subscribers.
01:00:01.000 The only reason we're able to file an amicus brief today was because of the Mug Club subscribers, but not everybody has that.
01:00:06.000 Do you see any relief coming soon on 230 and finally cleaning up this problem once and for all?
01:00:13.000 Because it's just been a long, long, arduous fight.
01:00:17.000 Yeah, I don't know.
01:00:20.000 I don't see any quick remedy, frankly, with divided government, where the White House is, the Senate control.
01:00:26.000 I don't see that.
01:00:31.000 One of the things I think that did get raised by when the government was arguing And questions from some of the more conservative judges is, was there pressure put on these social media companies to say, oh, that there's antitrust concerns if you don't censor?
01:00:48.000 There's other issues that, you know, we can influence other things that you care about if you don't censor the speech.
01:00:54.000 We're encouraging you to censor.
01:00:56.000 And I think that's really important because that's one of the things from our committee work that we sense was going on.
01:01:01.000 There's actually an email, I think, Where an internal email with Facebook where they said, well, maybe we could paraphrase it, but it basically says something like, maybe we should go along with the suggestions from the government because we got bigger fish to fry, other issues to deal with.
01:01:16.000 And so was government wink wink hinting, oh, we're going to have some antitrust concerns, other issues relative to 230.
01:01:16.000 Right.
01:01:21.000 If you don't censor.
01:01:24.000 The speech that they were trying to get him to censor, which was almost universally conservative speech.
01:01:29.000 Of course, yes.
01:01:30.000 I would love, if somebody in your team has a chance to read this brief, please get in touch with us.
01:01:34.000 We have a lot of emails, phone calls, videos of big tech doing this to us over the years that we would love to just get in your hands, give you ammunition to be able to use.
01:01:45.000 Trust me, we have them dead to rights on a couple of things.
01:01:47.000 We just need somebody to take the phone call!
01:01:49.000 By the way, including exchanging of actual funds, of actual money, just so you know, from Big Tech, and then doing things where they actually would issue a refund, which was effectively an admission of fault.
01:01:59.000 I won't say which platform, but we have that taking place, and I have a lawyer on full-time retainer who is...
01:02:05.000 Yeah.
01:02:06.000 Sorry, go ahead.
01:02:07.000 We'll have one of our lawyers on the committee staff get in touch with you guys.
01:02:10.000 That's important.
01:02:11.000 No, we're incredibly grateful.
01:02:11.000 Yeah.
01:02:12.000 And I know that you're a busy man, so I don't want to keep you too long, Representative Jim Jordan, but let me ask you this.
01:02:17.000 Greatest American wrestler of all time and greatest international wrestler of all time?
01:02:23.000 Well, there's three.
01:02:24.000 Gable, Smith, and Sanderson.
01:02:26.000 So, you know, Gable was the guy, when I was growing up, y'all looked to train, undefeated in college until his last match, and then wins the 72 Olympics.
01:02:34.000 And I was, what, I was eight years old watching that, and I thought, you know, that got me fired up.
01:02:38.000 John Smith came along.
01:02:39.000 I happened to actually compete against Smith.
01:02:41.000 Smith was maybe the greatest six-time world champ, two-time Olympic champ.
01:02:45.000 Uh, and the international style was just phenomenal.
01:02:47.000 Great guy.
01:02:47.000 In fact, our youngest son was, uh, assistant coach for John, uh, one year when he first got out of college.
01:02:52.000 But then Sanderson, undefeated in college, and Olympic and world champion, and maybe more importantly now, won, like, twelve or thirteen titles for Penn State.
01:03:01.000 They're just, they're gonna win again this week.
01:03:03.000 It starts Thursday.
01:03:04.000 You gotta get on ESPNU and ESPN because they cover the whole darn tournament, which I'm looking forward to.
01:03:08.000 Yes, as a coach.
01:03:09.000 And also, the embarrassing thing is, uh, I think Did the same ankle pick on every one of his top 10 ranked opponents.
01:03:16.000 Like, with Kael Stannis, it didn't matter what... You knew it was coming, and you couldn't stop it because of his setup.
01:03:22.000 We had Daniel Cormier, you know, UFC champion on the show.
01:03:25.000 And he goes, oh, you know Kael?
01:03:27.000 I was like, yeah, I watched your matches.
01:03:28.000 He ankle picked me, bro.
01:03:30.000 I saw it coming.
01:03:30.000 I couldn't stop it.
01:03:34.000 You must like UFC too.
01:03:36.000 So, Bo Nickel was a three-time national champ for Penn State, is killing it in UFC.
01:03:41.000 He's going to fight in UFC 300.
01:03:42.000 We just went to the fight down in Miami a weekend ago, but Nickel's fighting in the next one out in Vegas, and he's just killing everyone.
01:03:50.000 So, he's not a championship fight yet, or title fight yet, but he's on the main card.
01:03:55.000 So, I really think this guy is going to do well.
01:03:59.000 He's already doing well, but he just has that He's doing very well, but he did say that he believed he would beat a wild chimpanzee in a fight, which is the stupidest thing I've ever heard in my life.
01:04:11.000 Outside of that, I'm a fan.
01:04:14.000 But when he said it, I said, do you understand that they go for the face and groin to tear it off immediately?
01:04:20.000 This is not the same thing.
01:04:23.000 I will be tuning into that, especially UFC 300.
01:04:25.000 We appreciate it, Representative Jim Jordan.
01:04:27.000 You're welcome anytime, and we will be in touch with our brief, getting it to your folks.
01:04:30.000 Thank you guys so much.
01:04:31.000 Keep up the great work.
01:04:32.000 God bless.
01:04:32.000 Thank you, sir.
01:04:33.000 This has been Representative Jim Jordan.
01:04:39.000 He said he would beat a chimpanzee.
01:04:41.000 That's not going to happen.
01:04:42.000 Maybe a baby chimpanzee.
01:04:43.000 That's not understanding your opponent.
01:04:46.000 You'll die from sudden disease afterwards.
01:04:48.000 It's going to bite you.
01:04:49.000 If we want, we have George still in front of the Supreme Court.
01:04:52.000 If we want to go back and see if he had anything else for us, up to you.
01:04:54.000 We'll find out if he has something for us.
01:04:56.000 I don't want to go in, wear his headphones in the cameraman's pocket with a bunch of gummy bears.
01:05:00.000 In the bathroom.
01:05:03.000 He's drinking a Zima.
01:05:05.000 By the way, Josh Feierstein is going to be at Bricktown Comedy Club in Tulsa, Oklahoma.
01:05:10.000 Friday, March 22nd.
01:05:11.000 So this Friday.
01:05:12.000 This Friday!
01:05:13.000 Two shows this Friday.
01:05:13.000 I've never been to that club, Tulsa.
01:05:15.000 It's nice.
01:05:16.000 I was just there with Brian Callen this weekend.
01:05:17.000 That's right, it was a surprise spot.
01:05:18.000 A little surprise opening for him.
01:05:21.000 Really?
01:05:22.000 He was lying.
01:05:23.000 We were not high.
01:05:24.000 He was lying.
01:05:24.000 He was joking.
01:05:25.000 Yeah.
01:05:26.000 Alright.
01:05:28.000 A boy doesn't kiss and tell.
01:05:30.000 No, he doesn't.
01:05:30.000 What are you doing with those shifty nipples, dude?
01:05:36.000 Really quickly, look, and I'm sorry, I'm shameless here.
01:05:40.000 SCOTUS promo code $10 off if you join Mug Club.
01:05:42.000 When I was talking to Jim Jordan, the only reason the show survived to get to this day was because of Mug Club, because you foregoed millions and millions and millions of dollars in revenue to be able to do it.
01:05:50.000 So that's why I was telling him, because I'm pretty sure, even though he may know who you are, may have watched the show, probably doesn't know the whole backstory to that.
01:05:57.000 The only reason we survived was Mug Club.
01:06:01.000 Join Mug Club, $10 off, click the link in the promo, Or are you gay?
01:06:08.000 Let's not use it as a pejorative, even though it's an effective one.
01:06:11.000 I don't know.
01:06:12.000 Don't be gay.
01:06:13.000 Let's go to this in real time right now.
01:06:16.000 Look, look.
01:06:16.000 Trump attacks migrants, praises January 6thers, and rallies.
01:06:18.000 And it's Dana Bash talking.
01:06:20.000 And this guy.
01:06:25.000 Thank you so much for all being here.
01:06:27.000 So let's see how they characterize this, and then I'm going to show you why they're lying.
01:06:31.000 Here's the big problem, not that anyone watches CNN, no one does aside from maybe the Charlotte Airport, and they're a captive audience, it's against their will.
01:06:38.000 It's the fact that you can't find the original clip, the context, but social media algorithms will guarantee that you see the commentary.
01:06:45.000 Crappy commentary like...
01:06:47.000 He doesn't just need primary voters now, he needs enough voters to win re-election.
01:06:52.000 He has not changed his speech.
01:06:53.000 He has not changed his speech.
01:06:54.000 He actually doubled down on it the day after he gave that speech.
01:06:59.000 He went on his social media platform talking about Liz Cheney.
01:07:02.000 I can't do it.
01:07:05.000 The rest of the unselect committee.
01:07:06.000 I was trying to blend them together.
01:07:07.000 I want to hear someone say that he attacked migrants.
01:07:10.000 Please tell me what he did.
01:07:11.000 ... baby drill and free the January 6 hostages being wrongfully imprisoned.
01:07:16.000 I just want to say as you come in.
01:07:19.000 You were there on January 6th.
01:07:20.000 The people who were in jail after the process.
01:07:23.000 He survived.
01:07:24.000 Survivor.
01:07:25.000 Are they gonna cry?
01:07:26.000 Yeah, they're not.
01:07:26.000 And it was not a normal tourist visit either.
01:07:28.000 It was a violent attack.
01:07:30.000 He's the voice!
01:07:30.000 Horrific attack on the capitol.
01:07:32.000 Together they work!
01:07:35.000 He sounds like the Indian in Peter Pan!
01:07:38.000 January 6th is protest!
01:07:40.000 Void him!
01:07:43.000 This is actually central to his campaign.
01:07:47.000 He began his rally talking about January 6th.
01:07:51.000 What a forehead.
01:07:52.000 This really just speaks to, you know, Trump's got one strategy, one note that he's been doing in 2016, 2020, and now 2024.
01:07:59.000 It's read me to the base.
01:08:00.000 It is not a pivot to the general election.
01:08:02.000 I love how that one says, it's red meat to the base while you lie to yours.
01:08:05.000 So let me, before we go to Mug Club, let me show you what the media has been doing because I have no doubt they'll still do it right now.
01:08:12.000 This has been taking place Saturday, Sunday, we will show you the montage of them deliberately misrepresenting what President Trump said.
01:08:20.000 And by the way, it's not even close.
01:08:23.000 It's not, well maybe I could see how it could be interpreted.
01:08:25.000 Nope.
01:08:26.000 Absolutely not.
01:08:29.000 And they did it anyway.
01:08:31.000 And the reason that you've seen all of this is because the algorithms made sure that all of this was pumped into your feed regardless of how untrue, and of course in relation to journalism, you know, quality control.
01:08:45.000 Shitty!
01:08:47.000 The race for the White House and former President Trump's campaign now on the defensive after his fiery rhetoric at a rally in Dayton, Ohio on Saturday night.
01:08:55.000 Trump warning while discussing the economy that there would be a, quote, bloodbath if he is not re-elected in November.
01:09:01.000 The former president said some migrants aren't people, are not people.
01:09:06.000 He cast doubt on the future of American democracy if he loses in November.
01:09:11.000 The presumptive Republican nominee warned there would be a, quote, bloodbath, unquote, if he loses the election.
01:09:20.000 He uses these high-impact words that have either the direct or implicit tone of violence.
01:09:28.000 What I heard was a continuation of the same rhetoric, the same endorsement of political violence that we've seen from Donald Trump for years.
01:09:35.000 He's even predicting a bloodbath.
01:09:37.000 What does that mean?
01:09:38.000 He's going to exact a bloodbath?
01:09:40.000 There's something wrong here.
01:09:42.000 Yes.
01:09:43.000 With you.
01:09:43.000 It's a repeat of Fight Like Hell.
01:09:45.000 It's a repeat of... Yes.
01:09:46.000 He said bloodbath, only in this case... Look, all of you have used the term Fight Like Hell.
01:09:50.000 All of you know that Fight Like Hell could mean a litany of things.
01:09:53.000 When they say, is he going to exact a bloodbath?
01:09:55.000 Please, please, please, out there, if you're going to be on their side and tell me how they've taken this out of context... You ever taken part in a literal bloodbath?
01:10:05.000 I don't even know what it is!
01:10:07.000 What is a bloodbath?
01:10:09.000 I don't know, like, what is it, they think he's gonna, they think the White House is going to be the new bordello of blood?
01:10:14.000 Yes.
01:10:15.000 I don't, I have no idea how dumb they think you are.
01:10:20.000 As a matter of fact, I'm pretty sure that bloodbath has only been used as a colloquialism.
01:10:25.000 I don't think it stems from, we can get the research done, I don't think it stems from at one point an actual bath of blood.
01:10:31.000 Right.
01:10:36.000 Let's go through this anyway because this is relevant to Section 230, what's happening at the Supreme Court.
01:10:42.000 The lies are amplified and the truth is silence.
01:10:46.000 Do it in Mug Club! Do it in Mug Club!
01:10:47.000 All right, okay.
01:10:48.000 Do it in Mug Club!
01:10:49.000 We will do it in Mug- Okay.
01:10:50.000 I got it!
01:10:50.000 I got it!
01:10:51.000 Loudonwithbrenner.com slash Mug Club.
01:10:53.000 Use the promo code SCOTUS.
01:10:54.000 We're going to show you the actual bloodbath clip.
01:10:56.000 And then of course we'll show you the media repeatedly using the term bloodbath,
01:11:00.000 which Nancy Pelosi probably hasn't seen in three decades.
01:11:02.000 And all of this, of course, is impermissible here on YouTube.
01:11:06.000 So if you're on Rumble, click that button.
01:11:07.000 Join Mug Club.
01:11:09.000 I couldn't find a more perfect example, and by that I mean I'm angry, I have work on myself to do.
01:11:16.000 YouTube, piss off.