Timcast IRL - Tim Pool - June 16, 2022


Timcast IRL - Elon Musk Twitter Meeting LEAKED By Veritas w-Viva, Barnes & Rumble CEO


Episode Stats

Length

2 hours and 4 minutes

Words per Minute

202.93336

Word Count

25,228

Sentence Count

1,950

Misogynist Sentences

20

Hate Speech Sentences

18


Summary

In this episode of Timestamps, we're joined by the CEO of Rumble, Chris Pavlovsky, who joins us to talk about the company's new terms of use for free speech and why it's so important to have a free and open internet.


Transcript

00:00:00.000 you you
00:00:53.000 Elon Musk had an all-hands-on meeting with Twitter where he answered a bunch of questions, and it's great
00:01:09.000 He said he's gonna allow some pretty wacky stuff on Twitter.
00:01:13.000 Now, we know this for a couple reasons.
00:01:15.000 One, there were reporters talking about it, but Project Veritas has leaked the entire conversation.
00:01:20.000 It's really fascinating to see a whole bunch of woke Silicon Valley staffers looking at Elon Musk and having to ask these questions, and you know they're probably just fuming.
00:01:28.000 So, we'll talk about that.
00:01:29.000 I think that's significant, because it looks like Elon Musk is gonna be buying out Twitter, and that means a lot.
00:01:34.000 But we also have the CEO of Rumble joining us, because we're going to talk about Rumble's rules and the changes that are coming there, and a bunch of other really interesting stories around the big tech censorship stuff.
00:01:44.000 Gavin Newsom has joined Truth Social.
00:01:46.000 He wants to hang out with Donald Trump, I guess.
00:01:47.000 He says he's going to call it their lies.
00:01:49.000 So that's particularly fascinating.
00:01:51.000 And then, why it's so important to have free speech?
00:01:54.000 USA Today was caught fabricating sources and secretly purged 23 stories.
00:01:59.000 So here's what we're going to do.
00:02:01.000 We've got a lot to talk about in politics, for sure.
00:02:03.000 Over in New Mexico, this one county is refusing to certify the election.
00:02:06.000 We'll see what happens there.
00:02:07.000 What does that mean moving forward for the midterms?
00:02:09.000 Polls about how Democrats want Trump indicted.
00:02:11.000 But we're going to talk about Big Tech, your right to free speech, why all this is so important.
00:02:15.000 Gavin Newsom certainly thinks free speech is important, I guess.
00:02:18.000 Not really.
00:02:18.000 I don't think that.
00:02:19.000 And we'll be talking about all of that stuff.
00:02:21.000 Joining us, we've got a bunch of people.
00:02:23.000 We've got Viva Frye.
00:02:25.000 How's it going, Tim?
00:02:26.000 Who are you?
00:02:27.000 Where do I look?
00:02:27.000 At this camera right there?
00:02:28.000 Yes, that's you.
00:02:28.000 Viva Frye, Montreal litigator, content creator.
00:02:32.000 Robert and I have an awesome locals community, vivabarneslaw.locals.com.
00:02:37.000 Been working with Rumble to actually tinker with some terms of use for free speech that is going to be clear, transparent, and actually change the way people look at what it means to actually have a platform that respects free speech.
00:02:51.000 Right on.
00:02:52.000 We got Robert Barnes.
00:02:53.000 Absolutely.
00:02:54.000 Glad to be here.
00:02:55.000 Here to discuss the way that the new Rumble rules will not only revolutionize the way free speech can work on the Internet, but be an open source that people can utilize, create a participatory process so both content creators and consumers can be involved in the process.
00:03:08.000 It's a way that big tech can move forward in a way that promotes and protects the original goal of a free and open Internet.
00:03:15.000 Right on.
00:03:15.000 And of course, to really help us understand a lot of this is the CEO of Rumble himself, Chris Pavlovsky.
00:03:20.000 Thanks for having me on.
00:03:21.000 Looking forward to being on here tonight.
00:03:24.000 This whole change that we've done is actually inspired from the community here at TimCast.
00:03:30.000 If I wasn't here six months ago, I don't know if this would have happened, but seeing all the feedback and what the community is looking for, I think we're doing the right thing here and I'm excited to be here to propose what we want to do.
00:03:44.000 Cool.
00:03:45.000 We got Luke.
00:03:46.000 Super excited about today's conversation.
00:03:47.000 My name is Luke Rudowsky of wearechange.org.
00:03:50.000 Today I'm wearing one of my tamer t-shirts that I think the Canucks should definitely understand well here.
00:03:56.000 And it says, you cannot comply your way out of tyranny.
00:03:59.000 If you like the shirt and you want to get it and support me, you can on thebestpoliticalshirts.com.
00:04:04.000 Because you do, I'm here.
00:04:05.000 It's going to be a great conversation.
00:04:07.000 No machetes this time, Chris, I promise.
00:04:10.000 And thank you so much for having me.
00:04:12.000 And I'm also here in the corner, not expecting to talk a lot tonight, but very excited to hear what everyone has to say.
00:04:17.000 Before we get started, my friends, head over to eatrightandfeelwell.com and pick up your Keto Elevate from BioTrust C8 MCT oil powder.
00:04:28.000 That means medium chain triglycerides.
00:04:30.000 It is no secret, my friends, if you go back to, I think, like September, October and watch these shows, I was much fatter.
00:04:35.000 I've actually lost almost 30 pounds.
00:04:37.000 Isn't that crazy?
00:04:38.000 I've been doing Keto.
00:04:39.000 Heck yeah!
00:04:39.000 I started doing Keto and then I started putting in a little bit more carbs, so now I'm mostly just very low carb.
00:04:45.000 I'm basically doing Keto.
00:04:46.000 This stuff's fantastic.
00:04:47.000 I love putting in my coffee, put some heavy cream, put some BioTrust in there.
00:04:51.000 EatRightAndFeelWell.com.
00:04:53.000 You can get a 60-day money-back guarantee.
00:04:56.000 Keto Elevate provides your body only C8, the most ketogenic MCT.
00:05:00.000 That means it provides support for energy levels, healthy appetite management, mental clarity and focus, athletic performance, Keto Elevate, personally my favorite MCT.
00:05:08.000 And yes, I actually have tried some other ones and I wasn't a big fan.
00:05:11.000 It's probably not a coincidence that we like them so much.
00:05:14.000 You'll get free shipping on every order.
00:05:16.000 And for every order today, BioTrust donates a nutritious meal to a hungry child in your honor through their partnership with NoKidHungry.org.
00:05:23.000 To date, BioTrust has provided over 5 million meals to hungry kids.
00:05:26.000 Please help BioTrust hit their goals of 6 million meals this year.
00:05:30.000 You'll get free VIP live health and fitness coaching from BioTrust's team of expert nutrition and health coaches for life with every order.
00:05:37.000 And their free e-report, the top 14 ketogenic foods with every order.
00:05:40.000 I'll tell you this, I cut out sugars, started eating a bunch of this stuff.
00:05:44.000 I lost a lot of weight.
00:05:45.000 It mostly was just cutting out the sugar.
00:05:47.000 But if you buy at eatrightandfeelwell.com, they're going to give you the adequate advice and information you need.
00:05:52.000 So check them out.
00:05:53.000 We're big fans and we really do appreciate the sponsorship, BioTrust.
00:05:56.000 But don't forget to also head over to timcast.com and become a member!
00:06:00.000 Support our work directly, and you'll be supporting our journalists.
00:06:04.000 I also just realized this, too.
00:06:05.000 I never say my name on this show.
00:06:07.000 Not once.
00:06:07.000 Really?
00:06:08.000 Yeah, whenever everyone does introductions, Luke's like, I'm Luke Hradowski, and Ian's like, I'm Ian Crosland.
00:06:12.000 I never say my name.
00:06:14.000 So I'm like, you don't need to know my name, I guess.
00:06:17.000 You know, if you watch the show, people are like, what's this guy's name?
00:06:19.000 They just know my name is Tim, I suppose.
00:06:21.000 So go to timcast.com, support our work.
00:06:23.000 Not only Will you get access to exclusive segments from this show Monday through Thursday at 11 p.m., which we'll have up for you tonight.
00:06:28.000 You will also be supporting our journalists who work hard to get you true and fact-checked information every day.
00:06:34.000 You're also supporting our infrastructure because we use Rumble for our cloud infrastructure and our video hosting.
00:06:40.000 Why?
00:06:40.000 Because we want to help build a space outside of big tech's grip on everything.
00:06:47.000 Be more resilient to censorship.
00:06:48.000 And so we have a lot of other stuff we're working on in the background I often mention, but support companies that are trying their best to help build something differently, build something that's more resilient to the censorship, which is why we have this conversation set up as we do today.
00:07:04.000 But let's get started with this news about Elon Musk.
00:07:06.000 And don't forget, smash that like button, subscribe to this channel, share the show with your friends.
00:07:10.000 Here's the big news from Project Veritas.
00:07:13.000 Exclusive Twitter all-hands meeting from June 16th.
00:07:16.000 The amazing thing is, this meeting just happened.
00:07:19.000 And then Project Veritas is like, we had the meeting literally the moment it ended.
00:07:23.000 Leaked video of Elon Musk's address to Twitter employees about essential nature of free speech, voting Republican, and evolving Twitter.
00:07:30.000 Quote, I think it's essential to have free speech, Musk said on the call after describing his affinity for Twitter.
00:07:36.000 He added multiple opinions should exist on Twitter to make sure that we are not sort of driving a narrative.
00:07:41.000 On the call, Musk was asked about his political leanings, his plans for layoffs, and the direction of the company.
00:07:47.000 He described himself as moderate, noting that he traditionally has voted Democrat but voted Republican this week for the first time in his life.
00:07:53.000 He also discussed his vision for Twitter, saying that traditional news media is negative, and they almost never get it right.
00:07:59.000 He added that bots, spam, and multi-account users must be contained.
00:08:03.000 I think an important goal for Twitter would be to try to include as much of the country, as much of the world, as possible.
00:08:09.000 He notes that he's not hung up on titles.
00:08:11.000 He reacted to the news of Project Veritas publishing the recorded meeting on Twitter by posting exactly, in fact, he was responding to Lydia.
00:08:17.000 That's right, he was.
00:08:18.000 I was so excited.
00:08:19.000 The guy from Project Veritas actually called me.
00:08:21.000 I was in Target picking out a birthday card.
00:08:23.000 He's like, what are you doing?
00:08:24.000 I was like, I'm picking out a birthday card.
00:08:25.000 He's like, oh, so you didn't see that Elon Musk tweeted at you?
00:08:28.000 And I was like, that's so cool!
00:08:29.000 I was freaking out in Target.
00:08:30.000 It was really cool.
00:08:31.000 So I can only imagine the woke Twitter employees who were lamenting libs of TikTok must be particularly irked having to sit there and listen to Elon Musk be like, I think we should have free speech and put wacky stuff on this platform.
00:08:43.000 Did Elon Musk not know that it was being recorded?
00:08:46.000 Because if there's one meeting and given the content of what was leaked, you would say certain things knowing that it might be leaked afterwards.
00:08:52.000 Although this like is a great thing to always just take for granted.
00:08:55.000 Everyone's recording everything you say at all times so you don't say anything dumb.
00:08:58.000 Yeah.
00:08:59.000 Shocking.
00:09:00.000 So shocking that you have Gavin Newsom now looking for the other free speech platform because they support free speech somewhere, just not where they don't want to hear it.
00:09:07.000 That's so weird, though.
00:09:07.000 It's like, yeah, so that's another story we'll get into.
00:09:10.000 But what's the logic behind opposing free speech or being, you know, California has got a lot of skeletons in its closet pertaining to big tech.
00:09:19.000 It's California.
00:09:20.000 And now it's like, I'm going to go on truth social.
00:09:24.000 I'm just wondering, you know, it's amazing that they leaked the meeting right after it's done, but those are some good talking points.
00:09:30.000 The odds are Elon Musk is the source.
00:09:35.000 I think a very big one, especially from the corporate media's response to this, because according to their sources and according to the Twitter slack, The employees are very angry and they're very pissed ... off and they don't know how to deal with this larger ... acquisition which they have voiced you know that they were ... disappointed with so Elon Musk just a few days ago even ... talked about how Twitter is biased against half the ... country how they're inactive against death threats against ... conservatives he just voted for a conservative for the ... first time in his life he also hinted that he's going to be ... voting for Ron DeSantis in 2024 so obviously.
00:10:09.000 What he said wasn't controversial, but for the people working at Twitter, for the people in San Francisco, for the yuppie Starbucks drinking flip-flop wearing yuppies there, holy cow, their minds are probably going crazy and they're freaking out because this is the reckoning of a big social media platform that's going to change the game.
00:10:25.000 But this is funny.
00:10:26.000 The Twitter employees are outraged over this.
00:10:29.000 They've been complaining about Elon Musk's takeover.
00:10:31.000 We have the leaked chats where they're talking about banning libs of TikTok.
00:10:35.000 And it's like they don't seem to realize they are the snowflakes in the avalanche of people like Elon Musk voting Republican.
00:10:42.000 It is the actions they have taken with censorship, with their hostility and intolerance, that's resulting in Elon Musk being like, I don't think you're fun, so I'm gonna vote for this other guy.
00:10:51.000 I'm gonna go read the Babylon Bee.
00:10:53.000 They must be thinking, if only we banned the Babylon Bee, Elon Musk would still vote Democrat.
00:10:58.000 What's encouraging about it is that clearly not everyone within Twitter is so much on board with the nonsense, and it's a vocal minority who purport to represent the majority.
00:11:07.000 And they're going to find out at one point sooner than later it's not going to be cool to do this, because people are going to want an actual platform that actually just allows people to talk, not in hurtful, hate, you know, violent ways, openly, to share ideas.
00:11:19.000 I gotta say, true social's not bad.
00:11:22.000 So when I ragged on it because when they first launched, I couldn't even get in it.
00:11:27.000 And I was just like, but what I mean by that is, you know, I don't think I'm deserving of anything special, but I was reached out.
00:11:33.000 Someone reached out to me and they were like, hey, can we get you on?
00:11:35.000 I was like, sure.
00:11:36.000 And then I couldn't get on.
00:11:36.000 I was like, this is dumb.
00:11:37.000 I don't know what's going on.
00:11:38.000 Like it's just so disorganized.
00:11:40.000 And then someone had at Timcast and it was like a parody account.
00:11:43.000 And I was like, I didn't care.
00:11:45.000 But then I heard it was like number one in the app store.
00:11:49.000 And so I checked, and the engagement is crazy.
00:11:51.000 And I was like, whoa!
00:11:53.000 People are on Truth Social, and they're having conversations.
00:11:56.000 Like, more so than Twitter.
00:11:58.000 So, I'm thinking to myself, if people use Twitter because the conversation's happening there, but now it's not, and they're just trying to destroy your life because you tweeted a joke, or retweeted a joke, like Dave Weigel at the Washington Post, yo, you might as well just be on Truth Social.
00:12:12.000 Or somewhere else.
00:12:13.000 I think Elon sees this, is why he's desperately trying to salvage this.
00:12:16.000 Well, Elon made public statements about this.
00:12:18.000 He said that it's the failures of Twitter that led Donald Trump to create Truth Social, that it's bringing people there, and he also makes the argument that censorship is radicalizing individuals, as of course it's pushing people off to further points of the internet.
00:12:32.000 It's not allowing a real honest discussion, it's making people's views Uh, be double downed on instead of questioned.
00:12:41.000 And that conversation used to happen on Twitter.
00:12:43.000 It's not anymore.
00:12:45.000 Um, and, and we're seeing again, just, uh, the politicization, the, this kind of radicalization of people from both political parties that are going further and further from away away from each other.
00:12:55.000 And I think that's because of big tech social media.
00:12:58.000 I think Twitter, specifically.
00:12:59.000 I think Twitter created this rage cycle where... I knew this guy.
00:13:04.000 I'm not gonna say his name.
00:13:06.000 He's a reporter.
00:13:07.000 Normal guy.
00:13:07.000 Used to hang out.
00:13:09.000 One day he replied to Donald Trump.
00:13:11.000 Something like, oh shut up.
00:13:13.000 And he got like a hundred retweets.
00:13:15.000 All of a sudden he went from having a couple hundred followers to having a thousand followers.
00:13:19.000 Oh, the dude destroyed his life and career.
00:13:21.000 He became a Trump reply guy.
00:13:24.000 And then he ended up getting, you know, tens of thousands of followers.
00:13:26.000 And I said, yeah, but who's going to hire you now?
00:13:30.000 What are those followers good for?
00:13:31.000 But it felt good.
00:13:32.000 It was an addiction.
00:13:33.000 No, that's it.
00:13:34.000 That's what people did.
00:13:35.000 They drove themselves off a cliff.
00:13:37.000 No, I don't have a truth handle yet.
00:13:39.000 Maybe I'll have to look if the engagement in the discussion is there.
00:13:42.000 It's but I actually, as far as Twitter goes, I enjoy the discussion and the engagement with people with whom I disagree ideologically.
00:13:49.000 It's the amazing thing about sharing ideas and Fighting in the political sense.
00:13:55.000 Well, Gavin's there now, so.
00:13:58.000 I still enjoy picking on Gavin on Twitter, but he has yet to recognize me.
00:14:01.000 You know what I really like doing is I like responding to far leftists, but in agreement.
00:14:05.000 So it's like they'll tweet something that I agree with and I'll make sure I'm going to respond with an agreement or adding to their point.
00:14:11.000 Cause that's my point.
00:14:12.000 I'm like, I don't hate them.
00:14:14.000 You're like, I'm not just going to respond to someone for the sake of being like, you're wrong about that one.
00:14:17.000 I'm going to respond specifically only when I think they're right about something.
00:14:20.000 And then it's funny how their friends react and they're like, You know, no, Tim Pool should not be agreeing with us.
00:14:25.000 And I'm like, well, you know, I do, so do something about it, I guess.
00:14:29.000 It's the super double reverse cancellation.
00:14:31.000 If you agree with the people, then their friends have to cancel them.
00:14:33.000 Oh, yeah.
00:14:34.000 It's a circle of life.
00:14:35.000 And there's something I call Bugs Bunny-ing.
00:14:37.000 You know how Bugs Bunny did the duck season, rabbit season, back and forth?
00:14:41.000 You know that bit?
00:14:42.000 You know this one?
00:14:44.000 Alright, so you got Daffy Duck and you got Bugs Bunny.
00:14:47.000 And Daffy is saying it's rabbit season and pulling the sign off the tree.
00:14:50.000 And then Bugs says it's duck season and pulls the sign off the tree.
00:14:53.000 And Elmer Fudd is standing there like, who am I gonna shoot?
00:14:55.000 Because he's a hunter.
00:14:57.000 And then Bugs, at the last minute, flips it and he goes, it's rabbit season.
00:15:00.000 And then Daffy goes, no, no, no, it's duck season.
00:15:03.000 And then Bugs goes, if you say so.
00:15:05.000 And then Elmer Fudd shoots Daffy instead.
00:15:07.000 So when you agree with them, they have to disagree with you if they're tribalists.
00:15:11.000 So then all of a sudden you'll find these progressives on the side of the fascists or whatever.
00:15:16.000 Robert, do we know, is Elon going to get the company for a lesser price?
00:15:21.000 Have they resolved the bot issue yet?
00:15:23.000 That has not been resolved.
00:15:24.000 And so we'll see.
00:15:26.000 I mean, the question is, what happens to truth if Elon Musk takes over Twitter and really returns it to its free speech roots?
00:15:33.000 What do you think, Chris?
00:15:34.000 I think that you have a completely different audience on on truth than you do on Twitter.
00:15:38.000 I think these are completely different, two different audiences.
00:15:41.000 And I don't think there's much overlap in it.
00:15:44.000 So when if Elon ever does get a hold of Twitter, I think that audience is already gone and they were never there.
00:15:52.000 They're never coming back.
00:15:54.000 I agree.
00:15:55.000 I think it might be too late.
00:15:56.000 I think the segmentation of these platforms is upon us.
00:15:59.000 Yeah.
00:15:59.000 I mean, Donald Trump even said if he gets invited back on Twitter, he's not going to come back because he has Truth Social.
00:16:05.000 But there also have been accusations against Truth Social with censorship against people posting January 6th information.
00:16:13.000 That's some of the alleged information coming out within the last couple of days.
00:16:16.000 And there is something, as Barnes, as you brought up, to the larger point of if Twitter does take over and allow free speech on their platform, people are saying it's predominantly going to affect Donald Trump and Truth Social the most out of all the other platforms.
00:16:31.000 Well, I think that's where there needs to be what Rumble is doing is the right way to go, which is to create clear, transparent, open rules.
00:16:36.000 Honestly, Getter kind of didn't, well I won't be critical of anyone else, just say no one else has created the rules that Rumble is going about creating.
00:16:43.000 Not only creating those rules, but also creating a participatory process whereby consumers and creators can be part of that process.
00:16:50.000 Create rules that can be open source, that can be mimicked and mirrored and copied by Getter, by Truth, by Gab, by anyone else, by Twitter if Elon Musk purchased it.
00:17:01.000 And because these rules are designed to reach a balance between not making it a troll-heavy platform, but at the same time making it as free for speech and expression as possible.
00:17:10.000 That heresy and dissident speech is allowed.
00:17:13.000 And that's what people wanted in the original free and open internet.
00:17:16.000 And Rumble's been taking the lead at creating open, transparent rules to make that a reality.
00:17:23.000 Should we talk about the initiatives now?
00:17:25.000 I mean, I'm excited about what you guys got rolling because last time we had some very interesting conversations with a lot of other social media platforms here on this podcast.
00:17:35.000 We had a very interesting conversation with Chris six months ago.
00:17:38.000 So, you know, should we go there or should we wait for later to talk about these developments?
00:17:43.000 Let's do this.
00:17:44.000 Let me pull up something a bit more political and then we'll get into why this stuff is so important.
00:17:50.000 This is a story from the New York Times.
00:17:52.000 USA Today to remove 23 articles after investigation into fabricated sources.
00:17:58.000 The articles were removed after investigation identified stories and sources that appeared to be fabricated, USA Today said.
00:18:03.000 The internal investigation, which took place over a period of several weeks, began after USA Today received an inquiry related to the veracity of details in an article by Gabriela Miranda, who was a breaking news reporter at USA Today.
00:18:15.000 All right.
00:18:16.000 I'm going to pull up USA Today real quick.
00:18:20.000 So one of the leading fact-check news organizations in the country has been spreading false news and fake news for a little while.
00:18:25.000 And here's the best part.
00:18:26.000 NewsGuard gives them a 100 out of 100.
00:18:28.000 Wow.
00:18:30.000 And they've actually been fabricating stories.
00:18:34.000 So my question to NewsGuard is, why did you certify an organization that was fabricating stories?
00:18:39.000 Outright, Tim.
00:18:41.000 They got it.
00:18:42.000 They took them down now.
00:18:43.000 So they fact-checked their own fact-checks and determined they were factually incorrect.
00:18:47.000 So they should get 101.
00:18:47.000 There was no fact-checking.
00:18:51.000 There was just gulagging.
00:18:52.000 23 articles were deleted.
00:18:53.000 Here's my point.
00:18:57.000 USA Today publishes fake articles, NewsGuard looks at them and says, those are real.
00:19:02.000 Why?
00:19:02.000 Because USA Today said so.
00:19:04.000 USA is a fact check.
00:19:05.000 They fact check a bunch of stuff that we say is bogus fact checking, but you get called a conspiracy theorist for calling out the bogus fact checkers.
00:19:12.000 Okay, I get fact-checked for the most ridiculous, stupidest things, especially when it comes to memes by these institutions.
00:19:18.000 And again, another thing to kind of understand here, go on their website.
00:19:22.000 It should be front-page news on their website right now.
00:19:25.000 Hey guys, we lied to you.
00:19:26.000 Hey guys, we fabricated stories.
00:19:27.000 Hey guys, if this was me... It is!
00:19:29.000 It is the top headline.
00:19:33.000 There it is.
00:19:35.000 Thank goodness.
00:19:37.000 But it should have been like, so sorry, we goofed up, we messed up, we lied to you, and this is full transparency and accountability of what actually happened.
00:19:45.000 But hold on, if you back it up just one second, I still saw January 6th first and foremost.
00:19:49.000 I saw that too.
00:19:50.000 That's not front page.
00:19:51.000 I mean, they should stop running news, they should stop putting out articles, they should apologize, get on their knees to the general public and say, I'm so sorry, I lied to you.
00:20:00.000 But they don't.
00:20:00.000 That's what anyone with any kind of reputation should be doing right now because this is so embarrassing and this is not something that's uncommon.
00:20:07.000 This happens a lot.
00:20:08.000 There's a lot of reporters doing this.
00:20:10.000 Brian Williams, another person who did this very publicly.
00:20:14.000 This is why free speech is so important.
00:20:17.000 It's obvious.
00:20:17.000 I mean, obviously, free speech is important for a million and one reasons that we talk about in terms of your right to go outside and speak your mind or your right to practice the religion you want.
00:20:29.000 I'm speaking more broadly than just the First Amendment.
00:20:30.000 I mean, quite literally, your right to say these things about what you believe and your politics.
00:20:35.000 But it's also Your ability to share the news on these social media platforms.
00:20:40.000 And when the news broke of censorship against conservatives, it was May of 2016, and it was Gizmodo, I believe, who said that Facebook moderators for the trending tab were deleting conservative sources from their trending news section because they believed conservative sources were fake news.
00:20:57.000 That was it.
00:20:59.000 Now we have NewsGuard, which is one of their big clients, I believe, is Microsoft.
00:21:04.000 Bill Gates.
00:21:05.000 And they like to certify these news outlets.
00:21:08.000 Now, I like it.
00:21:10.000 I like the idea.
00:21:11.000 And so I make sure all of our sources are always going to have that nice green checkmark.
00:21:15.000 But I will not hesitate to point out that their system is completely broken and makes no sense.
00:21:20.000 Particularly because how does USA Today have a 100 out of 100 if they are publishing fake news?
00:21:26.000 How do fact-checkers know whether or not a story from USA Today is real or fake?
00:21:33.000 If the New York Times, CNN, or USA Today come out with a story and say, John Smith told us he sold a boat for $100, they would say it's true because USA Today said so.
00:21:45.000 See, that's not fact-checking.
00:21:47.000 They just say... You know, NewsGuard, for instance, totally biased in that we just assumed USA Today was right.
00:21:54.000 No, they should be downranked completely.
00:21:56.000 And they should be forced to jump through hoops, legal audits, if they want to get their green checkmark back.
00:22:00.000 They should be fired!
00:22:00.000 They shouldn't have a job!
00:22:02.000 Well, the reporter, I think, is getting fired.
00:22:04.000 The whole institution, for allowing it, in my opinion.
00:22:08.000 The problem is, there shouldn't be these gatekeepers in the first place.
00:22:11.000 Platforms shouldn't be gatekeepers, they shouldn't be labelers, and that's where Rumble is taking the right direction.
00:22:15.000 Think about how irresponsible that is.
00:22:18.000 Imagine putting a CEO of a tech company to assign other companies or other entities or other people to say what's right or what's wrong.
00:22:27.000 What's true or what's fiction?
00:22:29.000 Who am I to do that?
00:22:31.000 That's ridiculous.
00:22:32.000 It's the circular nature of it.
00:22:35.000 Bill Gates having any hand in this whatsoever is already super sus, as the children say.
00:22:39.000 But they use these to discredit the other independent voices to then raise their own voices to prominence.
00:22:46.000 So they can remove- who's the journalist that we talked about?
00:22:47.000 Well, it's like a version of Russiagate, where you use your own laundered information over and over again as sources and citations.
00:22:53.000 So you have a bogus source come in, and then they leak to the press that the FBI is investigating it, and then they use the press to get a FISA warrant, and then they use that fact to justify it to Congress.
00:23:03.000 There's a list that came out today of prominent people who have sowed doubt about the election.
00:23:10.000 Now, what does that mean, to sow doubt about the election?
00:23:12.000 It means nothing.
00:23:13.000 Literally nothing.
00:23:14.000 You could say something like, wow, look at this story about the election.
00:23:17.000 That's dumb.
00:23:18.000 And they'll say, oh, but by sharing it, you sow doubt.
00:23:21.000 So I made the list.
00:23:22.000 Yeah, I'm very excited.
00:23:24.000 But here's the reason they do it.
00:23:25.000 There are organizations that fundraise off of this.
00:23:27.000 There are news outlets that publish the lies.
00:23:30.000 They need some kind of legal justification so that I can't sue them.
00:23:33.000 So someone will make an opinion statement.
00:23:35.000 Tim Poole sowed doubt about the election.
00:23:37.000 How did he do it?
00:23:37.000 He reported on a story in New Jersey about an election that had to be redone due to them finding a bundle of ballots in a mailbox.
00:23:45.000 True story.
00:23:46.000 Oh, but that means Tim Poole is sowing doubt on the election?
00:23:49.000 Mmm.
00:23:49.000 So we're gonna put you on that list.
00:23:51.000 Now, news outlets can all report the more extreme version.
00:23:53.000 They can launder that information and say, Tim Poole published or pushed lies.
00:23:59.000 Then, when I come back and say, that's not true, they'll say, I was just referencing this study.
00:24:04.000 And that was my interpretation of it.
00:24:06.000 Tim, good luck suing.
00:24:07.000 It's happening on a smaller scale to me in Canada, but CTVW5 runs a story, a hit piece on Rumble, refers to it as the darling of the right-wingers, where you go to post-COVID misinformation.
00:24:19.000 They then demonize me on that show to make me look like I'm responsible for one cherry-picked, let's call it a bad comment.
00:24:26.000 The story goes, uh, publishes.
00:24:29.000 Wikipedia, then I noticed some people editing over my Wikipedia to say, it's time we get real with Viva's right-wing extremism.
00:24:36.000 W5 showcased him.
00:24:37.000 And then I noticed another article from the Simon Fraser Institute talking about disinformation on the internet that Google doesn't autocomplete.
00:24:45.000 When you look up Alex Jones, it says author and not conspiracy theorist.
00:24:48.000 And they think that's a problem.
00:24:50.000 And I was in the appendix that when you look up Viva Frye, it doesn't say right-wing COVID conspiracy theorist.
00:24:55.000 It just says nothing.
00:24:56.000 It's the wrap-up smear on a fake news level.
00:24:59.000 I want to add to that real quick, so I'll put a pin in that, Luke.
00:25:02.000 When Google Glass came out, it was very prominent because you could say, you know, okay, Google, tell me this, and it would talk to you.
00:25:08.000 You'd say, what is the capital of New York?
00:25:11.000 The capital of New York is Albany, and then it would give you some facts.
00:25:15.000 So we get this thing, and so, you know, I'm hanging out with Luke, and I go, Okay Google, who is Tim Poole?
00:25:21.000 And it goes, Tim Poole is an award-winning journalist from Chicago, Illinois.
00:25:25.000 And then I go, okay Google, who is Luke Rutkowski?
00:25:28.000 And it literally goes, conspiracy theorists.
00:25:33.000 I was going to make that point.
00:25:37.000 I think everyone in this room has been slandered, has been attacked, had the media just make up stories about them.
00:25:44.000 I mean, I had my name run through the mud.
00:25:47.000 They were just making stuff up out of just thin air.
00:25:50.000 But let's think about this story from USA Today.
00:25:53.000 What would happen if one of us did what USA Today did?
00:25:57.000 What would happen if an independent media organization ran 23 fake stories that they just totally made up?
00:26:03.000 We would be hearing about this for years to come.
00:26:09.000 Can I pause you real quick?
00:26:10.000 Do you really think it was just 23 stories?
00:26:13.000 This is what we know from their own independent audit.
00:26:16.000 This is like the police investigating the police, which is questionable.
00:26:20.000 There's a reason Bill Gates puts hundreds of millions of dollars into media companies.
00:26:24.000 There's a reason that social media algorithms promote these trusted news sources that lie about almost every single thing.
00:26:33.000 And I think even Elon Musk made this point today, saying that predominantly almost everything that the corporate media tries to sell is a lie.
00:26:41.000 Not all of it, but most of it.
00:26:43.000 And I think it's fair to say that, especially with the way that our society has been abused and used by the special interests, that corporate media is just PR for the ruling establishment and it's nothing else.
00:26:54.000 And this is a perfect example of how they play by a whole different set of rules when we get criticized, we get slandered, we get lied about, and we get attacked and vilified for trying to even speak up against this bullcrap.
00:27:06.000 Politico is a really interesting organization.
00:27:08.000 I don't understand how their NewsGuard is certified, which makes me question NewsGuard itself.
00:27:15.000 Politico has, we've showed these stories before, they have numerous stories contradicting their own reporting.
00:27:21.000 One story is from January of 2017 that says something like, you know, Ukraine panics, assisting Democrats in the 2016 election has backfired or something to that effect.
00:27:32.000 The story was basically that Ukraine tried helping Clinton to stop Trump, and when Trump won, it was bad news for him.
00:27:36.000 They then wrote, a few years later, that it was actually Russian disinformation that Ukraine helped the Democrats, and both stories are still live on Politico.
00:27:45.000 How could you say, look, if that was the only thing?
00:27:49.000 Let's say out of the other 500,000 stories or whatever on that website, they're all true and correct, but those two exist.
00:27:54.000 You'd immediately have to say, as a rating agency, you are being stripped of being a credible agency because you have two articles that both say both stories are false.
00:28:04.000 I'm gonna ask a stupid question.
00:28:05.000 Is New York Times News Guard certified?
00:28:07.000 Oh, 100%.
00:28:07.000 Okay.
00:28:09.000 There's still an article up there that says he dreamed of being a Capitol Police officer, then a group of pro-Trump mobsters killed him.
00:28:16.000 Brian Sicknick.
00:28:17.000 That article is still up.
00:28:18.000 Wow.
00:28:19.000 And one of the books that Robert has recommended on our locals community, The Grey Lady Winked.
00:28:25.000 You realize it's not a one-off.
00:28:27.000 It's institutionalized.
00:28:28.000 It's been this way for the last 70 years.
00:28:30.000 They got it wrong on the Homolodur.
00:28:32.000 They got it wrong... Holodomor.
00:28:35.000 They got it wrong...
00:28:36.000 On World War II.
00:28:37.000 They got it wrong on Hiroshima.
00:28:38.000 They got it wrong on the Palestinian boy that was killed in the first Intifada.
00:28:44.000 It's a history.
00:28:45.000 It's a pattern.
00:28:45.000 It's not about consistency.
00:28:47.000 It's about hierarchy.
00:28:48.000 It's not a mistake.
00:28:48.000 It's just the way.
00:28:50.000 That's it.
00:28:51.000 The only thing is, we're now able to fact-check it and call it out in real time.
00:28:54.000 And then if we make one mistake, we get... Not just blacklisted.
00:28:58.000 You'll get de-platformed.
00:28:59.000 If anyone had done one of these stories, they would be de-platformed.
00:29:02.000 Oh, you'd be banned from Google News instantly.
00:29:04.000 They would never show your website again.
00:29:06.000 For the longest time, this show didn't appear on Google Search.
00:29:09.000 It's the craziest thing.
00:29:10.000 It's like, we're on YouTube!
00:29:12.000 Then, one day, it was funny, I called them out on the show, and then people were like, yo, you're back on Google!
00:29:17.000 Like, someone watching at Google was like, let's put him back in there.
00:29:20.000 Loaded us back up.
00:29:22.000 But it's remarkable that Google is the way people find information.
00:29:26.000 Over 90%.
00:29:27.000 Yeah, you open the browser bar and you type in a word.
00:29:30.000 I don't even type in web addresses anymore.
00:29:32.000 I'll just type in, you know, like, what did I just do?
00:29:34.000 I typed in USA Today and press enter and then it went to Google.
00:29:37.000 If Google removed USA Today for being fake news, it wouldn't come up.
00:29:40.000 It was a very good suit against the Biden administration brought by the states of Arizona and Louisiana that goes through it and in their motion for preliminary injunction, details the degree of government collusion that's been taking place.
00:29:49.000 Even under Trump's administration, he didn't know his own Department of Homeland Security, his own cyber security folks were already manipulating information, including suppressing information hostile to Biden and favorable to Trump.
00:30:02.000 And they've only escalated that.
00:30:04.000 That's why Bobby Kennedy's suit against Facebook is so important before the Ninth Circuit Court of Appeals.
00:30:08.000 And that's why they fear Big Tech.
00:30:10.000 been government created government curated censorship for the purposes of
00:30:13.000 controlling the information in the narrative and that's why they fear big
00:30:17.000 tech and independent to any kind of challenge to big tech monopoly is a
00:30:21.000 threat to their gatekeeping control over the institutional narrative and they are
00:30:25.000 You characterize them absolutely very correctly.
00:30:27.000 And for those people out there that think monopoly means 100%, it doesn't.
00:30:31.000 It means 75% or more historically in American law.
00:30:33.000 And this is over 80% Google, as you're mentioning, over 90% of all searches controlled by Google.
00:30:37.000 Almost all news information that's sought or that is read or reviewed or heard is dominated by Google.
00:30:43.000 But Robert, response to the argument that people are going to say it's a merit-based monopoly, they've gotten it because of a superior product.
00:30:49.000 None of that's true.
00:30:50.000 The antitrust litigation details that.
00:30:51.000 I mean, there is some degree to which this technology naturally inclines itself to monopoly in the way Peter Thiel talks about, but by no means did they actually obtain this monopoly.
00:31:00.000 Twitter obtained it because they said, we're going to be the free speech wing of the free speech party.
00:31:03.000 They just lied.
00:31:04.000 That's what they did.
00:31:05.000 That's what Google did.
00:31:06.000 That's what YouTube lied.
00:31:07.000 YouTube said, hey, we welcome all content creators.
00:31:09.000 We're never going to censor anybody.
00:31:10.000 And they became one of the biggest censors in the entire globe.
00:31:13.000 And they keep changing their terms and services as you go along.
00:31:16.000 So you agree to one, you invest all of your time, your energy, your blood, sweat, and tears into a business.
00:31:22.000 And then they say, you know what?
00:31:23.000 We're just going to blacklist you because you challenged the narrative.
00:31:26.000 You questioned the agenda.
00:31:27.000 So we're going to demonetize you, downrank you, and make sure that you can't work on the internet at all.
00:31:32.000 Which is the power that they have, which is absolutely insane and way too much power for one organization to have.
00:31:37.000 Here's where I think it's the most important thing is.
00:31:39.000 I think for all of us, we've got platforms or literally run one.
00:31:43.000 And so the problems we face are particularly unique, don't exist in the general population.
00:31:49.000 But for the average person who does choose to get on Twitter and say, I would like to speak my mind and challenge this, they're the ones who get banned first.
00:31:55.000 When Learn to Code was happening, when they were banning people for saying hashtag Learn to Code, majority of the people who were getting banned were like small counts, people who were just, you know, posting it.
00:32:06.000 The big channels were less likely to get banned.
00:32:08.000 This is on purpose.
00:32:09.000 They don't want to create a splash.
00:32:11.000 But there are so many people I've met and spoken with who say, I can't, I went on, I was on Twitter for two days and I posted a news story and they banned me.
00:32:18.000 I hear it all the time.
00:32:19.000 Now think about what that means.
00:32:21.000 Twitter will say, if they're on the left, give them some leeway.
00:32:25.000 If they're on the right, don't give them the time of day.
00:32:27.000 So when right-wing people come on, or libertarian moderate right, whatever you want to call it, come on the platform and say, here's the real story, with a link to a source debunking something, they're banned instantly.
00:32:38.000 If you ban 60% of them, but only 40% of the left, if anyone at all, you create this lopsided system where the majority of information coming out will be narrative-controlled fake news, and the people who know better are unable to counter it.
00:32:52.000 That's why we need Elon Musk.
00:32:53.000 You're right over the target with this one.
00:32:55.000 It's the long tail that got banned, that no one's talking about.
00:32:58.000 It's the people at the top.
00:33:00.000 They were tougher to ban, but the long tail really got banned.
00:33:02.000 It shifted the whole system, tilted it everywhere.
00:33:05.000 This is why I think platforms like Truth have a completely different audience.
00:33:11.000 You go ahead, sorry.
00:33:12.000 No, that's what I'm trying to say.
00:33:13.000 It skews perception and reality, especially if you're able to get rid of that tail, as you perfectly described here, because there has been a full frontal assault on independent thought, independent media, and critical thinking.
00:33:24.000 If you dare to even just go against the establishment and what they want you to believe at that current time, at that current moment, even though it flip-flops by the interest of whoever's involved in it, you are done with.
00:33:37.000 You are not going to have a way to succeed or live online, but now there's alternatives.
00:33:41.000 Now there's Truth, there's Rumble, there's Getter, there's Hive, there's Steemit, there's so many other different alternatives, different platforms out there.
00:33:50.000 Steemit?
00:33:51.000 That's an old one.
00:33:52.000 I'm just going off the train of thinking that's an old one that I remembered.
00:33:55.000 There's Hive, there's so many different ones out there.
00:33:57.000 It's also meant to tell you as an independent person that your view doesn't count, that your view is wrong.
00:34:02.000 So that little person, that ordinary person, that everyday person, doesn't have lots of followers, gets on and has a dissident information about COVID, dissident information about the election fornication that took place in 2020, has dissident information about Ukraine.
00:34:14.000 They're told your view is not only disapproved, not only unsanctioned, but it's wrong and nobody really agrees with you.
00:34:19.000 And that's why you're being censored.
00:34:20.000 That's why you're being sanctioned.
00:34:21.000 That's why you're being disapproved of.
00:34:22.000 And that's why it's critical that there be tech challenges to this, whether it's locals, whether it's rumble, whether it's true.
00:34:28.000 And I'll say we're going to get into this with the Rumble terms of use discussion.
00:34:33.000 But the Learn to Code, the rationale at the time, if everyone remembers it, it was deemed something of a call to violence.
00:34:39.000 It was deemed to be harassment.
00:34:41.000 Learn to Code, for the journalists, Let's talk about Gavin Newsom real quick.
00:34:45.000 We have this from the Hill.
00:34:46.000 losing our jobs and they said learn to code and when the journalists started losing their
00:34:49.000 job they said learn to code. Oh when we said it to you it was loving you know a needle
00:34:54.000 when you say it to us it's a call to violence it's it's it's weaponizing and bastardizing
00:34:58.000 the terms for political purposes. Let's let's talk about Gavin Newsom real quick. Oh yeah.
00:35:01.000 We have this from the hill Newsom joins Trump's truth social to call out Republican lies.
00:35:07.000 This is actually quite a quite amazing. He says quote I just joined Trump's truth social
00:35:12.000 going to be on there calling out Republican lies. This could get interesting. My first
00:35:16.000 post breaking down America's red state murder problem he said. Adding a link to his truth
00:35:20.000 social posts. Yeah I know like urban centers are all in you know blue cities. But here's
00:35:25.000 the funny thing.
00:35:27.000 Twitter bans their way to irrelevance, and now a prominent Democrat's like, I better go over here to engage in this conversation.
00:35:36.000 This is his second cell phone in as many months, I think.
00:35:39.000 He tried to poke fun at DeSantis.
00:35:42.000 But why is it a cell phone?
00:35:43.000 Because he's basically admitting that he doesn't support free speech on the one hand on the Twitter platform, but does support it and wants to flock to it on another platform.
00:35:51.000 This is the same guy who tweeted out a mocking photo of DeSantis.
00:35:55.000 He was reading books that DeSantis was banning, not realizing that the same dude in California are banning books.
00:36:01.000 He doesn't understand a cell phone when it happens, but thank goodness that he gives it to the public.
00:36:06.000 Could it be that the Democrats, the leftists of big tech who have banned these conservatives
00:36:12.000 have created a boring platform and now Democrats are going to want to go to Truth Social?
00:36:17.000 What if it turns out that Twitter ends up dying, Elon buys it, and then everyone's like,
00:36:21.000 well, but Twitter's so last election.
00:36:23.000 Truth Social's funny because Trump's on there and we all want to know about it.
00:36:26.000 These journalists are going to have to report on what Trump truths.
00:36:29.000 Is that what it's called?
00:36:30.000 He's truthing?
00:36:31.000 He's truthing.
00:36:32.000 So when Trump truths, that's an amazing thing to say, by the way, journalists have to have accounts to see what he's saying, forcing them to sign up and be on the platform.
00:36:43.000 Then what's the point of being on Twitter?
00:36:45.000 It's not going to be newsworthy anymore.
00:36:47.000 The people are going to be on truth and Trump's going to control it.
00:36:51.000 Now you're on his platform, baby.
00:36:53.000 Is that where we're going?
00:36:54.000 It's going to be interesting to see if Trump censors Gavin.
00:36:58.000 He's not going to censor me.
00:37:00.000 It would be stupid for him to do, but it wouldn't surprise me.
00:37:03.000 You know what might happen?
00:37:04.000 Gavin might get his butt handed to him on Truth, and he might actually say, holy crap, California is not doing well in a great many respects.
00:37:11.000 He might actually see the truth on Truth.
00:37:14.000 Wouldn't that be ironic?
00:37:15.000 This warrants some kind of sketch where Gavin's like, I'm joining Truth.
00:37:18.000 A week later, he's wearing a MAGA hat.
00:37:21.000 Because I've seen the light.
00:37:22.000 You hear the story about these content moderators and feds, that when they go on Facebook and they join these groups, they end up getting radicalized.
00:37:31.000 They call it radicalization in the media, but I'm like, perhaps the moderator is seeing real news stories that normally get removed, and they're not getting the filtered narrative anymore.
00:37:40.000 And so they're going like, whoa.
00:37:42.000 Well, that's the whole point of strategic empathy.
00:37:43.000 It used to be you taught in the State Department, military, you have strategic empathy for your enemy or adversary.
00:37:48.000 But the question is, why don't we teach it anymore in the U.S.
00:37:50.000 State Department?
00:37:51.000 Well, what happens if your strategic empathy leads you to be more empathetic to that perspective, and all of a sudden you can't hate, say, Russia or Putin or...
00:37:57.000 somebody else around the world, you can't despise them anymore because you've learned to understand them.
00:38:02.000 And so we've taught teaching it so that people don't do it.
00:38:05.000 And what big tech is trying to do is to not even let you have access to it,
00:38:09.000 because once you do, people end up opening their mind to different perspectives.
00:38:12.000 Especially over the last... I mean, basically, you look at every Alex Jones conspiracy...
00:38:15.000 And I think that's what people forget about the First Amendment.
00:38:18.000 over the last five years in some form.
00:38:20.000 And so all of a sudden people re-perceive Alex Jones in a whole different light.
00:38:25.000 They can't afford that to occur.
00:38:27.000 That's why they need people to never listen or hear that information in the first place.
00:38:31.000 This censorship is about controlling the audience, not just the listener.
00:38:34.000 And I think that's what people forget about the First Amendment.
00:38:36.000 The First Amendment's not only the right to speak, it's the right to listen, it's the right to hear.
00:38:41.000 Testify.
00:38:41.000 That's actually quite beautiful, Robert.
00:38:44.000 It's actually crazy to think that Silicon Valley executives could determine what you could listen to, what you could be able to think about.
00:38:51.000 And that level of power is godlike.
00:38:54.000 Let me ask you a question.
00:38:56.000 Have you ever watched a video that you thought was like the best video ever and you want to show your friends?
00:39:00.000 And then you're like, watch this, watch this video.
00:39:01.000 You play it.
00:39:02.000 And as you're sitting there, they're not reacting to it.
00:39:05.000 And you're like, they don't like it.
00:39:06.000 This is really awkward.
00:39:07.000 They're not laughing at it.
00:39:09.000 Zuckerberg or any one of these tech people could, could, it's not just about negative.
00:39:12.000 It's not, it's not just about censorship.
00:39:13.000 They could be like, this message should be put out.
00:39:16.000 They could go in and force you to watch these things.
00:39:19.000 No, absolutely.
00:39:19.000 You know, the thing that kills me though, is I've been in this space for like 20 years and I remember these guys, they're all talking about the free and open internet.
00:39:27.000 We love like free speech matters.
00:39:30.000 And then all of a sudden in five years, what happened to these people?
00:39:33.000 Like, was this just bullshit for the last 20 years?
00:39:36.000 Like, how do they just flip like that?
00:39:38.000 Like, I sit in this chair now and I'm like, I can't fucking flip like that.
00:39:43.000 Can I swear?
00:39:44.000 I don't know what it is.
00:39:45.000 We try not to, but look, when the lizard people came down and implanted the brain slugs and took over their minds, I'm kidding by the way, but you and I just won't care.
00:39:54.000 That's gonna be in USA today.
00:39:55.000 Tomorrow it'll be going.
00:39:57.000 Well, I mean, I had this discussion with Twitter's lawyers in 2016, because I was suing for Charles Johnson, who has, you know, eclectic history, but against Twitter.
00:40:06.000 And Twitter at that point, at least Jack Dorsey, was serious about the very same rules that Rumble's talking about now, putting in those kind of rules.
00:40:12.000 Codify the existing law that will protect, you know, there's no First Amendment protection for stalking or defamation or doxing and these things anyway that they were claiming they were worried about.
00:40:21.000 But you have the authority.
00:40:21.000 Like, I can't, there's no excuse.
00:40:22.000 The impression I got is the investors that invested.
00:40:25.000 I don't think Dorsey was fully gung-ho behind all the censorship that took place.
00:40:29.000 You're seeing that in his alignment with Elon Musk now.
00:40:31.000 But I think for the most part, they went along with Trump.
00:40:33.000 I mean, Trump winning wasn't supposed to happen.
00:40:36.000 You have the authority.
00:40:37.000 Like, I can't, there's no excuse.
00:40:40.000 And I look at them and I'm like...
00:40:42.000 They all capitulated to the pressure of investors.
00:40:44.000 George Soros publicly said he was going to go after Facebook.
00:40:48.000 I mean, this is a guy who helped sink the British Crown.
00:40:49.000 Yeah, Saudi Arabia.
00:40:50.000 And also, you've got to understand, Elon Musk buying Twitter has started people like with the Clintons, the Obamas, and Bill Gates throwing secret money into shadowy funds, attacking Twitter, trying to get advertisers off of the platform.
00:41:05.000 We have that every day.
00:41:06.000 Right now.
00:41:06.000 attacks, particularly when the takeover is going to be complete. So there's a lot
00:41:10.000 of power, a lot of money behind the scenes that are influencing a lot of
00:41:14.000 things that we don't know about. But again, as you said, I kind of agree with
00:41:18.000 you more. It's on him, but he was facing a lot of pressure.
00:41:21.000 We have that every day.
00:41:22.000 Right now. Like, I'm not gonna capitulate to that. I'm more I don't think they capitulated.
00:41:29.000 I think they genuinely think they are suppressing freedom of speech to guarantee freedom of speech.
00:41:33.000 I think they've actually convinced themselves they need to do this in order to create a platform that's welcoming for everybody.
00:41:39.000 If you have an open debate about trans sports activities, it's going to make people feel unsafe to talk, and they need to limit the freedom of speech in order to promote the freedom of speech.
00:41:49.000 It's Orwellian lunacy, but I believe they actually believe it.
00:41:53.000 Well, it's...
00:41:56.000 I think one way to put it is they're anti-meritocratic.
00:42:00.000 And the conversations that rise to the top, that dominate, or the information that does, they don't want to.
00:42:06.000 So if you say something like 2 plus 2 equals 4, it's not so much that they're threatened by it because they don't like it.
00:42:16.000 It's not the idea they want.
00:42:18.000 They don't want an individual to rise up through merit and hard work and good arguments.
00:42:23.000 They want to control those arguments.
00:42:24.000 They want to control those narratives.
00:42:26.000 So they need to eliminate that element of it.
00:42:28.000 Yeah, the problem is there's way too many control freaks.
00:42:30.000 There's way too many Bill Gates types.
00:42:32.000 Just, you know, control, control, control.
00:42:33.000 You know, you dig into Gates.
00:42:34.000 Let's go back to Politico.
00:42:36.000 2017, Politico's European Union organization says Bill Gates is getting way too much power and influence in the public health world.
00:42:43.000 These are public health whistleblowers.
00:42:45.000 Then they disown their own piece by 2020 when Alex Jones and others are saying Bill Gates's agenda is going to be reflected in a lot of public health agendas around the world.
00:42:53.000 And we actually saw it happen.
00:42:54.000 We saw Event 201 become a reality around the globe.
00:42:58.000 It's because ultimately there's too many control freaks in positions of power in big tech and stock market.
00:43:04.000 And now we have something called the Bill Chill, where a lot of scientists are afraid to even criticize anything associated with Bill Gates or his money or his investments, whether it's fake meat or medical procedures or anything that is tied into his money.
00:43:17.000 Bill Gates is known to have a reputation for punishing people, cutting funding, getting rid of money, because his money's in all of the medical community almost, And punishing people for daring to release data or information that goes against his monetary interest.
00:43:31.000 It's called the Bill Chill and it should terrify everyone, especially looking at our modern-day scientific community and how easily it could be manipulated by the millions of dollars that he puts into it.
00:43:42.000 We're living in an age of basically real-life Bond villains and James Bond is Alex Jones.
00:43:49.000 I'm trying to think of a good rhyme with Fauci, because the Bill Gates method is exactly the Fauci method.
00:43:54.000 When you control the purse strings of the funding, you'll get people to say what you want.
00:43:59.000 And then the doctors that speak out, or whomever else speaks out, They'll get the... Yeah, this is why they're pushing the fake meat.
00:44:05.000 The corporate media is saying, this is great.
00:44:07.000 This is awesome for the environment.
00:44:08.000 Well, that's looking like it's not the truth.
00:44:11.000 It's nutritious.
00:44:12.000 Well, that's also new data coming out showing it's not the truth.
00:44:15.000 And it doesn't taste like meat.
00:44:16.000 Exactly.
00:44:18.000 And again, it's about just trying to reprogram people to acquiesce, to go along with eating soy and bugs and all this other stuff that is not good for you.
00:44:27.000 And people need to really understand the influence that these people have because when the information comes out highlighting how these people were lying, manipulating, and marketing their products through science, when people call it out, we get censored, we get attacked, and then we get downranked in the algorithm, demonetized, and shut off from the internet.
00:44:44.000 They're trying to plug everybody into the Matrix, man.
00:44:46.000 They can't have you handing out red pills, Luke.
00:44:49.000 Yeah, pretty much.
00:44:51.000 It's been an uphill battle for a very long time.
00:44:53.000 I mean, you've been there with me.
00:44:55.000 You took a different route.
00:44:55.000 You were the first to get demonetized on YouTube.
00:44:57.000 Yeah, one of the first channels.
00:44:59.000 It wasn't even yellow.
00:45:01.000 They just took off the money sign and I was like, what is this?
00:45:03.000 Does anyone know what's happening here?
00:45:05.000 No one knew.
00:45:05.000 Yellow didn't.
00:45:06.000 So before YouTube had demonetization, right?
00:45:09.000 So, for those not familiar, you go on YouTube and in the studio, you upload a video and there will be a green dollar sign that means you've got ads.
00:45:15.000 If it turns yellow, it means you're limited.
00:45:17.000 If it turns red with a line through it, the ads have been removed or can be grayed out and say not eligible.
00:45:21.000 Before any of that, it was just a green dollar sign saying ads have been turned on.
00:45:25.000 I'm sitting talking to Luke and he's like, what happened?
00:45:27.000 He looks at his video and the dollar sign is gone.
00:45:30.000 And they asked to go back in the video and turn it on again.
00:45:31.000 And it comes back because YouTube didn't actually have actually have a plan for demonetization.
00:45:36.000 So, and then there was a loop and manually did it.
00:45:39.000 And there was options to not even turn the money on.
00:45:42.000 It was, and they got rid of that option.
00:45:43.000 I was like, wait, what's going on here?
00:45:45.000 I made public posts about this.
00:45:46.000 I made videos about this.
00:45:47.000 I was like, is this happening to anyone else?
00:45:49.000 No.
00:45:50.000 And this was before the major wave of them just attacking people's livelihoods and trying to also, this is another underhanded thing that they're doing because they're also incentivizing people to talk about particular issues or have stances on particular issues because they know it's going to give them money.
00:46:05.000 They know they're going to be able to monetize content.
00:46:07.000 If they say this, they know if they, they counter it, they're going to lose money.
00:46:10.000 I just want to point out that from 10 years ago when they were trying to destroy your YouTube channel because you were going up to prominent individuals and questioning them and challenging them, Ben Bernanke several times, you know, was he chair of the Federal Reserve?
00:46:22.000 Yeah, a whole bunch of them.
00:46:24.000 So 10 years later, your face is on a Times Square billboard.
00:46:27.000 I know.
00:46:29.000 So the way I see it is we keep pushing back, we keep challenging these things, and we're going to keep winning.
00:46:34.000 Well, the other thing is, you know that by the subjects that they're targeting, whatever they whatever they tinker with, the algorithm is the unwelcome discussion of the day.
00:46:43.000 You know that it's not what they're telling you.
00:46:45.000 The January 6th committee hearings, I've been live streaming it in as much as it's been tedious and soul crushing.
00:46:52.000 The first one I put up got demonetized.
00:46:54.000 You know, as I'm five minutes into the stream, demonetized.
00:46:57.000 Fine.
00:46:57.000 I asked for re-monetization.
00:46:59.000 It gets manually approved for monetization.
00:47:02.000 Good.
00:47:03.000 I'm all happy.
00:47:04.000 I checked just for the fluke of it a day later.
00:47:06.000 It's been age-restricted and re-demonetized.
00:47:09.000 I was like, guys, you just approved it yesterday, but you know that this is... Were you debunking?
00:47:16.000 I was commentating, but it got manually approved.
00:47:19.000 It got re-monetized after manual approval.
00:47:22.000 And I was like, you can't a day later, after you said it was good, after manual review, say it's not good.
00:47:26.000 And you just know it's a subject they want to control the narrative on.
00:47:29.000 And that's their way of doing it.
00:47:30.000 Soft censorship.
00:47:31.000 They say, don't talk about it because you won't get paid for it.
00:47:34.000 Oh, by the way, you're not getting paid for it.
00:47:35.000 So we're not going to, we're not going to promote it because we're not making money on it.
00:47:38.000 And that's how they just control the narrative through the soft, indirect and hard censorship of demonetization.
00:47:44.000 Yep.
00:47:44.000 And it's what I was mentioning earlier about how, you know, they'll censor 60% of the right and only a small portion of the left so that it creates a lopsided narrative where the left gets to say more than the right does, hoping that it skews politics in that direction.
00:47:58.000 With demonetization, Left-wing channels get demonetized less than the right does.
00:48:03.000 Many people on the right have been banned and booted, sometimes not even warning.
00:48:07.000 Restricting the access to funding does the same thing.
00:48:11.000 You make sure certain ideas can't survive, but I will tell you this, they are losing They're losing, man.
00:48:16.000 That's why I mentioned, you know, Luke, they tried destroying his YouTube channel, and now he's on one of the biggest billboards in Times Square.
00:48:22.000 And that's why I wanted to do it, to make that statement, to make that point, because we can.
00:48:27.000 Because they're not going to win this one.
00:48:29.000 Free speech is going to win.
00:48:30.000 Well, when your gatekeep is so desperate that you have to gatekeep to protect Amber Heard, and you're just discrediting yourself on a constant, continuous basis.
00:48:41.000 Uh, though it won't be long before YouTube comes after Latu.
00:48:44.000 Because they showed independent information, independently streaming trials, people like Nick Riccato, people like Emily Baker, all that, uh, people like LegalBytes, that whole crowd, that okay, people can come to their own independent conclusions, they don't have to rely on the Washington Post interpretation of events, and it turns out the Washington Post put a liar and a defamer on their front opinion pages for a fake story
00:49:03.000 that, and to continue to make Amber Heard the symbol of Me Too,
00:49:06.000 if you wanted to destroy Me Too, that's what you would try to do.
00:49:09.000 And yet they continue to do so.
00:49:11.000 And instead, they're gonna try to probably go after demonetization.
00:49:13.000 That's their theory.
00:49:14.000 Their theory is, oh, these people must have taken Johnny Depp's side not because the facts were
00:49:19.000 overwhelming his side, but because they got super-tasked.
00:49:21.000 Is he the Taylor Lorenz story?
00:49:22.000 Yes.
00:49:22.000 I mean, infamous libeler.
00:49:24.000 Nobody libeled her.
00:49:25.000 She libeled Arianna Jacobs?
00:49:27.000 Influences?
00:49:28.000 Oh, a back of ways.
00:49:29.000 And she's not political at all.
00:49:31.000 She's just somebody that was an economic competitor to certain key people in Hollywood who were aligned with Taylor Lorenz.
00:49:37.000 And just one thing, David Mamet had a great expression or a great saying, every fear hides a wish.
00:49:41.000 Robert, in that fear might be a wish.
00:49:43.000 Come after LawTube.
00:49:44.000 I don't think they will.
00:49:45.000 Because they might get sued if they try to mess up.
00:49:49.000 We're going after 50 lawyers all at once.
00:49:52.000 Let's see what happens.
00:49:54.000 And there's a community there.
00:49:56.000 Taylor Lorenz, Washington Post.
00:49:58.000 It was Washington Post, right?
00:49:59.000 Went after Alita, Legal Bytes, Emily D. Baker.
00:50:02.000 And their criticism?
00:50:03.000 They're making money.
00:50:05.000 Hey, you know what?
00:50:06.000 They're making money and they're making lots of it, probably more than you, because of their merit.
00:50:09.000 She called them radicalized for commenting on a pop culture civil court case.
00:50:15.000 That is insane.
00:50:16.000 Women.
00:50:16.000 There are two women who took the side of Johnny Depp and they're calling them misogynists.
00:50:21.000 Johnny Depp who threatened the president.
00:50:23.000 It's not like he's associated with the right.
00:50:25.000 I mean, this is a guy that's been part of the strong left.
00:50:28.000 Right.
00:50:28.000 Right down the street.
00:50:29.000 heard story was clearly fake from day one. I mean it was clear the British
00:50:32.000 courts were intimidated by the court of public opinion.
00:50:35.000 That's why they issued that ruling. Anybody who followed that case knew that
00:50:38.000 ruling made no sense.
00:50:39.000 And what is the whole... I mean that was a Fairfax jury.
00:50:42.000 That was a liberal democratic jury.
00:50:43.000 Right down the street. Not that far away.
00:50:45.000 No, it was like the idea that this was had anything to do with politics but it shows
00:50:48.000 their gatekeeping obsession is that you can never dissent from our viewpoint and
00:50:52.000 if you do you have to be crushed.
00:50:54.000 I just want to say what little credibility Taylor Lorenz may have had when out the window when she accused YouTubers of being radicalized for commenting on pop culture.
00:51:02.000 Yeah.
00:51:04.000 I'm sorry, but commenting on pop culture is the most generic, normal American thing.
00:51:10.000 It's like TMZ.
00:51:12.000 It's gossip magazines.
00:51:14.000 People being like, Johnny Depp and Amber Heard, movie stars.
00:51:17.000 Let me give you my opinions.
00:51:18.000 And people being like, well, I think this.
00:51:19.000 That's radicalization.
00:51:22.000 I'm sorry, if you want to talk about white nationalists and stuff, talk about radicalization, I'm listening.
00:51:26.000 But when you claim that commenting on Johnny Depp was radicalizing, you have been radicalized, I'm just like, A female practicing attorney commenting on a trial is radicalizing.
00:51:38.000 I mean, it's idiocy.
00:51:39.000 She lost credibility a long time ago, but again, USA Today, how are they still certified?
00:51:44.000 Taylor Lorenz, how is she still employed or getting contracts?
00:51:48.000 Can you imagine being one of these veteran reporters at the Washington Post?
00:51:51.000 You know, you're there for 20 years and you're just dreaming of that day that you can be like Woodward and Bernstein and you're going to get that big story.
00:51:59.000 And you know, over the past 10 years, past eight years, it's been really kind of, oh, you know, it's just getting weird.
00:52:04.000 And then they hire this, this high paid, you know, millennial who just writes garbage, conflict of interest news and nonsense.
00:52:12.000 And you're like, that's it.
00:52:13.000 That's the talent.
00:52:13.000 That's the big money.
00:52:15.000 How could you?
00:52:16.000 Oh, you've got to quit.
00:52:18.000 Well, I mean, or you end up subject, like the one reporter did, to being potentially doxed by his own fellow workers for making a joke.
00:52:24.000 That's right.
00:52:25.000 In the sense of the personal attacks that just escalated.
00:52:27.000 I mean, what we're seeing, I mean, you look at the White House.
00:52:29.000 I remember Cernovich when he went there, and he was like, look at all these.
00:52:31.000 These are all kids.
00:52:32.000 These are all 20-year-olds.
00:52:33.000 They have no clue about the real world.
00:52:35.000 Oh, yeah.
00:52:36.000 Yeah, we had someone on the show once who said that they weren't familiar with Joe Biden's administration because they were like a young teenager.
00:52:45.000 Right.
00:52:45.000 and i was like oh so you that's why you voted for on thirty six i remember
00:52:50.000 i remember it was like ten years ago i remember the obama ministration was doing in the middle
00:52:53.000 east so in job i think i'm trying to my hand on that
00:52:56.000 but you're twelve so you have no idea what happened but now you're old enough
00:53:01.000 to vote so you know we're seeing some of the old left that's coming
00:53:04.000 back up and That's, you know, with Jimmy Dore, Aaron Maté, Glenn Greenwald, that's actually resurfacing, and the Pope, Bill Maher in the post-Trump era, now that they're past TDS, who are just discovering, oh, wow, Joe Biden's really a warmonger.
00:53:15.000 He's actually been a warmonger for 30 years, but they're rediscovering this.
00:53:18.000 Bill Maher.
00:53:18.000 He's a corrupt corporate hack for 30 years, and finally he's being exposed as such.
00:53:23.000 We had Dennis Prager on the other day and he was saying that he's not gonna, you know, fault Bill Maher for, you know, for being a liberal but trying and calling things out.
00:53:30.000 So, you know, give him the space.
00:53:31.000 I think it's a fair point.
00:53:33.000 Bill Maher has called out a lot of the woke craziness.
00:53:35.000 And so as a media personality, I can respect that.
00:53:37.000 I just wish the guy would read the news.
00:53:39.000 Yeah, yeah, exactly.
00:53:39.000 He comments on it without reading it and that's just crazy.
00:53:41.000 A week after Covington happens, he's still wrong about it and I'm like, and the audience is
00:53:45.000 cheering? Exactly. Let me pull up this story from John Nicosia. Source, Stelter is down to
00:53:50.000 weeks if not days left at CNN. They go on, he is everything that reminds the new owners of the Zucker
00:53:55.000 era they desperately want to get past.
00:53:57.000 They continue, management is confident Stelter is the one sharing the internal pushback to fellow
00:54:03.000 media reporters while simultaneously stirring discontent within the ranks.
00:54:07.000 Looks like we've got some more here.
00:54:09.000 He said at 1.49pm, Feb.
00:54:13.000 2, 2022, two sources, former CNN Zucker's girlfriend Allison Gollust, will not be staying on with the network once dust settles, and then he publishes On February 20th, she resigns.
00:54:28.000 So he's basically saying he was right then.
00:54:31.000 Brian Stelter may be on the way out at CNN.
00:54:33.000 I bring him up, along with this Taylor Lorenz story, because, like I mentioned, I'm just gonna say it again, there is a big picture of Luke Rutkowski in the middle of Times Square on one of the biggest billboards.
00:54:43.000 How is that for winning and pushing back on the elites and telling them that we are taking these spaces?
00:54:47.000 To see these people getting the boot, to see their credibility in the gutter, it's a good day.
00:54:53.000 Daily Wire had a story on that article and I think your tweet response to this was in there and I noticed mine was as well because I got a Google alert.
00:55:02.000 There's a part of me, I genuinely feel bad for Stelter.
00:55:06.000 Stelter?
00:55:07.000 Stelter.
00:55:10.000 I genuinely feel bad for him.
00:55:12.000 To some extent.
00:55:13.000 But he has demonstrated actual malice.
00:55:15.000 He demonstrated malice with that kid who asked him the question recently at... Oh yeah, the high school kid?
00:55:20.000 The high school kid.
00:55:22.000 I feel terrible.
00:55:23.000 He came on the show for an interview.
00:55:25.000 I won't remember his name.
00:55:26.000 You know, puts on a smiley face.
00:55:28.000 Oh yeah, we really have to work better on this as the media.
00:55:30.000 And then gives him the cold shoulder when the cameras aren't running.
00:55:33.000 They're liars.
00:55:34.000 But my analogy was that it comes from real life experience.
00:55:38.000 In our country place, my parents' cottage, we had a mouse problem.
00:55:41.000 And we put out a bucket with some birdseed in it and to wait for the mice to get up.
00:55:46.000 They got up, they fell in, they didn't get back out.
00:55:48.000 They were all happy until they ran out of food.
00:55:50.000 Then they started eating each other.
00:55:52.000 Literally.
00:55:53.000 So this is like mice in a bucket.
00:55:57.000 They'll play with each other when, you know, there's enough food to eat.
00:56:00.000 And then when the food starts going short, they literally start eating each other.
00:56:03.000 No honor among scoundrels.
00:56:04.000 And it couldn't happen to a better industry.
00:56:06.000 Yeah, man.
00:56:07.000 I gotta say, I agree.
00:56:09.000 We're watching the downward spiral, man.
00:56:12.000 Independent media is going to take over.
00:56:14.000 It is taking over.
00:56:14.000 The stuff I see in the background of what's happening, people from these types of organizations coming to us wanting to go the independent route.
00:56:23.000 The world is changing and a lot of these organizations don't see it quite yet.
00:56:27.000 But there are some individuals in some of these organizations that do, and they're starting to reach out, and this is something that's gonna, I think, accelerate a lot in the next, you know, six months to a year, especially over the next two, three years.
00:56:39.000 Independent media will take over.
00:56:41.000 It's inevitable.
00:56:42.000 I definitely agree with you, because the more they try to suppress the truth, the more they promote the truth tellers.
00:56:48.000 But it's even talent at these organizations that are starting to realize this.
00:56:54.000 And they're starting to realize they're restricted, and they want to have a show like Tim's.
00:57:00.000 It's happening.
00:57:01.000 Well, in three years, I'm going to be, it's going to be, you know, Wednesday, it's going
00:57:07.000 to be Thursday at 9am.
00:57:09.000 And I'm going to be like, hey, Luke, that 4-6 is coming out.
00:57:12.000 Do you want to go catch it over at the local AMC?
00:57:15.000 And Luke's going to be like, oh, OK, we'll catch that Thursday preview.
00:57:17.000 And we're going to show up.
00:57:18.000 We're going to walk up to the counter to get some snow caps.
00:57:21.000 And Brian's going to turn around and be like, would you like anything else with that?
00:57:25.000 He'll be fine.
00:57:26.000 He made a lot of money.
00:57:27.000 The one thing is Chris probably had no idea 10 years ago when you started Rumble.
00:57:32.000 You for the next five years are going to be number one target because you're starting something which is going to be the platform for the independent voices who are being snuffed out, pushed out and censored on what had hitherto been the free speech platforms.
00:57:47.000 So are you ready for the battle?
00:57:50.000 I better be.
00:57:51.000 I better be.
00:57:52.000 Well, that's, uh, I love what I do.
00:57:54.000 I love this space and I really strongly believe in it.
00:57:57.000 So absolutely.
00:57:58.000 Did you guys see that smear piece about Florida?
00:58:01.000 Like the far right is moving.
00:58:02.000 Which one?
00:58:03.000 Oh yeah.
00:58:04.000 I've seen a few of those.
00:58:04.000 They're all similar smear pieces.
00:58:06.000 Like every week there's one, I think.
00:58:07.000 I just typed in Brian Stetler into Brave Search, and one of the articles that comes up is from the New York Post talking about how his Reliable Sources... First of all, why would you name a show like that when you're such a propagandist?
00:58:18.000 That's why!
00:58:19.000 But it's titled, Reliable Sources on CNN Draws Lowest Ratings Since 2019.
00:58:26.000 So they're feeling it.
00:58:27.000 I mean, it's not popular what they're doing.
00:58:29.000 He got 73,000 viewers in the key demographic.
00:58:31.000 That's how many views I get when I post a video of chickens outside.
00:58:34.000 Tim, you had 40,000 people watching an empty studio for two and a half hours last week and generating revenue while it happens.
00:58:42.000 I mean, it's nuts.
00:58:43.000 People want to support quality.
00:58:45.000 The people who can't succeed on a merit-based system want to suppress that quality.
00:58:49.000 Exactly.
00:58:50.000 The people at these big journalist outlets.
00:58:53.000 They have no merit, they have no talent, and so they rely on this gigantic foundation of a hundred-year-old institution, or institutions, so that they can be let in the front door, take the elevator to the top, and then scream their garbage opinions at the world.
00:59:08.000 For the rest of us that have built up our own followings and done the hard work over time, it's because we've said things that have been insightful, we've challenged, we've been brave, or we've just done the hard work.
00:59:18.000 And over a long enough period of time, that results in a following that's merit they could never earn.
00:59:23.000 And they despise us for it.
00:59:25.000 Well, think about it.
00:59:27.000 Collectively, you guys have more members on your own subscription stuff than CNN Plus put together.
00:59:33.000 With $300 million of investment!
00:59:36.000 That was horrifying.
00:59:38.000 Think about that.
00:59:38.000 That's insane.
00:59:40.000 I did a live stream with Nate Brody, another YouTube lawyer of the LawTube, about the Jan 6 hearings.
00:59:45.000 And I'm pulling up articles, publications, I don't want to get anyone in trouble on this, but relating to previous reporting by CNN, NPR, about issues with machines.
00:59:55.000 Leaving it at that.
00:59:57.000 CNN had a video from 10 years ago.
00:59:58.000 It had 1,400 views on it.
01:00:00.000 And this is CNN with, they have millions, 1,400 views.
01:00:04.000 And their stuff now, There's a reason why they turn off comments.
01:00:07.000 There's a reason why they don't let you see the thumbs up.
01:00:09.000 Well, that was YouTube, but... That's the reason why YouTube did that.
01:00:12.000 Don't question this!
01:00:13.000 To protect the weak and punish the strong.
01:00:16.000 Well, the saying goes that any sufficiently unmoderated platform will become right-wing.
01:00:22.000 But the idea is that the right probably has a tendency towards meritocracy, because the people who are strong enough to lead end up doing it.
01:00:28.000 Their view of it is...
01:00:30.000 The people at Twitter have said this.
01:00:31.000 They need to create the health of the conversation, like you were mentioning earlier.
01:00:35.000 So they view themselves as stewards of fairness.
01:00:39.000 It doesn't work.
01:00:40.000 You can't cut off the tall grass, sacrifice those who do the hard work, and then prop up people who don't.
01:00:45.000 And that's what they've been doing, and it doesn't work.
01:00:48.000 It's creating all of these problems.
01:00:50.000 I would flip the expression.
01:00:51.000 I appreciate the expression, but I wouldn't say anything left to its own or, you know, free speech tends to turn right-wing.
01:00:56.000 I would just say that those who tend to fail on their own merits Try to restrict the rules.
01:01:01.000 So nothing changes in the essence except for the Overton's window shifts to the left.
01:01:05.000 So what was center looks like it was right wing.
01:01:08.000 But no, freedom of speech tends to go right wing not so much.
01:01:12.000 It's that those who don't succeed with their speech try to suppress the speech of those who do.
01:01:16.000 And I would say in the last like three, four months, you know, we've seen a lot of people on the left, perceived left, come to rumble that you wouldn't typically have thought would have came.
01:01:28.000 Like activists, for example, Susan Sarandon tweeting Rumble.
01:01:31.000 Like, what?
01:01:32.000 That actually happened?
01:01:33.000 She's on the left, I thought, right?
01:01:36.000 So you're starting to see this free speech thing happen on all angles now.
01:01:43.000 It's not just happening with one defined group, but many different groups.
01:01:48.000 Gaming, whatever it may be, any category.
01:01:50.000 It's happening across the spectrum and it's getting really aggressive and worse.
01:01:54.000 Well, it's why the Young Turks are now being critical of independent left creators.
01:01:59.000 They grew up as being the Bernie Sanders, anti-Obama voice of the progressive left that the institutional media wasn't covering.
01:02:06.000 Then they transitioned into Trump hatred, and then they transitioned into being sort of corporate establishment media that they themselves used to be critical of, have to rely on donations from corporate and big billionaire sugar daddies on the left.
01:02:18.000 And consequently, their support is shrinking and shrinking and shrinking because they're no longer organic or authentic or independent.
01:02:25.000 There's people on the left that could build a huge independent market spectrum like Tulsi Gabbard, like Glenn Greenwald, like Aaron Maté, like Max Blumenthal, Son of City Blumenthal, like The Grey Zone.
01:02:36.000 if they continue to do independent information that's reliable and trustworthy, even though it comes from a left perspective.
01:02:42.000 The thirst and the hunger is for independent, honest, authentic information.
01:02:46.000 Exactly.
01:02:46.000 Yeah, Kyle Kulinski and Crystal Ball, I think, are fantastic.
01:02:49.000 Well, there's the no true Scotsman thing about this, is that left-wing voices who want to succeed, they move to rumble.
01:02:56.000 But the second you do that, you become right-wing.
01:02:58.000 So Tulsi Gabbard goes to rumble, right-wing.
01:03:00.000 Russell Brand goes to rumble, right-wing.
01:03:02.000 Jimmy Dore goes to rumble, right-wing.
01:03:04.000 And so Jimmy Dore is a socialist, isn't he?
01:03:07.000 He's the guy who spat on Alex Jones at the 2016 convention.
01:03:09.000 We were there.
01:03:10.000 Me and him were in the room as it was happening.
01:03:13.000 That's assault, brother.
01:03:15.000 It technically was.
01:03:16.000 But I mean, Glenn Greenwald, when you call the gay liberal who's very critical of Bolsonaro in Brazil, and the guy who helped break probably more investigative journalistic stories than any individual reporter in the last decade, You call him right-wing, it shows that they have a fallback position.
01:03:31.000 They called Fast Company with an article saying, you know, beware right-wing comedy from Joe Rogan to the Babylon Bee.
01:03:37.000 And it's like, what?
01:03:39.000 OK, I'll take it, I guess.
01:03:41.000 Look, they want us.
01:03:42.000 It becomes circular definitional.
01:03:45.000 Veer away, you're right-wing.
01:03:46.000 Even if you're as left as Russell Brand and Tulsi Gabbard, you veer, you're right.
01:03:51.000 And that's it.
01:03:51.000 Joe Rogan goes on his show and says universal basic income, which is very, very left.
01:03:55.000 And they're like, yeah, he's right-wing.
01:03:57.000 I mean, his favorite candidate was between Bernie Sanders and Tulsi Gabbard.
01:04:00.000 I mean, he's been consistently on the left.
01:04:02.000 But I think it's just self-discrediting.
01:04:05.000 When you get to the point where you're calling Johnny Depp supporters right-wing, you have lost the narrative and you've lost institutional control.
01:04:12.000 When you're at the point where commenting on a Johnny Depp civil case means you're radicalized, you've lost the plot.
01:04:19.000 And that's the corporate press at this point.
01:04:21.000 There were people on our respective communities, Robert Barnes and I, they didn't care for Johnny Depp because of his anti-Trump statements, you know, and call for violence statements.
01:04:30.000 But when it comes to these types of things, the people who are independent thinkers can see beyond their own biases and just come to the conclusions based on the facts.
01:04:38.000 And yeah, just anyone who agrees with Johnny Depp, right-wing misogynist, even if they happen to be left-wing women.
01:04:45.000 Let's talk about Rumble.
01:04:47.000 We have this from corp.rumble.com.
01:04:49.000 Rumble proposes an open source content moderation policy and process to improve transparency and put creators first.
01:04:57.000 So for those that might not be familiar, maybe you're new to politics.
01:05:02.000 I'm assuming it's probably a small percentage of people.
01:05:04.000 Most will probably know this stuff.
01:05:05.000 Tim, one thing, this is exclusive to you, released on independent media, not through the typical channels, just to add to the previous stuff.
01:05:13.000 Right on.
01:05:14.000 So obviously for those, you know, YouTube, YouTube has been big, but YouTube has banned people without warning.
01:05:20.000 YouTube has censored information they don't like, and they've done very shady things.
01:05:24.000 If we want to have open and honest conversations, we need to be able to be on platforms that have, that are healthy and robust.
01:05:30.000 So with Rumble, which is, well, it's, it's, how would you describe it?
01:05:33.000 How do you, a video hosting service?
01:05:37.000 The way to describe Rumble, we're a platform that's two different things, both video and cloud now.
01:05:43.000 So we're going to be pushing cloud a lot in 2023, but we're a video platform, an open and free video platform that's going to protect the free and open internet as much as possible.
01:05:52.000 Truth Social uses Rumble infrastructure.
01:05:56.000 It does.
01:05:58.000 Actually, when they opened up the floodgates, when you were able to come on, it was because they moved over to the Rumble Cloud.
01:06:03.000 And TimCast.com uses Rumble's infrastructure for our video player on the members-only section of the show, which is Monday through Thursday at 11 p.m.
01:06:09.000 Sign up to become a member.
01:06:11.000 But all of the hosting, the entire website is built on your guys' infrastructure because there's got to be... We have to build something that is an alternative to Silicon Valley's monopoly on the space.
01:06:22.000 And it has to be resilient to censorship.
01:06:23.000 It has to be competitive.
01:06:25.000 I think you guys are.
01:06:26.000 So let's talk about what this move was.
01:06:28.000 Trying to make the rules more fair, better.
01:06:31.000 And this helps you compete, but it's also better for the people.
01:06:34.000 Yeah, so one, this was completely inspired by your show.
01:06:39.000 We took a lot of flack with our terms and conditions.
01:06:41.000 You put it up there six months ago and I was like, shit, this thing hasn't changed for a long time.
01:06:46.000 We went through a time period of eight years when I came on in January where things changed a lot.
01:06:55.000 The way we've kind of built our track record over the last eight years is based on terms and conditions where we didn't move the goalposts and we kept really sturdy.
01:07:03.000 We didn't change the definitions of certain things and our track record proved to be really good.
01:07:08.000 We don't ban for things that don't make sense and we're not doing what YouTube's doing.
01:07:16.000 The term said we could ban you anytime we want, for anything we want, however we want.
01:07:21.000 We had this conversation, a lot of the rules were very similar to what we see.
01:07:24.000 Yeah, with the exception of all, like, the misinformation stuff that YouTube talks about.
01:07:29.000 So, Viva and Barnes, you guys, what, you came up with a plan or something, or what happened?
01:07:33.000 So it's the same plan that I talked about with Twitter, you know, five years ago.
01:07:36.000 And with Twitter it talked about being amenable to, and then backed out at the last minute.
01:07:39.000 Which is, you can create a space that protects a free and open internet without being bombarded by trolls and haters and harassers and stalkers and doxxers and defamers.
01:07:48.000 You can have something that is a free and open space, free both from censors and free from stalkers.
01:07:52.000 And the rules are right there.
01:07:54.000 The rules are there in American law.
01:07:55.000 The rules are there in jury instructions.
01:07:56.000 The rules are there already laid out.
01:07:58.000 You're not supposed to have discriminatory misuse or abuse of a platform.
01:08:01.000 Because of Section 230, there hasn't been a lot of U.S.
01:08:03.000 litigation, but there are ways in which you can create rules that are a desirable community that maximizes freedom of speech.
01:08:10.000 Like right now, you can go to Rumble, and if you want to get independent, free information, like we did an interview with Dinesh D'Souza on 2,000 Mules, we can only do it on Rumble.
01:08:18.000 But you can create a space that is free for those kind of discussions, that you can have heterodox opinions, that you can have heretical opinions, and be completely free to share those with your community, and at the same time have rules that are not only consistent with that, but are also open and transparent.
01:08:33.000 The other aspect of this was have an appeals process that matters.
01:08:36.000 What frustrates a lot of people is that they get suddenly banned without notice, without knowledge, without means of a meaningful appeal.
01:08:42.000 Happened to Eric Conley, Unstructured Podcast, and all he does is just interview interesting people.
01:08:48.000 So the goal was let's create an appeal process that works and that's manageable.
01:08:52.000 And that's where Viva helped create a lot of those because he's been through that process, knows other people that's been through that process.
01:08:57.000 Also make it participatory.
01:09:00.000 We've had American democracy for about 300 years.
01:09:02.000 And the goal is to, we've learned that an open, transparent, participatory process produces the best result and best outcome.
01:09:09.000 It's not only about the free market of ideas.
01:09:11.000 It's about letting the ordinary person participate.
01:09:13.000 What we were talking about earlier, that's who often gets targeted for suspension and banning on social media.
01:09:17.000 It's the ordinary person.
01:09:18.000 That's why these rules are just proposed rules.
01:09:20.000 People can actually look at these rules and say, we see a problem here.
01:09:24.000 We think this could be improved.
01:09:26.000 There's actually an email set up that they can actually email in their ideas, their suggestions, their comments, make this process as best as possible, and ultimately have a community and content creator jury that will help adjudicate these processes.
01:09:39.000 So the goal is let's create something that will work for the entire big tech universe.
01:09:42.000 Let's create the model.
01:09:44.000 Chris has been willing to open source these rules, so anybody can borrow them, anybody can copy them, anybody can imitate them.
01:09:50.000 This is about making the free and open internet free and open again.
01:09:52.000 The jury system, I think, is interesting.
01:09:54.000 Minds implemented something similar.
01:09:56.000 Minds.com, M-I-N-D-S.
01:09:58.000 It's so hard to say, because it sounds like you're saying minds.
01:09:59.000 It sounds like you're saying minds.
01:10:01.000 Yeah, M-I-N-D-S.
01:10:03.000 They had a system where if someone posts something, That is a violation of the rules.
01:10:08.000 They have moderators.
01:10:10.000 If you post something that's like illegal content, like, you know, child abuse and stuff, it gets nuked.
01:10:15.000 If you post something and it's like, maybe that's violence or whatever, it gets sent to a jury of users and then they're asked to vote on it.
01:10:25.000 And then I'm not exactly sure how it works, but then they can vote.
01:10:27.000 Yeah, this breaks the rules.
01:10:29.000 If it does break the rules, all that happens is they put a filter on it that says not safe for work.
01:10:34.000 Anything that would instantly have to be removed for like law-breaking and stuff is removed no matter what because that's just law-breaking content.
01:10:41.000 But as for the community guidelines, the worst that could happen, I believe, is they just put a filter on it where it blurs the image and then you have to choose to see it.
01:10:49.000 So you don't even get banned for posting, you know, hateful stuff or anything like that.
01:10:52.000 Yeah, you run into the issue about people posting their own criminality or things along those lines.
01:10:57.000 But so full disclosure, one thing.
01:11:00.000 Robert and I have been working with Rumble for these terms for a little while.
01:11:04.000 But we don't like Rumble because we're working with them.
01:11:07.000 We're working with them because we like them.
01:11:09.000 And I knew Rumble since 2014 when I was just posting cat videos type things.
01:11:14.000 And they were just a video hosting platform and licensing agency.
01:11:19.000 What they're doing now is amazing and important because people are getting shafted left, right, and center on YouTube.
01:11:24.000 They're getting soft censored into discussing only the things that YouTube will allow them to.
01:11:28.000 We've got to, when we have certain controversial figures on, and those are, by the way, doctors.
01:11:35.000 I had to do one interview specifically on Rumble with Dr. Francis Christian because YouTube, I knew it was going to happen on YouTube.
01:11:43.000 But other people say, look, I want free speech on the internet.
01:11:46.000 That means running around and saying, you know, racial explosive and whatever.
01:11:49.000 And that's not what freedom of speech in the meaningful sense on the internet means.
01:11:54.000 What it means is that you have objective clear-cut rules that are not going to be weaponized for political and narrative driven purposes.
01:12:01.000 So, you know, hashtag learn to code is not going to be tolerable when the left says it, but when the right says it, it's a call to violence banning.
01:12:07.000 Right.
01:12:07.000 And so that's what you have to navigate with, with, with the internet.
01:12:10.000 I think we've done it.
01:12:11.000 We've created the same rules for public squares that exist all across America.
01:12:15.000 They have rules, right?
01:12:16.000 You can't go to a local public square and do pornography, do obscenity, try to attack somebody.
01:12:21.000 You can't do any of that.
01:12:22.000 Some cities these days, to be honest, I wouldn't be surprised.
01:12:26.000 Yes, unfortunately, yes.
01:12:27.000 Have you been to San Francisco?
01:12:29.000 Yeah, I know.
01:12:29.000 Well, hey, there you get it at little kids reading, you know, book story time, you know.
01:12:33.000 Surprise, surprise.
01:12:35.000 But the goal is to take what those historic rules have been and apply them to the digital public square and make it as transparent and open as part of the process.
01:12:42.000 But people can continue to partake and participate in this.
01:12:45.000 If they think there's improvements we can make, that Rumble can make, they're invited to do so.
01:12:49.000 This is the beginning of a participatory process to return the Internet to its roots of being open and free.
01:12:55.000 I dig it.
01:12:56.000 It's great.
01:12:57.000 And the sort of, call it board review or community review, it works when the community doesn't get radicalized, when it doesn't get filtered down through its own soft censorship.
01:13:08.000 And so, you know, people who love the community want to preserve it.
01:13:12.000 They're going to preserve it and they're going to preserve it so that they can all speak freely.
01:13:15.000 This sounds good, but how does it work?
01:13:17.000 How are you going to implement it and put it into action?
01:13:20.000 Who's going to be making the calls about what is allowed and what is not?
01:13:23.000 So it's a three-fold process.
01:13:25.000 So one is the actual rules themselves, to make them open, transparent, easily accessible.
01:13:29.000 That's what's being posted, I believe now, already up there.
01:13:31.000 Yeah, if you scroll up on the... Because if you break a YouTube rule, YouTube doesn't really tell you which rule you broke, or why, or what you could do for any possible redemption.
01:13:41.000 And that goes to the second part of the process.
01:13:42.000 So we helped first design the rules.
01:13:44.000 There's also going to be posting guidelines of how we're interpreting and applying these rules just using jury instructions.
01:13:50.000 Using things that ordinary jurors use every day in terms of the rules.
01:13:54.000 But the rules are really simple, straightforward, accessible.
01:13:56.000 They use identified legal terms throughout the United States for which we have 200 years plus of history.
01:14:03.000 Now, the second part of the process was what you're talking about.
01:14:06.000 How does it get adjudicated?
01:14:07.000 How do you get notified of it?
01:14:08.000 Who decides?
01:14:09.000 What role do you have in responding to it?
01:14:12.000 What do you know who the jury pool is consistent of?
01:14:16.000 Is that a publicly disclosed list?
01:14:18.000 That's what Viva took lead on.
01:14:21.000 Bottom line, you have your clear-cut rules.
01:14:23.000 There's automated stuff for copyright trademark.
01:14:27.000 Other than that, if a community member, a user, flags something, there's going to be a first review by Rumble.
01:14:33.000 And if they determine it's okay, it'll continue.
01:14:36.000 There will be flagging for people to avoid brigading, to avoid what I suspect happens a lot on YouTube.
01:14:42.000 People don't like your stuff, so they just go randomly and with impunity flag it.
01:14:47.000 If people flag too many things that are deemed to be unfair flagging, they'll suffer the consequences.
01:14:52.000 It'll create a sense of responsibility.
01:14:55.000 Will they be downranked in the algorithm?
01:14:57.000 Or how will their account be punished for abusing the system?
01:15:01.000 Ultimately suspended and barred if they continue to do it.
01:15:04.000 If they just continuously flag and wrongly flag content that Rumble and or the community determines is not flag worthy, they'll get strikes to the point where they'll get suspended or permanently banned.
01:15:14.000 Although even Chris has a big heart.
01:15:16.000 There will be no permanent bans except for egregious stuff.
01:15:19.000 It'll be a year ban if you violate, you know, over and over and over again to the point where you break the rules.
01:15:23.000 Which are going to be clear.
01:15:24.000 If you don't like them, that's going to be the other bottom line.
01:15:27.000 The rules are there.
01:15:28.000 It's not a lawless society.
01:15:30.000 If you don't like the rules, That'll be your decision.
01:15:32.000 But one thing you can rely on is that they're not going to be politically weaponized to go after one side of the ideological spectrum and immunize the other.
01:15:40.000 And you'll be given specific notice on which rule is being alleged to be violated.
01:15:43.000 You'll be given an opportunity to respond, an opportunity to respond to Rumble.
01:15:46.000 After that, if you don't like it, you can appeal it to the community board.
01:15:50.000 The community board is going to be fully publicly disclosed.
01:15:52.000 There will be content creators and members that are invested in the idea of Rumble as a free speech platform.
01:15:58.000 What about sentencing?
01:15:59.000 and you have an opportunity to oppose them and even then if it's a first
01:16:03.000 offense that only leads to the content being taken off.
01:16:07.000 What about sentencing?
01:16:09.000 After you're convicted of breaking the rules in court? And then there's a
01:16:12.000 structured process you have to violate it I think four times within six months
01:16:15.000 before you face any kind of deletion of a channel and even then it's only for a year
01:16:20.000 time period.
01:16:20.000 So four strikes because you were talking about a strike system.
01:16:25.000 There's two different processes.
01:16:26.000 So one that we're really changing here that I really like is the is the copyright thing.
01:16:30.000 We're gonna give everybody an opportunity to take down the video if there's a copyright or challenge it before it's taken down before an actual strike is applied to the channel.
01:16:38.000 I'm like on YouTube you can get like a hundred three or four strikes and channels gone.
01:16:42.000 So we're gonna give the creator an opportunity to appeal it right away without applying any strikes, give them a 12 to 24 hour period to figure that out.
01:16:51.000 That's on the copyright side.
01:16:52.000 On the policy takedown side, if you look at the policies, they're all related to unlawful conduct.
01:16:59.000 Let's say you do a bunch of things that are wrong.
01:17:02.000 If it's egregious, then obviously there's going to be no forgiveness.
01:17:05.000 But if there's nuance to something, then you're going to be able to come back within a year.
01:17:11.000 What does no forgiveness mean?
01:17:12.000 Permanent ban?
01:17:13.000 Yeah, well, when we first went through it, it was permanent bans for people that break the rules four times within a six month period.
01:17:22.000 But then I said, I was talking to both of you and, you know, I really felt like there has to be a way and a path to forgiveness.
01:17:30.000 Because everyone gets forgiven and like... Well, even murderers.
01:17:33.000 You can get 25 years after killing someone and still get let back out and joined society.
01:17:38.000 That is only based on whether or not you believe in reform for the ones you want to reform.
01:17:41.000 But other ones, it's immediately permanent banning and, you know, being kicked out of society.
01:17:46.000 Right.
01:17:46.000 I suppose there would be a life sentence.
01:17:48.000 You know what I mean?
01:17:49.000 And I think posting child abuse photos should be like a life sentence.
01:17:54.000 And that's what we've come to, right?
01:17:56.000 But like I said, these are just proposed.
01:17:59.000 We don't want to make these changes to the community until we feel the community is fully on board with it.
01:18:04.000 And this is our first step towards going towards that, because it is moving.
01:18:08.000 It's the first time we're going to move the goalposts, but I think we're going to move the goalposts in the opposite direction.
01:18:11.000 And really, they're trying to take the rules and comply with what they've been doing for the last eight years.
01:18:15.000 Yeah, it goes very in line with our track record.
01:18:19.000 The big question I hear from a lot of people is, UI update when?
01:18:24.000 Yes, well it's on the iOS app now, so you can see the new UI on the iOS app.
01:18:31.000 And the website, that'll be coming in the coming months.
01:18:35.000 Cool.
01:18:37.000 So it's looking good, I guess.
01:18:38.000 It sounds like something that we used to have in the US legal system called the due process.
01:18:45.000 Exactly.
01:18:45.000 We don't really have that anymore.
01:18:46.000 We have a lot of political prosecution.
01:18:47.000 Freedom of expression, freedom of speech, freedom of press, due process of law, jury trial rights.
01:18:52.000 All these ideas we've borrowed from to incorporate in the big tech space to restore the public square to the internet.
01:18:58.000 So these are the proposed rules.
01:19:00.000 Who's going to decide that these rules and content moderation is going to be moving forward?
01:19:06.000 And are you guys specifically saying, you can't say this, this political ideology is not allowed, this is okay, this is not.
01:19:13.000 Is this the process?
01:19:14.000 So what we have posted up there is pretty much what we're proposing right now.
01:19:19.000 And we're gathering feedback.
01:19:21.000 You can email us.
01:19:22.000 There's an email posted there where you can send us feedback and what you like and what you don't like.
01:19:27.000 So we can take that back.
01:19:28.000 But one of the things I really want to do is, you know, I've founded Rumble, I've run Rumble, but I'm not a creator that focuses on law.
01:19:39.000 I don't have a law degree.
01:19:42.000 Who am I to really put this together?
01:19:44.000 And the best thing I could think of is coming to creators that have law degrees and understand free speech better than anyone else to suggest something.
01:19:55.000 Is there a political ideology that's off-limits?
01:19:57.000 Like, if a group signed up, you'd be like... No.
01:20:02.000 I mean, it's designed... If you're making illicit content, so, like, the Klan and the Antifa tends to make illegal content, but there's no ban on the Klan, there's no ban on Antifa, there's no ban on anybody.
01:20:12.000 I mean, like, one of... I still think the best measurement for whether a platform is consistent about freedom of speech is, is Alex Jones on that platform.
01:20:20.000 Alex Jones has had zero problems on Rumble.
01:20:22.000 That's shown there.
01:20:23.000 And even when Senator Scott from Florida came after him because they allowed RT to be on Rumble, Rumble didn't change their position.
01:20:30.000 So even when the United States Senator from the state that Rumble is partially located in came after him, they stayed with it.
01:20:36.000 So I was only willing to invest my time if Chris was sincere, and Chris was clearly sincere.
01:20:40.000 Because I've been passionate about this for more than half a decade.
01:20:43.000 I gotta say, I think sincerity is great, but the reality is, this is the market opportunity.
01:20:50.000 If you're trying to grow a business, this is how you do it.
01:20:52.000 You imagine, take YouTube as a specific example, where they have, as a rule for content, we will deem misinformation, remove, strike, and penalize channels for suggesting anything that runs afoul of what the who is saying right now.
01:21:07.000 And bear in mind that the WHO, the World Health Organization, has it, depending on the year, said both A and not A, or both A and B. And so you don't even know what the rule is going to be on the going forward basis.
01:21:21.000 Let me tell you, I had a meeting with Google not that long ago, and I said, I advise all young people to start on Rumble because while the audience size certainly is much smaller, your opportunity for growth is much larger.
01:21:35.000 You're more likely to find an audience faster and you're less likely to have your business destroyed.
01:21:41.000 Due to arbitrary rules.
01:21:43.000 And I said, I gotta be honest with you.
01:21:44.000 I don't know what the rules are.
01:21:45.000 And I read them every day.
01:21:47.000 It's remarkable.
01:21:48.000 Because I have seen people get banned for one thing and not something else.
01:21:51.000 I have seen a prominent left-wing podcaster call for an act of terror.
01:21:56.000 And he got a strike.
01:21:58.000 Is that it?
01:21:59.000 I know another guy who had his entire channel deleted because he did black comedy.
01:22:03.000 That's it.
01:22:04.000 He broke no rules.
01:22:05.000 He was monetized.
01:22:07.000 And they just delete his channel outright with no warning, with no strikes, purge.
01:22:13.000 This was my own, like, I've been on YouTube before I was really focusing on Rumble, but since 20-whatever, 14.
01:22:19.000 This was my initial experience with what I call YouTube chicanery.
01:22:22.000 It was the Alex Jones deposition video.
01:22:24.000 All I did as a lawyer, total cringe, stood on the roof of my house with sunglasses, breaking down an Alex Jones deposition.
01:22:31.000 I didn't know that AJ was persona non grata on YouTube.
01:22:34.000 I do it, the video gets like close to a quarter of a million views, then they pull it, or they
01:22:38.000 demonetize it, then they pull it, and then I notice I got a term of community guidelines violation
01:22:43.000 for hate speech. So the video's gone, all that's left is when you click on it,
01:22:47.000 community guidelines violation for hate speech, and it was, and then it comes back on two weeks
01:22:53.000 later and it gets remonetized.
01:22:55.000 Senator Rand Paul was speaking on the Senate floor.
01:23:00.000 C-SPAN published the video and YouTube took it down.
01:23:04.000 YouTube, you are psychopaths.
01:23:06.000 And then let's also be honest here.
01:23:08.000 If there's a big brand, a big business, or the corporate media, they get the front door at YouTube.
01:23:13.000 They get walked right in.
01:23:14.000 They get pushed in the algorithm.
01:23:15.000 The trending videos, that's the videos that of course are connected to the biggest businesses that are connected to their Google advertisement businesses and revenue.
01:23:24.000 So there's already a clear bias.
01:23:26.000 If you're an independent media creator, and I wanted to bring this up with you guys, when you're on YouTube, there's no way the algorithm is going to be playing you any favors unless you have big money.
01:23:35.000 That's why Rumble is such a good alternative, as well as the other alternatives out there.
01:23:40.000 But who decides what's going to be in the algorithm?
01:23:42.000 Last time we talked about let people see what they want to see.
01:23:45.000 If they're subscribed to something, let them see it.
01:23:48.000 How is the algorithm going to be shaped to what Rumble is going to be showing people?
01:23:54.000 How will independent creators fare in that kind of ranking system?
01:23:59.000 So the way it is right now, it's just chronological.
01:24:01.000 So there, as far as an algorithm goes, it's chronological by time.
01:24:06.000 I think the important thing to do, and I think Ian mentioned this last time, is have these algorithms open sourced.
01:24:12.000 Yeah.
01:24:13.000 If we do deploy an algorithm right now, because it's chronological, there's really nothing to open source other than time.
01:24:18.000 Isn't there a challenge though, that if they know how the algorithm works, they'll game it?
01:24:21.000 There is.
01:24:23.000 So the way you can game a chronological algorithm is just keep putting content out constantly and flooding the feed.
01:24:29.000 So that's one way to do it.
01:24:31.000 But isn't that on the viewer?
01:24:34.000 If you're flooding the feed, don't follow them.
01:24:36.000 So you cut back on who you follow.
01:24:39.000 This is how we were running social media 10-15 years ago.
01:24:43.000 But we all went towards these engagement-based algorithms that amplify content based on engagement.
01:24:48.000 And that changed everything and changed the games.
01:24:51.000 They figured out they can skew things.
01:24:52.000 They figured out they could do things.
01:24:54.000 I think if you keep it chronological, it's helpful.
01:24:57.000 But that doesn't solve the problem for discovery.
01:25:01.000 So in order to have discovery of independent creators, you're going to need to provide some kind of algorithm and some kind of mechanism.
01:25:10.000 And what we want to do is have that discovery like kind of like in a TikTok format where you can kind of go through and scroll through content and open source that algorithm where that algorithm will be based basically on, you know, how much you like a video, how much you dislike a video, and whether or not you have a preference for that video.
01:25:28.000 So we're working on something like that right now because discovery is super important to help find creators and we should have something hopefully by the end of the year.
01:25:38.000 It's already launched right now but it doesn't have an algorithm in it.
01:25:40.000 The algorithm is very basic.
01:25:42.000 It's based on likes and the ratio of likes.
01:25:44.000 That's it.
01:25:45.000 And then the other way is through search, is just making sure your search is... we would like to open source that as well, and making sure that when you find something, you understand how you're placed in search.
01:25:55.000 You're placed in search right now based on time, the velocity of views, and the context of the video, so the characters, the titles, the descriptions.
01:26:07.000 It's very simplistic right now, and if it does become something more complicated, then open sourcing that is, I think, critical.
01:26:14.000 But the discovery portion is the part that we all want to solve for because once you nail that and you give viewership to small creators, then you really have something special.
01:26:23.000 What about your Rumble's API?
01:26:25.000 Are there going to be an option for people to, I don't know, develop on top of the software, embedding it, incorporating it?
01:26:33.000 So we already have that.
01:26:34.000 So RumblePlayer, which is going to be part of this RumbleCloud business that we're building, has open APIs that you can use to search, find, and embed into other platforms.
01:26:48.000 Cool.
01:26:49.000 Sounds like everything's there, huh?
01:26:51.000 Not yet, not yet.
01:26:52.000 Still lots of work.
01:26:53.000 There's a lot of work.
01:26:54.000 Nope, we're done.
01:26:55.000 We won.
01:26:56.000 I never understood what algorithm is more relevant than like and retention rate.
01:27:02.000 No, people watching a video and liking it.
01:27:04.000 I can see from a monetary perspective engagement, even if it's negative.
01:27:08.000 YouTube had the problem where every video started with a guy screaming smash the like button for 30 seconds and they turned it into a game and then all of a sudden all the top videos were just videos of people saying smash the like button.
01:27:19.000 But that'll self-correct when people start downvoting that crap because it's no longer fun to watch anymore.
01:27:24.000 People wouldn't even click it.
01:27:25.000 So there were people like, they're literally just videos where a guy for a
01:27:29.000 minute, it's like smash the like button, the camera's like zooming in and out,
01:27:32.000 and it would get 500,000 likes and people who don't like it just don't click it.
01:27:36.000 The issue, it dominates.
01:27:38.000 The issue with that as well is like, it will self-correct from what we're seeing
01:27:42.000 in the data, but the problem with it is, is that, you know, you're going to have a
01:27:47.000 platform that has a genre of videos that, that everyone likes, and then a new person
01:27:51.000 comes on and it's a completely different genre they're looking for.
01:27:54.000 And how do you figure that out?
01:27:57.000 I will say one thing.
01:27:58.000 The fact that YouTube, I've said it before, Robert will pontificate on this one day, when YouTube takes down videos from doctors on the basis of medical misinformation, I consider that to be practicing medicine without a license that YouTube is doing, arguably unlawful, in my humble opinion.
01:28:15.000 I agree, absolutely.
01:28:18.000 I mean, for me, coming from a political space, the Rand-Paul video removal was just mind-boggling.
01:28:23.000 And they're giving out wrong medical advice that is leading people to be hurt.
01:28:27.000 Their assault on one particular medicine and labeling it an animal medicine has hurt a tremendous amount of human beings who rely on that medicine for other things.
01:28:37.000 Remember when with the Roe v. Wade leak, Vice put out an article about how you can take
01:28:43.000 animal medication and use it to induce abortion?
01:28:45.000 Yep.
01:28:46.000 And then someone created a meme.
01:28:48.000 Let me see if I posted it.
01:28:49.000 I can maybe it's on my Instagram.
01:28:51.000 People might not be appreciating the story that the same agency which said you're not
01:28:56.000 a horse, yada yada.
01:28:58.000 At some point later on, they're talking about what's the word?
01:29:02.000 Not even homeopathic, alternative remedies.
01:29:06.000 It's bizarro upside down.
01:29:07.000 To induce abortion through horse medicine.
01:29:12.000 So yeah, that's the level where we're at.
01:29:14.000 And they're promoted in the algorithm.
01:29:16.000 They're shown to everyone.
01:29:17.000 You search a topic, they talked about it.
01:29:19.000 They're going to be shown to everyone in the general public.
01:29:22.000 Yeah, it's called horse pill theory.
01:29:25.000 And so it's from the Political Compass on Instagram.
01:29:28.000 And like Horseshoe Theory, the left, it shows the Motherboard article talking about taking horse veterinary medicine to induce abortion.
01:29:37.000 And then it goes to the right, and it's the horse paste.
01:29:40.000 And in the middle, it's BoJack Horseman.
01:29:42.000 I just thought it was a brilliant meme.
01:29:44.000 It's like, this is what you get in the modern era, I guess.
01:29:46.000 But they didn't ban Motherboard for recommending people eat horse medicine.
01:29:49.000 Please don't eat horse medicine.
01:29:51.000 I wonder if the FDA, did they put out a warning on that?
01:29:54.000 Well, I mean, the FDA is being sued because of their tweet about ivermectin.
01:29:57.000 So they deservedly so.
01:29:59.000 But I mean, when you have people like Dr. Peter McCullough, you're talking about some of the most well-respected medical doctors in the world that are now being censored.
01:30:05.000 Like, we're going to do an interview with him.
01:30:06.000 We're going to have to do it on Rumble.
01:30:07.000 We cannot do it on YouTube.
01:30:08.000 No, we're not.
01:30:09.000 We're going to gleefully do it on Rumble so that I can actually ask the questions I want to ask.
01:30:13.000 Same thing with Dr. Francis Christian.
01:30:15.000 We talked about this on the Twitter story.
01:30:18.000 These Twitter employees are listening to Elon Musk talk, and they're oblivious to the fact that he's there because of them.
01:30:24.000 Because of their political bias, because of their incessant need to silence people they don't like, they have created Republican Elon Musk.
01:30:32.000 YouTube is doing the exact same thing.
01:30:34.000 They are taking people.
01:30:36.000 There may be someone who says, like you, Vivo.
01:30:39.000 You're like, I'm gonna do a video on Alex Jones and his deposition.
01:30:42.000 They ban you.
01:30:43.000 Imagine a new creator who's not political, and they say, oh, I'm in law school.
01:30:47.000 I really want to talk about this.
01:30:48.000 They get banned.
01:30:49.000 They go, guess I'll go to Rumble.
01:30:50.000 Now, what are they doing at Rumble?
01:30:52.000 They're seeing nothing but all these big, prominent political creators who have been censored.
01:30:55.000 Everything YouTube was trying to silence, now front and center for those people to see.
01:30:59.000 They are pushing people into the information they claim they want to suppress.
01:31:03.000 It's remarkable how insane they are.
01:31:05.000 Well, they've done that to Alex Jones.
01:31:06.000 All the efforts to de-platform him have just led more people to be curious about what is he saying that everybody's so scared of him, and has led to more people going to InfoWars, more people going to InfoWars store, more people being engaged than almost ever before.
01:31:19.000 Their efforts to sign—and the irony is, I mean, Alex was seriously considering about retirement after 2016.
01:31:24.000 He'd achieved extraordinary number of things over a quarter century.
01:31:27.000 And then they decide to wage law for it, lawfare against him, and decide to try to take him,
01:31:32.000 deplatform him.
01:31:33.000 And he's the kind of guy who doesn't go gently into that good night, you know.
01:31:36.000 And so he is here today louder and stronger than ever before because of their efforts
01:31:40.000 to destroy him.
01:31:41.000 And not just that, I will say, having spent some time on the interwebs, a lot of people
01:31:45.000 are now realizing that Alex Jones was right more often than USA Today.
01:31:50.000 And when he says things hyperbolically, he's still right.
01:31:54.000 I don't want to throw my wife under the bus.
01:31:56.000 But when one of the stories, the conspiracy, one of the theories was that 5G towers are going to mind control you.
01:32:03.000 And it was like, oh, that's that's good.
01:32:05.000 When you when you when you understand that by that term, it just means interfere with sleep patterns.
01:32:09.000 You know, it could interfere with A form of your cognitive abilities.
01:32:14.000 It's not control in a sense, but when you realize that it's hyperbolic, but relatively accurate on some things.
01:32:21.000 He would often take the story and then just take this little dot and stretch it out.
01:32:26.000 He would dramatize it in order to get attention and the rest.
01:32:29.000 But people would pay, but people would ignore the underlying truth.
01:32:31.000 I mean, it turned out they were trying to turn some of the frogs gay.
01:32:34.000 You know what I mean?
01:32:34.000 I mean, it's that.
01:32:35.000 Well, that was that.
01:32:36.000 So so that's that.
01:32:37.000 I love the story because the frogs gay thing was was Alex talking about atrazine.
01:32:42.000 I believe was a pesticide.
01:32:43.000 Right.
01:32:44.000 Was it a message?
01:32:45.000 And they say that it was interfering with the endocrine systems of frogs.
01:32:48.000 All that meant was the frogs were becoming deformed or malformed.
01:32:51.000 And then Alex in his rant, it says, turn the friggin frogs gay.
01:32:54.000 And people people literally believe that he was he was being literal when he said that.
01:32:59.000 No, he was just doing an entertaining rant.
01:33:02.000 It's much like Trump, that his audience doesn't take him literally, they understand the proverbial reference to it, but his deeper truth that you can't trust institutional people in power, the people who seek power, like Michael Malice's theory, are disproportionately going to be dangerous people and we have to be constantly on the alert for them.
01:33:19.000 But I mean, it turned out everything he warned about, you know, that they're going to use a pandemic to help lock down and strip us of our civil liberties.
01:33:24.000 Well, we've experienced that.
01:33:25.000 That unique things might happen with elections.
01:33:28.000 We've experienced that.
01:33:29.000 That the mass censorship was coming through big tech control.
01:33:31.000 We've experienced that.
01:33:32.000 I think there was also something about medical microchipping.
01:33:36.000 A YouTube moderator who's watching this video is like, oh, I'm going to ban this video right now!
01:33:42.000 Can I do it?
01:33:42.000 Can I do it now?
01:33:44.000 Barnes, keep going!
01:33:44.000 Keep going, Barnes!
01:33:45.000 Well, it's just legal.
01:33:47.000 Again, that's the beauty of reporting information in lawsuits.
01:33:50.000 Going back to Rand Paul, this is public-sourced information.
01:33:52.000 So if you are saying something in Congress, if something is said in court, it cannot be the subject of a libel lawsuit or anything else if you're fairly and accurately reporting what was in there.
01:34:01.000 And so there's all this information, and yet now we can't even talk about things that are happening in Congress or happening in courts on YouTube.
01:34:08.000 That's a level of insanity we've never gotten.
01:34:10.000 When the Rand Paul thing happened, I had posted a video.
01:34:14.000 It got taken down, and the weird thing about it is the video was still there.
01:34:18.000 Someone messaged me, and they're like, hey, your video's gone, Tim.
01:34:20.000 And I go into my studio on YouTube, and I look, and I'm like, it's right there.
01:34:23.000 and then i hover the mouse over it and the mouse doesn't change you know like
01:34:27.000 when you hover over a link it turns into the finger pointing it didn't change i
01:34:30.000 couldn't click on anything it was like an image and i was like what and then i
01:34:34.000 found the euroxide tweeted or something and i video had been removed and i was
01:34:37.000 like they tried making me think that it was still there something like that
01:34:41.000 happened it was weird and there's a lot of dirty tricks that we don't even know
01:34:44.000 A lot of things happening behind the scenes that we're not even privy to that they're implementing right now that we don't even know about.
01:34:51.000 There's medical doctors, there's medical studies that are being censored and banned on big tech social media platforms.
01:34:57.000 That's when you know they jumped the shark.
01:34:59.000 We're gonna change all that, my friends, and I think we're winning.
01:35:02.000 That's why I keep pointing out that Luke's on a billboard in Times Square.
01:35:04.000 So is Ian, so is Michael Malice, because I was just like, we gotta put people up on this to give a big middle finger to the establishment.
01:35:10.000 But let's go to Super Chats and talk to you guys.
01:35:12.000 If you have not already, would you kindly smash that like button, subscribe to this channel, and become a member at TimCast.com because we're gonna have a members-only exclusive episode coming up at about 11 p.m.
01:35:24.000 on the website.
01:35:25.000 And share the show with your friends if you really like it.
01:35:26.000 Let's see what we got here.
01:35:28.000 John Shaw says, why not genetically engineer dog-sized ants, chip their brains, and use them to build infrastructure like bridges, canals, and underground highways?
01:35:37.000 Maybe I'm crazy.
01:35:38.000 That is a particularly crazy super chat.
01:35:40.000 Thank you very much for that.
01:35:41.000 That was OK.
01:35:44.000 They're turning the ants into construction machines.
01:35:46.000 They're turning the ants into dogs.
01:35:48.000 They're stealing your dogs and building bridges underground.
01:35:50.000 All right.
01:35:52.000 Jason Linholm says, damn, Viva, that hair.
01:35:55.000 Yes, it has gotten very long.
01:35:56.000 The Freedom Fro will continue to grow.
01:35:59.000 The Freedom Fro will continue to grow.
01:36:01.000 My wife said it would stop.
01:36:02.000 She said it would stop growing at one point in time.
01:36:04.000 And I said, that sounds like a bet.
01:36:06.000 Conspiracy theory.
01:36:07.000 JMaxx says, buy coffee brand coffee so the quartering has to shave his beard.
01:36:11.000 He actually did.
01:36:13.000 He did shave his beard this evening.
01:36:15.000 Rikita, he said, groomed him.
01:36:16.000 That was what they were doing, like a joint stream.
01:36:19.000 So he shaved it live on stream.
01:36:20.000 I was like, I have to show this.
01:36:21.000 He looks so weird.
01:36:22.000 Down to the skin.
01:36:23.000 He looks really effing weird.
01:36:26.000 I love it.
01:36:27.000 I don't want to say what he looks like.
01:36:29.000 Shut up, Luke.
01:36:30.000 No insults.
01:36:31.000 All right, what do we got?
01:36:34.000 Dano says, hey Tim and crew, love the show and all you do.
01:36:36.000 With the CEO of Rumble on, I would like to ask why this show isn't streamed live on Rumble?
01:36:41.000 Also, Rumble experience is better than YouTube.
01:36:44.000 It's an interesting question.
01:36:46.000 I suppose we don't have a real answer as of right now, but stay tuned.
01:36:50.000 That's all I can really say.
01:36:52.000 Dennis Gregerson says, heck yeah, Viva and Barnes, love your shows on Rumble.
01:36:57.000 Viva, I watched every Trucker Rally livestream.
01:36:59.000 You are the best, Viva.
01:37:01.000 Are you going to do that reporter?
01:37:08.000 Am I going to sue that reporter?
01:37:11.000 They wrote do.
01:37:12.000 D-U-E or D-O?
01:37:13.000 D-U-E.
01:37:15.000 Okay, fine, fine.
01:37:16.000 That was definitely a sue.
01:37:16.000 I was gonna ask which reporter.
01:37:18.000 I'm a married man, but I'm getting a quote right now.
01:37:23.000 Nobody should jump into litigation even if they're convinced they are right.
01:37:26.000 But at some point, enough is enough.
01:37:28.000 Even in their correction of the story, they then referred to the video that was allegedly removed from YouTube as a COVID video to persist in their... What's the word I'm looking for?
01:37:37.000 We'll be making that.
01:37:38.000 A statement as well.
01:37:39.000 Shortly.
01:37:41.000 To persist in their smear against Rumble they have to pretend that the video that I had removed from YouTube but wasn't removed from Rumble was COVID.
01:37:47.000 It was the Alex Jones deposition which shows you how idiotic things are on YouTube.
01:37:51.000 All right.
01:37:52.000 MM126 says, obviously Elon is a watcher of the show.
01:37:55.000 Nothing is coincidental.
01:37:58.000 I guess.
01:37:58.000 He better get on here.
01:37:59.000 He posted a meme of me.
01:38:00.000 He did?
01:38:01.000 Yeah, you see that one?
01:38:02.000 No.
01:38:02.000 Which was the meme of me talking to Vijay Gadde at Twitter, going in a circle.
01:38:05.000 Yeah, that was a great meme.
01:38:07.000 He tweeted at Lydia.
01:38:08.000 And then I immediately tweeted at him and I said he should come on IRL and discuss this with us.
01:38:12.000 He's sitting back and he's like, I'm not going on your show.
01:38:14.000 I understand.
01:38:14.000 He probably just likes to watch.
01:38:15.000 I get it.
01:38:16.000 Does he know that there's a Pappy's in the back?
01:38:18.000 I mean, yeah.
01:38:19.000 Elon, he doesn't drink, does he?
01:38:20.000 Oh, no, I don't think he does.
01:38:22.000 Well, it depends on the day.
01:38:23.000 Tell him it's apple juice.
01:38:25.000 If he doesn't know, he drinks.
01:38:27.000 He's drinking White Claw, that one.
01:38:29.000 He drinks White Claw.
01:38:32.000 We can smoke and talk about the synchronicity.
01:38:34.000 I got a feeling, you know, being the richest guy on the planet, he's not too concerned about the fancy whiskey that we have.
01:38:40.000 He's gonna be like, oh, you have Pepe?
01:38:42.000 I have 300 bottles in my backyard.
01:38:44.000 I bowl.
01:38:45.000 I play bowling with him.
01:38:46.000 Then you have to get him something that he's never had before or can't get because of where he is.
01:38:50.000 My brother-in-law makes a nice gin.
01:38:52.000 Well, he doesn't seem like a big money guy, right?
01:38:55.000 I will have my mom bake her secret recipe chocolate chip cookies.
01:38:59.000 Oh, yeah.
01:39:00.000 You make that berry juice we have.
01:39:02.000 Yeah.
01:39:02.000 Wineberry wine.
01:39:04.000 So, you know, we got wineberries out here.
01:39:07.000 They're Chinese raspberries and they grow all over the Appalachia.
01:39:13.000 Pawpaw is the October fruit.
01:39:15.000 Hillbilly banana.
01:39:17.000 So we'll make some hillbilly food for Elon when he comes.
01:39:20.000 It's a plan.
01:39:22.000 All right.
01:39:24.000 What do we got?
01:39:26.000 Ultramaga, Marty Smith fan, says I'm a Viva Barnes Locals member and TimCast member.
01:39:30.000 Thank you very much.
01:39:31.000 Question for Chris.
01:39:32.000 Can we have a rewind feature for Rumble Lives?
01:39:35.000 It's my biggest complaint.
01:39:36.000 And for Robert, can you pitch to TimCast to have Rich Barris on, please?
01:39:40.000 He's the best.
01:39:42.000 Yes.
01:39:43.000 We actually launched our live streaming with that feature.
01:39:45.000 It was just a little buggy.
01:39:47.000 We're going back to fixing that and should have that shortly.
01:39:51.000 And you know what else you should do?
01:39:52.000 Speak.
01:39:52.000 Give Rewind.
01:39:53.000 What YouTube used to do at the end of the year to celebrate creators was the YouTube Rewind.
01:39:57.000 Have the Rumble Rewind!
01:40:00.000 It was beautiful.
01:40:02.000 And when YouTube stopped it for one year, I forget what the reason was.
01:40:05.000 We said, nah, I'll tell you the reason.
01:40:07.000 It was because if they actually went after what was popular on the platform, they would have been making hate speech or offending the corporate press.
01:40:15.000 So they started, it was really funny.
01:40:17.000 Like one of the last YouTube rewinds they did had a bunch of creators who hadn't made videos in like years.
01:40:21.000 And people were pointing out like that person didn't make a video all year and you put them in your thing.
01:40:25.000 And they're like, but they're a YouTube celebrity.
01:40:26.000 It's like, dude, this they didn't like.
01:40:28.000 They had Will Smith on the platform going like, y'all!
01:40:31.000 Like, that is the most embarrassing, craziest thing.
01:40:34.000 They should do a Rewind with Will Smith this year.
01:40:38.000 Yeah, they should.
01:40:40.000 Just open it with it and nobody will look at that.
01:40:42.000 He's laughing the clip back and forth.
01:40:44.000 Oh, that's a good idea.
01:40:45.000 Everyone on YouTube.
01:40:46.000 Let's do a Rumble satirical funny version of a YouTube corporate rewind and make fun of them.
01:40:53.000 Let's put it on Rumble.
01:40:54.000 Let's do it.
01:40:55.000 I'm all in.
01:40:56.000 Will Smith is going to be in there.
01:40:58.000 Of course.
01:41:00.000 All right, let's try and grab some Super Chats.
01:41:04.000 What do we got?
01:41:07.000 Camel of the Mojave says, if it's a mainstream platform, it's probably already shoulder-deep and being, uh, shoulder-deep and being puppet, puppeted around by alphabet people or having their finances threatened.
01:41:18.000 But who does that refer to?
01:41:19.000 I think that's the general concern about any platform, and I think it's the concern about Rumble, as they say.
01:41:25.000 It's so big it's already controlled opposition or whatever.
01:41:29.000 Look, if I ever felt that way, I would not be here now.
01:41:33.000 Rumble is walking the walk, and Chris is talking the talk, and taking the flak for it.
01:41:38.000 People were saying that about Donald Trump before he got elected, that he was controlled opposition, that he was friends with the Clintons.
01:41:43.000 And then look what happened when he actually got in.
01:41:45.000 Well, I was not friends with anybody.
01:41:46.000 I didn't know a single person two years ago.
01:41:49.000 I come from a little town outside of Toronto.
01:41:51.000 That sounds sketchy.
01:41:53.000 Which town?
01:41:54.000 Brampton.
01:41:54.000 Brampton?
01:41:55.000 Yeah.
01:41:56.000 So outside of Toronto, I didn't know a single political person my whole life.
01:42:01.000 And I think it's people who believe that the system has so much control that even if they see something successful, they assume that too must be part of some secret control.
01:42:08.000 And that's not the case.
01:42:09.000 You can fight back and win.
01:42:11.000 I like to say that the greatest trick the devil ever pulled was convincing people he did not exist.
01:42:15.000 The greatest trick the system ever pulled is convincing people they cannot resist.
01:42:19.000 The reality is that's the key.
01:42:21.000 That's a good one.
01:42:21.000 I like that.
01:42:23.000 Brennan the American says, thanks for all you do.
01:42:25.000 I'm a 27-year-old who has a garden, food storage, and now chickens.
01:42:28.000 My wife and I feel slightly more ready for the storm to come.
01:42:31.000 You know, the former CEO of Home Depot came out today and said with the Fed hiking the rates, you better stock up on some cash.
01:42:38.000 You better get some cash reserves and you better get some non-perishables because it is going to get bad.
01:42:42.000 And seeing that and seeing reports of the constant, you know, stories about food shortages due to Ukraine, fertilizer, and all that stuff, plus the supply chain disruption, I mean, my assumption is, hope you're ready for August and September.
01:42:55.000 It's going to get crazy, nasty.
01:42:56.000 But then you see Biden putting out these tweets, it's the strongest economy, like lie after lie after lie.
01:43:01.000 And it's one thing for the lie to be there.
01:43:03.000 The response is in the comments section.
01:43:05.000 You talk about an ideological silo of absolute political ignorance.
01:43:09.000 Everyone's like, who are you?
01:43:09.000 Greatest economic recovery ever.
01:43:13.000 And then you get this distraction of the January 6th hearing where it's derangement.
01:43:17.000 It's derangement.
01:43:18.000 All this was also predictable.
01:43:20.000 I mean, Jacob Drazen, who's actually nearby, put out the report six months ago that this was going to happen if we went into Ukraine, that there was going to be a fertilizer and food crisis.
01:43:28.000 Credit to him.
01:43:29.000 Credit to the people at the Duran who cover him.
01:43:31.000 You can find him on YouTube.
01:43:33.000 And Richard Barris, going back to the chat, People's Pundit, was talking about this three months ago.
01:43:37.000 So yeah, Barris is great.
01:43:39.000 All these guys, independent information, you would have known this ahead of time.
01:43:42.000 It's only the Biden administration that appears to Ukraine is known as the breadbasket of Europe, and I've been talking about that for years.
01:43:49.000 You know the history of the flag?
01:43:50.000 The reason of the flag, yellow and blue?
01:43:52.000 It's the fields of wheat and the sky.
01:43:55.000 It's in their flag.
01:43:56.000 I thought it was because they were big fans of West Virginia.
01:44:01.000 I bought, uh, I've got some vans for skating and they're blue and yellow.
01:44:06.000 And I've had people be like, Oh, is that like Ukraine?
01:44:09.000 And other people would be like, Oh, West Virginia.
01:44:11.000 Yeah.
01:44:11.000 The funny thing is I have an avatar on the channel, which is a tie dye multicolored avatar.
01:44:16.000 And then people who are new to the channel think it's for, for pride month.
01:44:19.000 Oh, are you guys celebrating MAGA month?
01:44:24.000 What is Megamonth?
01:44:24.000 Megamonth, it's July.
01:44:26.000 Every corporation has to change their icon to an American flag and then we grill burgers on the weekends.
01:44:31.000 I guess the Trump supporters told me that I was being a cuck and that we have to grill every day.
01:44:36.000 I can tell you one thing, we're not celebrating Megamonth in Canada.
01:44:40.000 We're hopefully at the very least just trying to celebrate like... Going outside.
01:44:46.000 Having some fresh air.
01:44:47.000 The absence of curfew is freedom.
01:44:49.000 Yeah, you Canadians, my goodness.
01:44:51.000 Can I show everybody how I'm celebrating?
01:44:53.000 Yeah, with Russian candy.
01:44:55.000 High fructose corn syrup.
01:44:57.000 And vegetable oils.
01:45:00.000 Shut up, Luke.
01:45:00.000 I'm Sour Patch Lids, so I brought in an industrial sized bag of Patriotic Sour Patch Kids for Magamonth.
01:45:06.000 That's how I'm doing it.
01:45:08.000 That's going to fit into a lot of jokes in Canada as to, like, nothing can be more American than a bag of red, white, and blue gummy bears.
01:45:14.000 That's right, yeah.
01:45:16.000 High fructose corn syrup, vegetable oil, even.
01:45:20.000 Everyone loves them, but my goodness, I would be unconscious in a diabetic shot.
01:45:24.000 For the whole office, and not just mine.
01:45:29.000 Tim is reading Super Chats.
01:45:30.000 He looks through his computer.
01:45:32.000 Super Chatses, we call them.
01:45:33.000 What do we got here?
01:45:35.000 Chris P.A.
01:45:35.000 says, I think Canada wants to be like Norway, where it's illegal to defend yourself.
01:45:39.000 If you hurt someone in self-defense, you will be punished the same as if you initiated the attack.
01:45:43.000 I'm not sure about the second part of that, but one thing I can definitively tell you, you cannot own anything that is to be used specifically for self-defense.
01:45:51.000 You're not allowed owning a firearm if the purpose of that is for self-defense, unless you get a specific license.
01:45:56.000 Walk around with a baseball bat, no glove and a ball, and use it for self-defense, you'll probably get charged.
01:46:01.000 What if you have a sporting rifle of some sort?
01:46:04.000 Maybe you got a rifle for sport shooting, and someone breaks in your house and you defend yourself with it.
01:46:09.000 So my understanding is that it will be bona fide self-defense, but you probably will face other unrelated gun charges.
01:46:15.000 It'll be like... In New York City, people are charged like that for defending themselves.
01:46:19.000 Fine, you're off on murder, but it'll be reckless discharge of a firearm or pointing it at a human.
01:46:24.000 It'll be like what happened with the lady in Sweden or Switzerland, I think it was Sweden, who used either pepper spray or a taser to fend off an actual physical assaulter.
01:46:34.000 She got fined for unlawful possession of a taser.
01:46:37.000 I think it was a taser.
01:46:38.000 It's crazy.
01:46:39.000 But leave it to Justin Trudeau to revive a debate that had hitherto been relatively quiet when he comes out a week ago and says, in Canada we have a different culture.
01:46:46.000 You can't own a gun for self-defense.
01:46:49.000 I was like, excuse me?
01:46:50.000 Viva, he doesn't talk like that.
01:46:52.000 He talks like a cult leader.
01:46:54.000 In Canada, we have a different culture.
01:46:57.000 You can't have a gun.
01:46:59.000 I hear him talking, I'm like, ah.
01:47:01.000 We have a Charter of Rights that says you have the right to life, liberty, and security of the person, but you cannot guarantee for yourself your right to life, liberty, or the security of the person.
01:47:08.000 People are going to start asking questions.
01:47:10.000 If a man breaks into your home, just get on your knees and beg him not to harm you.
01:47:14.000 He's got a lot more uhs.
01:47:15.000 Uh, uh.
01:47:17.000 But he talks like that.
01:47:18.000 Like, what is, what is he doing?
01:47:20.000 And the creepiest thing is when he starts talking to your kids.
01:47:23.000 Children, you've been very good.
01:47:25.000 It's time for you to go.
01:47:26.000 I won't say I don't want to get you in trouble on YouTube.
01:47:28.000 All right.
01:47:30.000 Let's grab some more.
01:47:34.000 Andrus T. Berzin says, Barnes, please tell the group why you got suspended from Twitter.
01:47:39.000 Oh, I mean, it's still not clear why I got suspended from Twitter.
01:47:42.000 So I didn't get officially suspended from Twitter.
01:47:44.000 My account just disappeared for about like a week.
01:47:46.000 And then they reinstated, enough people created enough storm that they accidentally, well, I got hacked.
01:47:52.000 I got hacked and doxxed once, no reference to that.
01:47:56.000 And then the second one was just, they just removed my account for a period of time and said it was a mistake.
01:48:05.000 That was an official explanation.
01:48:10.000 Yeah, it's a great family-friendly place, and a lot of great people there fighting for freedom, fighting for personal responsibility, and working on a lot of really cool things.
01:48:20.000 If you want to live in a place with community and strong familial bonds and the right to teach your child to use a flamethrower, then New Hampshire is the place to be.
01:48:30.000 Absolutely.
01:48:31.000 I don't know.
01:48:32.000 Florida has some... I've seen some flamethrowing in Florida.
01:48:34.000 Really?
01:48:35.000 And it's a little... They ban a bunch of stuff.
01:48:37.000 They ban binary triggers in Florida.
01:48:39.000 What?
01:48:39.000 You can't have that.
01:48:40.000 There's a lot of strange rules in Florida, especially when it comes to red flag laws.
01:48:44.000 There's some weird jurisdictions that are very troubling.
01:48:47.000 So New Hampshire definitely takes the cake in some instances.
01:48:50.000 Yeah, I like New Hampshire.
01:48:51.000 I think it's still second to Tennessee.
01:48:54.000 Tennessee is pretty freaking great.
01:48:55.000 Yes, it is.
01:48:56.000 There wouldn't be a Texas without Tennessee, by the way.
01:48:59.000 There you go.
01:49:01.000 Buttweasel says, when is Rumble going to do Super Chats?
01:49:04.000 You do, don't you?
01:49:05.000 Yeah, we do.
01:49:06.000 Rants.
01:49:06.000 But I think the question is, why don't we do it in the app?
01:49:10.000 Oh, I see.
01:49:11.000 And that's the interesting answer, is that if you do it in the app on iOS or Android, they take 30 points from you.
01:49:19.000 We charge 20.
01:49:21.000 So imagine 30 plus 20.
01:49:22.000 It doesn't look so good.
01:49:23.000 So right now it's web only.
01:49:25.000 You can do it on mobile web, support it on web, desktop.
01:49:28.000 That's another Monopoly problem.
01:49:29.000 This is why Netflix, Tinder, and all these other platforms don't let you buy things on the app stores, because the app stores take a big percentage of money away.
01:49:38.000 And only Android and Apple are allowing the app stores to be there, and they take a huge cut.
01:49:43.000 There has been some good victories in the court that are leading to that game.
01:49:49.000 Yeah, when there's a big class action that was just recently.
01:49:51.000 So I don't think this is gonna last for too long, but we can't wait to put that in when we can.
01:49:56.000 But like, how can you be competitive if YouTube's charging 30% and they own Android, they don't have to give a shit.
01:50:03.000 Howard says, anyone buying Bitcoin right now?
01:50:06.000 FYI on Tesla, read the 10ks.
01:50:09.000 Anyone buying Bitcoin?
01:50:09.000 What do you think, Luke?
01:50:11.000 Um, it's crazy out there.
01:50:13.000 Well, it depends on why you buy it.
01:50:15.000 If you're buying it for short-term speculative purposes, that's high risk.
01:50:17.000 But if you're buying it as an alternative currency to have to fight the Fed and to fight the central banks, then it's a good idea, I still think.
01:50:23.000 I'm not selling, you know, it's funny because someone tweeted at me, they're like, not talking about Bitcoin now, are you?
01:50:27.000 And I'm like, I've talked about Bitcoin like basically every day because the crash happened.
01:50:30.000 I mean, it's just, it's talking about the same as I normally do.
01:50:33.000 I'm not selling any of it.
01:50:34.000 I just, I, It crashes.
01:50:36.000 I've seen worse.
01:50:38.000 I mean, George Gammon, who's really good in the economic space, has been saying forever it's going to go up and down, but buy it for long-term value if you're looking at it from a speculative value, but really buy it because it gives you an alternative security.
01:50:48.000 I call it sort of plan B. You know, it's heat.
01:50:51.000 You know, have something in your life that you can walk out on in 15 seconds flat if you feel the heat around the corner from the movie Heat.
01:50:57.000 You should be prepared if the system comes knocking on your door that you can exit when and where and how you want, and part of that is going to be Bitcoin.
01:51:03.000 You can't be completely dependent on the U.S.
01:51:05.000 banking system if you want to be secure.
01:51:08.000 David C. Kronk Sr.
01:51:09.000 says a red-pilled Gavin Newsom might actually have a chance in 2024.
01:51:12.000 By the way, that may be why he's doing what he's doing.
01:51:15.000 It's not to be red-pilled, but he's wanting to replace Kamala Harris and Joe Biden.
01:51:19.000 He's trying to seem more of like a moderate.
01:51:21.000 Yeah, and he's doing anything to get attention to himself.
01:51:24.000 He survived it.
01:51:25.000 And it was real smart.
01:51:26.000 Getting on Truth Social, we talked about it.
01:51:28.000 Everybody knows Biden's going to be replaced.
01:51:30.000 Everybody knows Harris is hated.
01:51:31.000 He thinks of himself as the next president.
01:51:33.000 He imagines himself as a Kennedy, which is a disgrace.
01:51:38.000 All right.
01:51:40.000 Mass.
01:51:41.000 Jenna.
01:51:42.000 Mass.
01:51:42.000 Jenna.
01:51:43.000 Okay.
01:51:43.000 Her name is Jenna, but it's Jenna side.
01:51:44.000 Very clever.
01:51:45.000 Says there was a weeping and gnashing of teeth as I sent off my self-employed quarterly payment.
01:51:50.000 I couldn't help but think of Luke Rudkowski.
01:51:52.000 Taxes are theft.
01:51:53.000 Yep.
01:51:53.000 Inflation is theft too.
01:51:55.000 Yep.
01:51:55.000 Combine the two and you're really hit.
01:51:57.000 Come to Canada.
01:51:58.000 You get the 50% tax and you still get the same inflation you got here.
01:52:05.000 All right.
01:52:06.000 Where do you think you'll escape?
01:52:09.000 All that I know is I've been paying a lot of tax, and it's a fortunate thing to be able to pay tax, but my goodness, you didn't realize you were working for the government 50% of the time.
01:52:18.000 That's crazy.
01:52:18.000 50%?
01:52:19.000 53.
01:52:20.000 Yeah, well it's 40 some odd percent, then you got your property tax, then you got your sales tax, then you got your license, then you got all these incidentals.
01:52:28.000 You're paying, for every dollar you make, you're paying more than 50 cents to the government if you make over a certain amount.
01:52:33.000 The mafia wants their money.
01:52:35.000 It's legalized, it's legalized mafia, and you gotta pay in advance also.
01:52:39.000 You gotta pay before you even make the money.
01:52:41.000 I can't wait till Rumble... Oh, people, people, serenity, serenity.
01:52:44.000 Sorry.
01:52:44.000 I can't wait till Rumble merges and becomes American.
01:52:47.000 Oh, yeah.
01:52:48.000 When, uh, you know, starting a company and then figuring out how corporate taxes are paid, I was just like, but I don't have that.
01:52:55.000 You know, like you got to pay in advance.
01:52:57.000 You got to, you got to, you got to pay based on what they think you might have.
01:52:59.000 But you made the last year quarterly payments in advance and it's, you know, and if you're doing worse, too bad.
01:53:05.000 Yeah.
01:53:05.000 Well, they'll give you a credit at the end of the year.
01:53:07.000 You know, I, I'm still waiting.
01:53:08.000 It doesn't matter.
01:53:09.000 Well, if you have a really good tax law, you don't have to pay much of it at all, but that's another story.
01:53:12.000 If you have a good accountant.
01:53:16.000 We should talk, Barnes.
01:53:19.000 I know a few people.
01:53:20.000 All right.
01:53:21.000 Sweet Lou says, we say channels on the right, but that includes all of the middle of the road truth seekers that get bundled in his right wing because they don't toe the line of leftists.
01:53:29.000 Love the hair, Viva.
01:53:31.000 Is that exactly it?
01:53:32.000 Like we had Dennis Prager on and he's talking about how he's a liberal, but he's a conservative because he talks about facts and reason and logic and things like and morality.
01:53:39.000 I mean, Viva was a YouTube video award winner before he came out.
01:53:44.000 I got the Shorty Social Good Award back in the day.
01:53:48.000 It'll never happen again.
01:53:52.000 I'm the Shorty Award winner for the best journalist in social media, I believe, 2012.
01:53:57.000 Not that there's anything less good about the Shorty Social Good Awards, but it was the new version of their Shorty Awards.
01:54:04.000 That's big-time stuff.
01:54:05.000 I think I got kicked out of the Shorty Awards for confronting somebody there, but I forgot to.
01:54:09.000 So I have that.
01:54:13.000 Alright.
01:54:14.000 It was in the New York Times building.
01:54:16.000 I remember that.
01:54:17.000 The party was good.
01:54:18.000 Yeah, I got kicked out of that.
01:54:20.000 Dylan Sharps is on the topic of censorship and having two Canadians in the house.
01:54:24.000 Can we get their thoughts on Bill C-11 and how it'll change media and how it could be a template for blue states to follow?
01:54:31.000 It's a template to turn Canada into a China or North Korea.
01:54:34.000 The Bill C-11, in the absolute nuttiest of nutshells, is regulating the internet the way the government already regulates television and radio.
01:54:43.000 So they want to subject They said initially streaming and like big platforms online to be governed by the Canada Broadcast Act which imposes Canadian content requirements, fines if you don't comply with it.
01:54:55.000 They want to impose that on the internet to force YouTube and social media to suppress or promote content based on its Canadian content criteria.
01:55:05.000 It is nothing other than a disguised attempt to re-establish a flailing legacy media on a platform where they are getting crushed by others based on their merit.
01:55:14.000 That's all that it is.
01:55:15.000 It's crazy when you have, like, big tech fighting the Canadian government against this bill.
01:55:21.000 And you have Washington Post fighting Canada on this bill.
01:55:26.000 It just shows you how horrible it really is.
01:55:28.000 You got YouTube is fighting, is complaining about it.
01:55:31.000 Everyone is.
01:55:32.000 But then you get Bell Canada coming and testifying for the Liberals.
01:55:34.000 Oh, we need this.
01:55:35.000 We got to protect Canadian culture from the guy who says we don't have culture and we don't have a Canadian culture.
01:55:40.000 Justin Trudeau said there is no Canadian culture.
01:55:42.000 That's crazy because you got, was it Tim Hortons?
01:55:45.000 Is that what it's called?
01:55:46.000 We got maple syrup, man.
01:55:49.000 70% of the global exports.
01:55:50.000 That's right.
01:55:50.000 We got fishing.
01:55:51.000 We got hunting.
01:55:52.000 You have Surrey.
01:55:53.000 We got a boot.
01:55:54.000 We got a boot.
01:55:55.000 We've got a Canadian culture, but they only care about it when they can taxi for it.
01:55:58.000 But I mean, in all seriousness, you know, Surrey and a boot are literally Canadian culture.
01:56:02.000 It's a cultural phenomenon.
01:56:04.000 Poutine.
01:56:05.000 It's called Tim Hortons, right?
01:56:07.000 Tim Hortons was the famous hockey player who died in a drunk driving car accident.
01:56:10.000 Most people don't know that, but it became a chain.
01:56:13.000 And the Tim Hortons, no apostrophe on it, also another part of Canadian heritage because French laws in Quebec don't allow, or didn't allow at the time, the apostrophe.
01:56:21.000 And so Tim Hortons didn't want to have to have two brandings, so they just eliminated the apostrophe.
01:56:25.000 Oh wow.
01:56:26.000 You also have language police.
01:56:28.000 We most certainly do!
01:56:30.000 Office de la langue française, the O.L.F., the language police.
01:56:34.000 They come and make sure if you have an apostrophe, you better have a trademark, a registered trademark.
01:56:39.000 Wasn't a parent arrested for misgendering their child?
01:56:44.000 The parent arrested for misgendering the child.
01:56:46.000 It was more complicated than that.
01:56:48.000 British Columbia, but their human rights tribunal.
01:56:50.000 It was like a father who refused to call his daughter's name.
01:56:53.000 He had disclosed information that was gagged in the trial.
01:56:57.000 It's a very absurd case, but it's a bad case that will make for bad law.
01:57:02.000 British Columbia.
01:57:03.000 Quebec was on the map for fining a stand-up comedian for making a joke at the expense of a handicapped child celebrity.
01:57:12.000 Mike Ward made a joke about this kid named Jeremy Gabriel who suffers from Treacher Collin Syndrome.
01:57:17.000 He had a stand-up bit about him.
01:57:19.000 The kid sued him in human rights court.
01:57:21.000 Wow.
01:57:22.000 Government takes up the case when they decide it's legitimate and the court fined the comedian $43,000.
01:57:28.000 It went all the way to the Supreme Court.
01:57:31.000 5-4 decision.
01:57:32.000 They said no.
01:57:34.000 It's not a human rights violation.
01:57:35.000 So he ended up winning.
01:57:36.000 It ended up winning years later, stress later, all this other stuff.
01:57:40.000 But yeah, Canada.
01:57:41.000 Brad Byrne says, Will Rumble make money though?
01:57:44.000 Google can just let YouTube run at a loss, but hosting is expensive.
01:57:48.000 How will this not just be gone in a few years like any other YouTube alts that came and went?
01:57:52.000 That's a great question.
01:57:55.000 So one of the things that we're really focused on right now is obviously the growth of the users and in the future, revenue.
01:58:02.000 But I can definitely say the audience that's on Rumble converts for advertisers at a pace that I've never seen before.
01:58:12.000 Prior to this conservative audience coming onto Rumble, like pre-2020, our CPMs with advertisers were significantly lower.
01:58:21.000 And now the audience that we're having right now, we have sponsors coming to us that are saying that we're converting at a rate that is so significantly higher than what they're seeing on other platforms that they are renewing and spending at a rate that, you know, we haven't seen before.
01:58:36.000 So I don't believe that to be true.
01:58:39.000 Actually, I know that not to be true, is that the revenue model on Rumble is actually going to far exceed, I think, what people are anticipating because the audience there buys.
01:58:48.000 It's a parallel economy.
01:58:50.000 It is.
01:58:51.000 And not only is it just a parallel economy, but the purchase power of the audience on Rumble is... You see it.
01:59:00.000 You see it with creators on Rumble.
01:59:01.000 Saltycracker will generate tons of superchats.
01:59:05.000 These people have wallets and they can spend money.
01:59:08.000 It's happening.
01:59:09.000 And you just need to go on Rumble and take a look and you'll see it for your own self.
01:59:15.000 It's there.
01:59:16.000 The economy is there.
01:59:17.000 And it's mind-blowing.
01:59:20.000 This week alone, the orders that we're seeing on the ad side was just mind-blowing of how happy the advertisers are and how much it's converting on an ROI basis.
01:59:31.000 This is not brand advertisers.
01:59:32.000 These are companies looking for ROI.
01:59:34.000 And they're getting immediate ROI when buying on Rumble.
01:59:37.000 When we launch Rumble ads, both on display, the video, and sponsorships on our platform, which is in beta right now, we've actually started letting people in in the last week for the first time.
01:59:48.000 I think we're going to see some... I already can see that we're seeing some incredible results.
01:59:55.000 Well, I can see places like YouTube struggling because, I mean, they put Tyson food ads on our stuff.
01:59:59.000 And I have a Tennessee blood oath against Tyson food.
02:00:02.000 So there's nobody that's watching us that's buying Tyson foods.
02:00:06.000 But because I mention it frequently, they're frequently the advertising on YouTube.
02:00:10.000 And now that you mention it, I've been noticing Coalition Avenir de Québec advertising, which is the government in Quebec that I have been calling Supreme Leader François Legault for the last two years, running ads on my videos as if anybody... I tell everyone, let the ad run, make them pay premium, you do whatever, and then go vote against them.
02:00:29.000 I was going to say, CPM, just for anyone who doesn't know, cost per mil, which is the amount per thousand views, and ROI, return on investment.
02:00:37.000 When Bloomberg was dumping money into YouTube ads, I kept getting comments from people being like, hey, I got a Bloomberg ad.
02:00:43.000 And I was like, that's great.
02:00:44.000 He's paying me to rag on him.
02:00:46.000 That's fantastic.
02:00:47.000 But let's be real.
02:00:48.000 It really doesn't make sense.
02:00:49.000 Bloomberg wants to put ads on videos critical of him so that he can get his message in front of it.
02:00:55.000 I end up getting money knowing my audience would never vote for the guy, so thanks for the money, I guess.
02:01:00.000 Yeah, and I think it's because there's aspects to which this system, because going back to your original point, that what Rumble is doing by becoming the free space on the internet is ultimately a money winner, is what counters all of this, and it's because YouTube's decision is a money loser over time.
02:01:16.000 Suppressing and censoring speech is not a desirable outcome for its audience.
02:01:21.000 I totally agree.
02:01:22.000 And we're, we're seeing it like the, they've given away their, uh, incredibly high value audience and it's, and it's growing and the purchase power is there.
02:01:33.000 It's us based.
02:01:35.000 It's the, I don't think they realize what they've lost.
02:01:38.000 I really don't.
02:01:39.000 Um, uh, I can see it from my side.
02:01:42.000 They lost something.
02:01:43.000 They lost something very, very important.
02:01:47.000 Yeah, man.
02:01:48.000 Well, if you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, and head over to TimCast.com.
02:01:55.000 We're gonna have that after-hours, uncensored, not-so-family-friendly version of the show coming up at about 11 p.m., so you'll definitely want to check that out.
02:02:02.000 You can follow the show at TimCast IRL.
02:02:04.000 Follow us on Instagram, we post clips.
02:02:06.000 Follow me at TimCast.
02:02:08.000 Viva, you want to shout anything out?
02:02:10.000 Viva Fry on YouTube and Rumble, TheVivaFry on Twitter, and... VivaBarnesLaw.Locals.com.
02:02:18.000 Did you want to shout anything else out, Robert?
02:02:20.000 No, other than the Locals thing, just a shout out to the people that they ask questions about.
02:02:25.000 You can follow Jacob Drazen, TheDuran on YouTube, all the great independent sources on Ukraine and world news, Richard Beres, People's Pundit, the only accurate pollster in the last half decade.
02:02:36.000 All those are great guys to follow.
02:02:37.000 Right on.
02:02:38.000 You can find me on Truth at Chris and you can follow Rumble on Truth at Rumble.
02:02:44.000 Barnes, when I'm in jail, I'm calling you.
02:02:47.000 Just a heads up.
02:02:48.000 I think I said this last time to you, but every time you come on, I'm like, I need him.
02:02:52.000 He said, when I'm in jail, not if.
02:02:56.000 It's only a matter of time until we're all in the gulag.
02:02:59.000 So just wait for it.
02:03:00.000 And if you want to find out more about me and what I'm doing, you can check out my platform, LukeUncensored.com.
02:03:06.000 I've been doing it for a number of years now.
02:03:08.000 I got a lot of crazy stuff up there.
02:03:10.000 We also use Rumble now.
02:03:12.000 And thank you, Chris, for coming out.
02:03:14.000 Thank you for listening to the audience.
02:03:16.000 Thank you for taking the tough questions.
02:03:18.000 I think it's really important for people to be transparent and open.
02:03:22.000 And I think you've done that in a good way.
02:03:24.000 So thank you so much for coming on.
02:03:26.000 And thank you for having me a part of the conversation.
02:03:29.000 Yeah, absolutely.
02:03:29.000 Thank you guys all so much for coming.
02:03:31.000 Elon Musk made this tweet go viral.
02:03:33.000 You guys should go watch what he has to say, because it does seem like he kind of wanted this to be like this.
02:03:37.000 He's saying a lot of really good things about free speech.
02:03:39.000 A lot of really encouraging stuff.
02:03:41.000 Anyway, I will not be shamed for loving Sour Patch Kids.
02:03:44.000 I don't care what's in them.
02:03:46.000 It's candy.
02:03:46.000 You eat it because it's fun, not because it's good for you.
02:03:49.000 And you guys can follow me on Twitter and Mines.com at Sour Patch Lids, as well as Sour Patch Lids.me.
02:03:55.000 It's poison.
02:03:55.000 Shut up.
02:03:56.000 A couple things you can check out.
02:03:58.000 You can check out the song Will of the People that I made.
02:04:01.000 We put it out just before the election in 2020, and we got a big billboard for it in Times Square.
02:04:06.000 We're gonna be putting out an album probably in the next couple of months, so stay tuned for that.
02:04:09.000 You can check out youtube.com slash castcastle.
02:04:12.000 We've brought on Jamie Kilstein to help take the vlog to the next level, and we're doing comedy bits, and the goal is to make it very much like