Timcast IRL - Tim Pool - January 09, 2021


Timcast IRL - Trump Has Been PERMANENTLY BANNED, Democrats WILL Impeach w- Minds CEO


Episode Stats

Length

2 hours and 48 minutes

Words per Minute

200.65175

Word Count

33,763

Sentence Count

2,895

Misogynist Sentences

21

Hate Speech Sentences

23


Summary

In this episode, we talk to Bill Ottman, CEO of Minds, a social media platform which has a unique approach to rule breaking and censorship. We talk about the removal of Trump's account, Apple and Google removing Parler from the Play Store, and much, much more.


Transcript

00:00:00.000 you you
00:00:16.000 you ladies and gentlemen I'm not sure what the biggest news of
00:00:40.000 the day is but But apparently it is that Donald Trump has been permanently
00:00:45.000 banned from Twitter.
00:00:46.000 They've outright removed his account.
00:00:48.000 All of his tweets just gone.
00:00:51.000 Memory hold censored.
00:00:52.000 Out of there.
00:00:53.000 The reason I say I don't know the biggest news story of the day is because the Democrats also announced they are going to be impeaching the president on Monday.
00:01:02.000 Things are getting just very, very weird.
00:01:05.000 So we heard that Apple and Google were threatening parlor.
00:01:09.000 that if they didn't overhaul their moderation or introduce heavy moderation, they would remove
00:01:15.000 Parler from the app stores. I don't know exactly what's going on, but I can tell you I pulled up
00:01:19.000 the Google Play store and Parler ain't there. So maybe it's already happened because we actually
00:01:24.000 have one of the experts, not to mention Ian is an expert as well, on social media censorship and
00:01:31.000 moderation. We've got Bill Ottman, CEO of Minds, MINDS.com, one of these other social media platforms
00:01:39.000 which has a unique approach to rule breaking and censorship and stuff, the jury system, right?
00:01:44.000 Yep.
00:01:45.000 Juries, I mean, why not empower the community to help moderate?
00:01:49.000 And imagine if Facebook, Twitter, they took their tens of thousands of moderators and actually proactively engaged with people who have mental issues or are extreme.
00:02:02.000 We have to realize that Extreme psychology is just something that exists, and we have to deal with it in a way that's not banning them off the platforms.
00:02:14.000 Well, because then what happens now is they're saying, you're going to get banned, everybody goes to Parler, and so then the big companies attack the infrastructure that allows people to even find Parler.
00:02:24.000 But everybody, as you know, Ian, who is a regular on the show.
00:02:27.000 Hello, yes.
00:02:28.000 You ran moderation.
00:02:29.000 Yeah, I co-founded Minds with Bill.
00:02:30.000 Technically, I came in, what, six months after you guys had it.
00:02:35.000 2010.
00:02:35.000 No, no.
00:02:36.000 20, like a year.
00:02:37.000 It was with the OG developer.
00:02:38.000 And that was really fun.
00:02:41.000 It was before Jon got involved.
00:02:42.000 And it was just me and you in Jon's basement, you know, talking about where the future of tech is going to be in 2010-ish.
00:02:50.000 Then we went to Occupy Wall Street.
00:02:51.000 I think we were all at Occupy Wall Street, actually.
00:02:53.000 But you were the band hammer.
00:02:55.000 You were removing these awful Trump supporters, just getting rid of them, saying, you can't like Trump, you gotta get out of here.
00:03:00.000 So the way it would work is if an admin on Mines makes a wrong decision, then a user can appeal it, and then it goes to random 12 users who vote on it.
00:03:11.000 And so we launched a jury system to keep ourselves in check.
00:03:14.000 Yeah, I think you got to have more than 12 people though, because like, there's margin of error, I guess.
00:03:19.000 But I understand the 12.
00:03:20.000 No, no, that's a variable.
00:03:22.000 Because like, you know, you know, what I see happening is not first of all, I'll say that's like an excellent system.
00:03:27.000 But I see what I see happening is look at Twitter, you know, they keep banning Trump supporters.
00:03:31.000 Eventually, the 12 people you're going to get are going to be one political ideology who are going to be like, yeah, ban him.
00:03:35.000 In the early days, it was just me.
00:03:38.000 I would be sitting in a queue, looking at stuff, and be like, okay, our policy is we only ban it if it violates the U.S.
00:03:43.000 Constitution.
00:03:43.000 So I have to define the U.S.
00:03:46.000 Constitution for each post.
00:03:48.000 I have to, like, make these Supreme Court judgment calls.
00:03:51.000 It was insane stress for one human to have to go through.
00:03:54.000 But, like, you probably saw pictures where you're like, that's easy, gone.
00:03:57.000 Yes, there were easy ones and there were really hard ones.
00:04:00.000 And this is what they're doing.
00:04:02.000 But can we just say that someone's saying go kill that guy?
00:04:05.000 Well, that's illegal.
00:04:06.000 No, it's not imminent.
00:04:08.000 So imminent threat is different than a threat.
00:04:10.000 They changed it from imminent to true threat.
00:04:13.000 Who did?
00:04:15.000 One time we got contacted by Pennsylvania and they were telling us that legally true threat is the language.
00:04:23.000 How do you know if someone's being sarcastic half the time?
00:04:25.000 Sarcasm is out the window.
00:04:27.000 Sarcasm, by the way, guys, does not fly on social media and text.
00:04:30.000 Do not be sarcastic in text because they will define it as not sarcastic for the sake of moderation.
00:04:35.000 Absolutely.
00:04:36.000 All right.
00:04:37.000 So we'll get into all this stuff because, well, you know, not to bury everything in the intro, but Luke is hanging out, of course.
00:04:41.000 Yeah.
00:04:41.000 Talking about sarcasm, I just wanted to say I'm so happy that our tech overlords, unaccountable multinational corporations with unlimited power are not keeping us safe.
00:04:52.000 I think We all needed to be safe from, you know, words and sounds and speech from our little small little ears and our feeble minds and especially words from democratically elected government officials.
00:05:07.000 I feel safe just like if I would be in the Matrix in that little bubble energy blanket.
00:05:12.000 Luke, Luke, Mark Zuckerberg is successful.
00:05:16.000 Okay, it is by merit that he has gained the power to shut down the President of the United States.
00:05:21.000 Yes, and I'm saying, I'm so happy he did this because I'm safe from, you know, the bad words and sounds and, you know, we're so safe that you don't have to go to wearechange.org and in the right top hand corner put in your email.
00:05:33.000 You could definitely follow me on all the mainline channels under We Are Change, so you got nothing to worry about.
00:05:38.000 We're safe and happy and protected now.
00:05:39.000 That's right.
00:05:41.000 What are you talking about?
00:05:42.000 All right.
00:05:42.000 All right.
00:05:43.000 Look, the president, there's a ton of stuff going on.
00:05:45.000 But beyond all of this, I think we are looking at the exponential escalation.
00:05:54.000 Joe Biden went on TV and he likened Hawley and Cruz, two sitting U.S.
00:05:59.000 senators, to Goebbels.
00:06:02.000 This is Absolutely off the rails, and it's not a joke.
00:06:07.000 What we saw yesterday, I have some ideas about what I think is going to happen and what I'm really worried about, but it's going to get bad.
00:06:12.000 All right, so we'll start the show.
00:06:13.000 We're going to talk about Trump being banned permanently because there's a lot of questions here.
00:06:16.000 This is a removal of historical record.
00:06:20.000 Before we get to finishing introductions, of course, don't forget Lydia's here.
00:06:23.000 I am here.
00:06:24.000 I'm pushing all the buttons over in the corner.
00:06:25.000 And now that we've had that long introduction, because everybody wanted to, you know, we have so much to talk about, let's just jump to the first story.
00:06:32.000 From CNBC, Twitter permanently suspends Trump's account.
00:06:36.000 They say, the company said in a tweet it made the decision due to the risk of further incitement of violence, saying after a close review of recent tweets from the from real Donald Trump account and the context around them, we have permanently suspended the account due to the risk of further incitement of violence.
00:06:51.000 The suspension accounts to a ban.
00:06:53.000 Trump can no longer access his account, and his tweets and profile picture have been deleted.
00:06:58.000 Trump had 88.7 million followers prior to his suspension.
00:07:02.000 Institutional accounts like POTUS and White House are still active.
00:07:05.000 So what if he tweeted through at POTUS something ridiculous?
00:07:09.000 Why wouldn't he?
00:07:11.000 You don't think he's going to?
00:07:13.000 I hope he does.
00:07:13.000 He still has access to Twitter.
00:07:15.000 Maybe.
00:07:15.000 And he has the POTUS account.
00:07:16.000 It's at P-O-T-U-S.
00:07:18.000 They would suspend it until Biden got inaugurated and then they'd reinstate it.
00:07:22.000 Probably.
00:07:22.000 Yeah.
00:07:23.000 They say it's a step Twitter has resisted taking for all of Trump's presidency.
00:07:27.000 While President Barack Obama was the first president to use Twitter, he mainly used the institutional POTUS account and did not rely on it as heavily as Trump has to get his message out.
00:07:37.000 Trump used his personal Twitter account to stoke supporters and even make personnel changes before they can even make it to the press release.
00:07:45.000 Yeah, you know what it was?
00:07:46.000 This allowed Trump to bypass the media and they hated him for it.
00:07:49.000 Now I'll tell you, we got a lot of questions here.
00:07:52.000 This was a historical record, the things that the president had been tweeting.
00:07:55.000 But before we bring that up and those issues, let me tell you, Donald Trump I cannot believe the biggest mistake made, and I don't necessarily blame him.
00:08:02.000 He's an old guy, you know.
00:08:04.000 He's not social media savvy.
00:08:06.000 He is pretty savvy for his age, and I'm not trying to disparage, you know, people who are older.
00:08:11.000 He could have, at any moment, tweeted out to 88.7 million people, follow me on Parler, follow me on Minds, follow me anywhere else.
00:08:22.000 And then he could have had those 88.7 million choose to go somewhere else.
00:08:26.000 In fact, just by posting on any of these other platforms, many people would have said, what did he say?
00:08:31.000 Where did he say it?
00:08:32.000 Let's go check it out.
00:08:33.000 The media would be forced to cover it.
00:08:34.000 He didn't do it.
00:08:35.000 Now there are reports that Donald Trump has joined Parler, and this is according to Fox News' Sean Hannity.
00:08:41.000 Parler is currently being hugged to death, is what they say.
00:08:44.000 What it basically means is so many people are flocking to Parler that the traffic is kind of overloading it, making it slow and hard to use.
00:08:50.000 I've had no problem logging... Well, I've had a little trouble logging, but once I got logged in, it seemed like everything was fine, so they must be dealing with it.
00:08:57.000 But it's something Trump could have done a long time ago.
00:09:00.000 Now, I guess we should just start by talking about the historical precedence of Jack Dorsey and Mark Zuckerberg and these tech CEOs having the ability to tell the president, you cannot speak to the American people.
00:09:09.000 Well, why is it that an avalanche of bans happen every time?
00:09:13.000 It's just like, is there communication happening between them?
00:09:18.000 Is it social pressure?
00:09:19.000 Like, what do you think is really going on here?
00:09:20.000 I think they're all cowards.
00:09:22.000 And then when one network says, we have to remove this, all of the rest of them go, now it's safe for us to do.
00:09:29.000 Another thing we really have to understand here is that this is a transition of power.
00:09:34.000 Speech is power.
00:09:35.000 And if you're not able to talk to your supporters, if you have a whole political party that right now is voiceless, that's a major step in what I believe in an extremely wrong direction.
00:09:48.000 The last time we had censorship, the major censorship story, was of course the Hunter Biden story.
00:09:53.000 It was wrong.
00:09:55.000 They were wrong about that story.
00:09:56.000 They censored it for, I believe, in my own opinion, political needs.
00:10:01.000 And then it came out that, oh yeah, the Hunter Biden story was true all along.
00:10:05.000 After the election.
00:10:06.000 After the election.
00:10:08.000 After, of course, Joe Biden, you know, I don't even want to... We've got to be careful with our language here.
00:10:14.000 After the election was called for Joe Biden, the story popped back up in the media.
00:10:18.000 Yeah.
00:10:19.000 This is, I mean, in my opinion, 1984 just happened today.
00:10:23.000 Wonder Woman 1984?
00:10:24.000 No, not that one.
00:10:25.000 Is that a coincidence?
00:10:26.000 Unless, of course, it was Joe Biden who grabbed the Dreamstone and said, I want to be president!
00:10:31.000 The whole world's on fire.
00:10:32.000 A lot of this is being done, allegedly, for your safety, but let's not disguise it.
00:10:36.000 Let's not lie about it.
00:10:37.000 This is essentially political and cultural dominance.
00:10:40.000 That's exactly what it is.
00:10:42.000 This whole system was abused before, and this is the continuation of that abuse of power.
00:10:46.000 I'll tell you what I said.
00:10:47.000 It's a major political party.
00:10:51.000 Demanding of massive, multinational, billion-dollar corporations the removal of their political opposition.
00:10:58.000 It's a cultural coup.
00:11:00.000 They are removing their opposition from the discourse, period.
00:11:03.000 And it's been happening for some time, and this is the most dramatic escalation.
00:11:07.000 We saw back in, what was it, 2018, when they got rid of, like, Jones and, like, Milo Yiannopoulos.
00:11:12.000 And they targeted certain people.
00:11:14.000 That was, you know, a slow uptick.
00:11:17.000 Now, it is the most dramatic.
00:11:19.000 People, high-profile personalities, not just on the right, are reporting they're losing thousands of followers.
00:11:24.000 We can't actually see the level by which this purge is happening.
00:11:28.000 The censorship is bigger than just the president, and it could be even hundreds of thousands of people.
00:11:34.000 I mean, I think I just lost a thousand followers on Twitter myself, but we can't say that we didn't see this coming.
00:11:40.000 I mean, just even, what was it, a couple weeks ago, we were talking about this very topic, And we talked about, will Donald Trump be banned?
00:11:47.000 And we both agreed and said, yes, Donald Trump's going to be banned.
00:11:50.000 Our timeline was wrong because we said after the inauguration.
00:11:54.000 No, I disagreed.
00:11:55.000 I said they would lose too much money.
00:12:00.000 Early on, I thought they would totally do it because they hate the guy.
00:12:04.000 But then I started talking to people who said, you know, look at their user base.
00:12:08.000 It was in decline before Trump.
00:12:09.000 So I think someone mentioned Twitter stock went down when they banned Trump.
00:12:12.000 Like, it's probably going to plummet.
00:12:13.000 In my opinion, I'm not, you know, I don't own any stock in Twitter, but I imagine it would go down because Trump made that platform.
00:12:22.000 It didn't take a genius to understand that this was going to be their crisis.
00:12:29.000 This was going to be the event that they use, that they're going to exploit, and that they're going to purposefully inflate as a major threat.
00:12:37.000 And that's exactly what they're doing.
00:12:38.000 They're doing it very disingenuously.
00:12:40.000 And they're doing it on a bed of lies.
00:12:43.000 And that's another important aspect here.
00:12:45.000 Nothing good is done based on a huge foundation of lies.
00:12:50.000 So we have to understand that this huge major move, this huge transition of power, all is
00:12:55.000 happening under fake pretenses.
00:12:58.000 And it's worrying.
00:13:00.000 They are engineering.
00:13:01.000 echo chambers. This is a global echo chamber.
00:13:05.000 And radicalization.
00:13:06.000 And radicalization. And see, the studies show evidence actually on both sides. It is true
00:13:11.000 that social media can cause you to go down a rabbit hole of ideology. But it is also
00:13:16.000 true that communication and social media is the only way you can become de-radicalized.
00:13:21.000 So there's evidence on both sides, but the overwhelming evidence shows that there's massive
00:13:28.000 blowback of censorship.
00:13:30.000 You're shutting down dialogue, which is our only option other than taking it to other levels.
00:13:34.000 Our only option for de-escalation is communication, is being able to hear people out, and not putting people on the fringe.
00:13:40.000 More people are going to be on the fringe, and more people who are on the fringe are only going to become fringier.
00:13:44.000 It's not the fringe anymore.
00:13:46.000 It's when when they start banning your run-of-the-mill conservatives who voice their support for the president
00:13:51.000 You don't like the president you can disagree with them These are not like you got a lot of regular people and what
00:13:56.000 you're doing is we I think we talked about before bill It's actually I think I mentioned we talked about it on Joe
00:14:01.000 Rogan You take a regular person and he gets arrested for pot and
00:14:05.000 you put him in prison with hardened criminals And you are guess what happens to that person
00:14:10.000 You know, you take some young person who's like, you know, first charge, they go to prison, and now they're around all the, you know, burglars, robbers, murderers, and that's what you're doing.
00:14:18.000 You're putting people in these environments.
00:14:21.000 It's 70 million.
00:14:23.000 It's 70 plus million people.
00:14:25.000 75.
00:14:25.000 75 million plus people that voted for Donald Trump.
00:14:28.000 And again, last night, I said on the show, the censorship is about to reach levels that we have never seen before.
00:14:34.000 We are here today.
00:14:35.000 And again, it's only going to get worse.
00:14:37.000 Sorry, Ian, I cut you off.
00:14:38.000 No, it's okay, man.
00:14:39.000 Brandon Strock.
00:14:40.000 I don't know if you're familiar.
00:14:40.000 He does the walk away campaign.
00:14:42.000 Hashtag walk away.
00:14:43.000 His campaign was banned off.
00:14:44.000 Off Twitter.
00:14:44.000 This is like a kind hearted, good dude.
00:14:47.000 And he's the kind of guy that if you bust him for pot and put him in prison with a bunch of criminals, he's going to get twisted.
00:14:54.000 And so they banned him.
00:14:56.000 They banned his entire Facebook page campaign of, I don't know, 50,000 people.
00:15:00.000 All his people.
00:15:01.000 All his volunteers and employees have now been banned off Twitter or off Facebook.
00:15:10.000 This is another thing.
00:15:11.000 Books are being unpublished.
00:15:13.000 There's even Democratic committees calling for no-fly lists for individuals who are a part of the right.
00:15:18.000 I mean, this is a new level of authoritarianism.
00:15:23.000 It's here.
00:15:25.000 It's real.
00:15:26.000 I remember I had a conversation with Joe Rogan, Vijay Gadde, and Jack Dorsey.
00:15:31.000 On The Joe Rogan Experience.
00:15:34.000 I think I remember that one.
00:15:35.000 At the end of the episode, I said, if you keep doing this, it is going to get really bad in this country.
00:15:42.000 What you are doing, and then what did we see all throughout this year, and what did we just see in the Capitol?
00:15:47.000 And then I said, I'm gonna build a van!
00:15:50.000 And they all laughed, and everybody laughed.
00:15:51.000 I got emails, they're like, oh you crazy dick, you're gonna go build a van, nothing's gonna happen!
00:15:56.000 Where are we now?
00:15:58.000 So here's what I think.
00:16:00.000 I think we're not looking at a linear path of escalation.
00:16:03.000 I think we're looking at an exponential path of escalation.
00:16:05.000 We went from a bunch of people in D.C., many of them breaking into the Capitol, which I think was ridiculous and stupid.
00:16:12.000 People are now dead.
00:16:13.000 A cop was bashed over the head.
00:16:14.000 That's insane.
00:16:15.000 But many of these people walked right in.
00:16:16.000 There's videos where the cops open the door.
00:16:19.000 And one cop says that he agrees with their right to protest.
00:16:21.000 And as they waltz on in, some of these people are bewildered.
00:16:25.000 I saw that and I thought to myself that the initial reaction will be overwhelmingly negative.
00:16:31.000 Now I think because of the mass purge, you're going to start seeing an overwhelmingly positive reaction from people who feel like they've been excised from society.
00:16:39.000 So there were a lot of people posting on Twitter that they felt that their opinions were watered down out of fear of being banned.
00:16:45.000 Well, don't worry!
00:16:46.000 Twitter just did the hard work for you.
00:16:48.000 Now these people have been kicked off after they said they followed the rules.
00:16:51.000 Not everybody did.
00:16:52.000 A lot of people being banned are, like, you know, advocating for some crazy stuff.
00:16:55.000 But there are a lot of people who are saying things like, well, I shouldn't say anything, you know, and there's tweets saying, I think we're all holding back.
00:17:03.000 They get banned.
00:17:04.000 Now what?
00:17:05.000 Okay, I guess there's no point anymore.
00:17:07.000 So here's what I think.
00:17:09.000 I think after what we saw, the immediate reaction was insane.
00:17:14.000 You've got politicians like Cori Bush and AOC demanding the expulsion of Republicans that supported Trump.
00:17:20.000 Simon & Schuster, a major book publisher, announced they're breaking their contract with Senator Josh Hawley.
00:17:27.000 He challenged them, saying he's going to sue them.
00:17:29.000 This is, I mean, it's dramatic escalation where the culture is being split.
00:17:34.000 People are being demonized at an ever-increasing rate.
00:17:37.000 And now we've come to the point where you actually have Joe Biden going on TV and likening two sitting U.S.
00:17:43.000 senators to Nazi propagandists.
00:17:45.000 This level of demonization and dehumanization is the precursor to horrifying things.
00:17:52.000 I'll spare some of the more hyperbolic words, but when you look back at history, historical civil wars, the start of major wars, this is the kind of thing that happens just before.
00:18:02.000 And what do you think's gonna happen when the incoming president announces,
00:18:07.000 the Wall Street Journal reported this, sweeping new domestic terror laws, Joe Biden's announced,
00:18:12.000 while calling sitting Republicans, likening them to Nazi propagandists.
00:18:17.000 What do you think that demonization will lead to?
00:18:19.000 The demands for expulsion.
00:18:21.000 Do you think that the Republicans are gonna be like, we're so sorry, Democrats?
00:18:24.000 No, the supporters are being forced into a totally different echo chamber reality
00:18:29.000 where people are angry.
00:18:32.000 We had David Cross tweet that he wanted blood.
00:18:35.000 Whether it was a joke or whatever the point was.
00:18:36.000 Yeah, we don't know.
00:18:37.000 That's the problem.
00:18:37.000 Don't know, don't care.
00:18:39.000 Sarcasm doesn't work on the internet, I guess.
00:18:40.000 Not in text.
00:18:41.000 It's similar to Sacha Baron Cohen being, you know, so pro.
00:18:45.000 Yeah, it's like, don't you understand that comedians need to be protected too?
00:18:49.000 This is coming for you.
00:18:50.000 Yeah, people don't understand.
00:18:52.000 People on the left, you're going to be affected by this sooner or later.
00:18:55.000 As soon as your paths cross the establishment and the talking points and the narrative that they want to push, I don't care who you are.
00:19:02.000 It could even happen to BLM.
00:19:03.000 BLM still hasn't had their meeting with Biden.
00:19:07.000 Last I saw in a couple days, I don't know if I may stand corrected here.
00:19:10.000 But anyone could stand in the way of this big, unaccountable, totalitarian monster that holds one of the biggest sacred powers in the world.
00:19:20.000 One of the things that makes America great, more than any other place in the world, is our free speech.
00:19:27.000 Once you get rid of that, once you limit that, once you stop allowing people the ability to freely communicate with each other, we are in uncharted territories where it is ripe for abuse.
00:19:38.000 The heartbreaking thing, Tim, that you kind of alluded to is that regular Democrats and Republicans cannot even speak to each other.
00:19:49.000 Families are being torn apart There is a tweet from a young woman ratting out her mom for being at a protest, I think the precursor protest, not the one at the Capitol but the night before, saying, this is my mom, here's her name, here's my dad, and then the mom got fired.
00:20:06.000 That's crazy stuff.
00:20:07.000 They're breaking up families.
00:20:08.000 Do you see the USA Today article where they put out 29 pictures of people that were at the Capitol?
00:20:14.000 If you can help us find their names and phone numbers, give us their information and help us find these criminals.
00:20:19.000 Remember when Andy Ngo just tweeted out, this individual has been arrested?
00:20:24.000 And the left said he is creating kill lists and doxing people.
00:20:28.000 And they started smearing him and attacking him.
00:20:30.000 And now media across the country is putting up people's names saying, find them, find them.
00:20:35.000 Sure, criminals, people accused of committing crimes, by all means, send your tips.
00:20:40.000 But I'm talking about the double standard.
00:20:42.000 Andy Ngo would simply be like, the police arrested this individual, and then they would attack him for it.
00:20:47.000 Then you get the media coming out now saying, we don't know who these people are or what they were doing.
00:20:50.000 In fact, what if some of these people were credentialed journalists?
00:20:53.000 Elijah Schaefer, for instance, of Blaze TV, a reporter, was getting attacked relentlessly by people on social media.
00:21:00.000 I'll admit, some of his tweets were a little bombastic.
00:21:03.000 He had one tweet where he called it a revolution?
00:21:05.000 Okay, dude, chill.
00:21:06.000 But he is a credentialed member of the press who works for the Blaze TV.
00:21:09.000 He got banned by Facebook, probably because of these hit pieces.
00:21:13.000 And then only after I think Glenn Beck came out and said, you know, and complained about it, they reinstated him.
00:21:18.000 That's the danger of publishing a face and then saying, find him!
00:21:21.000 Because we all saw what happened after the Boston bombing on Reddit.
00:21:24.000 When all these good-hearted Redditors said, we must find these criminals, and they ended up doxing random innocent people.
00:21:31.000 That's the problem.
00:21:32.000 So you have Andy Ngo.
00:21:33.000 Here's the persons whose picture, they were arrested on this charge, and they threaten him, and they attack him, and they accuse him of doxxing, and he's a villain.
00:21:39.000 He's a bad guy.
00:21:41.000 Now the media is totally on board.
00:21:42.000 Totally okay with going even beyond that and saying, identify these people.
00:21:46.000 And they're doing it.
00:21:47.000 They're doing it.
00:21:47.000 They're doing, you know, they're publishing names.
00:21:49.000 And here's the problem.
00:21:50.000 There's no actual empirical data or evidence that shows what they're doing is making the world a safer place.
00:21:57.000 In fact, the opposite.
00:21:58.000 And so what we're actually trying to do is on a 10-year basis, AB test.
00:22:03.000 AB test a strategy and see who can have a higher rate of de-radicalization.
00:22:09.000 Here's the thing.
00:22:10.000 We are going to be able to prove that we have a higher rate of de-radicalization on minds because you cannot de-radicalize someone that you have just banned.
00:22:21.000 Their rate of de-radicalization will be zero And I say this all the time.
00:22:27.000 If if you have somebody who does bad things and then one day they come out and they apologize,
00:22:33.000 accept their apology.
00:22:34.000 I mean, maybe not the second or third time, you know, it's up to you.
00:22:37.000 But if someone's like, you know, I shouldn't have been doing those things, please accept my apology. You say yes.
00:22:41.000 Please come join us doing the right thing. If you don't, they'll say that I have no opportunity
00:22:46.000 but to go to anyone who will be willing to accept me.
00:22:49.000 So if you've got someone who makes hate speech and they're, you know,
00:22:52.000 they're going on Twitter and they're saying rabble rabble and offensive words,
00:22:55.000 and you decide the best way to get rid of them is to ban them.
00:22:58.000 The only place they can go is to where everyone agrees with them, and they're allowed to, you know, say whatever they want.
00:23:04.000 If you truly don't want them to believe these things, the best thing you can do is talk to them.
00:23:08.000 Like that woman from the Westboro Baptist Church.
00:23:10.000 That was on Twitter, right?
00:23:12.000 People started reaching out to her, and then she flipped from being a member of this church to being like, hey, that was wrong, I shouldn't have been saying these things.
00:23:19.000 People really changed my mind and were nice to me.
00:23:21.000 I don't think they want this.
00:23:23.000 This is what I think, you know, in my opinion, I think they want the violence.
00:23:28.000 Ted Cruz came out and said, we all need to put the anger aside and come together.
00:23:32.000 And AOC responded by saying, no, you should be expelled from Congress.
00:23:37.000 OK, well, that's just going to make things worse.
00:23:40.000 He literally said, can we put the anger aside and come together?
00:23:43.000 And you attacked him for it.
00:23:44.000 To score brownie points on the Internet or whatever, I guess.
00:23:47.000 So then Ted Cruz has no opportunity whatsoever to come together with anyone.
00:23:51.000 So they won't.
00:23:52.000 And then his followers won't either.
00:23:53.000 And then the hyperpolarization continues.
00:23:55.000 CNN getting 9 million, you know, viewers.
00:23:57.000 Their biggest ever.
00:23:59.000 And you have this handoff between Cuomo and Don Lemon where they're scoffing and insulting and berating and degrading.
00:24:05.000 Instead of being calm and serious.
00:24:06.000 Even Shep Smith.
00:24:07.000 Get that off the TV!
00:24:08.000 We're not showing this!
00:24:08.000 That's not true!
00:24:09.000 That is not how you handle conflict.
00:24:12.000 These people have no idea what they're doing.
00:24:13.000 Yeah, the only answer to bad speech is good speech.
00:24:16.000 And if you start to limit it, you again, putting people on the fringe and you're starting to expand it.
00:24:21.000 I remember during the kind of infancy of the internet and kind of growing up in that day and age and thinking, this is absolutely incredible.
00:24:28.000 This is an amazing tool that allows people to speak to each other.
00:24:31.000 And then I realized, just like any kind of great technological advancement, it's a tool, it's a weapon, it's a sword, and it has blades that go both ways.
00:24:40.000 And just like it has the potential to help people, it has the potential to hurt a lot of people.
00:24:45.000 And now we're seeing a huge backswing.
00:24:47.000 I remember, I remember warning about this all the way back in 2008.
00:24:51.000 I have We Are Change videos on my channel warning in 2008 saying, hey, They're already starting to turn on the boilers.
00:24:58.000 The water is starting to get warm here.
00:25:00.000 And look what they did here.
00:25:01.000 They did it very methodically.
00:25:02.000 They did it very slowly.
00:25:04.000 They didn't just ban everyone all at once.
00:25:06.000 Or they're starting to actually now.
00:25:08.000 Now, we're seeing the first bubbles in that pot.
00:25:12.000 Now, we're starting to turn a little bit of a different color.
00:25:15.000 It's too late.
00:25:16.000 Here's what I want y'all to understand.
00:25:18.000 What Twitter is doing, what Facebook is doing, and YouTube is doing, is actually diminishing their ability to control anything, and will result in substantial chaos.
00:25:26.000 If there are 100 people in a room, and you have rules, and some of those people break the rules, then you can slowly start to remove some of those people, or you can compromise and say, here's what we'll accept and here's what we won't.
00:25:37.000 And what happens is, If you decide to ban 10 people, now you have 10 people in one room and 90 in the other, and you have no say over what those 10 people are doing.
00:25:47.000 We're going to come to the point where Twitter has half of the people in one room and half in the other, and then it's going to be two equal-sized rooms with no control over the other side.
00:25:54.000 No ability to influence or de-radicalize or communicate at all, and that's when the clash happens.
00:25:59.000 And the chilling effect, you're talking about friends who are just deleting all their posts because they're scared they're going to get banned now.
00:26:05.000 And I mean, the psychological impact that it has just on the people who are allowed to be there.
00:26:12.000 And Snowden has brought this up with, you know, how surveillance impacts your brain and how you want to communicate.
00:26:18.000 He said today is a major turning point in history.
00:26:22.000 Snowden did.
00:26:23.000 Yeah.
00:26:24.000 Go ahead.
00:26:24.000 Sorry.
00:26:25.000 No, yeah, but it's just the chilling effect and the mass sort of social, psychological implications that this has, like, in the herd mentality when everybody's posting, like, either in support of it or against it, but, like, at the end of the day, all that matters is data.
00:26:42.000 What actually works?
00:26:44.000 And they're not publishing.
00:26:45.000 They know this.
00:26:46.000 They know this.
00:26:47.000 They have to be able to see that what they're doing is making things worse.
00:26:51.000 And they have to know this.
00:26:53.000 There's no way.
00:26:55.000 They've seen it before.
00:26:56.000 They've banned people before.
00:26:57.000 They've seen the reaction.
00:26:58.000 And they have to... You know what it is?
00:27:00.000 Maybe they don't have the data from external platforms.
00:27:02.000 But I'll tell you this.
00:27:03.000 Anyone with a brain can take a look at the bannings they did, and then the chaos that ensued yesterday, and be like, hey, remember two years ago when you banned all these people?
00:27:11.000 How did that work for you?
00:27:13.000 Now they're on Parler, and you have no control at all.
00:27:17.000 You've given it up.
00:27:18.000 You've said, go do your thing.
00:27:20.000 They could have compromised.
00:27:22.000 Well, now we have the next level of how insane this gets.
00:27:25.000 Confirmation from Axios.
00:27:27.000 Google suspends Parler from App Store after deadly capital violence.
00:27:32.000 This is not gonna stop it.
00:27:34.000 It's a simple APK download, okay?
00:27:37.000 Google takes it out of the Play Store, and then they put it up on their website.
00:27:40.000 So when you open your phone, and you go to parlor.com, eventually at some point, I assume, it'll say, download the APK, you'll click it, boom, you got it.
00:27:46.000 What does removing it from the Play Store do?
00:27:48.000 It slows things down, significantly.
00:27:50.000 I mean, we were suspended from Google Play and App Store for like six months.
00:27:55.000 So you go to minds.com slash mobile and get the mobile app.
00:27:58.000 Now you can get the APK or you can get us in the app stores.
00:28:00.000 But like, you know, that hurt us and it will hurt Parler.
00:28:04.000 And, you know, it's and the thing is that it's become too polarized.
00:28:09.000 It's like the left, you know, Twitter, the left wing social network, Parler, the right wing social network.
00:28:13.000 It's like it need we need from a high level at the companies to be having serious conversations, live streaming and saying, how are we going to bring this community together?
00:28:23.000 How are we going to deal with these people?
00:28:25.000 Or a nationalized social media.
00:28:27.000 Or a globalized one.
00:28:28.000 One that follows the U.S.
00:28:29.000 Constitution that we create.
00:28:31.000 I mean, it will be globalized.
00:28:32.000 It can be anybody in the world can use it.
00:28:34.000 And it will follow the U.S.
00:28:34.000 Constitution.
00:28:35.000 Yes.
00:28:36.000 I don't think they would ever, ever.
00:28:37.000 And an app we need, a website, social network that does it, and we need an ISP that follows the U.S.
00:28:43.000 Constitution that's not a private company.
00:28:45.000 So let me read a little bit of this.
00:28:46.000 this, they say, quote, in order to protect user safety on Google Play, our longstanding
00:28:51.000 policies require that apps displaying user generated content have moderation policies
00:28:56.000 and enforcement that removes egregious content like posts that incite violence.
00:29:01.000 Jose Castaneda, a Google spokesperson, said, in light of this ongoing and urgent public
00:29:06.000 safety threat, we are suspending the apps listing from the Play Store until it addresses
00:29:10.000 these issues.
00:29:11.000 What does that mean?
00:29:12.000 You're not allowed to incite violence on Parler.
00:29:16.000 They'll ban you for it.
00:29:17.000 No joke.
00:29:18.000 In fact, Parler's got very strict rules.
00:29:20.000 My understanding is that they operate on what's called a broadcast standard, meaning you actually can't say certain things that are free speech.
00:29:27.000 And that's the way they run their platform.
00:29:29.000 I mean, it's a Gab 2.0 situation, basically.
00:29:32.000 And the problem, you know...
00:29:35.000 Free speech policies every network should have, but like the fact that there's, you know, there are privacy issues and transparency issues with Parler.
00:29:43.000 I'm just gonna, you know, be upfront about that.
00:29:44.000 Like, their code is not open source.
00:29:46.000 You can't see their algorithms.
00:29:47.000 If you're listening guys, open source your code.
00:29:49.000 Like...
00:29:50.000 Networks have to be open source, otherwise it's not viable in the long term.
00:29:54.000 I don't know if the Fediverse is the answer, we've talked about it before, but some kind of decentralized, federated social network.
00:30:03.000 The way it would work, it's very simple, this is how the Fediverse works, is that you don't have to use Parler.
00:30:09.000 You would use just Fediverse apps, a regular old app you download, and then you sign into your server or their server or whatever server, and it connects all the different companies into one social media system.
00:30:19.000 Yeah, the Fediverse is great.
00:30:20.000 It's the ActivityPub protocol, which Mastodon uses, which many sites are integrating.
00:30:28.000 There's another one, Polaroma.
00:30:30.000 We're working on ActivityPub integration.
00:30:32.000 It's taken way too long for us to do it.
00:30:34.000 It's a great protocol.
00:30:36.000 The problem with it, though, is that when you subscribe to someone on another node, the comment threads don't work.
00:30:42.000 Right.
00:30:43.000 So it's sort of messy.
00:30:44.000 It's still good, and it's a step in the right direction, but it's not truly decentralized social media because the admins can still ban the whole node.
00:30:52.000 They can cut off, like, if there was a networked gab connected to mines, one platform could cut off the other platform.
00:30:59.000 Right.
00:30:59.000 Yeah.
00:31:00.000 Well, the idea is preliminary, I suppose.
00:31:04.000 But how do we get to the point where if I choose to follow the president for his updates, no one can take that away, ever?
00:31:10.000 I think they call it Internet 2.0.
00:31:12.000 The Tron network was working on something like that.
00:31:13.000 Well, that's what we integrated with the PermaWeb, with the Arweave blockchain.
00:31:17.000 Basically, you have the option when you post to post to the PermaWeb.
00:31:23.000 Obviously, you don't want every post to be permanent.
00:31:24.000 But the reality is that we're moving into a blockchain decentralized world.
00:31:28.000 And that is a little bit scary.
00:31:31.000 So Ethereum is going to skyrocket.
00:31:33.000 Ethereum.
00:31:34.000 Yeah, I just bought some.
00:31:36.000 32.
00:31:36.000 But you bring up a blockchain.
00:31:39.000 I think what a lot of people need to realize is that a lot of these networks that use crypto, they're built off of the Ethereum cryptocurrency.
00:31:44.000 And there are some other amazing decentralized social networks like Scuttlebutt is fully decentralized.
00:31:49.000 It's sort of like techie and tough to use, but it is fully decentralized.
00:31:55.000 So it's on your machine, everything.
00:31:58.000 So regarding the Arweave network, if you were to post something on the Arweave blockchain and then you owned that post and it was there forever, could it then generate crypto tokens for every view that it accrued?
00:32:09.000 That's not how their system works, but I mean, theoretically, you could build whatever you want.
00:32:14.000 I remember at Burning Man, some people started to do peer-to-peer Bluetooth communications.
00:32:21.000 Yeah.
00:32:23.000 I was on a cruise and they told everyone when you come on to download this app, I forgot what it was called, but you turn on Bluetooth and then it is the craziest thing.
00:32:32.000 You can walk past one person and all the data is being transmitted from like my app to their app.
00:32:36.000 And then when they walk past another person, it bounces to like five more people and then created this mesh internet.
00:32:41.000 where if I was standing ten feet from you, you were ten feet from me and Ian was ten feet from, you know,
00:32:46.000 you know, Bob Smith, and then Bob Smith sent a message, it would relay through everyone to me.
00:32:50.000 Absolutely. Yeah, local networks. That's amazing.
00:32:52.000 And that's really important for countries that don't have serious infrastructure.
00:32:55.000 Yeah. Well, also in the United Kingdom, when rock and roll was banned,
00:33:00.000 what they started to do is literal pirate ships.
00:33:03.000 They had ships that would broadcast, uh, rock and roll.
00:33:07.000 Oh, I saw that movie.
00:33:08.000 Yeah.
00:33:09.000 Pirate radio.
00:33:10.000 There's, there's, you know, there's whole movies about this.
00:33:12.000 You know, censorship throughout history is something that is more common than uncommon.
00:33:18.000 So we have to understand, you know, this is going towards a trend that I don't want to be on.
00:33:23.000 Well, so here's what's going to happen.
00:33:25.000 The left controls the cultural institutions.
00:33:28.000 So a political coup, in my opinion, is totally meaningless.
00:33:31.000 It is.
00:33:32.000 What we're seeing right now is... Actually, I'll step back.
00:33:35.000 As they say, politics downstream from culture.
00:33:39.000 By getting rid of all of the voices that can, you know, propagate certain ideas, those ideas cease to exist.
00:33:46.000 That's it.
00:33:47.000 They're gone.
00:33:47.000 They've been slowly and methodically removing certain ideas and certain people from these platforms.
00:33:52.000 And they've been doing it for a very long time.
00:33:54.000 And I remember they were doing it to individuals that I didn't like.
00:33:58.000 I didn't agree with.
00:33:59.000 I didn't identify with.
00:34:00.000 I was like, yeah, I don't like these guys too, but...
00:34:04.000 But I don't think their voice should be censored.
00:34:06.000 And I've been saying that consistently from the very beginning, but they used our own fears, our own emotions against us to justify it slowly and surely.
00:34:14.000 And now we're here today.
00:34:15.000 I think the argument is it can't be stopped.
00:34:17.000 Like, it is downstream from culture and the technology.
00:34:20.000 So we need a technological revolution that will prevent anyone from being able to do that.
00:34:24.000 It is happening, but that's why it's crazy Bitcoin's hitting all-time highs today.
00:34:28.000 I mean, that is the decentralized infrastructure for the new system.
00:34:33.000 Yeah, it's incredible.
00:34:34.000 What does a Bitcoin get you in terms of that system, though?
00:34:38.000 You can do things on top of Bitcoin as well.
00:34:40.000 There are projects that are trying to build Layer 2 solutions on top of different blockchains.
00:34:45.000 This is the thing.
00:34:46.000 It's an encryption peer-to-peer technology, and it's something new like the printing press.
00:34:52.000 So when the printing press came out, everyone in the beginning was like, oh, this is nothing, this is an old machine, this is not gonna do anything.
00:34:58.000 But it revolutionized.
00:35:00.000 the way people gathered information Bitcoin is ...
00:35:03.000 revolutionizing the way people transact and just like the ...
00:35:06.000 internet double-edged sword could go either way could go ...
00:35:10.000 towards the total Venezuelan Russian cryptocurrency ...
00:35:13.000 track trace and database system or it could go towards a ...
00:35:17.000 decentralization Liberty Sovereignty Freedom ...
00:35:20.000 technology that helps people be individuals helps me will be ...
00:35:23.000 secure and most and most importantly helps people be ...
00:35:27.000 independent and that's the radicalization of the left in ...
00:35:30.000 this country. In 2016 I did an ...
00:35:32.000 interview with Oliver Darcy when he was working at ...
00:35:34.000 He's now at CNN on Reliable Sources.
00:35:37.000 The interview was because I tweeted out defense of alt-right white nationalists who were being removed from Twitter.
00:35:44.000 Now, these people weren't saying, you know, inciting violence.
00:35:47.000 These were people who were posting nasty opinions I didn't like.
00:35:51.000 These are people who don't like me at all because of my family's, you know, regional heritage, I'll put it that way.
00:35:57.000 And so Oliver Darcy asked me, like, what are your thoughts on this?
00:35:59.000 I said, it's a slippery slope.
00:36:01.000 You start removing people because you don't like these ideas, then eventually they're banning the left.
00:36:05.000 That was Oliver Darcy.
00:36:06.000 Recently, so the past day or so, he advocated for cable providers to ban Fox News, OAN, and Newsmax because they spread lies.
00:36:15.000 Talk about a dramatic change from where he was only four or so years ago when he was interviewing me because he thought it was newsworthy.
00:36:25.000 I said we must have free speech even for those we don't like.
00:36:28.000 I am still standing here on the same hill saying we must have free speech for those we don't like otherwise you create conflict and he now has gone so far that not only is he advocating for the removal of entire cable channels He's contacting AT&T and Verizon and Comcast and saying, will you remove them?
00:36:47.000 You're responsible, etc.
00:36:49.000 You're allowed to spread lies in the United States.
00:36:51.000 It's protected.
00:36:52.000 Again, that's still absolutely unhinged, dangerous behavior that corporations who want more power With little fleeing individuals coming to them ... saying please take away all my rights take away everything ... take away the most important right of what my ears could ... listen to of course they're going to be like yes give it to ... me I'll be your overlord daddy come on here again we've seen ... this coming for a long time one of the major criticisms ... against Donald Trump within the last few years is that he's ... not making a stand against free speech he's not making a ...
00:37:22.000 For now.
00:37:22.000 of his critics. What do you mean stand against? You mean not standing up for?
00:37:25.000 Standing up for the First Amendment, standing up for free speech and and it's
00:37:29.000 interesting to see Laura Loomer tweet this out today. She said according to a
00:37:34.000 telegram post, at least Trump can still order a sandwich on Uber Eats though.
00:37:38.000 I can't say the same, sadly. So again this has been happening to a lot of his
00:37:44.000 core supporters.
00:37:45.000 We, you know, again, you know, you know, you know, the interesting Laura Loomer was was beyond the canary in the coal mine.
00:37:52.000 I mean, this was like someone yelling the coal mine is full of carbon monoxide.
00:37:56.000 Get out now.
00:37:57.000 And the Republicans weren't smart enough to push through legislation while they had the chance.
00:38:01.000 So did Loomer get taken off the stripe or the the was it the Swift payment system?
00:38:05.000 She got banned from the global payment system.
00:38:07.000 I don't I don't I've never heard anything about Swift.
00:38:10.000 I think so, yeah.
00:38:11.000 Wrong Spencer.
00:38:11.000 Yes, yes, I get it. I get it. But when did you ever hear anything like that?
00:38:15.000 No, why can't she use money? Who banned her? What do you mean? What are you talking about?
00:38:17.000 Laura Loomer can't use Uber Eats. She's specifically banned.
00:38:19.000 She's not PayPal.
00:38:20.000 You can't get banned from Swift. Not her finance. No, she was banned,
00:38:22.000 I believe, from those platforms as well. But not Swift.
00:38:25.000 That is not the case.
00:38:25.000 Are you sure you can't be put on a blacklist of Swift?
00:38:28.000 Probably. I mean, Mastercard reached out to Patreon over this guy. What was his name?
00:38:32.000 Robert Spencer? I think so, yeah.
00:38:33.000 Wrong Spencer. Yeah, what?
00:38:37.000 It was the wrong Spencer.
00:38:38.000 It wasn't Richard Spencer.
00:38:39.000 Yeah, it was Robert Spencer.
00:38:41.000 Yes, and what do you mean the wrong Spencer?
00:38:42.000 He was the person they were targeting on purpose.
00:38:44.000 He writes about Jihadists.
00:38:45.000 Oh, that's right, that's right.
00:38:46.000 He writes about radical Islam, and Mastercard got angry and said to Patreon, remove him, otherwise we'll cut off services to your system.
00:38:53.000 I think Mastercard was forced to do that by Swift.
00:38:55.000 I could be wrong about that.
00:38:57.000 I mean, this is actually a historic day.
00:38:58.000 I think we actually have to zoom out and let it sink in for a second.
00:39:07.000 This proves corporations have more power than the government.
00:39:10.000 Private corporations have just removed the speech of the sitting U.S.
00:39:14.000 president.
00:39:15.000 Michael, Michael Tracy has a comment here that I think is really timely.
00:39:19.000 He says, quote, Corporate left liberals are desperate for revenge.
00:39:24.000 They will use all powers at their disposal, public and private, to neutralize their purported insurrectionist enemies.
00:39:32.000 And they absolutely do not care one bit what civil liberties are destroyed in the process.
00:39:37.000 That's what Michael Tracy said.
00:39:38.000 And they don't sound like they're being too divisive.
00:39:39.000 I don't want to think of this as left and right.
00:39:41.000 But why is speeding this?
00:39:43.000 Yeah, absolutely.
00:39:44.000 It is.
00:39:45.000 It's not though, it's way more than two sides.
00:39:47.000 It's a bunch of people with a bunch of different ideas.
00:39:51.000 But it is very frustrating that Trump didn't facilitate a cross-spectrum conversation more directly.
00:39:59.000 Like, he had the opportunity, and it seems like he really did not bring the people- Wait, Bill, you and I were in the White House at the social media event.
00:40:06.000 Yeah.
00:40:07.000 You're the CEO of a company he could be using, and someone said, will you just use another platform?
00:40:12.000 He said, which one?
00:40:13.000 Which one?
00:40:14.000 I don't know.
00:40:14.000 Donald Trump and Joe Biden.
00:40:16.000 And all the progressive, both sides.
00:40:18.000 We have to constantly be disclaiming that we need both sides.
00:40:23.000 Yes.
00:40:23.000 There's no network without both sides.
00:40:25.000 Would you consider using MindsTech for a government program that we could use?
00:40:28.000 Yeah, they can use it.
00:40:29.000 Hey, call me up.
00:40:30.000 Government?
00:40:30.000 Yeah.
00:40:31.000 I mean, make your own.
00:40:32.000 A national social network with the Minds software they could integrate with.
00:40:37.000 Absolutely.
00:40:39.000 I mean, the government just okayed banks running Bitcoin and Ethereum nodes, which is a very big deal.
00:40:48.000 What we're seeing, unfortunately, is tribalist divide.
00:40:51.000 There's not just two factions.
00:40:53.000 There's hundreds and probably thousands.
00:40:56.000 But there are two overarching parent factions.
00:40:59.000 Of these overarching parent factions, we just typically refer to them as left and right, but that means very little.
00:41:04.000 For instance, how is it that Tim Pool, who's economically left and even socially left, is considered right-wing?
00:41:12.000 Because I believe in freedom, free speech, liberty, principles, integrity, etc.
00:41:16.000 That is not something that exists, for the most part, among the left.
00:41:21.000 What the left believes in is, for one, what did we see from Cuomo on CNN during the riots?
00:41:25.000 He said, who says protests have to be peaceful, right?
00:41:28.000 And then Ramen Guy comes out and says it's right there in the First Amendment.
00:41:31.000 Mr. Cuomo, he's got to look it up.
00:41:32.000 Then you get Jake Tapper and CNN saying, oh, I can't believe what they're doing!
00:41:36.000 When the riots happened, I said, those riots are bad.
00:41:38.000 When the riots at the Capitals happened, I said, those riots are bad.
00:41:41.000 There is principle.
00:41:43.000 And I'm not saying everybody on the right has it.
00:41:45.000 I'm saying it's not something typically of the left.
00:41:48.000 It is a tendency.
00:41:49.000 I can't take the right and the left stuff, dude.
00:41:51.000 The Chinese used the rightists.
00:41:54.000 Mao's communist takeover was against the rightists.
00:41:57.000 They focused on dividing people into these camps and then targeted, used them against each other.
00:42:01.000 Yes, but what you need to understand is the divide is a real thing, not made up by someone talking.
00:42:06.000 It's a many fractionalized divide of many millions of different concepts at once.
00:42:11.000 And to just think that we're in different camps of Of types is crazy.
00:42:15.000 Are conservatives right now calling for the removal of their political opponents from major platforms?
00:42:20.000 That's not a conservative move.
00:42:21.000 It's a very liberal move.
00:42:23.000 Exactly.
00:42:23.000 And it's only coming from what we colloquially refer to as the right.
00:42:28.000 There are some on the left, liberals and progressives, people like Glenn Greenwald.
00:42:32.000 But now they call him alt-right.
00:42:34.000 And Matt Taibbi.
00:42:35.000 And Michael Tracy.
00:42:35.000 But who does?
00:42:36.000 I don't.
00:42:37.000 There are two tribes.
00:42:39.000 You can call them tribes.
00:42:40.000 A and B. One or two.
00:42:41.000 It doesn't matter.
00:42:42.000 Where are these two?
00:42:42.000 They are the overarching parents to numerous other tribes.
00:42:45.000 You have to use words, Ian.
00:42:46.000 Yeah, I will.
00:42:47.000 And is it because there's Democrat and Republican?
00:42:49.000 Two power parties?
00:42:50.000 So we say then there's left and there's right?
00:42:52.000 It's a reference to the French Revolution.
00:42:54.000 To those who sat on the left and those who sat on the right.
00:42:57.000 And those on the right were, you know, moderates, and those on the left were radicals who wanted revolution.
00:43:01.000 I think it's so dangerous to get into that mindset.
00:43:03.000 Well, using more specific language is actually very important.
00:43:07.000 I'm not going to sit here and say, the socialists, the anarchists, the communists, the tankies.
00:43:12.000 Just use their names, man.
00:43:14.000 Don't blame their group for what they did.
00:43:16.000 If someone does something, they're responsible for that.
00:43:19.000 It is a tendency among all of the factions on both sides to hold certain ideals and principles.
00:43:26.000 They don't completely agree with each other.
00:43:28.000 The left fights themselves all the time.
00:43:30.000 On the right, you have people who are awful and white nationalists who would defend Donald Trump, and the left doesn't.
00:43:37.000 But that is not overwhelmingly the majority.
00:43:42.000 or even a large portion of what the right represents. In fact, the alt-right and the
00:43:45.000 white nationalists actually agree with the left on a ton of left-wing policy issues.
00:43:49.000 The point is, when it comes to the cultural debate, there are two parent factions. Fine,
00:43:54.000 we won't say left and right, we'll say one and two. There you go.
00:43:55.000 Well, you're not one or two, dude.
00:43:57.000 I know. Right.
00:43:59.000 There's more than one and two.
00:44:01.000 Pragmatically, Ian, you have to use words in order to have a conversation.
00:44:05.000 It's like saying, are you going to vote Republican or Democrat this time?
00:44:08.000 Well, yeah, absolutely.
00:44:10.000 You should be talking about all the options.
00:44:11.000 There are some initial reports.
00:44:12.000 I haven't been able to independently verify this, but we're hearing that President Donald Trump was trying to use the POTUS account.
00:44:19.000 And then he was tweeting from it, but Twitter removed the tweet instantaneously that it happened.
00:44:24.000 Again, that's just some of the reports that are coming in right now from the at POTUS account.
00:44:29.000 Now the POTUS account is still active.
00:44:31.000 The latest allegations and reports that are coming in now that I have not verified is that he did tweet and it was deleted by Twitter.
00:44:37.000 That would be a shadow.
00:44:38.000 So that's what I'm hearing right now.
00:44:41.000 It's happening.
00:44:42.000 Who's reporting it though?
00:44:44.000 I mean, I just had it here.
00:44:45.000 I just went up here.
00:44:46.000 I already saw like three people tweet it.
00:44:50.000 Well, we'll search for it and we'll see what we can find.
00:44:52.000 So he's been purportedly shadow banned on his other account.
00:44:56.000 Well, it's not his account.
00:44:57.000 It's our account.
00:44:58.000 Yes, we have images here.
00:44:59.000 Josh Kaplan, verified Twitter user.
00:45:01.000 He is a homepage editor for Breitbart News.
00:45:03.000 Tweets.
00:45:04.000 Twitter deletes series of tweets presumably written by President Trump on the POTUS account following the permanent ban of his account.
00:45:10.000 Wow!
00:45:11.000 He tweeted 8.29 p.m.
00:45:12.000 today.
00:45:13.000 As I've been saying for a long time, Twitter has gone further and further in banning free speech and tonight Twitter employees have coordinated with the Democrats and the radical left in removing my account from their platform to silence me and you, the 75 million great Patriots who voted for me.
00:45:28.000 Twitter may be a private company, but without the government's gift of Section 230, they would not exist for long.
00:45:33.000 I predicted this would happen.
00:45:35.000 We have been negotiating with various other sites, and I will have a big announcement soon.
00:45:39.000 While we also look at the possibilities of building out our own platform in the near future, we will not be silenced.
00:45:45.000 Twitter is not about free speech.
00:45:47.000 They are all about promoting a radical left platform, where some of the most vicious people in the world are allowed to speak freely.
00:45:53.000 Stay tuned.
00:45:54.000 This did come from the president to the voter's account, and it has been removed.
00:45:59.000 And another thing that we're hearing is that he might give a publicly televised address soon.
00:46:04.000 But the thing is, who's going to hear it?
00:46:05.000 Because when we saw during the election, many news organizations just cut him out.
00:46:11.000 They said, no, we're not going to play the president's address.
00:46:13.000 We're just going to stop it here.
00:46:14.000 What the president is saying is wrong.
00:46:16.000 We don't agree with it.
00:46:17.000 And there was huge editorializing, even on Fox News.
00:46:20.000 I'll tell you what I'm worried about.
00:46:21.000 Before the mass purge they just pulled off, I think many people were willing to accept that Trump had finally been defeated.
00:46:26.000 If he does a national address now, I mean, well, you want to hear it.
00:46:30.000 I'll tell you, you know, you know what I, what I'm worried about before the mass purge,
00:46:34.000 they just pulled off.
00:46:36.000 I think many people were willing to accept that Trump had finally been defeated.
00:46:40.000 Mike Cernovich, for instance, one of the most prominent Trump supporters ever in the Trump
00:46:46.000 He put up a poll and he said, did Donald Trump concede?
00:46:49.000 Yes, no, unsure, show me the results, something like that.
00:46:54.000 And around 25% said no.
00:46:56.000 Donald Trump made a video where he said there will be a new administration and we will peacefully transition.
00:47:01.000 I made a video, Trump confirms defeat, Mike Pence certifies Joe Biden, Trump said it, I don't know what else you're supposed to take from that.
00:47:08.000 But many Trump supporters, around 25% of Mike Cernovich's, you know, polled, said he didn't concede.
00:47:14.000 Mike responded by saying, to those who believe this, I love you, please unfollow people who have made you believe this, go home to your loved ones, they miss you, and, you know, it's time to stop.
00:47:26.000 Essentially, I'm paraphrasing.
00:47:28.000 He's right.
00:47:29.000 There are people who are ardent and prominent Trump supporters who are saying, guys, please, please, enough, okay?
00:47:35.000 Trump has said he's lost, it's over.
00:47:38.000 Then they do this.
00:47:39.000 Then Twitter comes out and does this, and they do more, and they do more.
00:47:42.000 Now they're purging people left and right.
00:47:44.000 And I think many of those people who are probably like, yeah, you know, that was probably dumb at the Capitol, and I guess it's over, now they're enraged.
00:47:50.000 Now they're angry.
00:47:51.000 You took, like, it's like, here's how I imagine it.
00:47:54.000 You've got a guy at a bar, right?
00:47:57.000 And he's disparaging you.
00:47:59.000 And then you're about to fight.
00:48:00.000 And then he goes, you know what?
00:48:01.000 You know what?
00:48:01.000 I'm not doing this.
00:48:03.000 I'm outta here.
00:48:04.000 He turns around and walks away, so you throw a can at his head.
00:48:07.000 And then he turns around and says, that's it.
00:48:09.000 That's what we had.
00:48:10.000 People were right.
00:48:11.000 Not everybody was walking away, for sure.
00:48:12.000 A lot of people were saying crazy things.
00:48:14.000 But a lot of people were like, alright, alright, walking away.
00:48:16.000 And then Twitter was like, not yet.
00:48:18.000 Whipped him in the head.
00:48:19.000 And then they turned around, and now I think they're gonna explode.
00:48:22.000 And now my concern is, After seeing this, there is real fear about multinational corporations shutting down the president.
00:48:32.000 It should not be allowed, in my opinion, by U.S.
00:48:34.000 law, that our executive, our chief executive, could be shut down on a major communication platform that has essentially seized the commons in terms of communication.
00:48:44.000 This is a major power grab.
00:48:45.000 We have to understand, historically, when an entity or a force go after power and they get it, they go after more.
00:48:52.000 You know, there's an expression, if you give your pinky, you're gonna give up your whole hand eventually.
00:48:56.000 Give an inch to take a mile.
00:48:57.000 Exactly.
00:48:58.000 So, this is a slippery slope.
00:49:00.000 This is the snowball that's been happening ever since 2008.
00:49:05.000 And it's been snowballing and it's getting big and it's only going to get bigger because who's going to stand in the way?
00:49:10.000 Who's going to promote free speech?
00:49:12.000 Who's going to uphold it?
00:49:13.000 Who's going to protect it?
00:49:14.000 What institutions are out there that could actually make a stand here?
00:49:18.000 I don't see Fox News doing that at all.
00:49:20.000 I don't see any other website.
00:49:22.000 All of the means of communication have been hijacked and are in the hands of special interests and few people.
00:49:29.000 Is there any hope, guys?
00:49:30.000 Let me tell you something.
00:49:32.000 When Trump won, there were a lot of Trump supporters being mean and snide and mocking and belittling as people screamed and cried and memes went crazy.
00:49:41.000 And now we're seeing the same.
00:49:42.000 We're seeing a lot of the same on the other side.
00:49:44.000 But now it's not coming from random people on the Internet.
00:49:47.000 It's coming from CNN.
00:49:49.000 We have Asha Rangappa.
00:49:51.000 She is, I'm pretty sure, let me make sure, yes, FBI, former FBI special agent and CNN analyst said, I'm not even going to screenshot what he's tweeting from the POTUS account.
00:50:00.000 Twitter has already taken it down, but boo-boo mad.
00:50:03.000 This is the kind of dismissive and insulting and humiliating content coming from prominent institutions, which will trigger mass rage.
00:50:13.000 I don't care if random Twitter user 123 tweets nasty things at me.
00:50:19.000 I don't care.
00:50:20.000 I don't know who you are.
00:50:21.000 I don't care to make YouTube videos about random Twitter users who say dumb things.
00:50:26.000 But when it's someone who works for a major cultural institution or politics and they have power and influence, I think it needs to be talked about and called out.
00:50:34.000 They disagree.
00:50:35.000 And the funny thing is, this is actually a left-wing principle.
00:50:37.000 They say, don't punch down.
00:50:39.000 Then why is it that mainstream media is punching down, calling regular Americans stupid, mocking the way they live, mocking the way they work, mocking their lives?
00:50:49.000 Now I get it.
00:50:50.000 Mocking the president is not punching down, by all means.
00:50:53.000 Mock him.
00:50:54.000 My concern is when CNN comes out and attacks the people who are angry.
00:50:59.000 You can't, also another thing, you can't really fight fascism with fascism.
00:51:03.000 Like that's something also a lot of people need to realize here.
00:51:06.000 You can.
00:51:07.000 Except those who don't like fascism are going to be really angry about it.
00:51:10.000 You're going to get fascism.
00:51:11.000 That's the thing.
00:51:12.000 Right, right, listen.
00:51:13.000 If authoritarian leftists want authoritarianism, and they just don't want authoritarian rightists to win, well then there you go.
00:51:19.000 Authoritarianism is the battle.
00:51:20.000 I don't think a lot of people realize the larger kind of implications here.
00:51:23.000 I don't think a lot of people realize where this is going and the
00:51:27.000 historical precedent that this is setting as well.
00:51:29.000 So augment what you said, look, you can fight fascism with fascism, but
00:51:32.000 you can't defeat fascism with fascism.
00:51:34.000 Yes.
00:51:35.000 Or that's better.
00:51:37.000 You can fight fashion with it in a fascism fight.
00:51:40.000 All you'll get is fascism.
00:51:41.000 There you go.
00:51:43.000 So to talk a little bit about the fact checking strike, quote unquote,
00:51:46.000 fact checking strategies at Facebook What they're doing is basically bringing in a small group of think tanks, and it's not a real system.
00:51:55.000 So, what we're actually talking about doing is working with this group, Ground.News.
00:52:01.000 I don't know if you've ever heard of them.
00:52:03.000 For every article, their algorithm grabs all the articles and shows on both sides.
00:52:09.000 And they sort of are doing the best, it seems, good faith effort to show all the coverage on both sides.
00:52:16.000 And so that when you're scrolling down your feed, you can get context to what you're seeing.
00:52:23.000 And, you know, that's where we need to be going, giving people access to all of the data around the post so that they have the most information, not just saying this is true or this is false.
00:52:34.000 Facebook knew what they were doing.
00:52:36.000 Facebook knew early on that they were feeding, you know, hyper-partisan content to different sides.
00:52:42.000 In fact, there are some news organizations that knew this too and created two different versions of their content because one would feed to the left and one would feed to the right.
00:52:51.000 And there was a thing they would do, it's just general journalism A-B testing,
00:52:55.000 where they'll write an article and in some regions they would use certain titles,
00:52:59.000 different in other regions, total different framing, to see how it would perform to maximize
00:53:05.000 the amount of ads they would get, the amount of clicks they would get, and in turn the revenue.
00:53:09.000 Because they knew that somebody who lived in Texas would probably see, you know,
00:53:13.000 rather see an article that says Nancy Pelosi is bad. And the people who lived in San Francisco
00:53:18.000 probably would like to see Nancy Pelosi good. They were doing like experiments on users,
00:53:22.000 Facebook was, where they would feed them emotionally charged articles to see if it
00:53:26.000 would produce more clicks.
00:53:28.000 Nice and a maze.
00:53:29.000 Now they've got, you know, artificial intelligence or whatever this algorithm is that's private, you know, tweaking and deciding what people see and what gets flagged.
00:53:39.000 So I'll tell you, you know who our government really is?
00:53:41.000 It's the robots.
00:53:42.000 It's who controls the narrative, yeah.
00:53:44.000 What is to govern is the control, the mind control of the system right now.
00:53:49.000 If Facebook uses an AI to feed content to people, I think it's AI.
00:53:54.000 It might be on machine learning.
00:53:55.000 I'm not sure.
00:53:57.000 Listen, okay.
00:53:58.000 Speaking colloquially, Facebook creates an algorithm that feeds certain content.
00:54:02.000 All that matters is, this is what I've said, Jack Dorsey has swallowed so much of his own refuse, I think he's actually been radicalized by what he's created.
00:54:13.000 If you look at him compared to where he was when Twitter launched to where he is now, he's a dramatically different person.
00:54:17.000 And why does he believe the things he believes?
00:54:19.000 He created a platform that incentivizes rage content, And then he started eating that content, which changed his brain, and then he had the keys to the castle, went in, and changed more of the platform to keep feeding into that... It's an insanity loop.
00:54:36.000 He created it.
00:54:37.000 It radicalized him.
00:54:38.000 And then he sold it.
00:54:39.000 And then he made it crazier and crazier to fit his new radicalized mind.
00:54:43.000 Think about what he said in the beginning.
00:54:44.000 We're the free speech wing of the free speech party.
00:54:46.000 But that was just a joke.
00:54:48.000 Think about what I said to him when I said, your misgendering rule is ideological because conservatives don't understand that view.
00:54:56.000 They have a totally different worldview.
00:54:58.000 But he couldn't see it because he had been swimming in the refuse he created.
00:55:03.000 Yep.
00:55:03.000 And he didn't just create it.
00:55:04.000 He made the world crazier in the instant that you brought it up.
00:55:08.000 But also another aspect to understand here, mental health has been in decline ever since we saw the rise of social media.
00:55:14.000 Many people say that is a massive correlation.
00:55:16.000 That's another way that it impacts people with depression, suicides, and a lot of other mental health defects that are skyrocketing right now as we're speaking.
00:55:26.000 So if you go out, you know, on an average day, if you talk to a neighbor or if you talk to a stranger, They're not as crazy as they are online.
00:55:35.000 You will talk to a random person.
00:55:36.000 You have a lot more in common.
00:55:38.000 You used to, but now it's becoming less and less rare.
00:55:42.000 More on average, if you go out there, if you talk to your neighbor, you're more likely to get along with them than not.
00:55:47.000 The world is not as crazy as it was purported, but the reporting of it as being as crazy makes it crazy.
00:55:52.000 I think that's how it used to be.
00:55:54.000 I think you're wrong now.
00:55:55.000 I think the radicalization of social media has now led to people just outside at random being tribalized and radicalized.
00:56:03.000 And not to mention people will barely talk to each other because they have a mask on their face.
00:56:07.000 And it's like a whole other level of isolation in your own house.
00:56:12.000 So even if you're not on social media, you're not going out and being social.
00:56:18.000 Is he going to shake my hand?
00:56:19.000 Am I allowed to touch this guy's hand?
00:56:21.000 It's so weird.
00:56:22.000 Or think about when you're in a city where there's a political event going on, and they think you look like a Trump supporter, or they think you look like Antifa.
00:56:31.000 Where can you walk based on how you appear?
00:56:33.000 Imagine walking around New York City wearing... Actually, people have done this.
00:56:36.000 Walk around L.A.
00:56:37.000 wearing a MAGA hat.
00:56:37.000 See what happens.
00:56:38.000 Blaire White did it.
00:56:39.000 She got attacked.
00:56:39.000 A fingernail ripped off.
00:56:40.000 I would say a lot of the chaos today, and the mental health issues, comes from text, and people attempting to communicate through text, which is a new form of communication with humans.
00:56:49.000 They used to send letters before that.
00:56:51.000 You know, written language, they all communicated with words and sounds.
00:56:56.000 And we've lost so much touch of our ability to communicate with our words.
00:57:01.000 I find, I have so much, you know, faith and love for people that make internet video, because you speak your mind with your words and your sound and your vibration, and it's completely different than etching something onto a stone for someone else to attempt to interpret it.
00:57:15.000 But even...
00:57:17.000 Like, this is one of the reasons we don't do Skype here, and we never... First of all, we're not set up for it.
00:57:22.000 There could be maybe some exception in the future, but we don't, because it doesn't work.
00:57:27.000 Even hearing their voice, you gotta see their face.
00:57:30.000 You have to be able to... I interrupted you, Ian.
00:57:32.000 You know what I mean?
00:57:33.000 I did notice that, yes.
00:57:34.000 To jump in and make that point.
00:57:35.000 Good.
00:57:36.000 That's what we're supposed to do.
00:57:37.000 You can't do that on a Zoom call.
00:57:38.000 In text, it wouldn't work.
00:57:40.000 Because the digital overlap, a lot of times there's a problem with delay.
00:57:42.000 The time delays and the awkwardness of, you know, you're standing at a camera, you're not standing at a screen, you don't see their face.
00:57:47.000 People don't realize that.
00:57:48.000 They think when you're doing a Skype debate or a Zoom call that you can see each other's face and look into each other's eyes.
00:57:54.000 No, you have a camera.
00:57:55.000 When I would do a Fox News or, say, in the past when I did go on MSNBC, Or some of these other networks.
00:58:03.000 I'm staring into a black hole.
00:58:06.000 That's it.
00:58:06.000 I don't see myself.
00:58:07.000 I don't see them.
00:58:08.000 And you know what the worst thing about these cable networks is?
00:58:10.000 You can never hear anything.
00:58:11.000 There's a delay.
00:58:13.000 Yeah, you get an earpiece and it's so quiet and you're like, can you turn it up?
00:58:16.000 I can't hear anything.
00:58:17.000 And then they're like, uh, so we're going to go live.
00:58:20.000 The producer's always loud.
00:58:21.000 We're going to go live in about 10 seconds.
00:58:22.000 And then you hear the house go...
00:58:25.000 And I'm like, dude, I can't hear you.
00:58:27.000 You're ready to talk.
00:58:27.000 OK.
00:58:29.000 You need to be sitting down with someone to have a real meaningful conversation.
00:58:33.000 But at the same time, we're seeing all the news networks now are doing remote video chats.
00:58:38.000 And like I agree with you, it's not as good.
00:58:40.000 But I mean, to Ian's point, A video message is more effective.
00:58:44.000 Or a phone call.
00:58:45.000 Compared to a text.
00:58:46.000 A text message is dangerous.
00:58:48.000 You cannot get emotion from it.
00:58:49.000 And then people freak out because they over interpret what they're writing.
00:58:53.000 Like sarcasm is completely lost in text.
00:58:55.000 Well even just like in relationships and like people like people's girlfriends send them messages like oh my god.
00:59:00.000 Dude, and we live in this world of it now that's radicalizing and making people insane.
00:59:04.000 I don't know how to overemphasize how dangerous it is to communicate with text and rely on it as your form of communication.
00:59:12.000 We're vibrating monkey bodies that speak words for a reason.
00:59:17.000 I think text is a great way to relay information.
00:59:20.000 Text was a revolution.
00:59:20.000 But not to communicate feelings.
00:59:23.000 It is.
00:59:24.000 Yeah, it is.
00:59:24.000 It can be.
00:59:25.000 It can be, but it depends on the context.
00:59:28.000 I think that if I want to, I can write something with a feeling in it, but if you read it, you're going to interpret your own feeling.
00:59:33.000 But if I say something to you with a feeling, you're going to feel what I'm feeling.
00:59:39.000 I was going to pull up a tweet based on what you said.
00:59:41.000 I tweeted this.
00:59:43.000 If history has taught us anything, it's that you should trust the government in times of emergency to do what's right, keep us well informed as the ongoing legitimacy of the threat, and give up emergency powers once the crisis is averted.
00:59:55.000 I intentionally just said it, but I call it a filter.
00:59:59.000 I call tweets like that a filter.
01:00:01.000 Clearly, to anybody who is hearing me talk, if I was to say this in real life, I would say something like, if history has taught us anything, it's that you should trust the government in times of emergency to do what's right, keep us well informed of the ongoing legitimacy of the threat, and to give up emergency powers once the crisis is averted.
01:00:18.000 Clearly knowing I'm being facetious.
01:00:20.000 In a tweet, people thought it was real.
01:00:22.000 Now most people retweeted it laughing.
01:00:24.000 They understood the context.
01:00:26.000 Tim's being sarcastic, facetious, etc.
01:00:28.000 But a lot of people saw that and they were like, dude, what is wrong with you?
01:00:31.000 And those people can become crazy dangerous if they don't understand.
01:00:34.000 I have a friend who's a prominent leftist who told me that people don't understand my tweets are jokes.
01:00:40.000 And I'm like, what am I supposed to do when I say something so absurd and ridiculous?
01:00:44.000 Like, I have another one.
01:00:47.000 Put a smiley face at the end.
01:00:48.000 I said, or a goofy face.
01:00:50.000 Yeah, that's all you gotta do.
01:00:51.000 So, you know what I do now is I actually reply with, hello, you must be new to Twitter.
01:00:55.000 This is a joke.
01:00:56.000 I'm not serious.
01:00:57.000 I said, the good news is now that, now with Democrats in full control, we can finally lock down the country for a couple of years to make sure COVID goes away.
01:01:04.000 And a lot of people are like, what, are you crazy?
01:01:06.000 What's wrong?
01:01:07.000 Some people really think that stuff.
01:01:09.000 They believe it.
01:01:09.000 Right.
01:01:10.000 They'll tweet that out.
01:01:10.000 Serious.
01:01:11.000 So people are like, is he, has he switched?
01:01:12.000 Has he, has he changed?
01:01:14.000 Dude, Ryan Long just tweeted.
01:01:16.000 It's the funniest thing.
01:01:17.000 Not that I, you know, want to be looking at Twitter right now, but Jack Torcy just yelled, I'm a golden God before jumping into his pool.
01:01:29.000 Yeah.
01:01:29.000 So there's this phenomenon where you can sell your company.
01:01:32.000 That's kind of a problem.
01:01:33.000 You create this phenomenon like Twitter, Jack did, and then he sold it.
01:01:36.000 He owns like 6% of it now.
01:01:38.000 He gave up power, gave up control.
01:01:41.000 Google was started by Larry and Sergey.
01:01:43.000 They've become monsters.
01:01:44.000 They're gone.
01:01:44.000 They're not even part of the company anymore, as far as I know.
01:01:47.000 Well, they're Alphabet.
01:01:48.000 Yeah.
01:01:48.000 Well, you said they were even off the board of Alphabet?
01:01:50.000 I heard that.
01:01:51.000 Really?
01:01:51.000 Yeah.
01:01:52.000 Maybe confirm.
01:01:54.000 They silently stepped away.
01:01:55.000 And the odd thing to me about Dorsey is that he's all about Bitcoin.
01:02:00.000 He's tweeting about Signal, which is actually a great open source.
01:02:03.000 You know, Elon was tweeting about Signal the other day.
01:02:05.000 Like, amazing project.
01:02:07.000 That was Signal.
01:02:08.000 Was that Moxie Marlinspike?
01:02:09.000 Yeah.
01:02:09.000 Interesting.
01:02:10.000 Yeah.
01:02:10.000 And so it's like he's aware of this.
01:02:13.000 But for some reason, the speech thing, he doesn't get.
01:02:17.000 Who?
01:02:17.000 Who?
01:02:18.000 Dorsey.
01:02:18.000 Dorsey.
01:02:19.000 Why does he care about Bitcoin and Signal?
01:02:20.000 Well, you have to ask yourself, is he really in charge?
01:02:22.000 He's not.
01:02:22.000 I don't think so.
01:02:23.000 I think he's a figurehead.
01:02:24.000 And so he got fired a long time ago.
01:02:27.000 And they brought on Dick Costolo.
01:02:29.000 And then when he left, I can't remember why, I think he may have been fired, Dorsey became the CEO.
01:02:33.000 I think it was Costolo.
01:02:34.000 Maybe I'm getting my people mixed up.
01:02:35.000 And Dorsey, I think, was just brought on to appear to be in control, and really he's not.
01:02:42.000 I don't think he is.
01:02:43.000 I think he owns 6%, yeah.
01:02:45.000 Well, another thing to really kind of consider here, when you look at a lot of the big tech companies,
01:02:50.000 they either have direct involvement with the startup of them and intelligence agencies,
01:02:55.000 or they have ongoing contracts and cooperations with continued government agencies that they are working hand-in-hand
01:03:02.000 with.
01:03:03.000 Case in point, Amazon, and the CIA, and the Department of Defense, Facebook, and what was that startup connected to the intelligence agency that was Integro in their start?
01:03:14.000 We have Google, and of course Google Maps, working with of course the US Pentagon to make that happen, and there was another one, In-Q-Tel I think?
01:03:23.000 No, I don't know.
01:03:24.000 That's a lot of factoring we have to do for a lot of that stuff.
01:03:26.000 Yeah, I'm gonna have to look into that stuff to talk more clearly on it.
01:03:29.000 Let's talk about... Yeah, this says Dorsey owns 2% of Twitter's outstanding shares.
01:03:34.000 Wow.
01:03:34.000 Does that mean total stock?
01:03:35.000 Outstanding shares?
01:03:37.000 Worth $531 million.
01:03:38.000 2%?
01:03:39.000 He's not involved at all in that company.
01:03:40.000 Yeah, he runs Square.
01:03:41.000 Square is his company.
01:03:42.000 He splits.
01:03:43.000 He's CEO of both.
01:03:45.000 Right.
01:03:45.000 Yeah.
01:03:45.000 But I think he's just... He owns 13% of Square.
01:03:47.000 Oh, really?
01:03:48.000 Yeah.
01:03:50.000 So really, he might be CEO entitled, but that doesn't mean he runs the company.
01:03:54.000 Well, let's talk about where the escalation has brought us.
01:03:57.000 From Fox 4, Josh Hawley speaks out, arguing Biden called him a Nazi when talking to reporters.
01:04:04.000 U.S.
01:04:04.000 Senator Hawley fired back Friday, saying President-elect Joe Biden compared him to a Nazi propagandist.
01:04:10.000 He didn't say he did.
01:04:12.000 Joe Biden did.
01:04:13.000 From the Dallas News, Joe Biden likens Ted Cruz to Nazi propagandist Goebbels for helping Trump spread big lie about election fraud.
01:04:21.000 It wasn't just Cruz.
01:04:22.000 It was also Josh Hawley.
01:04:24.000 Now Hawley's firing back, saying, President-elect Biden has just compared me and another Republican senator to Nazis.
01:04:31.000 Think about that for a moment.
01:04:32.000 Let that sink in.
01:04:34.000 Holly argued he raised lawful questions about the way elections were conducted, just as Democrats did in previous years, but saw a much different outcome.
01:04:42.000 This is undignified, immature, and intemperate behavior from the president-elect.
01:04:46.000 It is utterly shameful.
01:04:48.000 He should act like a dignified adult and retract these sick comments.
01:04:51.000 The president-elect made the comments while answering reporters' questions in Washington, D.C.
01:04:55.000 Friday afternoon.
01:04:56.000 A reporter asked Biden if Senators Hawley and Cruz should resign after a violent mob contesting the election results stormed the Capitol.
01:05:03.000 Biden said the two senators should be flat-beaten in their next elections.
01:05:07.000 Biden then referred to the big lie and said that those like Goebbels, Hawley, and Cruz kept repeating the lie.
01:05:13.000 Goebbels was a member of the Nazi Party and a Reich Minister of Propaganda under Adolf Hitler during World War II.
01:05:19.000 It's exactly the kind of rhetoric everybody would want to hear from the incoming president-elect, right?
01:05:23.000 The one who's calling for unity?
01:05:24.000 No.
01:05:25.000 This is... this is...
01:05:28.000 A level of depravity and insanity.
01:05:30.000 So what should he do in his last few weeks?
01:05:33.000 Trump?
01:05:34.000 A few days.
01:05:36.000 I don't know, but they said they're going to impeach him on Monday.
01:05:39.000 The Democrats are going to impeach him.
01:05:41.000 I don't know if he'll get removed because if Republicans and Democrats split 50-50, then Mike Pence breaks the tie.
01:05:46.000 But what if Mitch McConnell says, nah, I'll break the tie.
01:05:48.000 And he decides Trump's gotta go.
01:05:51.000 And then he votes him out.
01:05:52.000 Are we going to get any declassified files?
01:05:54.000 No, I don't think so.
01:05:56.000 I don't.
01:05:57.000 Trump has been unable to get anything declassified.
01:05:59.000 They don't listen to him.
01:06:00.000 He's talking about firing people.
01:06:01.000 I don't know what he's gonna do.
01:06:02.000 But listen.
01:06:04.000 You know, I was mentioning this earlier.
01:06:06.000 When you have the people who are willing to walk away kind of conceding with their tail between their legs and then you throw something at the back of their head.
01:06:12.000 This from Joe Biden is like they're pouring fuel on the fire.
01:06:16.000 Why?
01:06:17.000 Why would he say this about these senators?
01:06:19.000 Why?
01:06:20.000 Why would he tell?
01:06:22.000 Look, it was over.
01:06:23.000 Right, right, right.
01:06:24.000 Yeah, it's done.
01:06:25.000 And I'll tell you what's what's really crazy about the scenario.
01:06:28.000 What Ted Cruz and Josh Hawley did was entirely constitutional.
01:06:32.000 Yeah, they were allowed to do it.
01:06:34.000 It was not out of the ordinary.
01:06:36.000 It's happened before.
01:06:36.000 It happened 2005.
01:06:37.000 It happened a bunch of times.
01:06:39.000 I mean, 2016 was crazy.
01:06:41.000 And the end result would have been Biden getting certified as president.
01:06:44.000 It would have given Trump supporters their voice on the Electoral College count floor.
01:06:50.000 It would have satisfied many, not all, to be like, well, at least the American people can hear what we have to say, and we weren't denied that opportunity.
01:06:57.000 Now, unfortunately, It was the Trump supporters who stormed in and stopped that from happening.
01:07:02.000 But to criticize Josh Hawley and Ted Cruz as though they're Nazi propagandists, or in any way like him, simply because they wanted to say, here's what's happening, and here's why we have concerns about this, that's... I think that should be evidence that people don't want unity, and that this is likely going to escalate, and escalate faster than you realize.
01:07:21.000 It reminds me about what the Nazis did, because they would demonize the communists, and then He's like, we need a new national crackdown on terrorism.
01:07:31.000 Those people are acting like the Nazi party from old.
01:07:34.000 It's like, that's what the Nazis did to the communists.
01:07:37.000 They cracked down and they said that they were the evil from, you know, the 10 years ago in Russia or whatever.
01:07:42.000 I'm sorry.
01:07:42.000 The first sentence is you're a Nazi.
01:07:44.000 The second sentence is unity.
01:07:46.000 Everyone come together.
01:07:47.000 Like, Really?
01:07:49.000 Are you really trying to act like you're bringing people together if you're using hyperbolic language like that?
01:07:56.000 Well, so we have had for years people tweeting things about killing Nazis and punching them, but then they go and call literally everyone Nazis or compare everyone to Nazis.
01:08:04.000 And so, what are people supposed to think?
01:08:07.000 You want to hurt us, you want to attack us, because it's not about attacking Nazis, it's about using the worst possible smear you can against those you don't agree with.
01:08:14.000 That way, when you advocate for some- like, listen, there are people on Twitter who are saying, you know, kill Nazis or whatever.
01:08:21.000 Twitter allows it.
01:08:22.000 Then, once that's been approved, they then tack onto it, here's a list of who the Nazis are, and they grab random people they don't like.
01:08:28.000 And now Twitter's approved that.
01:08:30.000 So I'll tell you, when Joe Biden says that the protesters are domestic terrorists, those that stormed the Capitol, he says these senators are basically propagandists for the insurrection.
01:08:41.000 The Wall Street Journal reports, Mr. Biden has said he plans to make a priority of passing a law against domestic terrorism, and he has been urged to create a White House post overseeing the fight against ideological-inspired violent extremists and increasing funding to combat them.
01:08:59.000 Is it called the Enabling Act?
01:08:59.000 Yeah, you know, I said it's gonna be called like the SAFE Act, like Securing American Freedoms, you know, Enhanced or something like that.
01:09:02.000 Oh, jeez.
01:09:02.000 The SAFE Act.
01:09:02.000 The SAFE for everybody.
01:09:16.000 The He's in his ear.
01:09:17.000 He's an establishment candidate.
01:09:19.000 He's a lobbyist.
01:09:20.000 And he's going, what they're probably thinking, these establishment people, is once we get power, we better make sure these people never win again.
01:09:27.000 And he's not just talking about Trump.
01:09:29.000 He's talking about Bernie Sanders, too.
01:09:30.000 And that's why I think it's hilarious.
01:09:32.000 Many of these leftists walked right into this.
01:09:33.000 It should have been obvious.
01:09:35.000 I said it.
01:09:36.000 If the establishment gets back in, they're going to lock the doors and no populist will ever see it, whether it's left or right.
01:09:43.000 Yeah, they would do the same thing to Cortez, I would think.
01:09:47.000 Cortez is trying to play the game right now, calling for people's censorship.
01:09:50.000 Another scary aspect of this is that, you know, Biden, he kind of showed that he's not really there.
01:09:56.000 He's not on it.
01:09:58.000 He doesn't have the ball in front of him.
01:10:01.000 It looks like someone else has the ball and is carrying this whole program here.
01:10:06.000 When you look at his speech, when you look at his mindset, Competent doesn't come to mind and when you have that you also understand that this is the person that sold out to the special interest almost more than any other president before him.
01:10:20.000 He argued with Barack Obama saying that there needs to be more special interest inside of the Obama administration and Obama had to tell him no.
01:10:29.000 So when we have big tech executives inside of the Biden administration, Goldman Sachs, the military-industrial complex, and you have unlimited power, you look at that entire recipe, there's no one or nothing that could check him.
01:10:43.000 Another thing that I kind of wanted to bring up is that if you remember Chuck Schumer literally brought up that if you mess with the, this is not his exact phrase, but he said when Donald Trump messes with the intelligence agencies, they have six ways to Sunday to get back at him.
01:11:00.000 In relation to that, I also want to bring up this CBS News article that is literally titled, Social Media is a Tool of the CIA, Seriously.
01:11:09.000 That is the title of their article on CBS News, and they start off by saying, quote, you don't need to wear a tinfoil hat to believe that the CIA is using Facebook, Twitter, Google, and other social media companies to spy on people.
01:11:24.000 That's because the CIA published a helpful list of press releases on all the social media ventures it sponsors via its technology investment firm In-Q-Tel.
01:11:35.000 So again, that's that's the firm that I brought up here previously before.
01:11:38.000 So there is a lot of things to talk about there.
01:11:40.000 There's a lot of room for kind of speculation here, even though I don't like doing that.
01:11:45.000 But we have to understand when we look at these big tech companies, they're not just outside entities outside of the government.
01:11:51.000 They are entities that work with the government hand in hand, not just spying on you, but in more severe ways than we even know.
01:11:59.000 And this is truly an emerging power that can't be unchecked.
01:12:03.000 And Amazon.
01:12:04.000 It's not considered a social network, but they have that 100% computer that everyone's got people have in their
01:12:09.000 house that you can command and listen. Well, this is another thing with Amazon with Amazon
01:12:13.000 They're working. They're working on new Technology that will break encryption they're working on
01:12:20.000 they've got it. Well, yeah Well quantum supremacy, you know, you know about this,
01:12:25.000 right?
01:12:25.000 Oh, yeah, but I don't think it's it's we don't know the exact levels Yeah, we don't know the exact details here.
01:12:31.000 They were heavily criticized for developing facial recognition technology I was used by ice, but that's only just the tip of the iceberg here literally Comparatively to all the other big deep state projects that they're working on that they're developing that need to be brought up Have you seen go big show on TBS?
01:12:49.000 Oh Uh, we were slowly... I was watching two minutes of it when you were watching it.
01:12:52.000 Why can't you just let them have the power?
01:12:55.000 Didn't you want to see the man do the backflip on the tricycle?
01:12:58.000 I was too busy working out.
01:13:00.000 There was another guy I heard, Luke, who got a football to the groin.
01:13:04.000 Now, wouldn't you much rather just order a pizza, sit back, watch a football to the groin show, and leave the Democrats to have their power and let them do what they want?
01:13:13.000 Wasn't that in Idiocracy, where they had a show where the guy was just getting hit in the balls?
01:13:18.000 Really?
01:13:19.000 The show that you were watching?
01:13:20.000 Was there really a segment?
01:13:21.000 No, no, no.
01:13:22.000 I wouldn't be surprised.
01:13:23.000 I wouldn't be surprised either.
01:13:23.000 Yeah, idiocracy, man.
01:13:25.000 Mike Judge nailed it.
01:13:26.000 Yeah.
01:13:26.000 I mean, to be honest, Donald Trump's in the WWE Hall of Fame, and then Camacho, the president, was a wrestler, so... Beavis and Butt-head's coming back, I heard.
01:13:35.000 Oh, it comes back every so often, doesn't it?
01:13:37.000 I think we're headed for dark days, man.
01:13:40.000 Because what's happening is happening faster and faster.
01:13:44.000 And what we saw at the Capitol was... I think in terms of the political ramifications, it was serious.
01:13:51.000 And the craziest thing is when you look at photos and there's a photo going around of a guy with zip tie handcuffs, and people are like, what were they planning on doing with that?
01:13:57.000 Like taking hostages?
01:13:59.000 And then there's a picture of a grandma who's just like waving a little flag and she has no idea what's going on.
01:14:03.000 It's really, really weird what we're seeing, but the media is treating this like the apocalypse.
01:14:08.000 It's exactly what the left, the establishment, the cultural institutions needed to take action to start purging everybody.
01:14:14.000 So now there were reports earlier before the show that Steve Bannon's show has been deleted from YouTube.
01:14:19.000 So I think, you know, and they said it was for election related misinformation.
01:14:22.000 Yeah, that wouldn't surprise me.
01:14:24.000 And there's a very famous meme going around now that says, quote, we spend $750 billion annually on defense and the center of American government fell in two hours to the duck dynasty.
01:14:37.000 And the guy in the Chewbacca bikini.
01:14:40.000 And they have a photo of the guys in costume.
01:14:43.000 You know what's really funny?
01:14:45.000 As this purge is going on, I do think some people are leaving Twitter.
01:14:48.000 That they're deactivating their accounts and they're going to parlor because the president has been removed.
01:14:52.000 But I think a lot of people are being banned and a lot of people are noticing.
01:14:55.000 I think it's probably a lot of people purposefully leaving.
01:14:58.000 I wonder if the majority is actual bannings.
01:15:01.000 But my Twitter following is like, it goes down and it spikes really high.
01:15:04.000 Because I'll tweet something and then the people who remain will start following me.
01:15:07.000 But then as people are leaving it goes down.
01:15:09.000 A lot of people are down, like I saw one tweet just now, 16,000 people.
01:15:14.000 That's huge.
01:15:17.000 This is a mass purge.
01:15:21.000 This can't just be people leaving.
01:15:23.000 Twitter is going through networks.
01:15:25.000 They're probably looking at a network and just removing people.
01:15:29.000 Well, so there's a Twitter bot that will tell you when someone in the Trump network follows
01:15:35.000 or unfollows. And it was this massive lift saying Rudy Giuliani unfollowed this person,
01:15:39.000 and this person, and this person, and just a huge list of people saying Rudy Giuliani unfollowed
01:15:42.000 these people. And I'm like, did Twitter just pull up Rudy Giuliani's following list and just delete
01:15:47.000 everybody he followed? Because they all got nuked. He was trying to protect people, maybe?
01:15:52.000 No, I don't know.
01:15:52.000 Rudy Giuliani is fighting for Trump.
01:15:54.000 So, you're removed.
01:15:55.000 When I was down there in the bathroom, I just thought about that we need to break up these
01:15:59.000 corporations again.
01:16:00.000 This is this monopoly on public speech.
01:16:03.000 And I just don't see a value to shattering the corporations into a bunch of proprietary
01:16:08.000 networks like why would we break Facebook into Facebook Prime and Instagram again that
01:16:12.000 Zuckerberg owns both of that the code is still.
01:16:14.000 So I keep going back to the way we would break up a social network's monopoly is by freeing
01:16:19.000 their software code after they reach a certain level of user base.
01:16:23.000 And people's argument is, why would I give up my work for all this my life's work if I've attained 100 million followers?
01:16:30.000 Now I lose my code.
01:16:32.000 And I'm like, well, your code going free doesn't mean you lose the network, you still own Facebook, you still can profit off of all that activity on Facebook, but the code Yeah, a code should be like an idea.
01:16:43.000 I think it should be open sourced, and I think if we did have open source technology, the world would be a lot better and freer.
01:16:49.000 And the network effects that you can achieve.
01:16:51.000 My opinion.
01:16:52.000 The network effects and growth you can achieve, not to mention because your community will trust you more now, because you're being transparent with them, but the reason Bitcoin is exploding right now is because it's open.
01:17:03.000 There's not going to be a closed system.
01:17:05.000 But I understand that you guys view the code as our code.
01:17:08.000 I think that sometimes the code can be my code, and you're all just dirty commies who think that.
01:17:13.000 And I actually believe in private property, so I disagree.
01:17:16.000 But in all seriousness, no, I think there can be, I think a lot of things need to be open, depending on what they are.
01:17:21.000 Probably social media.
01:17:22.000 If it's something that has a serious impact on our politics, civics, then we should probably understand how that works.
01:17:27.000 But if it's a proprietary service, I don't think that code should be forced open.
01:17:30.000 Like a city?
01:17:30.000 You know, not forced, but like it's in your interest to do it.
01:17:35.000 Well, I'm suggesting...
01:17:36.000 Yes, absolutely.
01:17:37.000 We need to know how they're working, why they're working, and we should be able to watch.
01:17:41.000 Because think about it this way.
01:17:42.000 If we could see the voting counting going in in real time and how the code worked, and
01:17:45.000 then something weird happened and a vote flipped, everybody would see it.
01:17:49.000 The problem, I guess, is that it's connected to the internet, but then everyone's watching
01:17:52.000 So, I don't know.
01:17:53.000 There's challenges to this.
01:17:55.000 It's just accountability.
01:17:56.000 Why can't we have accountability?
01:17:58.000 We should have accountability for so many different things in our society that would clear things up.
01:18:04.000 If you're gonna say that there was Russian collusion with Donald Trump, show us the evidence.
01:18:08.000 It took them a while to reveal absolutely nothing.
01:18:12.000 And then, in the meantime, they slandered and discredited and threw people under the bus, including myself and WeAreChange.
01:18:20.000 We're talking about, you know, the voting that just happened.
01:18:22.000 Be transparent.
01:18:23.000 Do investigations.
01:18:24.000 Look into it.
01:18:25.000 Show us the evidence.
01:18:27.000 Again, that would have proved and solved so much angst.
01:18:30.000 That would have proved and solved so much of the uncertainty.
01:18:33.000 And again, when these companies make these large decisions, banning people, destroying people's voice, they're doing it in a way where there's no accountability for that.
01:18:43.000 There's no way to appeal it.
01:18:44.000 We don't even know why the decision was made.
01:18:47.000 We don't know exactly what even led up to it.
01:18:50.000 It's just a totalitarian saying, that's it.
01:18:52.000 I get my way.
01:18:53.000 I don't even have to say why I did what I did.
01:18:56.000 And that's a dangerous, unaccountable power that surely...
01:19:01.000 I was saying this years ago, is going to be abused, is being abused right now.
01:19:05.000 A lot of people want to talk about civil war and stuff.
01:19:08.000 And they think the right has some tremendous advantage because they're the tough guys, because they're the survivalists and all that stuff.
01:19:13.000 But I said, listen, man, they'll sever the lines of communication in two seconds before anything starts.
01:19:19.000 And then you'll be sitting there looking at your phone saying, I wonder what's going on.
01:19:21.000 The lines of communication are being severed.
01:19:23.000 It's what they're doing.
01:19:25.000 Yeah, Zuckerberg's kind of like a mayor of a city, of a town.
01:19:28.000 And right now, it's like a private town.
01:19:31.000 He's not a mayor.
01:19:31.000 Well, it's kind of like he's an internet mayor or an internet governor.
01:19:35.000 Well, have you ever seen those old westerns where, like, the guy rides into the town and he's like, it's my town!
01:19:40.000 Sheriff, you work for me!
01:19:41.000 It's my town now.
01:19:42.000 And he owns the town.
01:19:44.000 So our government is in place to make sure that no individual owns these cities.
01:19:48.000 Like, no one owns New York.
01:19:49.000 It's controlled by all of us.
01:19:51.000 And I think that Facebook has gotten to the strength of power.
01:19:53.000 Well, someone governs it that's put into power.
01:19:56.000 Look at what De Blasio's doing.
01:19:58.000 Look, his wife's got a $2 million staff while the city burns.
01:20:00.000 The way the law is built, I'm just talking about, is that I think that Facebook is powerful enough and influential enough that we should treat it like a city and not a piece of ownership of something that someone can own.
01:20:14.000 I'm just talking about the code.
01:20:15.000 He can still own the domain and people can still use Facebook and he can have stores and everything.
01:20:22.000 Publicly owned and open with guaranteed rights and we don't need to worry about making money on or they could have utility it could still be Private and all the code could be a utility that we could build another network.
01:20:33.000 That is a utility with the same code They could integrate with Twitter Yes.
01:20:37.000 Because if you shatter it into a bunch of proprietary networks, it wouldn't stop the monopoly on the behavior.
01:20:43.000 Right.
01:20:44.000 I think we need platforms that are free speech, you know, open, publicly owned.
01:20:50.000 And that's just me.
01:20:50.000 Look, maybe I'm lefty, huh?
01:20:52.000 Taxpayer funded, nationalized with guaranteed rights.
01:20:55.000 You break the law, someone reports you, it's a criminal offense.
01:20:57.000 You broke the law.
01:20:58.000 If you say a nasty opinion, you block them and say, don't want to see you.
01:21:01.000 That simple.
01:21:02.000 What do you do?
01:21:03.000 Harassment laws apply.
01:21:05.000 Harassment is a crime.
01:21:06.000 This is how the big networks grew, under that premise.
01:21:11.000 Their content policies were always pretty restrictive, but to a certain degree they rode the whole wave of letting people say most of what was okay, and now they're doing the bait-and-switch.
01:21:26.000 So the company that's a big problem too is that you can make a company make it huge and popular and then sell it to some totalitarian dictator and then all of a sudden a hundred million people are now being Driven by this this guy that now owns the city you basically handed the keys of the city of this next guy So yeah, I agree.
01:21:41.000 I don't think that These networks should be controlled by the bait-and-switch.
01:21:46.000 Even the potential for the bait-and-switch shouldn't exist.
01:21:54.000 You would just expect that executives with billions of dollars, thousands of developers at their disposal, could come up with realistic problems for breaking echo chambers.
01:22:07.000 Here's recommendations of stuff that you might disagree with or from people from the other side of the spectrum.
01:22:13.000 Here's recommendations for this.
01:22:15.000 Here's how to curate your algorithm so you know, so you're getting a balanced diet of information.
01:22:20.000 Like they just literally, it is intentional.
01:22:23.000 But you know the problem with that is, for like Twitter, is that if you're somebody who's like, is a far leftist, and they say, why don't you follow Tim Poole?
01:22:31.000 He's a, you know, moderate individual who believes in free speech and liberty.
01:22:36.000 Then they're going to start spamming me and insulting me and it's going to be really annoying.
01:22:39.000 The problem is, ultimately in the end, you have many different kinds of people but there seems to be two overarching kinds of people.
01:22:47.000 The if someone is bothering me I'll block them group and the if someone's bothering me I demand Twitter block them from everyone group.
01:22:54.000 And so there's no negotiating.
01:22:56.000 It's one of the things that I think Jack Dorsey actually said in one of his testimony, in his Senate testimony, he was like, we have people who are demanding on the left that we ban people for this reason, and then the right demands that we don't ban them for this reason, and we have to figure out, like, we have these both, you know, both groups screaming in our ears.
01:23:15.000 Now I guess ultimately, because the cultural institutions and the media are controlled by the left, these big tech companies know exactly who butters their bread.
01:23:21.000 They want to sell advertisements, right?
01:23:23.000 Well, if a news story comes out in the Wall Street Journal that YouTube does bad, then YouTube says, we're so sorry, Wall Street Journal, please.
01:23:30.000 And then they cave.
01:23:31.000 That's what happened with PewDiePie in the first adpocalypse.
01:23:35.000 And the crazy thing is these news outlets know YouTube is their competition.
01:23:38.000 So they're doing it on purpose for probably for a financial gain.
01:23:41.000 But they also learn that people like to hear their own thoughts regurgitated to them so they created echo chambers through the algorithm and I remember back in the day when the Internet was still amazing in a beautiful place and was a free place because it didn't have any algorithms it didn't have any news feeds it didn't have any curated Timelines with these corporations deciding what ... you should hear if you would subscribe to something you ... would actually see it you would actually hear it this ... curation has essentially led to these larger echo ... chambers to these larger to these larger radicalizations ... and have pushed people further and further apart on the ... political spectrum.
01:24:17.000 We're now we are in a situation where people are at ... each other's throats and we have to wake up and realize ... that this was done by social media so what makes you think ... giving all your power to social media is going to fix it ... this is such a frustrating thing you see the direct ... fingerprint whether it's the mental health crisis whether ... it's the algorithm whether it's the echo chambers whether ... it's them colluding with intelligence agencies and ... government agencies when you see this problem.
01:24:46.000 And they're a part of it.
01:24:47.000 And now you have people saying, they're going to fix it all if you just give them all of your power.
01:24:51.000 And people are falling for it, celebrating this today?
01:24:54.000 You've got to be freaking kidding me.
01:24:55.000 It's like the monkey's paw.
01:24:57.000 You know that story?
01:24:57.000 It's like you get three wishes, but then it twists your wish.
01:25:00.000 These leftists are like, yeah, censorship, yeah.
01:25:02.000 They're going to get censored.
01:25:03.000 Censor all the bad people.
01:25:05.000 The definition of bad changes.
01:25:06.000 It's like a Twilight Zone episode, where it's like, it was time now.
01:25:09.000 It's not fair.
01:25:10.000 It's like, why am I being banned?
01:25:12.000 No!
01:25:12.000 Everyone was finally banned and I could finally have peace!
01:25:15.000 And then they drop their phone and the phone shatters.
01:25:17.000 No!
01:25:18.000 You know that episode, right?
01:25:18.000 No.
01:25:19.000 It sounds awesome.
01:25:19.000 It's the episode where the guy just wants to read and the world ends.
01:25:22.000 Oh, and he breaks his glasses.
01:25:23.000 Yeah, and he's got big, thick glasses.
01:25:24.000 So I'm imagining it's like a leftist demanding everyone be banned.
01:25:27.000 And then finally, once everyone's banned, he has his phone.
01:25:29.000 He's like, now I can look.
01:25:30.000 And he drops his phone.
01:25:31.000 No!
01:25:32.000 Well information, people need to understand, information is key during war.
01:25:35.000 One of the first things that was done during the Iraq war from some of the reports that I heard from frontline soldiers is that there was leaflets dropped on populations.
01:25:43.000 Oh yeah, of course.
01:25:44.000 Saying Americans are coming, they're here to liberate you and here to free you.
01:25:48.000 So this has been done as a part of psychological warfare many times throughout many important battles and this is the information war ramping up to huge just astronomical levels where Even fifth generational warfare doesn't put a candle to it, to what's happening now.
01:26:06.000 And that's another term that people should look up and should research themselves when they want to understand what is deeply happening and what is going to happen from here.
01:26:12.000 In Vietnam, they used to blast audio in the jungle.
01:26:16.000 This, like, demonic sounds at night.
01:26:18.000 No, no, no, no, no.
01:26:19.000 And it would scare the Vietnamese because they were all emotional and they wouldn't come out that night.
01:26:23.000 No, I gotta correct you.
01:26:24.000 Talk about informational warfare.
01:26:25.000 It was just like... Let me, let me, let me... You know, medulla oblongata.
01:26:29.000 Let me correct you.
01:26:29.000 Please do.
01:26:30.000 During Vietnam, the U.S.
01:26:32.000 would blast audio of wailing Vietnamese saying, Why did I do it?
01:26:37.000 I made a mistake.
01:26:38.000 Run while you still can or you'll be trapped here for eternity like I am.
01:26:43.000 Because their religious belief was that if they weren't properly buried, they were trapped to roam the area where they died forever.
01:26:50.000 The problem was it was too effective.
01:26:52.000 And the South Vietnamese, I believe it was the South who was working with us, panicked and ran when they heard it.
01:26:57.000 But imagine you're in the jungle, in the dead of night with your gun, and then you hear a wailing ghostly voice crying and begging you, saying, don't become trapped like I am, run while you still can.
01:27:07.000 Psychological warfare is crazy stuff, man.
01:27:10.000 You know that old fake story about the general and the pig's blood?
01:27:14.000 Apparently it's not a real story, but they talk about this general who, after killing a bunch of Muslim soldiers in the Middle East, poured pig's blood on them and left one alive and let him leave, so that he went and told them and then they all stopped fighting.
01:27:30.000 I believe that story's not true, but people tell it all the time, the idea being that he's like, oh no, this is bad, it's against their religion, and so he panicked, told everyone, and then they refused to engage.
01:27:39.000 Psychological warfare.
01:27:41.000 You know what?
01:27:41.000 It's simple.
01:27:42.000 Pen is mightier than the sword.
01:27:43.000 That's what they say.
01:27:43.000 The first step in it is to control communication.
01:27:47.000 Once you control communication, once you control what people can and cannot listen to, you have such a severe advantage over your supposed enemies or even someone you think are the Nazis.
01:27:57.000 You know what Trump's mistake was?
01:27:58.000 He didn't watch Revenge of the Sith.
01:28:00.000 Because I just watched Revenge of the Sith.
01:28:02.000 That was his mistake.
01:28:03.000 And you know what Palpatine did?
01:28:04.000 Smart.
01:28:05.000 Palpatine feigned a assassination attempt, you know, when Mace Windu comes in and then Anakin's there and then he's like, don't let them kill me!
01:28:14.000 I'm too weak!
01:28:15.000 And then Anakin, you know, ends up killing Mace Windu and everything.
01:28:18.000 I'm kidding, by the way.
01:28:19.000 But it is it is it just like you know Trump is sitting there. I think I don't think Trump had uh to be honest
01:28:25.000 They're saying trump incited and all this stuff. I don't I don't think so
01:28:28.000 I don't think trump intended for this to happen when we had we had jack murphy on recently and he was saying that it
01:28:32.000 sounded Like when trump was giving his speech, it was a concession
01:28:34.000 speech He said sometimes it takes more courage to do nothing and
01:28:37.000 he was like, what does that mean?
01:28:38.000 You know, yeah, like trump was trying to wind things down.
01:28:41.000 Well, he said it's gonna be up to mike pence Yeah.
01:28:44.000 So he made sure he had no responsibility at all.
01:28:49.000 And everyone was like, OK, let's see what Mike Pence is going to do.
01:28:51.000 And he released that statement, which got around.
01:28:55.000 Somebody tweeted something really funny.
01:28:56.000 They said, how long until Trump uses the presidential alert system to send a message?
01:29:00.000 I was thinking about that when he tweeted he's going to presidential alert his new parlor account.
01:29:05.000 Yes.
01:29:05.000 Well, there's even some scuttlebutt of them creating their own social media networks.
01:29:10.000 Well, he said that.
01:29:11.000 That's what they should do.
01:29:12.000 They should use the Mines Code.
01:29:13.000 They should take the codes there.
01:29:15.000 Take the code and just build your own network with Mines Code.
01:29:17.000 We need a government site, man.
01:29:19.000 More diversity.
01:29:20.000 I mean, Tim, you've known.
01:29:21.000 You've known to back up your social situation the whole time.
01:29:24.000 Of course.
01:29:24.000 You have to.
01:29:25.000 You have to protect yourself.
01:29:26.000 And the fact that, you know, he didn't, it's...
01:29:30.000 There's troubling questions and troubling and dark times ahead.
01:29:34.000 And you know, Joe Biden, he warned us.
01:29:36.000 He said a dark winter.
01:29:37.000 That's what he called it, a dark winter.
01:29:39.000 And I think the challenge for us is, well, I should rephrase this.
01:29:47.000 I don't think there is a path towards de-escalation.
01:29:50.000 And I've said this quite some, a long time ago.
01:29:52.000 And I had, it's really interesting, the left repeatedly claimed that I was wrong and hyperbolic and fear-mongering and all that for simply saying, look what happened.
01:30:00.000 This is freaky.
01:30:01.000 Why would it stop?
01:30:02.000 I'm worried about this.
01:30:03.000 And then when it does happen, they're like, why were you talking about it?
01:30:05.000 You shouldn't have mentioned that Atlantic wrote about a coming civil war.
01:30:09.000 Like, how dare you read what the mainstream media is saying?
01:30:12.000 The issue, there was a tweet earlier from the, someone wrote in the Washington Post, I think it was Margaret Sullivan.
01:30:17.000 She said it was Tucker's fault.
01:30:18.000 It was Hannity's fault.
01:30:19.000 It was Fox News's fault.
01:30:21.000 And I'm like, if I recall, they were condemning the violence all year.
01:30:25.000 And it was CNN who said protests have to be peaceful.
01:30:29.000 But when you see tweets like that, when you see Democrats calling for expulsion, calling for escalation, then I would be more than happy to have everybody just be like, we don't want to do this anymore.
01:30:40.000 Who wants to go see a movie?
01:30:41.000 I'd be like, I'm down.
01:30:42.000 I don't care what your politics are.
01:30:43.000 Let's go grab pizza and a beer and hang out and just stop all this.
01:30:46.000 But you've got a constant berating and beating and suppression happening where they didn't just go for the president today.
01:30:55.000 They're going for his supporters and they're nuking everybody.
01:30:58.000 Steve Bannon's War Room deleted.
01:31:01.000 The YouTube channel's gone.
01:31:02.000 They're making sure they're purging this aspect of American culture.
01:31:06.000 I think there's no linear path to de-escalation, maybe.
01:31:09.000 Like you said earlier, it's a compounding or an exponential escalation in any direction.
01:31:16.000 Because of the way the system works now, it started with radio and television that you could speak for an hour, but then people could listen to it for 10,000 hours.
01:31:26.000 Even though you only spent an hour of your time, 10,000 hours of listening could accrue.
01:31:32.000 Or a hundred thousand or a million.
01:31:33.000 And now with internet video, it's not just on for an hour a day.
01:31:36.000 It's on there permanently for like exponentially more listening hours or potential.
01:31:41.000 So change can happen exponentially in any direction, including a deescalative function.
01:31:47.000 Totally agree, man.
01:31:47.000 It could happen within days if with the right powers in place.
01:31:52.000 Bill, you brought up a good point.
01:31:53.000 You got to be prepared for this stuff.
01:31:54.000 You guys were preparing.
01:31:55.000 You guys have an alternative.
01:31:57.000 I was preparing.
01:31:57.000 I've been collecting emails on, you know, wearechanged.org because I knew something was coming.
01:32:04.000 Donald Trump supporters are screaming about this, saying this is going to happen to you.
01:32:08.000 You need to do something.
01:32:09.000 You need to prepare.
01:32:11.000 Some of us have prepared, but essentially, you know, let's just be honest here.
01:32:15.000 He, I mean, It's not just that he wasn't prepared.
01:32:19.000 He was sometimes ignoring individuals telling him directly, this huge censorship hammer is coming your way.
01:32:25.000 Well, how long ago was it that we were at the White House?
01:32:28.000 A year or something?
01:32:29.000 A year?
01:32:30.000 Longer than a year.
01:32:30.000 It was like a year and a half.
01:32:32.000 And Trump was just like, what platform should I use?
01:32:34.000 Didn't think to ask someone?
01:32:35.000 Well, he invited us.
01:32:38.000 Yeah, but he was resisting because he has a very mainstream approach to things.
01:32:43.000 He's an old guy and he's very TV, watches Fox and all that.
01:32:50.000 I don't understand why Dan Scavino didn't.
01:32:52.000 He's savvy.
01:32:53.000 Or at the time it was Brad Parscale.
01:32:57.000 Why didn't any one of these people say, let us run the account for you?
01:33:00.000 We'll set up a parlor, we'll set up a mines, whatever.
01:33:02.000 I mean, what really worries me is just the normal people sitting on social media watching the preaching happen and, you know, glorification of this kind of event.
01:33:11.000 And like, people with good intentions actually do think that this is helping.
01:33:16.000 And that's what's really scary.
01:33:18.000 It's like people genuinely believe That this path is going to make the world safer.
01:33:24.000 I'll tell you one thing.
01:33:27.000 You take a look at the people with connections, the people with resources, and the people of great success, and what have they been doing?
01:33:35.000 They've been buying Bitcoin.
01:33:37.000 I wonder why.
01:33:38.000 They've been buying land in rural areas and fleeing cities for some time.
01:33:43.000 They certainly must know something.
01:33:45.000 Maybe they don't all think the same thing, but they think something similar.
01:33:48.000 And when you see people like, I don't know, a couple years ago, going on a major podcast and warning about a coming conflict due to censorship, and then saying, I'm going to build a van, so, you know, a bug-out van.
01:33:58.000 And now here we are with the Capitol being stormed.
01:34:01.000 You know, I don't expect, like, a plumber to be fully tuned into what's happening, but there are people who base all their investments on just, they wait for Warren Buffett to do something.
01:34:12.000 Like, he must know something, so I'll just buy what he buys, right?
01:34:15.000 Well, when you see all of the people who have access to government officials and media institutions and intelligence agencies fleeing to rural states, red states and rural areas and buying up swathes of land, it should make you think something about what's going on and what you should maybe consider.
01:34:30.000 Another thing to really kind of deep dive into and to really think about is we're also seeing something that is curated for us.
01:34:38.000 So the algorithm, the newsfeed, the curated timeline, they're showing you people celebrating, but that doesn't essentially mean that a lot of people are celebrating and those viewpoints can be manipulated.
01:34:50.000 Our perceptions can be manipulated by what we're selected to see, and already there have been studies done showing how the timeline could manipulate your emotions and how they can make you feel different emotions just by deciding what to show you.
01:35:06.000 They could do that on so many other different levels.
01:35:10.000 Do you know about the Confessions of the Economic Hitman?
01:35:12.000 Yeah, of course.
01:35:14.000 Yeah, one of the things he talks about is that in order to stage a coup in a foreign country, one of the things you do is you hire about 1,000 people to protest and film it, and then from those tight camera angles showing massive crowds, you say it's 100,000.
01:35:28.000 And then people believe it, and they think the country is in chaos, and then it makes regular people freak out.
01:35:37.000 We've seen a lot of things done through various intelligence agencies with sock puppet accounts, which is, you know, bots.
01:35:42.000 Sock puppet accounts, basically one person will have 50 accounts with fake pictures, with fake names, and they'll post things attempting to manipulate and influence people.
01:35:50.000 And this is an ongoing problem around the world.
01:35:52.000 The US used to do it all the time to manipulate foreign countries.
01:35:55.000 The US government and the Israeli government admitted that they had government agents that are trying to sow a particular viewpoint and a particular narrative that are working from the tax dollars to push the government's agenda.
01:36:11.000 So this is something also talked about by Cass Sunstein, Obama's former information czar, who talked about how there needs to be an effort to undermine individuals who are affected by 9-11, like family members who had questions about that event.
01:36:25.000 He specifically talked about how the online community needs to be infiltrated.
01:36:30.000 needs to have people who go on there and make everyone else look bad so people don't take some of these serious
01:36:36.000 legitimate questions seriously.
01:36:38.000 And he talked about pretty much informational warfare about how to undermine any legitimate form of criticism of
01:36:44.000 government.
01:36:45.000 Well, there are, I'm not going to name the companies for, you know, legal reasons, probably litigious, but they
01:36:51.000 dominate Reddit.
01:36:52.000 There are political organizations that have training manuals on how you derail conversations and manipulate opinion.
01:36:59.000 So when a post pops up and says, you know, Donald Trump does backflip, Someone will comment, I like Backflip.
01:37:04.000 And they tell you, if someone says, I like what Trump is doing, respond with this.
01:37:09.000 And they have a script.
01:37:11.000 And so these people, their whole job is to go on and comment on each other's posts.
01:37:14.000 There was a really funny incident where two people who clearly worked for two different companies were commenting on each other.
01:37:21.000 And someone pointed out, it was robotic, almost.
01:37:25.000 And they were like, this is where you can see where the tangle happens.
01:37:28.000 when two companies trying to promote I think it was promoting Democrats collided and
01:37:32.000 It created this weird loop of nonsensical comment every comment because they were just they weren't actually
01:37:37.000 responding They were like, well the chart says if they say this I say
01:37:40.000 this and so then it creates like a feedback loop You know, I mean it makes you really wonder because I'm
01:37:45.000 right now on Twitter and I'm seeing a lot of people celebrating
01:37:48.000 I'm seeing a lot of people. I'm seeing a lot of the reply guys who are gonna be out of work now
01:37:53.000 Yeah, I know, I love it.
01:37:55.000 And let's be honest, some of it is legitimate, but then I also wonder maybe some of the celebration, maybe some of the victor all that's promoting this larger narrative and agenda could be potentially manipulated.
01:38:06.000 I don't know.
01:38:07.000 I haven't proved it, but I just know for a fact.
01:38:09.000 That it happened before with other agendas, with other special interests that manipulated the system to procure a perception that leads you to be programmed in a way that is beneficial for those doing the programming.
01:38:23.000 And that's exactly what we have to understand here.
01:38:25.000 When we're giving our attention, when we're looking at this timeline, we are giving a part of ourselves into this larger company that could now take so much from us.
01:38:36.000 When something's free, you are the product.
01:38:38.000 and they could twist it turn it and they could be like well you know well maybe he does need to buy this or this they already know so much about you it's absolutely terrifying so what can they do with that information the possibilities are endless and we should not be kidding ourselves if those possibilities aren't being used and institutionalized and implemented right now well That's the problem with proprietary code, in my opinion, is that you don't know what the algorithm is doing, which is why I'm obsessed with mines.
01:39:06.000 I didn't even know when I met you, you were like, hey, let's build a social network.
01:39:09.000 My first thought was another one.
01:39:11.000 Why?
01:39:11.000 There's already Facebook, Twitter and YouTube.
01:39:13.000 We don't need one.
01:39:14.000 But then you started telling me about free software and the power of knowing what the algorithm is doing to you.
01:39:20.000 So it's literally doing things to us. So is you bill who radicalized him into screaming free the code I did it was
01:39:25.000 Introduced him to Richard Stallman and Linus Torvalds But I mean it is true like Linux for an example like which
01:39:33.000 took over the whole operating system infrastructure of the whole
01:39:36.000 Financial system the whole world because it's free. It's free. I mean it and that is going to happen with everything
01:39:42.000 Well, so let me just explain something real quick to people and understand
01:39:46.000 Linux is an operating system.
01:39:47.000 It's free.
01:39:48.000 It's great.
01:39:49.000 And people realize that if you're using some of these traditional operating systems, which you probably know, like Windows, you gotta pay for licenses.
01:39:56.000 Linux is free.
01:39:57.000 So just put Linux on all your servers.
01:39:59.000 Yeah, you go.
01:39:59.000 You save a lot of money.
01:40:00.000 Yeah.
01:40:00.000 So in the biggest, even Facebook and Twitter and Google use Linux heavily.
01:40:06.000 Yeah.
01:40:06.000 But then they build on top of it and they don't share their little secret sauce.
01:40:10.000 And that's because it's an open source code and not a free code.
01:40:13.000 Like isn't there software codes where if you build on top of it, you have to make different licensing structures.
01:40:18.000 And so the difference is an open source software like Linux, you can build on top of it and make it private and then call that whole thing private.
01:40:25.000 But with a free software code, you can build on top of it.
01:40:27.000 It has to remain free.
01:40:29.000 And so any changes you ever make going forward remain free.
01:40:33.000 And yeah, it's called copy left.
01:40:36.000 Copy the principle of having to share copyright.
01:40:39.000 Yeah.
01:40:40.000 Copyright.
01:40:40.000 Yeah.
01:40:41.000 So, you know, and everyone should be able to do whatever they want to do.
01:40:44.000 But that's the funny thing that, you know, it's sort of a left principle.
01:40:50.000 I mean, both are good, but it's everything switching.
01:40:53.000 And what you were saying about, like, where you find the source of truth.
01:40:57.000 When you look at what the left and the right agree on, like the progressives and libertarians, like find those people who can talk.
01:41:03.000 You know, Dennis Kucinich made a really interesting statement where he's like, you know, me and Ron Paul are like best friends, basically.
01:41:10.000 Back in the day.
01:41:11.000 Back in the day.
01:41:11.000 And they would he said that on many votes, if you look back in the record, it was always like hundreds to two.
01:41:20.000 And it was, it was them.
01:41:21.000 It was the two guys from the government, both sides of the spectrum.
01:41:24.000 So if it was like something involving civil liberties or surveillance or we're not, it would, it would be those two guys who, who voted together.
01:41:31.000 And so, you know, that's where the truth is.
01:41:33.000 And that's why, like, you know, some people don't like Tucker Carlson, but like Glenn Greenwald will get on there and talk to Tucker.
01:41:40.000 I mean, you have like, that's an important, so yeah.
01:41:43.000 So these people who are willing to have the conversations, but have radically different political beliefs, but that's why they're all called right wing.
01:41:49.000 Right.
01:41:49.000 Now the weird thing is they can't call Jimmy right wing, so they smear him in other ways.
01:41:54.000 They call him a shill or whatever because he's not right wing.
01:41:57.000 He's like screaming, we gotta have Medicare for all and they're not fighting for us.
01:42:00.000 It doesn't work.
01:42:01.000 But he criticizes the Democrats all the time.
01:42:02.000 Now Tulsi Gabbard, she's not right wing.
01:42:05.000 She's also for... She's for universal healthcare with private insurance.
01:42:10.000 That's what I agree with.
01:42:12.000 And they attack her for it, for not being left enough.
01:42:14.000 It's crazy!
01:42:15.000 Like, most countries in the world that have universal healthcare, when the left is like, oh, all the countries do it, yeah, they have private health insurance on top of it.
01:42:20.000 And that's what she's for, but they call her right-wing.
01:42:22.000 They call all- Glenn Greenwald has been right-wing for a long time.
01:42:25.000 It's hilarious.
01:42:26.000 It's easy to slap that label on somebody.
01:42:27.000 But the funny thing is, with Biden, he's always been generally moderate, hasn't he?
01:42:33.000 Traditionally, was he a radical left?
01:42:36.000 No, he wasn't.
01:42:37.000 The better way to put it is, he says what he thinks he needs to say to reach the lowest common denominator.
01:42:41.000 But we also have to understand when we're talking about individuals like Tucker or Greenwald or Tracy, these are individuals who also criticize Trump, right?
01:42:49.000 These are individuals who actually have, you know, virtues, who actually have principles, who actually have ideas that they believe in that they don't flip-flop on no matter what the political alignment is.
01:43:01.000 They rather go on merit rather than political ideology which is something that's extremely rare should be promoted more but sadly we're seeing less and less of and we're going to see a lot less of that especially because behavior like that was not incentivized by the algorithms people knew if they wanted more followers they wanted more engagement they would Metaphorically take a dump on their political opposition and they were dunking on them and everyone was celebrating and they were fighting and then until the fighting gets so out of hand where here we are today in this censorship.
01:43:34.000 They know it's going to spread more paranoia.
01:43:37.000 They know it's going to spread more fear.
01:43:39.000 They know it's going to spread more conspiracy theories disinformation and false news and it's going to make it worse.
01:43:46.000 So the fire is being fueled.
01:43:48.000 It's it's out of hand already.
01:43:50.000 And it's going to get a lot hotter in here.
01:43:53.000 So that's my two cents.
01:43:54.000 Yeah, it's burning up.
01:43:55.000 And it's it's crazy that, you know, there are internal wars happening at these companies as well.
01:44:01.000 Like one of the anomalies that I am trying to understand is like Peter Thiel, for instance, is on the board of Facebook.
01:44:08.000 He was a big Trump supporter.
01:44:11.000 I didn't realize he was with Facebook.
01:44:13.000 Yeah, he was.
01:44:14.000 Peter Thiel was Facebook's first.
01:44:16.000 Do you see the movie The Social Network?
01:44:17.000 Yeah.
01:44:17.000 You know, when he walked in that guy's office and he gave him his first hundred K check or whatever?
01:44:21.000 That was Peter Thiel.
01:44:22.000 Peter Thiel is like a traditional libertarian guy.
01:44:24.000 And granted, he does invest in companies that I think have really bad privacy abuses.
01:44:28.000 I mean, Facebook, Palantir, these kinds of things.
01:44:31.000 But he's also big on Bitcoin and he's playing both sides of the spectrum.
01:44:36.000 And so it's just like unbelievable that they're not seeing the long game here.
01:44:41.000 Yeah, his argument with starting Palantir was that there was going to be a 21st century spy tech, and so it may as well be us.
01:44:48.000 Because we have good intentions for people.
01:44:51.000 That was his mentality.
01:44:52.000 I hear a lot of dictators use that same kind of logic.
01:44:56.000 We should go to Super Chats!
01:44:57.000 So, uh, man, we got so many super chats in today that once we get too many, YouTube's actually removing a bunch of them.
01:45:04.000 Well, no, that's bad.
01:45:05.000 I mean, it means, yeah, too many come in and then the old ones just fall all over.
01:45:13.000 I'm just saying, I feel bad for people whose messages, you know, ended up getting erased.
01:45:20.000 Yeah, you know, so about half an hour's worth of super chats, you know.
01:45:23.000 Sure, tweet any ones that you got on.
01:45:25.000 Yeah, tweet them at us.
01:45:26.000 Destroy, tweet them at us.
01:45:27.000 Let's see what we got here.
01:45:28.000 So, if you haven't already, smash the like button, subscribe, hit the notification bell, and if you want to support the show, you can go to timcast.com slash donates to donate directly, assuming, you know, eventually something happens to this channel, but we should have the full site up and running soon.
01:45:41.000 Yes!
01:45:42.000 And by soon, I mean like a few days or so.
01:45:45.000 So, That'll be good.
01:45:47.000 And then we'll have, you know, exclusive content on the website, and it'll be great.
01:45:52.000 But let's read some of these superchats.
01:45:53.000 Let me also suggest you smash that gorilla and buy a t-shirt.
01:45:57.000 Oh, and I'll let you guys know, the I Am A Gorilla t-shirt is officially up on the Teespring store.
01:46:02.000 But YouTube has to approve of it, so if you go to the Teespring store, which I think might be linked below, I don't know.
01:46:07.000 Oh, it is, yeah.
01:46:08.000 Then there's the I Am A Gorilla shirt, it's in there.
01:46:10.000 I saw you guys bought a bunch of Harumph t-shirts last night, that was exciting.
01:46:13.000 Oh, did they?
01:46:14.000 Yeah.
01:46:14.000 They did.
01:46:14.000 Well, next we're going to put the Gorilla one up.
01:46:16.000 Ooh, yeah.
01:46:16.000 And then we have the I Am A Gorilla Love Yourself, which will be fun.
01:46:18.000 I like it.
01:46:19.000 Correct.
01:46:19.000 Daniel Maxwell says, they want to prevent the center and right from organizing and planning out a political counterattack.
01:46:25.000 The problem is doing this is going to force us closer to a violent solution, which is not going to end well for anybody.
01:46:32.000 I think, I don't think there's an organization necessarily other than they don't like, the other is bad.
01:46:38.000 I think both sides think the other is bad.
01:46:40.000 And one side is calling for censorship and one side is calling for free speech.
01:46:44.000 The censorship side is winning because these people are squeaky wheels that never stop complaining.
01:46:49.000 When, you know, well, they control news organizations, but I'll leave it there.
01:46:55.000 Somebody mentioned that Trump tweeted using the government account, everyone go read, which we did read.
01:46:59.000 Yeah, I deleted it.
01:47:01.000 Let's see.
01:47:02.000 Oh, that was an important one.
01:47:03.000 Omega Blade says, can you bring the puppy back in to help promote a healthy safe space during this time of strife?
01:47:09.000 Jim doesn't want the puppy in the house.
01:47:11.000 You can bring it in the show if you want to go bring the puppy.
01:47:13.000 Luke doesn't want to get up.
01:47:13.000 He's too lazy.
01:47:16.000 The puppy peed too many times?
01:47:18.000 Yeah.
01:47:18.000 A few.
01:47:18.000 It's a lot of work.
01:47:19.000 We're training.
01:47:20.000 He's chewing up the carpet too.
01:47:22.000 Running around shaking all flopping all happy.
01:47:25.000 Making everyone joyous.
01:47:27.000 All right, let's see.
01:47:31.000 S-Head says, I've seen too many people calling for revolution and all the actions that will be the precursor to civil war.
01:47:36.000 Everyone needs to watch the Peter Capaldi Doctor Who speech about revolution.
01:47:39.000 Maybe then they will understand where it all leads to.
01:47:42.000 I wrote a song about that!
01:47:44.000 What's it called?
01:47:44.000 It's called Will of the People.
01:47:45.000 Oh yeah!
01:47:46.000 You guys should listen to it.
01:47:47.000 On this YouTube channel.
01:47:48.000 So it's actually one of the top videos now, because it's got like 700,000 views.
01:47:51.000 That's crazy.
01:47:52.000 I didn't think, you guys are awesome.
01:47:53.000 Get it to a million.
01:47:54.000 But, uh, that would be great if those who are watching should check it out.
01:47:57.000 But if you haven't seen the video and I, and I, look, it's a video I made.
01:48:00.000 It's a song I wrote and performed.
01:48:01.000 It was produced by Nishra Allman.
01:48:04.000 It is about the cycle of revolution and how these people who think they're fighting for a better future will not get what they think.
01:48:11.000 And, uh, as the saying goes, be careful when fighting monsters, lest ye become one.
01:48:15.000 For when you gaze into the abyss, the abyss gazes back.
01:48:19.000 But yeah, check it out.
01:48:19.000 Will of the People on YouTube, on this YouTube channel.
01:48:21.000 You can search for it.
01:48:22.000 I'm also hearing the quartering was just taken down.
01:48:24.000 What?
01:48:25.000 No, but that looks like he deleted it.
01:48:27.000 Okay.
01:48:28.000 That's the initial reports.
01:48:29.000 Unverified now.
01:48:30.000 What I'm seeing on Twitter says the account doesn't exist.
01:48:32.000 I got a message.
01:48:33.000 As opposed to this account has been suspended.
01:48:35.000 Jeremy got rid of his YouTube channel?
01:48:38.000 No, Twitter.
01:48:38.000 Oh, Twitter.
01:48:39.000 But he did it before.
01:48:40.000 He's done it before.
01:48:40.000 Yeah.
01:48:41.000 Yeah.
01:48:41.000 It's not new.
01:48:44.000 All right, let's see.
01:48:45.000 Timothy Hediger says, terms of service greater than First Amendment.
01:48:49.000 Well, that's the problem now, isn't it?
01:48:51.000 Yeah, does mine's terms of service strictly adhere to the First Amendment?
01:48:55.000 Mostly, but like there's certain things like malicious spam that isn't in the First Amendment.
01:49:01.000 So I've been talking a lot about rewriting a bill of rights, like an internet bill of rights.
01:49:05.000 There have been documents.
01:49:06.000 I mean, there's like the Manila principles.
01:49:08.000 There's the Santa Clara principles.
01:49:09.000 There's a number.
01:49:10.000 Like, yeah, go to manilaprinciples.org, actually.
01:49:13.000 It was drafted by the EFF, who, you know, in some ways... I think it's a similar ideology coming out of them as the ACLU.
01:49:23.000 It's like they sort of start... John Perry Barlow... They're authoritarians.
01:49:27.000 But also not.
01:49:28.000 Also not.
01:49:29.000 No, I mean, like... They're pro-censorship.
01:49:31.000 There are very good people that I know at the EFF who are not pro-censorship.
01:49:35.000 They're not pro-censorship, though.
01:49:37.000 As an organization, they tweet pro-censorship stuff all the time.
01:49:40.000 They used to be, I used to be a big advocate, I donated.
01:49:43.000 Go to the hacker conventions, EFF.
01:49:45.000 I actually fundraised for the ACLU at one point.
01:49:48.000 Now they're pro-censorship.
01:49:50.000 They advocate for removing people's right to speech.
01:49:53.000 Yeah, repeatedly.
01:49:53.000 Well, they're standing up for 230.
01:49:56.000 Standing up for 230?
01:49:56.000 Yeah.
01:49:57.000 Yeah, but we need 230 reform to protect people's right to speak.
01:50:01.000 Not just blanket keep it or leave it.
01:50:04.000 Yeah, nah, I'm not a fan.
01:50:06.000 Anyway, Daniel Nelson says, to make matters worse, we can't even go to open mics right now.
01:50:11.000 Our literal public town square for locals is currently not available.
01:50:15.000 I am trying not to be very frustrated right now.
01:50:17.000 And that's a very important point.
01:50:19.000 None of us can go out to public squares, or to town hall, or even church, where we normally communicate, and you're forced into these ideological bubbles.
01:50:26.000 Now they're banning, they tell you you can't go to church, then they ban you from social media.
01:50:30.000 These people are gonna burst, man.
01:50:31.000 And the church and the pubs have historically been a place for organizing rebellions.
01:50:36.000 Now they're shut down.
01:50:37.000 Coffee houses.
01:50:37.000 He goes on to say, anyway, which foot should I be catching pop shoves?
01:50:41.000 Good question.
01:50:42.000 What, what, what?
01:50:42.000 That's not a good question.
01:50:43.000 It's your front foot.
01:50:44.000 End of story.
01:50:44.000 I don't understand.
01:50:46.000 I guess if you're doing like a nollie pop shove it, you do your back foot.
01:50:50.000 Left foot or right foot.
01:50:51.000 Was that the question?
01:50:52.000 I am clueless.
01:50:52.000 Everyone in the room but Tim.
01:50:53.000 Do you know anything about skating?
01:50:54.000 regular shove it. So are you regular goofy? What are you talking about? You do a
01:50:57.000 nollie shove it, you could maybe use your back foot, but I guess a front foot would
01:51:00.000 still be cool if you did a nollie pop shove it. Some jargon there for all of
01:51:03.000 you have no idea what I'm talking about. I am clueless.
01:51:05.000 Everyone in the room but Tim.
01:51:06.000 Do you know anything about skateboarding? I have no idea what that's like. That's like Ewok talk.
01:51:11.000 Kelty skateboard jargon?
01:51:13.000 Come on.
01:51:14.000 Kelty said, my company in Seattle just announced it will hunt and fire outwardly racist people online or in private life participation.
01:51:20.000 Chilling.
01:51:21.000 Who judges outwardly?
01:51:23.000 NRA, Republican?
01:51:24.000 Who are the fascists?
01:51:26.000 If you're not, if you're, look.
01:51:29.000 You ever see that episode of Rick and Morty where the giant heads come and then the guy forms a religion where they wear clay heads?
01:51:36.000 Show us what you got.
01:51:37.000 I think so.
01:51:38.000 And then, like, whatever the faces do, they interpret it in some ridiculous way.
01:51:42.000 And when Rick and Morty's parents are like, we don't want to be involved in this, then they tie them to balloons and prepare to send them to their deaths.
01:51:49.000 Like, that's basically what it is.
01:51:51.000 We don't know what we're talking about.
01:51:52.000 It's very Bronze Age tribal religious type behavior.
01:51:56.000 Just adhere to the tribe, do as you're told, no wrong think.
01:52:00.000 You know?
01:52:01.000 Yeah, saying racist is a weird term.
01:52:03.000 It's not even about racist, though.
01:52:04.000 It's just about, are you a member of the tribe or not?
01:52:07.000 Will you conform or not?
01:52:09.000 Otherwise, they'll chase you out.
01:52:11.000 That's about it.
01:52:13.000 All right, let's see some more super chats here.
01:52:16.000 Timmy Rice says, my question is for everyone in the room, lids included.
01:52:19.000 Would you sideline your social beliefs for free speech and freedom?
01:52:24.000 What does that mean?
01:52:24.000 You could be quiet.
01:52:26.000 Would you sideline your social beliefs?
01:52:27.000 Yeah.
01:52:28.000 Like, I'm not going to push some personal narrative to destroy your ability to speak freely.
01:52:35.000 Yeah.
01:52:35.000 Wait, I think it's a hard question.
01:52:37.000 Would I, my beliefs for freedom of speech, like would I ever give up on my belief in freedom of speech?
01:52:43.000 Would you give up your beliefs in order for other people to have free speech?
01:52:47.000 But my beliefs are for free speech.
01:52:48.000 Sort of an oxymoron.
01:52:48.000 But what if your beliefs were, maybe they're not, maybe you think they are, but would you put down what you truly believe?
01:52:54.000 Those who would give up a little bit of freedom in exchange for security deserve neither.
01:52:58.000 We'll lose both.
01:53:01.000 Sam Good says, Hey, how does Ian get things done?
01:53:04.000 If you have to break things down to complete individuality.
01:53:10.000 Tim, you get angry too east?
01:53:12.000 Too easily.
01:53:13.000 Too easily.
01:53:14.000 I'm a social liberal and a financial conservative.
01:53:17.000 Okay.
01:53:18.000 Depends on get done what?
01:53:19.000 Sometimes I write it down.
01:53:21.000 People are warning about what's going to happen on the inauguration day.
01:53:25.000 People are posting things online about the 20th.
01:53:28.000 I think Tim and Ian arguments are becoming a meme in themselves.
01:53:31.000 Well yeah, but that's partly the point.
01:53:35.000 Tim reminds me of a lot of people I've known throughout my life.
01:53:38.000 Well, it is a meme.
01:53:39.000 Like when you weren't here the other day, people were posting, no Ian, no peace.
01:53:43.000 And then they were posting free the code.
01:53:44.000 Dude, you should hear us sing together.
01:53:46.000 It's magical.
01:53:47.000 It's great.
01:53:48.000 Yeah, it's awesome.
01:53:49.000 No, but people were posting free the code in chat because you weren't here.
01:53:52.000 And the Fed and stuff.
01:53:55.000 I can't believe I radicalized you.
01:53:58.000 I say we get boxing gloves and a live feed.
01:54:00.000 Yes, let's do it.
01:54:01.000 Oh, Lord.
01:54:01.000 I'm here for it.
01:54:02.000 Oh, God.
01:54:03.000 Dan Orlowski says, the FCC sent out a message stating that broadcast station that they have an obligation to play the messages put out by the emergency alert system.
01:54:10.000 Wait, really?
01:54:11.000 I don't know.
01:54:12.000 Never heard that.
01:54:13.000 Akapot says, Ian, you're way smarter than people give you credit for.
01:54:16.000 You rock, man.
01:54:17.000 Keep it up.
01:54:17.000 Thanks to all of you.
01:54:18.000 Keep it up.
01:54:18.000 I'm just an amalgam of all the people I know.
01:54:21.000 Your comment on the limitation of right-left is astute.
01:54:24.000 Google political circle.
01:54:25.000 It's not a line.
01:54:27.000 Actually, Gavin McInnes drew a picture and posted it once, showing the political circle.
01:54:31.000 But it's just another way to interpret certain beliefs in a spectrum.
01:54:36.000 If you talk about left and right economically, it's easily aligned.
01:54:39.000 When you talk about tradition versus progress, it's easily aligned.
01:54:43.000 When you talk about authoritarianism, then you can make it a circle because there are certain groups that align with each other, but based on ideological differences like racism or something, they'll agree completely.
01:54:54.000 Like, there are alt-right people who are for universal healthcare and left-wing politics, but they're racist.
01:54:59.000 So it's like their politics, their market ideas, are very similar, but they have weird, you know, cultural ideas.
01:55:05.000 Or like people that want Medicare for all, but they're authoritarian about it, versus people that want it, but they're libertarian about it.
01:55:12.000 Like ban private health insurance versus let people buy private health insurance.
01:55:15.000 It's the authoritarian versus the libertarian.
01:55:17.000 Bernie Sanders says, ban that.
01:55:19.000 Take that away from people.
01:55:20.000 They have no right to choose.
01:55:21.000 That's very authoritarian of them.
01:55:23.000 It is, yeah.
01:55:23.000 Only the government can give you your health care.
01:55:25.000 The libertarian approach is, we will create the option for universal health care, and then you can choose to get private insurance, because then you can have something, you know, supplemental if you can afford it, or if you need it.
01:55:37.000 I don't understand the logic of taking away people's right to choose.
01:55:40.000 That makes no sense.
01:55:43.000 DTRJr says, radios equal free speech.
01:55:47.000 Bring radio broadcasting back.
01:55:49.000 Okay.
01:55:50.000 Have you considered doing terrestrial?
01:55:52.000 I don't know anything about it.
01:55:53.000 That'd be hilarious.
01:55:54.000 I mean, they're a lot of fun to do.
01:55:56.000 They do terrestrial.
01:55:57.000 We were looking at getting a ham radio, shortwave radio.
01:56:00.000 Yeah, we should do that.
01:56:01.000 We'd have to get like a band and then we'd be able to broadcast.
01:56:04.000 There are certain licenses you need, I think.
01:56:06.000 We have way too many Super Chats today, guys.
01:56:08.000 I'm so sorry.
01:56:10.000 So let's see.
01:56:11.000 Corey Blair says WikiLeaks just dumped.
01:56:13.000 Link on Parler.
01:56:16.000 I'm not going to read any of that based on what he's saying because I don't know if it's true, but there's a lot in there.
01:56:20.000 So I don't know if they actually released anything, but you know, there you go.
01:56:23.000 Oh, more people are saying it.
01:56:24.000 WikiLeaks just dumped all their classified files on Clinton emails.
01:56:27.000 Interesting.
01:56:28.000 Really?
01:56:28.000 I had to look that up.
01:56:30.000 Wow.
01:56:30.000 Curious now.
01:56:33.000 Alright, let's see.
01:56:34.000 Travis Ruiz says, Hey Tim, thanks.
01:56:35.000 I was an active DNC supporter until I saw what you were talking about.
01:56:38.000 Facebook is building a data center in Huntsville, Alabama.
01:56:41.000 The FBI is also building here.
01:56:43.000 And it's... And then he made an emoji face.
01:56:45.000 Hmm.
01:56:46.000 Yeah, I don't like the Republican Party, obviously.
01:56:49.000 I don't like the Democrats.
01:56:51.000 That's why she didn't vote for these people.
01:56:52.000 You know, Donald Trump was different.
01:56:55.000 But not like Donald Trump is necessarily a good president.
01:56:57.000 Look, I think in terms of certain issues that I've talked about, particularly war and dealing with critical race theory, he's been a lot better than any president in my lifetime.
01:57:06.000 Like, no new wars.
01:57:07.000 Yep.
01:57:08.000 He's had problems with some conflicts, but it's been, you know.
01:57:11.000 Once I found out about the drone war and his escalation and secretization of the drone wars I got, I lost a lot of respect for Donald Trump.
01:57:18.000 And that was only like a few months ago when I was talking to Luke about it on the show.
01:57:22.000 I didn't realize that he had secretized government authorization of drone strikes now.
01:57:26.000 It's like on the high command of the military.
01:57:29.000 Dan Scope says, Order 66 has been called.
01:57:32.000 The Jedi, defender of law and order, have become enemies of the Republic and must be removed.
01:57:36.000 Next step is transfer emergency powers to the Chancellor to get us through the crisis.
01:57:40.000 I just watched Revenge of the Sith the other night.
01:57:45.000 Man, it's so different watching that movie now that I'm older.
01:57:47.000 I can't remember, when did that come out?
01:57:49.000 Like 2000s?
01:57:49.000 2001 is when the first one came out.
01:57:53.000 The first prequel?
01:57:53.000 Yeah.
01:57:54.000 But man, the dialogue is so corny and like, oddly acted.
01:57:57.000 I mean, what's his name?
01:57:59.000 George Lucas?
01:58:00.000 Just not a good dialogue writer.
01:58:01.000 I don't think so.
01:58:02.000 And his directing, making him act that way was weird.
01:58:05.000 But it is, I don't know, I think it's an interesting, you know, analog in a sense, or analogy.
01:58:12.000 Like, Order 66.
01:58:14.000 They're now purging people.
01:58:16.000 I wouldn't necessarily call the people being purged Jedi, you know, because it's, you know, it's nuanced politics here.
01:58:21.000 Force sensitive?
01:58:23.000 No.
01:58:24.000 Well, depending on who's getting banned, you could say that there are people who are perspicacious.
01:58:29.000 What's that mean?
01:58:31.000 An acuteness to comprehension of reality.
01:58:33.000 Like, oh, that's awesome.
01:58:34.000 Yeah.
01:58:35.000 Perspective.
01:58:36.000 Yeah.
01:58:36.000 Right.
01:58:36.000 Yeah.
01:58:37.000 Perspicacious.
01:58:38.000 Yes, let's see.
01:58:40.000 Nate Hammer says, removal from office after impeachment requires 67 senators, two-thirds, to vote for it.
01:58:44.000 That's right.
01:58:45.000 So a simple majority would not be enough.
01:58:47.000 It's not going to happen.
01:58:48.000 I mean, I really don't think so.
01:58:50.000 There are people posting something.
01:58:52.000 I've seen some Democrats post this.
01:58:54.000 They said something to the effect of, Josh Hawley and these other Republican members of Congress will gladly accept a Trump fundraiser down the line.
01:59:05.000 And I'm like, yes, they will.
01:59:08.000 What do you mean Trump fundraiser?
01:59:08.000 trying to imply is it's a bad thing that must be stopped.
01:59:10.000 What do you mean Trump fundraiser?
01:59:12.000 Like in the future Trump will hold a fundraiser for political candidates.
01:59:15.000 And then they'll go and shake his hand and all his supporters will be there and some
01:59:19.000 outlets have said he'll be a kingmaker.
01:59:20.000 He'll choose the winners and losers when it comes to the Republican Party because they
01:59:23.000 love Trump.
01:59:25.000 And the argument from these people who are posting about it is that it's a bad thing
01:59:27.000 that must be stopped.
01:59:29.000 What did ABC say?
01:59:31.000 ABC wrote an article saying, how do you cleanse the Trump movement from the Republican Party or whatever?
01:59:37.000 Yeah, that's scary language.
01:59:39.000 Exactly, right.
01:59:41.000 Key Lo says, if they expel all the senators and reps that supported Trump, how do you think the states they represent will react?
01:59:49.000 Hmm.
01:59:49.000 Secession?
01:59:50.000 No taxation without representation?
01:59:52.000 If they expel representatives from a state that supported Trump, that would be crazy.
01:59:56.000 The senator?
01:59:57.000 There's no representation for the state.
01:59:58.000 It would be outrage.
01:59:59.000 What were you talking about before about sort of the governmental sort of check that they're doing on people's beliefs with cops?
02:00:07.000 Oh, Luke brought that up.
02:00:08.000 Luke brought that up.
02:00:08.000 Okay.
02:00:09.000 Well, I'm just looking at my Twitter.
02:00:10.000 Sorry, I was distracted.
02:00:11.000 There was something going on.
02:00:12.000 I just lost a thousand followers just now.
02:00:14.000 Mass purge going on.
02:00:15.000 It's crazy.
02:00:15.000 But you were saying that the cops are going to be vetted now?
02:00:18.000 They're going to go and check them?
02:00:19.000 Well, there was a new report that the Capitol Police officers will be investigated for ties with white supremacy after Debbie Washerman Schultz, Congresswoman, released a statement saying that she thinks that there was insider... It was an inside job?
02:00:33.000 Yeah.
02:00:34.000 She thinks the U.S.
02:00:36.000 Capitol was an inside job?
02:00:37.000 No, she thinks that there was officers who helped people get in.
02:00:40.000 They did.
02:00:40.000 It's on video.
02:00:42.000 And she thinks that some of them had ties to some of the protesters and now we're getting information that all the officers will be investigated for, quote, ties of white supremacy.
02:00:50.000 But what does white supremacy mean?
02:00:52.000 We've heard that argument before.
02:00:53.000 We've heard people called Nazis for just the simplest, smallest, littlest microaggressions.
02:01:00.000 So again, who knows?
02:01:02.000 I'm seeing rumors that Cloudflare has removed 4chan, but I'm not... I saw it earlier, I wasn't able to confirm it.
02:01:10.000 Cloudflare, that's like Amazon Web Service, right?
02:01:12.000 They're like a CDN.
02:01:14.000 This is slowly becoming China when it comes to their control of the internet.
02:01:19.000 If you look at what happens when a small group of people control the internet, you essentially have China.
02:01:24.000 Then you essentially have the social credit score.
02:01:26.000 Then you essentially have them literally using American Twitter to talk about how great it is that Uyghur women are no longer baby-making machines and that it's great for gender equality that they have pretty much essentially concentration camps for them.
02:01:41.000 It's so important to contextualize how the rest of the world is looking at us right now.
02:01:47.000 OK, yes, people are probably disgusted with what happened at the Capitol, but like people in oppressive authoritarian government regimes are looking at this censorship and like being like, what are you doing?
02:02:00.000 I mean, they are problems are just so much less than what's going on in in these countries where they can't even go on the Internet at all.
02:02:09.000 And we're banned.
02:02:10.000 Our our companies, our private companies are banning people from the Internet.
02:02:14.000 We've got major.
02:02:15.000 Major breaking news.
02:02:16.000 I don't know if you guys saw this earlier, but Olive Garden put out a statement about Sean Hannity and banning him from the NeverEnding Possible.
02:02:23.000 Oh no!
02:02:23.000 His viscous attacks.
02:02:25.000 Uh, we have a statement from Sean Hennedy saying, I never signed up for Olive Garden's never-ending pasta pass.
02:02:31.000 Hennedy says it's fake news.
02:02:33.000 How can you ban someone from something they didn't have?
02:02:35.000 I don't know.
02:02:35.000 I don't think Olive Garden ever actually said it.
02:02:37.000 I think someone made a graphic because it was hilarious, this idea, like, we're- They misspelled vicious.
02:02:41.000 Oh, what in the- To viscous, yeah.
02:02:43.000 Oh, to viscous.
02:02:43.000 Everyone was mocking it.
02:02:44.000 Matt Taibbi was mocking it.
02:02:46.000 Interesting.
02:02:47.000 That's like who got banned from some service from like monkey something they didn't have.
02:02:53.000 MailChimp.
02:02:53.000 Yes.
02:02:53.000 Who got banned from MailChimp?
02:02:54.000 Enrique, yeah.
02:02:55.000 They didn't even have a MailChimp.
02:02:56.000 Yeah, so they just claimed it because activists claimed he did.
02:02:59.000 So then they announced they banned him even though he was never with the service.
02:03:02.000 Yeah.
02:03:03.000 Let's see.
02:03:04.000 Airsoft Master says, Hey Tim, if Google, Facebook, and Twitter kind of companies keep heading the way they are going in regards to limiting speech, do you think we will ever be able to backtrack to before all of this, or is it to the point of no return?
02:03:16.000 Look, if you only ever ban more and more people, then eventually there's no more people left to ban.
02:03:22.000 And it'll be Jack Dorsey sitting in a small room going like, I think, um, um, there's, my opinion is bad.
02:03:29.000 I'll ban myself.
02:03:29.000 And then it's an empty server with nobody in it.
02:03:32.000 They're going to build AI for you to interact with.
02:03:35.000 If you want a social network where they're still populated, it'll just be a bunch of artificial bots.
02:03:39.000 I'm wondering when they're going to build Love Simulator, the video game.
02:03:42.000 Some people say TikTok was allegedly doing that, but who knows?
02:03:47.000 Wow.
02:03:49.000 When you ban everybody, then nobody is banned.
02:03:51.000 I guess technically that doesn't work because then nobody will be on the platform.
02:03:54.000 But imagine Twitter just bans everyone.
02:03:57.000 And then what happens when the left can't actually get to an argument anymore?
02:04:00.000 So they start going to parlor.
02:04:02.000 Because they've started doing it.
02:04:03.000 They actually go there.
02:04:03.000 And then they post screenshots laughing about, you know, owning the cons or whatever.
02:04:07.000 And like the stupid things they post.
02:04:08.000 And I'm like...
02:04:09.000 There you go!
02:04:10.000 So it'll be like digital drive-by arguments where like Twitter will be left-wing and Parler will be right-wing and then someone on Twitter will go to Parler and then say something and screenshot it and go back to Twitter and post it.
02:04:21.000 It looks like they banned Rush Limbaugh.
02:04:23.000 Whoa.
02:04:24.000 Yeah.
02:04:24.000 Yep.
02:04:25.000 Good times.
02:04:26.000 Yeah.
02:04:26.000 He's totally gone.
02:04:26.000 I don't think we're ever going to be able to go back to where we were.
02:04:29.000 No way.
02:04:30.000 We'll be able to move forward to a different dimension, like a different way of Internet.
02:04:36.000 Like, I don't think the centralized proprietary services are going to be the future of social media.
02:04:42.000 It'll be more of a, you know, decentralized.
02:04:44.000 I like this are we've block blockchain, you know, mesh net type.
02:04:50.000 I think there is a large group of people who do want to talk to people who are different from them and, you know, rational Democrats, Republicans, people on the left or right who want to go and find someone that's different from them.
02:05:02.000 There's a pocket of people that exist like that.
02:05:04.000 It's probably a small group.
02:05:06.000 That's more what we're trying to do on Minds.com.
02:05:08.000 Have the conversation cross-spectrum, be open to both sides, not just Not just one side or the other, but, you know, it is definitely, you know, you can either ride the divide, and that's what all of these big networks are doing, and that's what some alternatives are doing, or, you know, you can try to bridge people together, but it's way harder.
02:05:26.000 We got a super chat here from Christopher Yager.
02:05:28.000 He says, Tim vastly overestimates the degree to which the right would be using social media and internet to organize in a civil war scenario.
02:05:35.000 Not necessarily.
02:05:36.000 I just think it's a powerful tool.
02:05:38.000 But with that being said, there's a really interesting story I read once that, I don't know if it's true or not, but I read it in the context of nonfiction.
02:05:44.000 I think it was in a magazine or something or some website.
02:05:46.000 They talked about how there's, you know, modern warfare, and then there's the archaic forms of warfare we used to have.
02:05:53.000 And they were doing a training scenario where they brought in a retired, you know, general or high-ranking officer.
02:05:58.000 To lead a group to do a war game scenario against the current, you know, military, the modern warfare.
02:06:06.000 And the modern group, with all their new technology and everything, lost to rudimentary and old school tactics and technologies.
02:06:13.000 And what they did was, the linchpin for how the retired guy defeated the modern army was that the modern groups were relying on digital technology for communication.
02:06:22.000 And so the old school guy slipped a note into the pocket of a guy on a motorcycle to transfer the orders, and they didn't know how to track what was being done or what they were saying or what they were going to do, and they were trying to monitor communications through radio, and it was just a guy on a bike with a note in his pocket.
02:06:37.000 Gave him the orders, and then took him by surprise.
02:06:39.000 It could be just an apocryphal story about not forgetting your fundamentals.
02:06:44.000 Maybe it's not true.
02:06:45.000 But it was a much, much, much longer story.
02:06:48.000 Maybe someone online has heard that story before.
02:06:50.000 But it's really interesting.
02:06:51.000 Because it makes sense.
02:06:52.000 You get caught up in what you expect, and you ignore the simple solutions.
02:06:57.000 So yeah, ham radio.
02:06:58.000 People talked about that.
02:06:59.000 And ham internet.
02:07:00.000 Do you guys know about that?
02:07:01.000 No.
02:07:02.000 I don't know a whole lot about it, but people using ham radio to get really...
02:07:07.000 Some internet signals.
02:07:08.000 Yeah, but it's real slow, but you can send like characters and text and stuff.
02:07:12.000 Yeah, using him.
02:07:13.000 Yeah.
02:07:14.000 Yep.
02:07:15.000 Deniz Atik says, Luke, where did you get your shirt?
02:07:18.000 Oh, that's such a nice, great question.
02:07:20.000 I really appreciate that.
02:07:22.000 Yes.
02:07:22.000 You could get my shirt on wearechange.org forward slash shirts, and they go towards keeping me free and independent and here.
02:07:29.000 So thank you guys so much for, uh, where, you know, buying and wearing my shirts.
02:07:33.000 It means a lot to me.
02:07:34.000 Dennis also says, Tim, you would make my day if you said, I am a gorilla in a deep voice.
02:07:39.000 I did my best.
02:07:40.000 Um, we do have the I am a gorilla shirt coming.
02:07:42.000 It is done.
02:07:43.000 It's on the Teespring store.
02:07:44.000 In the description below is the link to the Teespring store.
02:07:46.000 It should be there.
02:07:47.000 You should be able to see it.
02:07:48.000 And then YouTube has a separate approval process.
02:07:51.000 And then once it's there, we'll feature it.
02:07:52.000 And then the next one coming is the I am a gorilla love yourself.
02:07:55.000 See, you know, the shirts we're making are silly.
02:07:57.000 Like me with a bubble pipe saying Harumph.
02:07:59.000 And Luke's got these very serious, like the world, the apocalypse is here and we're all doomed.
02:08:03.000 Essentially, but they sell pretty good and I think it's a great way to meet people.
02:08:08.000 When I wear the shirt, especially the toilet paper one, I have one that talks about the pyramid of control and on top is the toilet paper manufacturers.
02:08:17.000 It starts conversations and then underneath is the Illuminati and then underneath is the CIA and the media.
02:08:23.000 But again, it starts conversations, which is important because then you could see someone is a part of your tribe.
02:08:28.000 Someone is thinking the way you are.
02:08:30.000 And it's a great way to build a community.
02:08:32.000 I mean, when I'm walking around, people are like, man, I love that shirt.
02:08:35.000 I love that hat.
02:08:36.000 And I'm able to talk to them and know someone in the community that is thinking the way that I am.
02:08:41.000 So it's a great way of bringing people together.
02:08:43.000 Garhentz says, Tim, the military one is the Millennium Challenge 2002, and it's Lieutenant General Paul Van Riper.
02:08:50.000 I'm sure I got a lot of the story wrong, because it's like 2002.
02:08:52.000 I probably haven't read it in a long time, but that's the gist of it.
02:08:55.000 So I'll look into that to see if that's the story, because it was a really cool story when I read it.
02:08:58.000 Let's see.
02:08:59.000 Scott Brumley says, any truth to Google and Apple banning the Parler app?
02:09:04.000 Yeah.
02:09:04.000 Yeah.
02:09:05.000 I don't know if Apple's done it yet, but I know Google has.
02:09:07.000 Yeah.
02:09:07.000 And you've already talked about that.
02:09:09.000 Mines has been through this before.
02:09:10.000 Yeah, you have to basically... To the parlor people, here's my recommendation for how to get out of the Google thing.
02:09:18.000 Well, email them and say, hey, did you see... Well, for us, we got banned because of an explicit image of a woman naked.
02:09:26.000 It was behind a blur, and I just emailed them back after six months, and I was like, you realize Twitter has full porn?
02:09:33.000 And they were like, oh yeah.
02:09:35.000 And so I think you just gotta find someone inside.
02:09:39.000 Honestly, it's the only way.
02:09:40.000 But they were accusing you guys of not moderating or something?
02:09:43.000 Yeah.
02:09:43.000 But you do moderate.
02:09:44.000 Yeah, we do, of course.
02:09:45.000 We have deep NSFW filtration tools, and I don't know if Parler has that, but you need that.
02:09:52.000 I'm pretty sure Parler has pretty strict rules.
02:09:55.000 Yeah, I thought I thought that they used to be more strict.
02:09:57.000 They were doing like FCC policy.
02:09:59.000 That's what I thought they were doing.
02:10:00.000 Then they switched.
02:10:01.000 Oh, yeah, I didn't know that.
02:10:03.000 Yeah, so I it was people were getting banned for like, not that crazy.
02:10:06.000 Like it was worse than Twitter.
02:10:08.000 Like swearing.
02:10:09.000 Yeah, they were banning people for like dropping F bombs and comments.
02:10:11.000 Yeah, because it was FCC broadcast standard.
02:10:14.000 And it made sense.
02:10:14.000 Like I got the idea.
02:10:15.000 I don't think it makes sense.
02:10:16.000 No, I mean, like, someone had the idea, like, I know, what if we use a TV standard, that way people have, they can say things you can't say on Twitter, but we're still having moderation, that should be satisfactory, because the TV does it.
02:10:27.000 Like, you can turn on TV shows and they, I'll tell you this, Cobra Kai, right?
02:10:31.000 You guys ever see Cobra Kai?
02:10:32.000 A little bit.
02:10:33.000 I watched the first season, it was awesome, I just stopped watching after that because, you know, I don't really watch a whole lot of TV.
02:10:37.000 But they said things in that show I can't say on YouTube.
02:10:40.000 But it was a YouTube original on YouTube!
02:10:43.000 How does that make sense?
02:10:44.000 Pre-vetted.
02:10:45.000 That's crazy.
02:10:46.000 I like the guy that plays Johnny.
02:10:47.000 He's a good actor.
02:10:48.000 Yeah, it's a good show.
02:10:49.000 It's a good show.
02:10:51.000 TJL431 says, I'm just joining and I know it's late, but have you discussed Google has taken down Parler?
02:10:55.000 It doesn't work on my Android anymore.
02:10:56.000 Wait, it doesn't work on your Android?
02:10:58.000 I know you can't go to the store and get it anymore.
02:11:00.000 It wasn't working because it was overloaded.
02:11:03.000 So Lauren says, Tim, you said to buy Ethereum and the price jumped 0.15%.
02:11:07.000 Well, just before the show, we were talking about crypto like before we went live and Bill was like, oh, Ethereum, man, you got to buy it.
02:11:13.000 And I was like, you think I should?
02:11:14.000 And you're like, oh yeah.
02:11:15.000 And I was like, okay.
02:11:16.000 And then I bought it.
02:11:17.000 So then we were talking about crypto.
02:11:18.000 I was like, you know, Bill mentioned buying Ethereum.
02:11:20.000 I have Ethereum.
02:11:21.000 I'm not a big fan of talking a lot about crypto, but it needs to be talked about.
02:11:25.000 But the risk is there's a lot of people who try and like talk about something because they want the price to go up.
02:11:30.000 That's stupid.
02:11:31.000 Can you explain the proof of work to proof of stake that they're doing?
02:11:34.000 Yeah.
02:11:34.000 So, I mean, Ethereum is proof of work like Bitcoin.
02:11:37.000 All the miners are running around the world and basically securing the network and the miners are earning money from that process.
02:11:45.000 But they're moving towards more of a proof of stake.
02:11:46.000 So you will in the future be able to mine Ethereum on your laptop.
02:11:49.000 Oh, I see.
02:11:50.000 So you can stake your 32 ETH on your laptop.
02:11:53.000 And it's more decentralized, theoretically more secure.
02:11:57.000 But some people like proof of work.
02:11:58.000 Bitcoin is, you know.
02:12:00.000 It basically just means, like, mining is the way Bitcoin produces coins.
02:12:05.000 Ethereum is... It's the same, currently, but it's going to be transitioning.
02:12:10.000 By holding a certain number of the coins, you facilitate the Ethereum network.
02:12:13.000 But you have to stake them into the protocol and run a program on your computer.
02:12:17.000 Oh, but that takes up energy and costs money.
02:12:20.000 Yeah, but you can run it on a laptop.
02:12:22.000 The thing with Bitcoin mining, you need serious equipment to be able to have cost benefit.
02:12:27.000 What is the program called?
02:12:29.000 It's just the ETH protocol.
02:12:31.000 An ETH full node.
02:12:32.000 You'll have to explain it to me, I guess.
02:12:36.000 Oh, and basically you earn interest.
02:12:38.000 You earn like 6% a year by staking your ETH.
02:12:42.000 And you need 32 of it.
02:12:44.000 Yeah.
02:12:45.000 Where do you need to put it in order to start staking it?
02:12:47.000 I'm honest.
02:12:48.000 I honestly don't know.
02:12:49.000 You get 6%.
02:12:50.000 Yeah, you get 6%.
02:12:51.000 That's huge.
02:12:52.000 Yeah.
02:12:53.000 And there's actually a cool, this is a little bit of a name drop, but particularly for Bitcoin, you can earn interest at this site BlockFi.
02:13:02.000 This guy, Anthony Pompliano, is on their board.
02:13:04.000 Definitely anyone interested in Bitcoin, check out Pom.
02:13:07.000 He's an animal.
02:13:08.000 He's amazing.
02:13:09.000 I went on his podcast a couple years ago.
02:13:11.000 But block by you can earn interest on your Bitcoin and ETH just by holding it there now granted
02:13:17.000 I'm not necessarily recommending that because you're putting custodial custody with them, but it's this I mean
02:13:21.000 Gemini and coinbase You're giving them custody. So right the beauty of crypto
02:13:24.000 is you can hold on in your own device, you know And you're saying you got a ledger. Yeah, which is great
02:13:28.000 You can do that, put into cold storage and you can have sovereignty or you can use these services that can give you interest.
02:13:35.000 But with ETH, it's moving towards more of like decentralized finance and there's all these protocols like Uniswap and whatnot where you can plug into and earn interest by providing liquidity.
02:13:45.000 And it's like a whole new financial system that is blowing up.
02:13:49.000 There's even like decentralized insurance protocols and lending protocols.
02:13:54.000 You know.
02:13:55.000 Have you considered doing it with the Mines token?
02:13:57.000 Mines is an Ethereum-based token and it's on, yeah, we are doing that.
02:14:02.000 So you'll be able to earn interest in Mines tokens by storing it and by staking it?
02:14:06.000 You will be able to in the future.
02:14:07.000 Excellent.
02:14:08.000 Wait, really?
02:14:09.000 Look at this guy.
02:14:10.000 He's got a smirk on his face when I said that.
02:14:11.000 It's not out yet.
02:14:12.000 That's a teaser.
02:14:13.000 That's a teaser.
02:14:14.000 But definitely check out Mines.com slash token if you want to learn more about that.
02:14:17.000 Or Mines.com, which is a platform you can use for social media as the great purchase place.
02:14:22.000 Yeah, and our whole thing has been to help pay people.
02:14:24.000 And we're doing rev shares as well with Mines Plus.
02:14:27.000 Minds.com slash plus.
02:14:28.000 We're taking 25% of the revenue of the company and proportionally sharing it with all of the users who submit content to that.
02:14:34.000 Wow.
02:14:35.000 And, yeah, so, you know, fiat, crypto.
02:14:38.000 That's the biggest thing that helps, you know, YouTube maintain its position, is that it's where you can have a job.
02:14:44.000 It's where you can make money.
02:14:45.000 So that's the big challenge.
02:14:47.000 Most of these networks can't handle it.
02:14:48.000 Google subsidizes it.
02:14:50.000 Absolutely.
02:14:50.000 That is the amazing thing about YouTube.
02:14:52.000 I mean, maybe crypto will be the key.
02:14:54.000 Maybe the value of crypto is going to go up so much that it makes a new... Yeah, the library token went from like $0.02 to $0.10 in like the last couple months.
02:15:02.000 I can imagine as the Mines Utility Token starts to gain, what's the trajectory of the Mines Utility Token?
02:15:08.000 What's your plan for the next couple of years for it?
02:15:12.000 Coming soon.
02:15:13.000 Oh, interesting.
02:15:14.000 But you can buy it.
02:15:15.000 Somebody just made a comment.
02:15:17.000 Stephen Vuro says, just sold my Ethereum and will buy Twitter options.
02:15:21.000 Puts, Twitter will tank soon.
02:15:23.000 Let's make some money, boys and lids.
02:15:26.000 I don't know if that was a good bet.
02:15:27.000 Listen, I don't have any stock in Twitter, but I wouldn't be surprised if Twitter tanks with Trump being gone.
02:15:34.000 And maybe it's not thousands of people being censored.
02:15:37.000 Maybe it's thousands of people leaving.
02:15:39.000 That's what I was saying.
02:15:40.000 Yeah.
02:15:40.000 Yeah.
02:15:40.000 Trump's gone.
02:15:41.000 So they're like, I'm out.
02:15:42.000 Yeah, I didn't even want to mention Twitter earlier in the show, just because of all this, what we're talking about today.
02:15:46.000 I was like, tweet me out.
02:15:48.000 I feel so dirty saying that.
02:15:49.000 You know what?
02:15:50.000 You feel better when you leave.
02:15:51.000 Come to the... Oh, I'm on mine.
02:15:54.000 All the Trump reply guys, people who built careers off of waiting for Trump to tweet to say something dumb.
02:15:59.000 They're out of jobs.
02:16:01.000 All of these journalists.
02:16:02.000 There was one journalist from BuzzFeed.
02:16:04.000 She tweeted that my mornings were haunted by Trump, who at 5am my phone would buzz and I'd have to see what it was.
02:16:09.000 Well, you are now being relieved of duty.
02:16:11.000 Congratulations.
02:16:11.000 Trump is gone.
02:16:12.000 You don't all have to wake up early in the morning when Trump tweets.
02:16:14.000 I feel bad for some of these journalists.
02:16:16.000 They're not all the Trump haunting, you know, rage bait.
02:16:19.000 There's like some legit reporters who are like, they're going to make me write about this, aren't they?
02:16:23.000 This is so dumb.
02:16:24.000 And they're going to be like, look, people want to know what Trump is saying.
02:16:26.000 It's all over.
02:16:27.000 It's gone.
02:16:28.000 The idea that people are going to chase him just to complain about him.
02:16:33.000 Yeah, it's going to happen.
02:16:35.000 Dude, I worry about his mental health.
02:16:36.000 I'll be honest.
02:16:37.000 I feel like he gets, he gets bullied hard and he is a bully.
02:16:41.000 And so he, he asked for it full out.
02:16:43.000 But like the way that people treat him, it's like, it's like your family member who you just like their attitude.
02:16:48.000 There's something about them that you just can't talk to them because they're so annoying and they just always need
02:16:52.000 to win He's he is that but at the same time it's just like people
02:16:57.000 need to realize that that that's his personality and just yeah
02:17:00.000 well he's also on social media a lot and
02:17:02.000 Anyone who's on social media and doesn't take a break that has an effect on your mental well-being
02:17:08.000 I always recommend and I always do this personally myself once a year at least take one week or two weeks
02:17:13.000 No cell phone nothing No Facebook, no Twitter, no Instagram, no e-thotting, nothing.
02:17:19.000 Two weeks, clear.
02:17:21.000 Somebody just made a comment, we were talking about this earlier.
02:17:24.000 Aurora Diaz says, in Rwanda, the media called on the public to kill their Tutsi neighbors and the moderates that defended them.
02:17:30.000 Are we heading towards genocide?
02:17:33.000 You know what's funny?
02:17:35.000 People always think it can't happen here.
02:17:37.000 And I was reading about... You mentioned this before the show, Ian.
02:17:41.000 The Jews fleeing Germany.
02:17:42.000 Yeah.
02:17:44.000 And I was reading about... A lot of them did.
02:17:46.000 A lot of Jews in Germany left once they saw things getting crazy.
02:17:49.000 I used to wonder, why didn't they all just leave before when they saw it getting crazy?
02:17:53.000 And I'm not a historian or anything, but I was reading an article that said they thought it can't happen here.
02:17:58.000 And so they just...
02:17:59.000 Dude, if Don Lemon, no offense Don, I'm not parrying you, but if someone went on the news and said that, to go kill people, it would happen.
02:18:06.000 That's crazy that people would go out there and hunt them down.
02:18:09.000 The people are that animalistic.
02:18:11.000 Listen, look, when Cuomo is on, as CNN is basing their ratings, like predicating their strategy upon demonizing 75 million people, they know they've lost the audience and they've given up.
02:18:25.000 So instead of saying, let's make a network that is more balanced so we can communicate to as many people as possible to make money, they said, now we're not going to get those people, just get the others.
02:18:33.000 What happened was the polarization started getting so extreme that networks started picking a side that would make them money.
02:18:39.000 Because the center don't pay that well.
02:18:40.000 Let's be real.
02:18:41.000 That is the problem.
02:18:42.000 That's honestly our, the demon that haunts us.
02:18:45.000 It's like we're trying to play the center role and not polarize.
02:18:49.000 But people love the drama.
02:18:51.000 They love the extremes.
02:18:52.000 Well, the algorithms love it too, and they incentivize it.
02:18:55.000 And I've been saying... You have to train yourself to want to see people's opinion on the other side.
02:18:59.000 It's actually a reflex that you have to build.
02:19:01.000 I've always followed left and right.
02:19:03.000 And that's one of the biggest problems they have.
02:19:06.000 The left doesn't follow the right.
02:19:08.000 The right follows the left.
02:19:09.000 That's a common theme we see on major social platforms.
02:19:12.000 And so I'll see BuzzFeed writers tweet about Trump, and then I'll see a Trump supporter tweet about Trump, and I'll be like, Ah, I see what they're saying.
02:19:18.000 And then when I tweet, the craziest thing is, I'll tweet something like, you know, Ted Cruz condemned the violence and called for reconciliation.
02:19:27.000 And then AOC immediately responded with, you should be expelled and resigned.
02:19:31.000 But the left doesn't see what the right is talking about.
02:19:33.000 And they're like, but AOC is right.
02:19:35.000 He should be.
02:19:35.000 And I'm like, yes, you're part of the, you're the problem.
02:19:37.000 They call that the demand for escalation.
02:19:40.000 And what's really interesting is even during Obama's first administration, when he was still the hope and change guy, end the wars, bring back privacy for the individuals, I had a subsection of my audience that was like, just admit it, you should be an Obama supporter.
02:19:53.000 I was like, no, some of the things he's saying and promising is good, but it won't happen.
02:19:58.000 Same thing with Donald Trump.
02:19:59.000 People are like, just support Donald Trump.
02:20:01.000 Just do it.
02:20:01.000 I'm like, no.
02:20:03.000 He's sitting down with Kissinger.
02:20:04.000 I criticized him throughout his presidency, but now people are like, he's not doing anything.
02:20:09.000 They're not doing anything regarding Joe Biden.
02:20:12.000 There is no spirit of hope and change.
02:20:14.000 There is no spirit of, of people that baseline support him.
02:20:19.000 So it really, really, really makes you wonder what's going on.
02:20:22.000 He's the meh president.
02:20:23.000 Yeah.
02:20:24.000 No, I think it works out really well for the far left.
02:20:28.000 They didn't like Trump.
02:20:29.000 They don't like the populist right.
02:20:31.000 They were able to get rid of the populist right while putting in a very weak president, which they say is easier for them to overthrow.
02:20:40.000 So what they did was they created a universal enemy for the populist faction.
02:20:46.000 But he didn't sound weak today when he compared US representatives.
02:20:50.000 No, he did.
02:20:50.000 He did.
02:20:51.000 He actually did.
02:20:52.000 Weak?
02:20:52.000 You think that was weak?
02:20:52.000 He was mumbling and like... He always mumbles.
02:20:55.000 When doesn't he mumble?
02:20:57.000 Him sounding weak isn't the issue.
02:20:59.000 It was the demonization.
02:21:01.000 It was very strong words.
02:21:02.000 Yeah.
02:21:02.000 Indeed.
02:21:03.000 You know, I went down, I, I went down, uh, I, you know, the, the shop, I, I went down the shop, uh, I was at the shop.
02:21:07.000 not hearing him. What the journalists do is they translate for Biden. Joe Biden could
02:21:11.000 say like, you know, I went down, I went down, I went down the shop, I was at the shop. And
02:21:18.000 then the article will say, quote, I was at the shop. They cut out all the struggling.
02:21:24.000 And then, and then they say, but he has a stutter.
02:21:26.000 It's like, come on, dude.
02:21:27.000 That's not stuttering when you say the things he's saying.
02:21:30.000 The speeches he was given just a few years ago compared to the speeches now, you see a big, big difference.
02:21:36.000 Big time.
02:21:37.000 So look, the rhetoric is, they bring up Rwanda and the Tutsi and stuff.
02:21:43.000 You know what, man?
02:21:45.000 We're not there right now, but they've been saying for a long time to kill the Nazis.
02:21:51.000 And Twitter allows this.
02:21:53.000 Do they say that?
02:21:54.000 Kill them?
02:21:55.000 Yes.
02:21:55.000 Really?
02:21:56.000 Yes.
02:21:56.000 That explicitly.
02:21:57.000 That's like illegal, constitutionally illegal, right?
02:21:59.000 Well, it started with punch.
02:22:01.000 Punch, yeah, I remember that.
02:22:02.000 And so the issue is, when I was on with, you know, it was two years ago now, with Joe and Jack.
02:22:09.000 There was a tweet from Antifa explicitly advocating for violence, and I said, this has been reported hundreds of times probably, so it won't be removed.
02:22:17.000 And then Joe pulls it up, and it's like, oh yeah, wow, they're like explicitly telling people to go take an action and go do something illegal.
02:22:23.000 Twitter won't remove it.
02:22:24.000 So what happens when, it's not so much about people going on TV and doing it, but on Twitter, they're literally doing it right now.
02:22:30.000 I bet you can go on Twitter, you can pull it up, you'll find it.
02:22:32.000 Right.
02:22:32.000 And then what happens is they're going to start saying, and they've always been saying that Trump supporters
02:22:37.000 are Nazis.
02:22:38.000 Trump is Hitler.
02:22:39.000 And so what happens when they go on and they say, it's not your Tootsies, they say it's the Nazis.
02:22:44.000 They go on the media and say, you have to go, stop these people before it's too late.
02:22:49.000 So if you say to go kill a type of person, and then you say that guy is that type of person,
02:22:56.000 you're essentially saying- But Twitter allows it.
02:22:59.000 Is it not constitutional?
02:23:00.000 I bet if you posted, you know, to take action against a communist, you'd be nuked in two
02:23:07.000 seconds.
02:23:08.000 But Nazis, different.
02:23:09.000 It's the Brandenburg test.
02:23:11.000 The Brandenburg test is the legal precedent for imminent violence.
02:23:17.000 What is it?
02:23:18.000 It's just, is it imminent or not?
02:23:21.000 Yeah.
02:23:21.000 And that means, is it true?
02:23:22.000 You said it was changed now to, is it a true threat of violence?
02:23:25.000 Yeah, it seems to be sort of changing.
02:23:28.000 I think in the state law of Pennsylvania, they were using the language true threat, but the Supreme Court precedent is the Brandenburg test.
02:23:37.000 Whether or not they're actually telling someone to do something right now.
02:23:40.000 That's so weird.
02:23:42.000 Well, we can take a couple more Super Chats, see what's going on over here.
02:23:46.000 People are talking about Rush Limbaugh getting suspended.
02:23:50.000 Let's see, Woody would like me to read his Super Chat.
02:23:52.000 Let me see if I can find it.
02:23:54.000 So look, I apologize to a lot of people.
02:23:56.000 When we get slammed with Super Chats, it's huge.
02:23:59.000 And then we can't actually track everything.
02:24:02.000 So I'll try and see if I can find this Super Chat.
02:24:07.000 And when you have thousands, it just becomes... It's a lot tonight.
02:24:11.000 Thank you, guys.
02:24:12.000 Oh, here we go.
02:24:13.000 Woody says, Tim, I understand non-violent civil disobedience, but the point of 2A, to bear arms, is to provide that check on government tyranny.
02:24:21.000 My question is, when is such action necessary?
02:24:23.000 A misguided attempt from the mostly peaceful protesters, for sure, but where's the line?
02:24:28.000 Um, we had Vosch on the show, and he mentioned Nazi Germany.
02:24:32.000 And I think everybody would agree, if they're rounding people up onto trains to bring them to concentration camps to be, you know, max genocided, you'd probably have to fight back.
02:24:39.000 You have no choice.
02:24:40.000 Crazy thing is they didn't know where those trains were going.
02:24:42.000 Yep.
02:24:43.000 And then we have that bill coming out of New York.
02:24:45.000 Have you seen this one?
02:24:46.000 Was it A14 or something like that?
02:24:49.000 That says that they can remove and detain people suspected of having contact with someone who may have a communicable disease.
02:24:57.000 Any disease.
02:24:58.000 Now it's not passed, but it's been introduced.
02:25:00.000 Basically it would allow Cuomo the power, and anyone who signs the power, to remove anyone without legit cause.
02:25:07.000 Now, of course, the bill says they must have clear and present evidence of a communicable, you know, public health threat, epidemic, or contagion, contagion, or whatever.
02:25:14.000 What does that mean?
02:25:16.000 It means they're gonna be like, your delivery guy tested positive for COVID, so you are coming in the truck.
02:25:21.000 Yeah, and like, where is the truck going?
02:25:23.000 They didn't, that's the thing, the Jews didn't know.
02:25:25.000 They were like, hey, we're gonna put you on trains and take you to a resort.
02:25:28.000 They were telling them we're gonna take you to like another town or another place to set you up.
02:25:32.000 They didn't tell them they were gonna go take them, throw them in ovens.
02:25:35.000 So, Yeah, maybe they should have used weapons to defend themselves, but they didn't know that that's... And that's, and that's the problem.
02:25:41.000 What happens when someone comes to your house and says, we'd like you to come with us, sir.
02:25:45.000 And you just say, okay.
02:25:46.000 So that's, it's tough.
02:25:47.000 I don't know, man.
02:25:48.000 It's crazy that it's almost the perfect storm of, of reasons and rationale you like with, with all the COVID stuff, all of the political as well.
02:25:57.000 It's like, there's these two major reasons that people are sort of getting isolated into these groups and, It's crazy that both are happening at the same time.
02:26:07.000 Yeah.
02:26:08.000 Well, I think it's fair to say it's going to get worse, not just because of the political collapse of the right, but because of the oncoming tsunami of financial consequences that are going to be there because of the lockdowns, because of this kind of larger idea of the Great Reset, which the Biden administration is going to be pushing, admittedly.
02:26:29.000 John Kerry said it.
02:26:30.000 Yeah, John Kerry, a part of Joe Biden's administration, admitted that the Great Reset is going to come faster and quicker than many people expected, and it's going to be done under the Joe Biden administration.
02:26:42.000 So when that happens, that's going to be another major ramification.
02:26:45.000 The major efforts to take away people's Second Amendment is going to be another major clash point, and we're headed towards a trajectory that is really, really dangerous for everyone, even if you're in the middle.
02:26:58.000 Especially, and again, not just even if you're in the middle, to the people on the left as well, it's going to be against anyone not toting the official line, not loving the government, not being obedient to them in every possible way.
02:27:11.000 So keep that in mind.
02:27:12.000 So we had somebody comment saying that it was really easy to, you know, advocate for peaceful non-violence or non-violence of disobedience when you haven't had your life destroyed by the lockdowns and all that stuff.
02:27:23.000 And that's a fair point except, you know, a fair point in terms of the stress.
02:27:28.000 I just don't think what they did at the Capitol will actually make things better or make things worse.
02:27:32.000 It'll justify the lockdowns.
02:27:33.000 Like Gretchen Whitmer said when they protested the lockdown, she goes,
02:27:36.000 well, now we got to extend the lockdowns because you all came outside.
02:27:39.000 That's what you get.
02:27:40.000 So I guess what I advocate for is self-sustainability and independence and security.
02:27:46.000 Protect yourself, protect your family and your friends, learn how to survive and be self-reliant.
02:27:51.000 Try and get away from the cities to the best of your ability.
02:27:54.000 But I do feel that, you know, with the talk from Fauci and Bill Gates about this extending into 2022, going through another, what, year and a half or two years of this?
02:28:04.000 I think that statement alone is them telling us they intend for violence.
02:28:09.000 Because they know.
02:28:11.000 We've seen the mass rioting already, now on both sides.
02:28:13.000 The rage that came from this lockdown.
02:28:15.000 Also, that would insinuate that they're going to print another $30 trillion.
02:28:24.000 If they're saying the lockdowns are going to... Specifically, they said, new normal.
02:28:28.000 We'll be in this.
02:28:29.000 Normalcy won't come back until 2022.
02:28:32.000 Which means, yes, people either are going to get their stimulus checks, or they're going to get crazy.
02:28:37.000 Yep, it's almost like UBI is here.
02:28:39.000 Yeah, to a degree.
02:28:40.000 Like, they might just keep doing it on a regular basis.
02:28:43.000 And the idea might be, by giving people a UBI, but taking away their ability to work and produce things, you end up with people only being able to buy bare necessities.
02:28:54.000 It's almost like, if I were to imagine, It being on purpose, I'm not saying it is.
02:29:00.000 But it's almost like trying to sweat out a fever.
02:29:02.000 You ever hear that?
02:29:03.000 You know, you get a fever, so you throw all the blankets on and just sweat as much as you can to just end it.
02:29:08.000 But then you gotta rinse that salt off your skin.
02:29:10.000 Well, so the idea would be, they think the world is being depleted of resources, and it's true to an extent.
02:29:15.000 It is.
02:29:16.000 And there's fishery collapses, there's insect population collapses, there's some really scary stuff going on.
02:29:20.000 They fear climate change, and now the Great Reset explicitly talks about this.
02:29:24.000 And so the idea is simple.
02:29:26.000 Interesting.
02:29:26.000 It's really weird.
02:29:27.000 Give them only just enough to survive, to eat, and recalibrate them.
02:29:32.000 That's what they mean by the reset.
02:29:34.000 Interesting.
02:29:35.000 So everybody says they don't care about the movies.
02:29:37.000 Verizon tweeted the other day, do movie theaters have a place in the new normal?
02:29:41.000 And I quoted, I said, this is a really weird tweet.
02:29:44.000 It's really weird.
02:29:45.000 They deleted it.
02:29:46.000 It was really weird.
02:29:47.000 Yeah.
02:29:49.000 And I think most people said no.
02:29:51.000 It was a Twitter poll.
02:29:52.000 So, ultimately, they will probably have lockdowns going on for a few years.
02:29:58.000 They'll give people only just enough, and it'll be fought over relentlessly in Congress, and then I think some people will snap.
02:30:05.000 If the left doesn't get their $2,000 per month, they'll snap.
02:30:08.000 If the right sees the country printing and just essentially devaluing the dollar like crazy, and then they can't run their businesses and fulfill their own lives and purposes and have freedom, they'll snap.
02:30:20.000 So, I'm not optimistic about the future unless everything comes back to normal, which is probably not going to happen.
02:30:25.000 I don't think so.
02:30:26.000 We're at $28 trillion deficit, or uh... Close.
02:30:30.000 $27.7 trillion right now.
02:30:32.000 Closely approaching $28 trillion.
02:30:33.000 They had to print like $6 trillion this year.
02:30:36.000 In order to match that, it would have to be more next year because the dollar is worth less.
02:30:40.000 So you're looking at at least like Uh, 10 trillion next year, but that, but we haven't, we're not locked down this whole year.
02:30:47.000 So it'd be like 10 to 12 trillion next year.
02:30:50.000 And then that would extrapolate into 2022 to like another 18 or 20 trillion.
02:30:54.000 So like your dollar might be worth.
02:30:58.000 It's not just three times less.
02:31:00.000 It's like.
02:31:02.000 We should open the show with a debt clock calculator every time.
02:31:06.000 Put it in the corner, pop it up like the smash the like button.
02:31:10.000 The federal debt to GDP is now at 130.51%.
02:31:13.000 It was at 126 two weeks ago.
02:31:17.000 Yeah.
02:31:19.000 124.
02:31:19.000 And then Joe Biden announced his economic team today and his larger economic plans of spending more money to help deal with this.
02:31:26.000 An immediate, he says, an immediate $2,000 stimulus check.
02:31:29.000 To every person, to every 200 million people.
02:31:32.000 Um, 330 because they give kids, they give money for kids.
02:31:35.000 So like if you, if you're a family of five, you, your wife, and then all your kids get compensation.
02:31:39.000 I think kids will get less.
02:31:40.000 That's like, you know, a couple hundred bucks per kid.
02:31:42.000 But what is that going to do when the dollar's worthless?
02:31:45.000 People who have bought Bitcoin will, in the land of the collapsed dollar, the man with Bitcoin is king.
02:31:49.000 So that's like $600 billion a month is what they're looking at.
02:31:53.000 $2,000 a month is what they're looking at.
02:31:55.000 I mean, Ilhan Omar has called for $2,000 a month.
02:31:58.000 Think about the mass amount of money they're printing every month if that was the case.
02:32:01.000 I think Canada implemented a kind of similar system.
02:32:03.000 I have to look that up, though, to be honest with you.
02:32:05.000 What's interesting is a lot of people are saying, you know, the other day I mentioned I bought Bitcoin and they're like, Tim's buying the top.
02:32:10.000 You know, you should wait till it goes down.
02:32:11.000 And I'm like, well, you can look at the massive spikes of Bitcoin in the past where it's broken all-time high, broken all-time high, and then it does fall down.
02:32:19.000 I think Bitcoin may go down as possible, but I also think those dips didn't happen right after a bunch of people stormed into the U.S.
02:32:29.000 Capitol building, and there was a chaotic transition and mass purging on social media, and 66% of all U.S.
02:32:38.000 dollars being printed in one moment.
02:32:40.000 So I kind of think people are buying Bitcoin in fear.
02:32:42.000 You have MassMutual, an insurance company, putting $100 million of their treasury into Bitcoin.
02:32:49.000 No, really?
02:32:50.000 You have Fidelity.
02:32:51.000 You have MicroStrategy, a publicly traded company, putting $500 million of their corporate treasury in because it is digital gold.
02:33:02.000 Ian, you mentioned the crypto market cap is now a trillion.
02:33:05.000 The market cap of gold is $9 trillion.
02:33:08.000 We're going there.
02:33:10.000 Bitcoin is eating the financial system.
02:33:13.000 It is the new printing press and Elon Musk, one of the world's, if not the world's richest man, hinted at even investing in it with Tesla.
02:33:21.000 I'd like to tell everybody something.
02:33:23.000 Okay.
02:33:25.000 Uh, what was it?
02:33:28.000 Bitcoin was at, what, like a dollar?
02:33:30.000 You could have walked outside.
02:33:33.000 Excuse me, sir.
02:33:33.000 Might I have a dollar from you?
02:33:35.000 I will pay you back.
02:33:36.000 Sure.
02:33:37.000 I don't care.
02:33:37.000 Keep the dollar.
02:33:38.000 Okay.
02:33:39.000 Bought one Bitcoin, and then just walked away.
02:33:41.000 You'd have $40,000 right now. $40,000!
02:33:46.000 You could buy a Tesla with one Bitcoin.
02:33:47.000 You could buy a Tesla with one Bitcoin.
02:33:51.000 Okay, in November, it was at 13.
02:33:53.000 It is at 40 now.
02:33:55.000 But the thing is, like I said, we are seeing people storm into the Capitol building.
02:33:59.000 We are seeing mass printing of the dollar.
02:34:02.000 It is not the same as it was last time.
02:34:04.000 Some people did sell off when it hit like 42, because it's like, whoa, they get scared.
02:34:08.000 And then I'm like, I'm buying the deal.
02:34:09.000 It's a big market money.
02:34:11.000 How can you have faith in a financial system that literally is just hitting zero on the keyboard?
02:34:16.000 Wow.
02:34:17.000 Wow.
02:34:18.000 One problem with crypto is the entire market is a trillion right now.
02:34:21.000 The entire crypto market cap of the world is one trillion.
02:34:23.000 And so it's going to get really big.
02:34:25.000 The US government just printed 6 trillion, so they could have bought, for all we know that market could have been co-opted, and it's being controlled, it probably is, by big money.
02:34:33.000 I was talking about this, we were talking about this before the show, the likelihood the US had the ability to just buy 51% of Bitcoin to control the network, perhaps, and you mentioned we won't know, and that's a good point.
02:34:43.000 I'll put it this way.
02:34:44.000 I am, I personally am confident in Bitcoin.
02:34:46.000 And I'll tell you this, if you mentioned, what are these companies?
02:34:49.000 Mutual Insurance?
02:34:49.000 Mass Mutual.
02:34:51.000 Fidelity.
02:34:51.000 JP Morgan just predicted that Bitcoin will hit like 150K.
02:34:56.000 I think Bitcoin's going to go over a million.
02:34:58.000 Yes.
02:34:59.000 Yeah.
02:34:59.000 Yeah, absolutely.
02:34:59.000 And I think not, I'm not going to say that I think it's going to happen.
02:35:03.000 I don't know when it'll happen, but I certainly think so.
02:35:05.000 Because if we're talking about one trillion market cap for Bitcoin, And entropy, meaning a lot of Bitcoin just doesn't exist anymore.
02:35:12.000 It's already out of the supply.
02:35:13.000 21 million caps.
02:35:15.000 It's one trillion for the entire, all the cryptos combined.
02:35:17.000 Oh, all the cryptos.
02:35:18.000 Bitcoin's at like 600 or 700 billion.
02:35:22.000 That actually is better for my point.
02:35:23.000 My point is, if these big companies are hedging in Bitcoin and it's nowhere near enough compared to the size of the U.S.
02:35:30.000 economy, then Bitcoin has to become worth more because of the finite amount of Bitcoin available.
02:35:36.000 I wonder if Bitcoin's an inferior coin and it's only super valuable because it's popular.
02:35:42.000 First and best dressed.
02:35:43.000 Yeah.
02:35:43.000 A lot of people have said that.
02:35:46.000 It is the fairest system that we have.
02:35:50.000 What about Litecoin?
02:35:51.000 It's a fork of Bitcoin.
02:35:52.000 I know.
02:35:53.000 And it's like 86 million coins.
02:35:55.000 So they used to say Litecoin was the silver to Bitcoin's gold.
02:36:00.000 Yeah, I don't know.
02:36:01.000 I'm more just bullish on ETH and Bitcoin particularly.
02:36:07.000 Mostly Bitcoin.
02:36:08.000 When Ethereum came out, people were like, this is a revolution.
02:36:12.000 It was like, Bitcoin was a revolution and Ethereum is like a revolution on top.
02:36:15.000 Because the smart contracts, the programmability of Ethereum, essentially.
02:36:20.000 Bitcoin was like the Bible.
02:36:21.000 When the printing press came out, it revolutionized.
02:36:23.000 And the first thing it printed was the Bible to show that it was a revolution.
02:36:27.000 Bitcoin is the Bible.
02:36:28.000 It's the first printing on this new revolution of the blockchain.
02:36:32.000 The blockchain is the printing press.
02:36:35.000 Yeah.
02:36:36.000 So I'm not a big fan of these like alt coins.
02:36:39.000 These other, you know, I think it's silly.
02:36:41.000 And that's why I do have a couple like on my website, you can donate some couple addresses.
02:36:46.000 There's a lot of scams out there.
02:36:47.000 There's also a lot of predatory behavior.
02:36:49.000 You have to be aware of that.
02:36:50.000 And you always have to be super careful.
02:36:52.000 And it could go back the other way.
02:36:54.000 It's a new technology.
02:36:55.000 It will go back the other way.
02:36:56.000 And it could be used to track, trace, database, and spy on you.
02:36:59.000 And some people even believe it's a honeypot.
02:37:01.000 Who knows?
02:37:02.000 That's why there are privacy coins like Zcash and Monero, which actually are interesting.
02:37:07.000 They just got banned from some U.S.
02:37:09.000 exchanges for, like, probably surveillance reasons.
02:37:11.000 Yeah, the IRS released a statement that they're trying to crack and break down Monero.
02:37:16.000 So that led to a lot of people investing in Monero, essentially.
02:37:21.000 I saw Ripple got shredded.
02:37:22.000 They're proprietary, dude.
02:37:24.000 And the CEO sold $1.6 billion of it without notifying the SEC, I think, got in trouble.
02:37:28.000 Yeah, I am not a fan of Ripple.
02:37:31.000 Ripple was kind of the big brother, big bank coin.
02:37:34.000 And they got chewed up by the big establishment that they were trying to cozy up to.
02:37:39.000 So we got a funny super chat.
02:37:40.000 Let's read this.
02:37:41.000 Minute Man says, Elon Musk said he's more worried about population collapse than too many people.
02:37:47.000 If you work out the math of population in Earth's livable land, there's not too many people.
02:37:51.000 P.S.
02:37:51.000 Got pulled over by a cop who's a fan of yours.
02:37:54.000 How did that come out?
02:37:55.000 Like, you got pulled over?
02:37:55.000 He was listening on the podcast?
02:37:58.000 Yeah.
02:37:58.000 Or like, were you listening in the car?
02:37:59.000 And he was like, oh, you're listening to this?
02:38:01.000 Oh, nice.
02:38:01.000 Probably.
02:38:02.000 Dude, there's a hilarious visualization.
02:38:04.000 Read the code and then he fist bumps him.
02:38:06.000 Of all the humans on earth in a pile.
02:38:09.000 That was an actual visualization.
02:38:11.000 It fits in like an area.
02:38:15.000 Look up the visualization of all the humans in a pile, please.
02:38:18.000 It's funny.
02:38:20.000 But look, the issue is not the livable land.
02:38:23.000 It's the impact of a person per, you know.
02:38:26.000 And transportation of goods.
02:38:28.000 Well, so we had on Chris Martinson, a PhD, and he said insect populations are collapsing, and that's the bottom of the food chain.
02:38:35.000 It's going to affect birds, it's going to affect a bunch of other things, and we're going to see that.
02:38:38.000 It's going to be bad for us, plus pollination of plants.
02:38:41.000 So that's a serious crisis.
02:38:43.000 Yeah.
02:38:44.000 There's a great video called The Overpopulation is a Myth that brings out some scientific data that kind of suggest a lot of different things that you don't really hear on the mainstream media that people should check out, in my opinion.
02:38:55.000 I think that Elon's boring company is way bigger than people realize right now.
02:38:59.000 I was trying to invest in it, but I think Tesla owns it.
02:39:02.000 It's still owned by Tesla right now.
02:39:03.000 How boring.
02:39:04.000 Or it's owned by him.
02:39:04.000 It's not public.
02:39:06.000 But once we start living underground, like if we can have livable tunnels, we've just doubled our land space without going very far.
02:39:14.000 Yeah.
02:39:15.000 It is a beautiful thing.
02:39:15.000 Or tripled or quadrupled or quintupled.
02:39:17.000 That a guy trying to move the world to sustainable energy is now the richest man in the world.
02:39:21.000 And he wants to do implantable microchips in your head.
02:39:25.000 Yeah, that's the odd part.
02:39:27.000 I don't think so.
02:39:27.000 They're not chips, yeah.
02:39:28.000 They're just tethers.
02:39:29.000 It's a health benefit for people who are struggling.
02:39:33.000 That's one of their use cases.
02:39:35.000 Like, if we had brain-computer interface on the scale of USB, I think it would be amazing.
02:39:39.000 The challenges are encryption, security, and making sure that when you plug something into your brain, you can't be compromised.
02:39:45.000 Anybody who's a fan of Ghost in the Shell?
02:39:47.000 You guys familiar with Ghost in the Shell?
02:39:48.000 I haven't seen it.
02:39:50.000 Long story short, in the future, people have cyberized brains.
02:39:53.000 People can hack your brain.
02:39:55.000 So that's a consideration for if you get a Neuralink.
02:39:57.000 I'm not saying that... I think a lot of people get scared of it, like, I would never get it.
02:40:01.000 And I'm like, well, look... If you were losing your memory and you could... Right.
02:40:05.000 Technology is neutral.
02:40:07.000 Somebody... If you have a parent who has Alzheimer's and they said, with Neuralink, we can use a USB that would act as a memory backup and make sure their brain, you know... People would be like, absolutely, yes.
02:40:18.000 Yeah.
02:40:19.000 It's just an issue of security.
02:40:20.000 Technology is neutral.
02:40:20.000 It's the application.
02:40:21.000 And so there are risks, for sure.
02:40:23.000 Did you see the pig demo of Neuralink?
02:40:25.000 I saw just like snippets.
02:40:26.000 I didn't know.
02:40:27.000 There's a pig.
02:40:28.000 Oh, you gotta watch it.
02:40:29.000 There are three pigs with, yeah.
02:40:31.000 It's the demo.
02:40:32.000 And you can hear the data.
02:40:34.000 It's read-only right now, so it's just transmitting what the pig's smelling and seeing and showing you as like... What's it, Dolly?
02:40:40.000 I think is her name.
02:40:42.000 I want to say.
02:40:42.000 Dolly was the sheep that they called.
02:40:43.000 Oh, Dolly's the sheep.
02:40:44.000 It's like the Matrix.
02:40:46.000 I probably shouldn't read that.
02:40:47.000 I want to read this.
02:40:48.000 Read it.
02:40:49.000 Sparky the Pyro says, apparently 4chan is now trying to get everyone to change their name to Donald Trump and use his picture.
02:40:54.000 It's called Operation Spartacus.
02:40:56.000 Love it.
02:40:57.000 I love it.
02:40:58.000 I can't stop them all.
02:40:59.000 Timothy Hediger says, wrong dollar equation.
02:41:02.000 Why?
02:41:02.000 Turnover is at zero.
02:41:04.000 When money turnover goes from one to two, watch out.
02:41:08.000 I don't know.
02:41:14.000 432 cycles per second says, all Bitcoin maximalists are now Ethereum maximalists because they are not stupid.
02:41:19.000 Also, don't count on the blockchain failing except for solar flare or nuclear war.
02:41:24.000 Interesting.
02:41:25.000 You could also store it in glass in the blockchain in orbit, so maybe a flare wouldn't affect it.
02:41:29.000 I don't know.
02:41:30.000 Did you guys hear this?
02:41:31.000 Phoenix says, Tim, the US seized Pirate Bay's Bitcoin, billions worth, and it just moved recently.
02:41:36.000 Look it up.
02:41:37.000 Anyone hear about that?
02:41:37.000 No?
02:41:39.000 Didn't know that.
02:41:39.000 But if the any all governments who are smart are stockpiling for the purposes of the treasury, and you were saying that there is risk and that there is for sure.
02:41:47.000 I think I've been, you know, I have some Bitcoin.
02:41:52.000 I wish I bought way more, you know, back in the day.
02:41:55.000 And I regret not listening to Max Keiser.
02:41:56.000 Max and Stacey.
02:41:57.000 Max and Stacey.
02:42:00.000 I love that meme.
02:42:00.000 Have fun staying poor.
02:42:01.000 And it's like him drinking a martini or whatever.
02:42:03.000 Did you hear Sean Lennon's intro to their podcast?
02:42:07.000 Oh no, he did one.
02:42:07.000 Yeah, he did it for them.
02:42:08.000 It's really good.
02:42:09.000 Oh, it's awesome!
02:42:10.000 I didn't know that was Sean.
02:42:11.000 That was Sean, yeah.
02:42:12.000 I'm just gonna say it again.
02:42:13.000 If you listen to Max and Stacy, you'd be super rich right now.
02:42:17.000 History repeats itself.
02:42:19.000 I know, I know.
02:42:21.000 He's been right about so much for a long time.
02:42:23.000 And this was back in like 2012.
02:42:24.000 He was like, you gotta buy Bitcoin!
02:42:27.000 And it was like a couple bucks.
02:42:29.000 I love the way he just went for the throat of the banking system after the collapse in 2008.
02:42:33.000 Man, he was vicious.
02:42:35.000 And just right on about how criminal they were, about the money they took, and Obama bailing them out.
02:42:40.000 We're hoping to get them on the show soon.
02:42:42.000 They don't want to travel for obvious reasons.
02:42:44.000 But Max and Stacey are amazing.
02:42:46.000 Eventually.
02:42:46.000 Do you love Max and Stacey?
02:42:47.000 Yeah.
02:42:48.000 And I wish I listened more to them.
02:42:50.000 They're actually advisors of mine's.
02:42:51.000 Oh, really?
02:42:52.000 Yeah.
02:42:52.000 Oh, cool.
02:42:53.000 Yeah, they're good people.
02:42:54.000 Well, you've been on Max's show.
02:42:55.000 Yeah.
02:42:56.000 If you have the Orange Pill podcast?
02:42:58.000 Yeah.
02:42:58.000 Right on.
02:42:59.000 Yeah.
02:43:00.000 No, no, no.
02:43:00.000 It was on his RT show.
02:43:02.000 Oh, yeah.
02:43:02.000 I haven't been on that.
02:43:03.000 That's a new show, the Orange Pill.
02:43:04.000 Are they still doing the RT stuff?
02:43:06.000 I don't know if they are.
02:43:07.000 I don't know.
02:43:07.000 They're doing the orange pill podcast and it's like the orange pill.
02:43:10.000 There's a colored pill for everything.
02:43:11.000 But it's Bitcoin.
02:43:13.000 And I think he's right, man.
02:43:16.000 He's right.
02:43:16.000 Yeah.
02:43:17.000 If you haven't already, smash that like button.
02:43:18.000 We'll do one more.
02:43:19.000 We'll do one more Super Chat here.
02:43:22.000 Let's see, we'll do two more Super Chats.
02:43:23.000 MickeyThe4th says, There's an amazing channel called Radical Liberation focusing on in-depth analysis of geopolitics doing weekly streams.
02:43:31.000 They recently did an episode on how the scientific theories of overpopulation have been here to justify stuff for 200 years.
02:43:38.000 Must watch.
02:43:39.000 Interesting, we'll check it out.
02:43:40.000 And we'll just do one more.
02:43:41.000 It's the perfect segue as we begin to sign off.
02:43:44.000 Julie Simone says, Love the addition of Luke to the show.
02:43:48.000 Hit me up if you need any puppy training tips.
02:43:50.000 Congrats on becoming a dog dad.
02:43:52.000 Oh, thank you.
02:43:52.000 I'm doing a lot of training.
02:43:54.000 It's a lot of work.
02:43:56.000 And keeping up my own independent media organization and coming in here, it's a lot of stuff.
02:44:01.000 But we'll see where it goes.
02:44:03.000 The cuddles are worth it.
02:44:04.000 Yes.
02:44:06.000 Does she bite your feet in the middle of the night?
02:44:08.000 No, we're crate training her.
02:44:10.000 That's good.
02:44:12.000 You want her to like the crate?
02:44:13.000 Yes.
02:44:14.000 As sick as that sounds.
02:44:15.000 They feel like it's their own home and they actually prefer it.
02:44:20.000 My friend had a dog that when you would train her she would bark, you'd grab her throat and pinch it.
02:44:25.000 And then she's never barked.
02:44:26.000 Well, you want to use positive reinforcement.
02:44:28.000 So whenever she does something good, like when I call her, she comes to me.
02:44:31.000 I give her a treat.
02:44:32.000 When she potties outside, I give her a treat.
02:44:34.000 You don't want to really use negative enforcement, especially with breeds that are known to attack their owners.
02:44:39.000 You kind of want to keep them.
02:44:40.000 They say that works with humans too.
02:44:42.000 Yes, there we go.
02:44:44.000 You want to keep things happy.
02:44:45.000 So the dog's always happy and always pumped up and always, you know, the tail is wagging.
02:44:50.000 So that's what I've been doing.
02:44:51.000 Maybe I'm making a mistake.
02:44:52.000 Let me know.
02:44:53.000 So.
02:44:54.000 Right on.
02:44:54.000 My friends, smash the like button, subscribe, notification bell.
02:44:57.000 It really, really does help.
02:44:58.000 Engagement is good for YouTube.
02:44:59.000 YouTube loves it.
02:45:01.000 And now the fun part.
02:45:02.000 If you want to follow me, I'm on Twitter, Instagram, and Parler at Timcast, presumably for the foreseeable future, but you never know because things are getting absolutely crazy.
02:45:12.000 So you can also check out my other YouTube channels for the time being, hopefully for the foreseeable future.
02:45:17.000 YouTube.com slash Timcast and YouTube.com slash Timcast News.
02:45:21.000 And we do the show Monday through Friday live at 8 p.m., so subscribe.
02:45:24.000 Give us a good review on iTunes.
02:45:25.000 It really, really does help.
02:45:26.000 And if you haven't checked us out there, you can check us out on all podcast platforms.
02:45:29.000 Bill, thanks for coming and hanging out.
02:45:31.000 Thanks for having me, man.
02:45:32.000 I love you guys.
02:45:33.000 Not only do you have social media accounts, you have an entire social media network.
02:45:37.000 Yeah, yeah.
02:45:39.000 Tim is one of the rock stars over at Mines, so don't forget to shout out Mines and your little reg list.
02:45:47.000 You know, it's funny with the little social media icons that people put on their website.
02:45:50.000 It's like, you know, the trendy ones and, you know, you just gotta, yes, all the alternatives.
02:45:56.000 Cover yourself.
02:45:57.000 Cover your own bases, everyone out there.
02:45:59.000 Join them all.
02:46:00.000 Just do what you can.
02:46:01.000 So, mines.com slash ottman, O-T-T-M-A-N.
02:46:04.000 Right on.
02:46:05.000 Luke.
02:46:06.000 You sell shirts?
02:46:07.000 Yes, I sell shirts.
02:46:08.000 I was gonna say check out my small independent mom and pop media organization on the YouTube channel We Are Change, but I think it's more imperative you go to wearechange.org and definitely sign up on that email list so we could talk together without some head honcho oligarch standing in the way between me and you.
02:46:24.000 There's also wearechange.org forward slash donate, which you could support my independent voluntary efforts here.
02:46:30.000 And there's like 20 different ways where you could get involved.
02:46:32.000 And I really, really, I mean, I got to admit it, like the people you have here, the people you've been able to galvanize, top A, amazing individuals.
02:46:40.000 Some of them are like facetious, and they make a lot of funny comments.
02:46:43.000 But I seriously, seriously, one of the best communities that you've been able to foment and build here.
02:46:48.000 Awesome, amazing human beings.
02:46:50.000 Thank you guys so much.
02:46:52.000 Thank you for coming to my channel too and checking it out as well and spreading the support and love.
02:46:57.000 It truly is crucial and important more than ever that we get the word out now.
02:47:02.000 While we can.
02:47:05.000 Exactly.
02:47:05.000 Oh, well, hello, Tim.
02:47:07.000 Thank you.
02:47:08.000 Yes, you can follow me in Crossland at most social networks, Twitter, YouTube, Facebook, which I don't really check, Instagram, and Mines, of which I was a co-founder with Bill.
02:47:18.000 And it could be very well the future of social media if we maintain its free software methodology.
02:47:24.000 Also, smash that gorilla.
02:47:26.000 And share this.
02:47:27.000 Share this.
02:47:27.000 I don't think Tim mentioned yet to share this, but I want to encourage you to share this content because in the day of, you know, computer simulated algorithms that are deciding what people see, you still have the power to show people things you like.
02:47:41.000 And Ian, if every single person who tuned in today shared this, we would be bigger than CNN.
02:47:47.000 Okay, then it's possible.
02:47:49.000 Shares are more powerful than ever.
02:47:51.000 That's right.
02:47:52.000 You can also follow at Sour Patch Lids.
02:47:53.000 You can.
02:47:54.000 You can follow me on Twitter.
02:47:55.000 I'm on Twitter at Sour Patch Lids.
02:47:57.000 Sour Patch L-Y-D-S.
02:47:59.000 And I post random stuff.
02:48:00.000 Is today Friday?
02:48:01.000 Today is Friday.
02:48:02.000 Wow.
02:48:02.000 Yes.
02:48:02.000 We'll be back Monday.
02:48:04.000 I will be back tomorrow morning on my channel over at youtube.com slash timcastnews.
02:48:08.000 But we'll be back with this show live 8 p.m.
02:48:10.000 Monday.
02:48:11.000 So again, smash the like button, subscribe, check us out on all podcast platforms.
02:48:14.000 Thanks for hanging out and we will see you all next time.