Timcast IRL - Tim Pool - August 17, 2020


Timcast IRL - Virginia Democrat Hit With Felonies Over Tearing Down Statue, The Hammer Hath Fallen


Episode Stats

Length

2 hours and 21 minutes

Words per Minute

185.55815

Word Count

26,207

Sentence Count

2,326

Misogynist Sentences

18

Hate Speech Sentences

25


Summary

Bill Ottman, CEO of Minds, joins us to talk about censorship and the audio problems that have been plaguing us for a while now. We also hear about a state senator who is facing two felonies for conspiring to tear down a Confederate monument in Virginia and a woman who is 76 years old.


Transcript

00:00:00.000 The hammer has finally dropped.
00:00:27.000 A Virginia state senator is hit with two felonies for conspiring to tear down a Confederate monument in Virginia.
00:00:37.000 And, you know, this one's a tough one.
00:00:38.000 This lady is actually, like, 76.
00:00:40.000 Like, what do you do with a 76-year-old?
00:00:43.000 How's it going, everybody?
00:00:44.000 We got a couple stories for you.
00:00:46.000 We got a bunch, actually.
00:00:47.000 But today, We're getting back into guests, and I'm hanging out with Bill Ottman.
00:00:52.000 Hey, hey.
00:00:53.000 Let's do this.
00:00:54.000 Who are you, Bill?
00:00:56.000 I'm Bill.
00:00:57.000 I am of minds.
00:00:59.000 I am of the planet Earth.
00:01:01.000 Planet Earth.
00:01:02.000 Bill, you're like the co-founder and CEO of Minds.
00:01:04.000 Yep.
00:01:04.000 So this is perfect because we just had a big wave of censorship, which is why I segwayed into introducing Bill.
00:01:10.000 Babylon Bee got suspended.
00:01:12.000 Quickly reversed, however, but this is big.
00:01:14.000 Bill Mitchell, big Trump supporter, got suspended.
00:01:17.000 Our connection looks good?
00:01:18.000 Account also got got hit now. I don't even know Andrew Doyle the person who runs the tata account
00:01:24.000 I don't even know they can get in and so Censorship, I mean and you you run a social network minds.
00:01:30.000 Yeah, everybody's everybody's been been cheering for the audios messed up
00:01:34.000 Audio sounds choppy What's it say our connection looks good, I don't know that's
00:01:41.000 really weird Better still says audio is good
00:01:47.000 Are people lying?
00:01:48.000 I can't imagine.
00:01:50.000 Oh, that was weird.
00:01:51.000 F audio.
00:01:53.000 Audio going to hell.
00:01:55.000 Uh... Interesting.
00:01:56.000 That is unexplainable.
00:01:59.000 I appreciate you guys telling us the audio's messed up.
00:02:01.000 That's weird, it's never happened before.
00:02:02.000 Let's see if it gets better.
00:02:04.000 Mic sounds crackly?
00:02:07.000 Loose audio cable.
00:02:08.000 Alright, please hold.
00:02:10.000 Oh!
00:02:13.000 What's the cat?
00:02:14.000 Possibly.
00:02:15.000 He walked through underneath the desk and knocked everything over.
00:02:18.000 My mic is bad?
00:02:20.000 That's weird and clear.
00:02:23.000 It's really hard to figure out because everyone's saying the mic is messed up so I can't tell.
00:02:26.000 That's a bummer.
00:02:27.000 Can you open the feed and see if you can find, like actually play the YouTube video and see what it says?
00:02:32.000 Oh yeah, hold on.
00:02:34.000 Mic is choppy.
00:02:35.000 This might be a YouTube thing, man.
00:02:38.000 I wonder.
00:02:38.000 It says our connection's really good.
00:02:41.000 Yeah.
00:02:44.000 I can't tell.
00:02:44.000 Everybody's just saying the same thing.
00:02:46.000 I can't read the chat because it's still choppy.
00:02:49.000 So, well, if you can hear.
00:02:52.000 I use the same setup for recording all day.
00:02:55.000 So... It is choppy and I don't know how to fix it.
00:02:57.000 Just my mic.
00:03:00.000 I have no idea.
00:03:01.000 It is choppy?
00:03:02.000 Yeah.
00:03:03.000 Weird.
00:03:03.000 What happened?
00:03:03.000 I don't know.
00:03:06.000 I don't know.
00:03:06.000 I have no idea how to fix it.
00:03:10.000 Well, that's a bummer.
00:03:13.000 Let's see, I'm trying to see what people are saying.
00:03:14.000 Somebody says, switch Tim's cable.
00:03:18.000 Tim and Lydia sound bad.
00:03:19.000 Yeah.
00:03:20.000 Do I sound bad?
00:03:21.000 I think you sound fine.
00:03:23.000 Can't hear.
00:03:30.000 Let's hear another mic.
00:03:31.000 No YouTube, not choppy.
00:03:33.000 So what just happened?
00:03:34.000 You know what?
00:03:34.000 It was probably the cat.
00:03:36.000 Yeah.
00:03:37.000 He walked underneath the desk and he knocked like everything over.
00:03:40.000 Like you're talking underwater.
00:03:48.000 Okay.
00:03:49.000 You know, man.
00:03:53.000 Both mics choppy.
00:03:54.000 Huh.
00:03:58.000 I have no idea.
00:04:02.000 There's no... Turn off your mixer, turn back on.
00:04:07.000 We're troubleshooting live. You guys get to enjoy this with us.
00:04:10.000 One, two, three, four.
00:04:25.000 What about now?
00:04:29.000 All mics are bad.
00:04:32.000 Frustrating.
00:04:34.000 Reboot!
00:04:37.000 It was Boku.
00:04:39.000 It was.
00:04:41.000 Like talking into a fan, restart the feed.
00:04:53.000 Well, I have no idea.
00:04:56.000 We will restart the feed.
00:04:57.000 I guess we have to.
00:04:58.000 Yeah.
00:04:59.000 Okay.
00:05:00.000 All right.
00:05:00.000 Okay.
00:05:23.000 We restarted.
00:05:25.000 Can you hear me now?
00:05:26.000 I have no idea.
00:05:28.000 I'll wait until, uh, we did a sound check.
00:05:30.000 We did everything.
00:05:31.000 The mic is broken.
00:05:35.000 We'll have to, uh, wait a second to see what you, what, what the chat says.
00:05:39.000 Cause we did a test on our end and it sounded fine.
00:05:42.000 So let us know if it's working now.
00:05:46.000 It could have been the computer itself.
00:05:49.000 Better.
00:05:49.000 Good.
00:05:49.000 Better.
00:05:50.000 Oh, we fixed it.
00:05:50.000 Okay.
00:05:53.000 We're going to start over.
00:05:55.000 Yeah.
00:05:55.000 We're starting over.
00:05:56.000 All right.
00:05:57.000 All right.
00:05:58.000 How's it going, everybody?
00:05:59.000 It looks like everything is working.
00:06:02.000 So I know exactly what happened.
00:06:05.000 I just swapped out the cameras to 4K.
00:06:09.000 And I guess it wasn't actually an audio problem.
00:06:11.000 It wasn't really an audio problem.
00:06:13.000 It was a computer problem that was negatively impacting everything.
00:06:17.000 So everything was stuttering.
00:06:18.000 I guess it's good now.
00:06:20.000 So we're back.
00:06:21.000 We got big news.
00:06:23.000 Oh man, I was really excited with that opening before when I said the hammer hath fallen.
00:06:27.000 They charged a Virginia Democrat with two felonies for conspiring to tear down a statue, a Confederate monument.
00:06:35.000 The only thing is, she's 76, so I'm not quite sure how you actually, you know, what do you do?
00:06:41.000 You lock up a 76-year-old?
00:06:43.000 Anyway.
00:06:43.000 So we'll talk about this and where we're going.
00:06:45.000 There's also another big story where they tore down a statue of George Washington again.
00:06:50.000 But this was last week, and now they've caught the people who did it.
00:06:53.000 We got six arrests.
00:06:54.000 The other big news we have is censorship.
00:06:56.000 And the good news there is we're joined by Bill Ottman.
00:07:00.000 I introduced you a second ago, but you're being introduced again.
00:07:03.000 So there's Bill.
00:07:05.000 Bill, you're the co-founder and CEO of Mines, I guess?
00:07:08.000 I guess.
00:07:09.000 I guess.
00:07:10.000 I don't know.
00:07:10.000 We just did this, but then the thing broke, so it's like...
00:07:13.000 Yeah, psyched to be here, man.
00:07:15.000 Right on!
00:07:16.000 Yeah, so we had a bunch of big accounts get shut down.
00:07:19.000 Babylon Bee, which is a very famous parody website, much like The Onion, very political, got shut down.
00:07:27.000 I thought it was a total ban, but then they restored it very quickly.
00:07:31.000 And then we had Bill Mitchell, so this is a couple days ago, he's a huge Trump supporter, and he got shut down.
00:07:35.000 Then we saw Titania McGrath, very famous parody account, mocking woke people.
00:07:43.000 Now Andrew Doyle, who runs the account, is locked out.
00:07:45.000 And this censorship plays a huge role in why everything seems to be going nuts, because if the people who challenge the status quo aren't allowed to speak, then it just runs off the rails, so.
00:07:56.000 You know, we got Bill here.
00:07:57.000 People have talked a lot about mines.
00:07:59.000 Did they say what happened specifically to Babylon Bee?
00:08:01.000 What the reason was?
00:08:02.000 Nope.
00:08:03.000 Count Dankula tweeted that a bunch of accounts that challenge the left got shut down.
00:08:08.000 And so it might have been caught up in this big sweep where, look, it's just the way it is, man.
00:08:13.000 No comedy.
00:08:14.000 No comedy allowed.
00:08:15.000 No.
00:08:15.000 Yeah, because everything's offensive.
00:08:17.000 It's like Fahrenheit 451.
00:08:19.000 If it's offensive, we gotta go.
00:08:21.000 So, we've also got some other, in that vein, DC Comics getting woke and going broke.
00:08:27.000 I don't think that's really your forte, but we're gonna thrust you.
00:08:30.000 Hey, I used to have Marvel Dex.
00:08:33.000 Not DC Dex, Marvel Dex.
00:08:35.000 And then, this is actually really cool, because we're just gonna have fun with it at the end.
00:08:39.000 We got some aliens, and there's a house that's kind of going viral.
00:08:43.000 It's got a prison in it.
00:08:44.000 It's got a full, nine-cell prison.
00:08:47.000 You walk into this house, and you're like, what a really nice house.
00:08:49.000 And then you walk in the basement, and you're like, this dude's a murderer.
00:08:51.000 But it looks really cool.
00:08:52.000 It's just like your prison.
00:08:55.000 No, downstairs is really nice.
00:08:58.000 No one's supposed to know about that.
00:08:59.000 Yeah.
00:09:00.000 So, uh, and also of course, Lydia's hanging out.
00:09:02.000 Yeah.
00:09:02.000 I'm over here in the chair too.
00:09:04.000 Producing again.
00:09:04.000 It's great.
00:09:05.000 I got such big news, man.
00:09:07.000 We're going to do a show.
00:09:08.000 If you haven't already, smash that like button, subscribe, hit the notification bell.
00:09:12.000 We're live Monday through Friday at 8 PM.
00:09:15.000 We, listen, I apologize for the technical difficulties.
00:09:18.000 We totally upgraded the studio just the other night.
00:09:21.000 And if only you could see it.
00:09:22.000 It's amazing.
00:09:23.000 It's cool.
00:09:24.000 Bill knows.
00:09:24.000 There's a very big screen extra.
00:09:26.000 It's a very big TV.
00:09:28.000 So this led ultimately to the hiccups we just had where the audio cut out, so I apologize for that.
00:09:34.000 But the news is, boy do we got a bunch of guests coming.
00:09:38.000 So Bill's here, and I'm stoked.
00:09:40.000 We've worked on stuff, I've known Bill for a long time, and Mines is one of the potential solutions to the censorship crisis.
00:09:47.000 That's, you know, I will also briefly mention it's going to start thunderstorming.
00:09:51.000 So if the power goes out, you know, I got backup batteries.
00:09:54.000 They last about 10 minutes.
00:09:55.000 So we'll be able to like sign off while the thing screeches.
00:09:57.000 The power is shutting off.
00:09:59.000 But it is thunderstorming.
00:10:00.000 But we got a bunch of guests.
00:10:01.000 I don't know who I can announce.
00:10:03.000 The redheaded libertarian Josie will be here next week.
00:10:07.000 And Carrie Smith is going to be here tomorrow.
00:10:09.000 She wrote an article.
00:10:10.000 I actually read part of it for our segment.
00:10:13.000 She is a liberal voting for Trump.
00:10:16.000 And then we have Jack Murphy the next day who literally wrote the book on Democrats voting for Trump.
00:10:20.000 So we've got a bunch of guests lined up.
00:10:22.000 Oh, man.
00:10:23.000 And I'm not going to say, I guess, who the later guests are.
00:10:26.000 Tune in on Friday?
00:10:27.000 Tune in on Friday.
00:10:28.000 Yes.
00:10:28.000 Because I have to clear it with some of the guests before I announce they're going to be here.
00:10:32.000 I'm stoked.
00:10:33.000 Yeah, but we got some cool people.
00:10:36.000 For some reason, this week is all about liberals voting for Trump.
00:10:38.000 Don't ask me why.
00:10:39.000 It's just how it played out.
00:10:41.000 But let's do this.
00:10:41.000 Let's jump over to the first story.
00:10:43.000 The one that you guys are very interested in.
00:10:45.000 Here's the top of Tim's head.
00:10:47.000 This is what I was talking about with the thing breaking.
00:10:51.000 So you're going to watch me fix this in real time.
00:10:55.000 I told you it was breaking, right?
00:10:57.000 We fixed this before the show, actually.
00:10:59.000 And then it broke and we had to reset it.
00:11:00.000 Then here we are doing it again.
00:11:03.000 We, I assure you, we did fix all these things and then it crashed just now.
00:11:08.000 This is, it's not my fault.
00:11:10.000 I take, I take only a little bit of the responsibility.
00:11:14.000 So, oh man, this is so much fun.
00:11:16.000 It can be clipped as a how-to.
00:11:17.000 That's right.
00:11:18.000 Yes.
00:11:19.000 Here is how you fix the problem when your cameras, we upgrade them to a 4K and then the whole thing breaks.
00:11:28.000 Is someone saying there's an echo now?
00:11:30.000 Oh, for Pete's sake.
00:11:32.000 Oh, man.
00:11:32.000 Well, that's a bummer.
00:11:36.000 Thunder or lightning.
00:11:37.000 Echo, echo.
00:11:38.000 I can't control the echo.
00:11:39.000 I have no idea what's causing that.
00:11:41.000 That's really weird.
00:11:43.000 I can, uh... I don't see a lot of that.
00:11:48.000 What are people saying?
00:11:49.000 Uh, none of that.
00:11:50.000 Reverb, not echo.
00:11:50.000 Yeah, I'm not seeing any.
00:11:51.000 Oh, weird.
00:11:52.000 Yeah.
00:11:52.000 Well, look how crazy the camera looks.
00:11:54.000 We just... Whatever, man.
00:11:55.000 We're gonna wing it.
00:11:56.000 We're gonna go with it and hopefully...
00:12:00.000 You can still hear it?
00:12:01.000 People are saying no echo.
00:12:03.000 I don't hear echo.
00:12:03.000 I think we're okay.
00:12:04.000 We're fine.
00:12:05.000 Here's the big story.
00:12:07.000 Senator Lucas charged with two felonies for June incident at the Portsmouth Confederate Monument.
00:12:13.000 This is out of Virginia.
00:12:14.000 This was back in June.
00:12:16.000 We've seen a wave of these, you guys know this, the riots where they're going in, they're tearing down statues.
00:12:21.000 Well, now the hammer is dropping.
00:12:24.000 We got two big stories about people being arrested for it.
00:12:27.000 The crazy thing about this one is that this lady's actually an elected Democrat.
00:12:31.000 They say the Portsmouth Police Chief Angela Green announced during a Monday afternoon press conference that State Senator Louise Lucas has been charged with two felonies for an incident at the city's Confederate monument on June 10th.
00:12:43.000 She, among others, is facing charges of conspiracy to commit a felony and injury to a monument in excess of $1,000.
00:12:50.000 Portsmouth officials held the briefing Monday afternoon to announce that several warrants have been secured against individuals more than two months after the incident at the city's Confederate monument.
00:12:59.000 So it sounds like it's not just her.
00:13:00.000 It's gonna be a bunch of other people.
00:13:03.000 Green issued a statement, but did not take any questions as the investigation is ongoing, city officials said.
00:13:08.000 No.
00:13:08.000 I saw that.
00:13:08.000 the confederate monument was vandalized and broken apart by protesters
00:13:09.000 Yep.
00:13:12.000 which culminated with a protester being seriously injured when part of the statue fell on him. So I remember this.
00:13:18.000 Did you ever see this story?
00:13:19.000 No. They pulled the statue down on a dude's head.
00:13:22.000 I saw that. Yup. This is why I always try to tell my friends when they're
00:13:27.000 complaining about... I have a bunch of progressive friends and they'll be like,
00:13:29.000 well they're confederate statues, you know.
00:13:32.000 And my response is always, it fell on someone's head, you know?
00:13:36.000 Look, if you want to remove it, I get it, but at the very least, you've got to have some kind of safe... safety.
00:13:42.000 And honestly, even after it happened, people weren't crowding around helping.
00:13:46.000 It seemed like people were still just sort of standing there, like, oh, did that just hit somebody, but still, like, not immediately going to help them?
00:13:53.000 Yeah, that's because they're not paying attention and they're just throwing ropes and pulling it down and they're, I think they're emulating what they see in like the Middle East and other countries.
00:14:01.000 And so none of them really know what they're doing.
00:14:03.000 It's just, it's all fun and games.
00:14:06.000 And then people get seriously hurt.
00:14:08.000 Let's read a little bit more.
00:14:08.000 They say, since then, a team of investigators compiled evidence, including video from that day.
00:14:14.000 As a result of the investigation, detectives determined that several individuals performed felonious acts.
00:14:20.000 And have taken out warrants against them, including Lucas, as well as a Portsmouth school board member.
00:14:25.000 Wow.
00:14:25.000 Lakeisha S. Clue Atkinson, members of the NAACP and members of the Public Defender's Office.
00:14:31.000 Wait, wait, wait, whoa!
00:14:33.000 They took, wait, hold on.
00:14:34.000 They got warrants against members of the NAACP and the... Yeah, I saw that.
00:14:38.000 No, no, no, this is wrong.
00:14:39.000 I'm reading this wrong, aren't I?
00:14:41.000 Several individuals performed felonious acts, even the Public Defender's Office?
00:14:45.000 Wow.
00:14:46.000 Man, I didn't realize that.
00:14:48.000 The initial source I had was The Hill.
00:14:50.000 I like to pull up the original sources, though, and then I go through, I'm like, yep, this all adds up.
00:14:54.000 Wow.
00:14:56.000 So they have a full press conference on it.
00:14:58.000 Here's the full list of those facing charges of conspiracy to commit a felony and injury to a monument in excess of $1,000.
00:15:06.000 State Senator Louise Lucas, James Boyd of the NAACP, Louie Gibbs, NAACP.
00:15:13.000 Then we have LaKesha Hicks, NAACP.
00:15:16.000 LaKesha S. Clue, a school board member.
00:15:19.000 These next people I'm not going to name because it doesn't really say anything about what they do, but check this out.
00:15:23.000 Here's the list of individuals facing a felony charge of injury to a monument in excess of $1,000.
00:15:29.000 Brenda Spry, public defender.
00:15:31.000 Alexander Stevens, public defender.
00:15:33.000 Meredith Kramer, public defender.
00:15:36.000 Wow, man.
00:15:38.000 Now my mind is blown.
00:15:39.000 That's crazy.
00:15:39.000 Wow, dude.
00:15:40.000 Green asked that anyone with an active warrant turn themselves in.
00:15:44.000 Public defenders?
00:15:45.000 The police department is asking for help identifying 13 additional people.
00:15:49.000 Detectives are asking that the public take a look at these photos and reach out if you recognize them.
00:15:54.000 Call the Portsmouth Crime Line.
00:15:56.000 Wow, man.
00:15:58.000 The hammer's fallen.
00:15:59.000 You've been seeing a lot about this, right?
00:16:00.000 I mean, it's been happening in my town.
00:16:03.000 It seems so unnecessary because towns are voting to take down the statues.
00:16:07.000 So, you don't need to tear it down.
00:16:10.000 It seems like there's plenty of legal votes happening to take it down.
00:16:14.000 Some of them, they keep defending the statues.
00:16:17.000 But you know what really bothers me is the whole conversation has been nothing but Confederate.
00:16:22.000 That's all I hear.
00:16:24.000 So, you know what I always see?
00:16:27.000 And I don't know if you can answer to this or add to this as someone who runs a social network.
00:16:32.000 On Facebook, I get inundated with these memes, and the left has these memes.
00:16:38.000 It seems to be where they get their news.
00:16:39.000 I mean, factually, there was a study done by a group called Newswhip that found the left gets their news from Occupy Democrats, which is a meme farm, whereas the right, it's Fox News, which is just a conservative news source.
00:16:51.000 So, there's a meme right now where it's like, here's why Trump and the Republicans are defending Confederate statues, and these memes go viral, and it's all people see, when the real argument is actually about George Washington, Ulysses S. Grant, Hans Christian Haig, and about not committing felonies because sometimes statues fall on people's heads.
00:17:11.000 So I don't know, you know, if just in terms, because maybe we should save this for more of the censorship segment, but I mean, Facebook's only feeding people this fake news.
00:17:21.000 You know what I mean?
00:17:22.000 Yeah.
00:17:23.000 The statues need to be re-contextualized so that people can understand their place in history and time.
00:17:34.000 And like, there's just other ways to do it.
00:17:38.000 Well, I mean, like, I guess what I was trying to ask is about how do we get people to realize—actually, look, I'll show you this.
00:17:44.000 This is what I was trying to get to.
00:17:46.000 Six Black Lives Matter protesters are arrested for tearing down and defacing George Washington's statue at LA City Hall as cops recover gas mask, laser pointer, helmets, goggles, arm protectors, and change of clothing to conceal identity.
00:18:02.000 First of all, they're rioters.
00:18:03.000 But I guess what I was trying to get to is we have all of these news stories about how they're tearing down actual founding fathers.
00:18:10.000 But whenever you go on social media, you just see the left sharing memes about Confederate statues.
00:18:15.000 Right.
00:18:15.000 It made that leap and nobody really acknowledged it.
00:18:18.000 Trump did.
00:18:19.000 And when he did, the media lied and said, Confederate, Confederate, Confederate.
00:18:23.000 Now, I think I know why the media does it.
00:18:25.000 They don't want to admit that they entertained these riots.
00:18:30.000 And now they're tearing down George Washington.
00:18:32.000 But I guess my question to you, if you can speak to this, is how do we break that where the left only sees the fake news?
00:18:41.000 You know what I mean?
00:18:42.000 Right.
00:18:43.000 It's like building echo chambers, how to break echo chambers.
00:18:45.000 So you need to...
00:18:49.000 I do believe in chronological feeds.
00:18:52.000 In a sense, I believe in having the right to be able to create your own echo chamber and not forcing people to see stuff that they don't want to see.
00:18:59.000 But I personally like to break my own echo chamber and I think providing tools to let people see the other side is super important.
00:19:09.000 So there could be mechanisms for that.
00:19:12.000 The issue I see with Facebook is that, while I agree with you, I follow who I want to follow
00:19:17.000 and yeah, I think if you're a smart, reasonable person, you're trying to follow as many different
00:19:21.000 voices as possible.
00:19:23.000 Twitter periodically will switch me back to algorithmic mode.
00:19:27.000 So I'll just try to explain this for those that aren't familiar.
00:19:30.000 You said reverse chronological.
00:19:32.000 That's basically whoever you follow when they post, you see it.
00:19:35.000 What all of these sites have been changing to, like YouTube included, is algorithmic.
00:19:40.000 Meaning they're going to show you what they think you want to see.
00:19:43.000 Is that the right way to explain it?
00:19:44.000 Yeah.
00:19:45.000 What they know you're more likely to engage with.
00:19:49.000 Yeah, so sometimes it's fake news.
00:19:50.000 It's just so annoying on Twitter.
00:19:52.000 They revert you back every time.
00:19:53.000 There's that little star icon on the top, so you can go over to Chronological.
00:19:57.000 I mean, a lot of sites don't even allow you to do that, but people are just not going to click that button very often.
00:20:03.000 I have to check every so often.
00:20:05.000 Man, do I get angry.
00:20:06.000 Because for me it's really important, following news, that I get the latest news up to date.
00:20:11.000 And then yeah, every so often I'll be like, that's strange.
00:20:14.000 Didn't this story happen a day ago?
00:20:16.000 You son of a... And then I gotta click it and switch it back.
00:20:19.000 They won't let you keep it.
00:20:21.000 Right.
00:20:21.000 It keeps trying to switch it on you.
00:20:23.000 So listen, in reference to what we're seeing here, there's two big things that I think we get as to why a woman would tear down this, you know, face felony charges.
00:20:31.000 First, I'll say this.
00:20:33.000 She's 76.
00:20:34.000 She probably thought, I'll go to prison, whatever.
00:20:39.000 I'm 76, what else am I gonna do?
00:20:41.000 Tear it down.
00:20:42.000 And now, there you go, right?
00:20:46.000 Younger people, they got their whole lives ahead of them, so the older activists probably jump in.
00:20:50.000 But I think a lot of these younger people, Because they're only seeing what they want to see, they go out and they join these, you know, these riots, and they end their lives.
00:21:00.000 Like, I mean, like with prison.
00:21:02.000 Do you see what happened in New York with the Molotov cocktail couple?
00:21:05.000 No.
00:21:06.000 I know this is out of your wheelhouse, you're a social media guy, so I'm just gonna, you know, I guess talk at you for a second.
00:21:10.000 There's a couple people, they had Molotovs, they're two lawyers, early 30s, pull up, launch them at police vehicles, hand them out or something to that effect, now their lives are over.
00:21:20.000 A lot of these people are going to prison and they're getting wrapped up in this like fake, I call it a paranoid delusional state.
00:21:29.000 But which people are getting let off?
00:21:31.000 Like what is the consensus around which states and cities are just letting the rioters off versus not?
00:21:40.000 Well, I guess if you throw a Molotov, you're not getting one off.
00:21:43.000 Yeah, New York let a bunch of people go for low-level protest offenses like violating COVID lockdown.
00:21:50.000 Fort Worth, I'm pretty sure they're different, released people for literal rioting charges.
00:21:55.000 I think some of these may have been felonies.
00:21:57.000 And then in Portland, they're just straight up like, you can go.
00:22:00.000 Just...
00:22:02.000 Yeah, so... Yeah, that is causing their police force to quit en masse, right?
00:22:06.000 And isn't their public defender friends with Antifa, dude?
00:22:10.000 Who?
00:22:10.000 I saw something like that.
00:22:11.000 The Portland Public Defender?
00:22:12.000 Well, he did an interview, apparently.
00:22:13.000 The Portland Public Defender did, like, an interview on some far-left show talking about how he's gonna be releasing these people.
00:22:20.000 So I think you see, you know, with, I'm sorry man, I didn't, I didn't realize public defenders, like this is, this is crazy.
00:22:27.000 These are people in government who are breaking the law, tearing things down.
00:22:31.000 It's like the rule of law is just crumbling.
00:22:34.000 And I think it has a lot to do with only being fed certain information.
00:22:40.000 We've actually talked about this quite a bit over the past week.
00:22:42.000 If the only thing people see is far left, then that's the only direction they can go, you know?
00:22:48.000 So it's kind of like COVID, where you keep getting inundated with news about how the world is ending, and no one can see otherwise.
00:22:57.000 So then businesses stay locked down.
00:22:59.000 Well, Facebook got exposed for the Princeton—the secret Princeton study they did trying to alter people's moods.
00:23:05.000 What?
00:23:05.000 Yeah, they, this was like five years ago, but basically they, without telling anybody, injected both positive and negative emotional content into people's feeds and found that they could change their emotions through what they fed to them.
00:23:21.000 Dude.
00:23:22.000 And so, yeah, obviously what you consume is what you become unless you make sure to consume a diverse amount of content.
00:23:32.000 Yeah, well, but on Facebook, it's not, I mean, you sort of can't.
00:23:36.000 You can't.
00:23:37.000 Because out of the, look, on Facebook for me, I'm maxed out, because I just, I get so many friends requests, friend requests, you can't even, I don't even think you can actually request.
00:23:45.000 You're so cool, dude.
00:23:46.000 Yeah, yeah, yeah, yeah, yeah.
00:23:47.000 It's, you know, some people don't do this.
00:23:49.000 They just deny everybody, make their accounts private, so it's only their friends and family.
00:23:53.000 Very early on, I just didn't care and started hitting accept.
00:23:56.000 So now my feed is just random garbage.
00:23:59.000 And I—you know what I see more than anything, though?
00:24:01.000 I almost exclu—I would say 80% of the memes I see are left-wing fake news conspiracy stuff.
00:24:07.000 Things like, you know, there's one going around from Occupy Democrats.
00:24:11.000 I think it's Occupy Democrats saying, not a single Democrat is for open borders, wants to take away your guns, or, you know, is trying to kill babies or whatever.
00:24:19.000 And I'm like, Yeah, it was like, not a single Democrat wants non-citizens to vote.
00:24:24.000 And I'm like, all of these things are very easily disproven with like mainstream acceptable sources that say, yeah, there are Democrats who want these things.
00:24:33.000 But then I personally only see on Facebook.
00:24:36.000 Well, I shouldn't say only, but like 80% of the time, the memes are left-wing.
00:24:39.000 Very few right-wing.
00:24:40.000 Maybe that says something about what I interact with, so... You know what it might be?
00:24:45.000 I'm like, commenting about it's not true, so then Facebook just sends me more of the same.
00:24:48.000 Well, there's no way we can know, because we have no idea what Diago's doing, so it's just...
00:24:53.000 I mean, thousands of variables are determining what you're seeing.
00:24:57.000 And for sure, they're punishing memes.
00:25:01.000 They can even read the language in the meme without any text associated with it, with all of their, you know, image recognition.
00:25:08.000 There's a, uh, for YouTube, you can put an image.
00:25:11.000 I forgot what it's called.
00:25:12.000 You know what it's called?
00:25:13.000 The Google program where you put an image search?
00:25:15.000 No, no, no, no.
00:25:16.000 You can load an image and then it will read any text found in the image.
00:25:20.000 Yeah, so they're shadowbanning memes by detecting the language and the symbolism in it.
00:25:28.000 It's just happening.
00:25:30.000 Do you think that Facebook is intentionally manipulating information for political or financial reasons?
00:25:37.000 Yes, but I will caveat that with, it is weird that there are, you know, Peter Thiel's on their board.
00:25:46.000 Yeah.
00:25:47.000 Who's, you know, voted for Trump.
00:25:48.000 Right, yeah, he's anti-SJW for sure.
00:25:50.000 Yeah, so it's not black and white, like these companies, like there's warfare happening within these companies, but I think it's clearly dominated by Super draconian content policies that don't allow a huge spectrum of speech.
00:26:05.000 It's not just against conservatives.
00:26:07.000 It's also against, you know, anti-authoritarian progressives, LGBTQ, anti-war, any sort of more edgy material.
00:26:17.000 So it's, you know, people like to... I would agree that conservatives get the brunt, but it's not that black and white.
00:26:25.000 For sure.
00:26:26.000 Yeah, there was a report by Project Veritas that found—it's pronounced Live Action, right?
00:26:31.000 It's Live Action?
00:26:32.000 Yeah, I believe so.
00:26:33.000 Yeah, Live Action is a pro-life organization.
00:26:36.000 They were being just outright censored on Pinterest.
00:26:40.000 Veritas released this blacklist document and they also had anti-media, which is a progressive anti-war, you know, for the most part, anti-police brutality.
00:26:49.000 And I saw that too and I'm like, this is interesting because a lot of Occupy Wall Street accounts got shut down too.
00:26:55.000 So it's not just conservatives, it's basically anti-establishment.
00:26:59.000 Yeah, the anti-media crew, I knew them.
00:27:02.000 They wrote, you know, they were very diligent in writing articles.
00:27:07.000 Like, you know, maybe they got some things wrong, but like, those guys worked hard.
00:27:10.000 But why blacklist anyone?
00:27:13.000 Yeah, exactly.
00:27:14.000 If you subscribe...
00:27:17.000 Get the content!
00:27:18.000 Just let me decide who I'm seeing, and if they're annoying me, I can unfollow them.
00:27:24.000 It's just that simple.
00:27:25.000 You need that option.
00:27:26.000 But that sounds naive.
00:27:29.000 They know that.
00:27:30.000 I'm not trying to drag you.
00:27:30.000 I'm saying these companies, they know this.
00:27:33.000 They're doing it because it empowers them.
00:27:36.000 So here, let's do this.
00:27:38.000 Let's jump right over to the next bit.
00:27:40.000 We have a tweet from Mr. Count Dankula.
00:27:43.000 Count Dankula's awesome, by the way.
00:27:44.000 I hope you guys follow him.
00:27:47.000 He says Twitter just did a mass banning of parody accounts that make fun of far-left rhetoric, but none of the accounts that mock right-wing rhetoric.
00:27:55.000 So, first...
00:27:56.000 This was before all this went down.
00:27:59.000 Pro-Trump pundit permanently suspended from Twitter.
00:28:02.000 The conservative pundit Bill Mitchell has been permanently suspended from the social media platform, and this is confirmed to The Hill.
00:28:08.000 Mitchell has been permanently suspended for violating the Twitter rules by using one account to evade the suspension of another account, a Twitter spokesperson said in an email.
00:28:17.000 Mitchell confirmed suspension in a post on social media app Parler, though he asserted
00:28:22.000 he was booted from Twitter over his stance on wearing a mask amid the coronavirus pandemic.
00:28:25.000 Twitter just suspended me for opposing masks. Who knows if I'll ever be back, Mitchell said.
00:28:31.000 I'm sure the decision wasn't political at all. I have a quick question for you.
00:28:34.000 You run a company, so I'm sure you have a legal department.
00:28:37.000 If you have any questions, please feel free to contact me.
00:28:38.000 I'm happy to help.
00:28:38.000 If you accused somebody of doing a thing for which you banned them, and it was not true, you clearly face legal liability, right?
00:28:49.000 Like, if you said, we banned Tim Pool for manipulating the platform with multiple accounts, and I didn't do that, you're lying.
00:28:54.000 That's like, just like, slander, libel, defamation.
00:28:56.000 Yeah, I mean, we could get into some of the Patreon lawsuit.
00:29:01.000 Oh!
00:29:03.000 Well, I mean... Why is that getting no attention?
00:29:05.000 The Patreon lawsuit?
00:29:05.000 Yeah.
00:29:06.000 I think we're waiting for updates.
00:29:06.000 I've seen not a single major article about that.
00:29:09.000 This is bad news for Patreon.
00:29:11.000 Because they're based in California.
00:29:11.000 Yeah.
00:29:12.000 I mean, not a single mainstream article about it.
00:29:16.000 Of course, it's Silicon Valley.
00:29:17.000 They're in trouble.
00:29:18.000 It's their own rules, you know?
00:29:20.000 But we'll see when the... I mean, the Daily Dot covered it.
00:29:20.000 Yeah.
00:29:23.000 For those that aren't familiar with what happened, Patreon got sued.
00:29:29.000 There was a request for arbitration because they banned people.
00:29:33.000 Then they sued those people trying to stop this because they would have to front millions of dollars.
00:29:37.000 They lost the suit and have to front millions of dollars.
00:29:41.000 And now, because of this, it sparked a big wave of attention.
00:29:44.000 It was a huge mistake.
00:29:45.000 Streisand, in fact, they're getting a wave of people like Sargon, Lauren Southern, you know, a couple other people.
00:29:51.000 All of their fans are now going after Patreon.
00:29:53.000 Patreon's gonna have to front, what, 10, 20, 30, 40, 50, who knows how many millions.
00:29:58.000 But the reason I asked you this specifically is, Twitter likes to play these games, in my opinion, I don't know if it's true or not, when they suspend people and say something like, you know, this person was using multiple accounts.
00:30:11.000 But if they're not really, like, how do we know that's true?
00:30:15.000 I want to see proof, you know what I mean?
00:30:16.000 Yeah, I think there would be grounds for legal action, and that's why you need to provide a path to redemption.
00:30:24.000 That's why we rolled out the jury system, to keep ourselves in check in case we make mistakes.
00:30:27.000 People can appeal it to the jury on Mines.
00:30:31.000 It's just like, you make mistakes, but there's no talking to a human at Twitter or Facebook when you get banned.
00:30:38.000 I mean, for some people there is.
00:30:40.000 For some people.
00:30:40.000 Yeah.
00:30:41.000 Yep.
00:30:42.000 If you're wealthy, successful, famous, you're a big media company, you pick up that phone, no problem.
00:30:46.000 Or your friend's a moderator.
00:30:47.000 Yep.
00:30:49.000 It is that incestuous.
00:30:51.000 That's weird.
00:30:53.000 So, you know, Jack actually talked about a path to redemption, Jack Dorsey of Twitter, a long time ago.
00:30:59.000 Because the way it's explained, right?
00:31:02.000 Let's say you're on Twitter.
00:31:03.000 People use this to connect with politicians, their local council, things like this.
00:31:08.000 And you say a naughty word that you didn't realize was a bad word, like hashtag learn to code or something.
00:31:15.000 Now you're banned.
00:31:16.000 Forever.
00:31:17.000 You can never open a new account.
00:31:18.000 You can never use the platform again.
00:31:19.000 That's like a digital death sentence for a minor infraction.
00:31:24.000 So, you know, Jack Dorsey's talked about a path to redemption.
00:31:28.000 They've never done anything.
00:31:30.000 Not that I know of.
00:31:31.000 You guys have a jury system on Mines.
00:31:32.000 Yeah.
00:31:33.000 Yeah, what does that do?
00:31:34.000 Yeah, basically, if you get tagged or, you know, flagged on SFW and you think that it was wrong, Then it goes to a randomized group of 12 active users.
00:31:46.000 We are thinking about expanding that number to make it a little bit less, to make there less potential for abuse.
00:31:55.000 But it goes to 12 active users, they vote, and it can get overturned.
00:31:58.000 And you can go and check the analytics.
00:32:01.000 We've made some mistakes, but people got back.
00:32:03.000 Yeah.
00:32:04.000 Yeah, I remember like around the time you implement it, there were people accusing you of bias for banning them.
00:32:09.000 They didn't even get banned.
00:32:10.000 They got flagged NSFW because... Oh, there you go.
00:32:13.000 But you can't even get banned for posting NSFW content on Mines.
00:32:16.000 You can get your channel marked, which in order to... We want NSFW content.
00:32:21.000 But we need to be able to put blurs in order to have that so that people who don't want to see that aren't seeing it.
00:32:27.000 Anyway, I did a live stream with this one guy, Axe17Apologetic, YouTube channel, and he criticizes Islam and he said, you know, he was sensational on purpose when it happened and he sort of knew and he said he was going to come back and it was cool.
00:32:46.000 You know, I understand people are sensitive, especially for platforms who say that they're free speech focused and then for bannings to happen.
00:32:55.000 That's definitely scary.
00:32:56.000 So, straight up, if something happens to you on Minds or someone you know on their channel has happened, just email info at minds.com.
00:33:04.000 We'll work it out.
00:33:06.000 We'll try to figure it out.
00:33:07.000 I get emails all the time from people.
00:33:09.000 They're like, Tim, you complain about censorship.
00:33:11.000 Why won't you promote Minds and BitChute?
00:33:13.000 And I'm like, I promote mines.
00:33:15.000 Well, Bill's here now, so Bill, promote mines.
00:33:18.000 Yeah, the jury system, I think, was really, really smart.
00:33:20.000 I remember when you told me you were rolling it out.
00:33:22.000 This way, if something happens, it's like it was on 12 random users, not necessarily you guys.
00:33:27.000 And the mandate is to vote with the First Amendment-based policy, not to just vote on your opinion.
00:33:35.000 But I'm sure a lot of people... Some people can try to game it, and that's probably happening, but you just have to make a good faith effort.
00:33:41.000 This is why I think you should expand it to a lot more people.
00:33:44.000 How many?
00:33:45.000 I don't know.
00:33:46.000 A hundred?
00:33:46.000 A thousand?
00:33:47.000 Depends on how many active users you can pester.
00:33:50.000 You know what I mean?
00:33:51.000 But, you know, the thing is, what happens if it's just a bad day and you get six random bad people?
00:33:56.000 It's a lot easier to get bad people when you have a smaller pool.
00:33:59.000 It's much easier to avoid that.
00:34:01.000 But there's a couple other things I want to highlight real quick, too, so we can talk more about this.
00:34:05.000 The Daily Caller reported, many reported, the Babylon Bee.
00:34:09.000 So this is in line with what Count Dankula was saying.
00:34:11.000 Channels that mock the left.
00:34:13.000 And I mean, the Babylon Bee is general satire and parody.
00:34:16.000 It's not even necessarily just targeting the left, though it kind of is.
00:34:19.000 They're technically a Christian site.
00:34:20.000 Right, right, right.
00:34:22.000 And then we have Kyle Mann.
00:34:24.000 He confirmed it for spam and manipulation.
00:34:27.000 That's really, really weird.
00:34:28.000 But they were brought back.
00:34:30.000 They say, we're back.
00:34:31.000 Twitter destroyed our headquarters with a drone strike, but we are being assured it was an honest mistake.
00:34:36.000 I love it.
00:34:37.000 It's fantastic.
00:34:38.000 Yeah, so we also have Titania McGrath.
00:34:41.000 Check this out.
00:34:42.000 Are you familiar with Titania?
00:34:43.000 Titania?
00:34:44.000 I always thought it was Titania.
00:34:45.000 It's Titania.
00:34:46.000 Titania.
00:34:46.000 Yeah, named for, what was it named for, the fairy queen?
00:34:48.000 The Queen of the Fairies, yeah.
00:34:49.000 The Queen of the Fairies.
00:34:49.000 Yeah, didn't you say that a bookstore just mistakenly... Yeah.
00:34:53.000 Yeah, that's great.
00:34:53.000 That was so good.
00:34:55.000 So, Titania is locked now.
00:34:57.000 You gotta, you know, you've gotta click accept.
00:34:59.000 Actually, I'm pretty sure, oh, actually, I don't know if Titania actually has the photo
00:35:05.000 of the bookstore, but there is, yeah, yeah, yeah, okay.
00:35:09.000 That was James.
00:35:10.000 Yeah, it was because they got locked out, James Lindsay posted it.
00:35:13.000 Here's Andrew Doyle, who runs it.
00:35:15.000 He says, it looks as though it was Titania McGrath's thread about medical science that
00:35:19.000 got her locked out of Twitter.
00:35:20.000 All it was doing was satirizing the tortious logic of critical theory and how it promotes
00:35:25.000 dangerous ideas in the name of social justice.
00:35:28.000 How is that worthy of a ban?
00:35:31.000 Because they've banned a bunch of parody accounts, it would seem.
00:35:34.000 Like they're going after them.
00:35:35.000 You know what I think it is?
00:35:37.000 For the longest time, the meme was, quote, The Left Can't Meme.
00:35:42.000 All of a sudden now there's a Reddit forum called The Right Can't Meme, and it mocks these fake memes that, like, just not real.
00:35:49.000 Like, somebody's purposely making bad memes that attack the left to blame the right, when an MIT Technology Review story and study found that The Donald and 4chan were the most prolific generators of memes.
00:36:04.000 Yet all of a sudden now it's being reversed.
00:36:07.000 So they're banning right-wing parody accounts.
00:36:11.000 Satire is crucial, man.
00:36:14.000 I'll die without it.
00:36:15.000 Well, you know what some people say is that ridicule is one of the most effective weapons.
00:36:20.000 Dude, there's no point to the internet without comedy.
00:36:26.000 I mean, I guess it's commerce.
00:36:27.000 Well, yeah.
00:36:28.000 No, yeah.
00:36:30.000 Thank you.
00:36:31.000 Thank you.
00:36:33.000 I can order my underwear off Amazon.
00:36:36.000 But no jokes.
00:36:37.000 That's too far.
00:36:40.000 Yeah, we'll go ahead.
00:36:41.000 So do you think this is just the next step in their prep for November?
00:36:45.000 Do you think that if we remove a little bit, do you think people will stop noticing that the comedy is only coming from the right if they nuke some of these satire sites?
00:36:53.000 And why did they let the Babylon Bee back up, do you think?
00:36:57.000 Because I really don't know.
00:36:58.000 I mean, it's just arbitrary, it seems.
00:37:02.000 There's no consistency, so we have no idea what's going on.
00:37:06.000 Tim knows better than anybody.
00:37:08.000 I mean, he grilled them himself.
00:37:10.000 Oh yeah, I'd love round two.
00:37:13.000 We need round two.
00:37:13.000 Yeah, seriously.
00:37:15.000 What follow-up was there?
00:37:17.000 Um, a single thing?
00:37:18.000 Periodically, like, I'll have a conversation with Jack, and he just, you know, he's like, he's like a... I don't know how to describe it.
00:37:27.000 He's just abusive, you know?
00:37:31.000 He whispers all these sweet nothings into your ear, he massages your shoulders.
00:37:33.000 Emotionally abusive.
00:37:34.000 And he's like, no, no, it's okay, it's okay.
00:37:36.000 We're gonna, we're gonna fix the platform.
00:37:38.000 Come on, just come back.
00:37:40.000 I won't suspend you again.
00:37:42.000 I won't ban your friends.
00:37:42.000 I won't shut you down.
00:37:44.000 It was a mistake.
00:37:45.000 It was a mistake.
00:37:45.000 I promise.
00:37:47.000 Every single time.
00:37:48.000 But the policy hasn't changed.
00:37:49.000 Nothing's changed.
00:37:50.000 It's gotten worse!
00:37:52.000 They're trying to ban, you know, satire.
00:37:55.000 I think, you know, I think we have a hysteria problem in this country.
00:38:00.000 That's where I think it's coming from.
00:38:01.000 I think it has to do with the algorithms.
00:38:03.000 I think the algorithms, it's really funny that we started seeing this phenomenon, and maybe you know more about this than I do, where it polarized everybody.
00:38:11.000 So all of a sudden, there was no middle ground anymore.
00:38:14.000 It was all either super left or super right, you know?
00:38:17.000 Then they banned all of the far right.
00:38:19.000 So now everyone kind of moved over to a center right position.
00:38:22.000 They don't want to get banned.
00:38:23.000 So it's far left and center right.
00:38:25.000 Yep.
00:38:26.000 They are engineering polarization, radicalization, extremism.
00:38:30.000 And I think they know it because there's data scientists who work there and there's dozens of studies about how censorship increases violence, increases radicalization.
00:38:43.000 So they act like they're being moral crusaders, but they are creating extremists.
00:38:50.000 And now, I'll, you know, keep myself in check and say obviously people can get radicalized on social media and people do get, you know, whether it's Islamic propaganda and whatever it is, KKK.
00:39:07.000 But you need the open forum in order to give people... No one is changing their mind if you kick them off.
00:39:14.000 It's not happening.
00:39:15.000 They will never change.
00:39:17.000 So that's pretty much the end of the argument.
00:39:19.000 They can't ever get better.
00:39:21.000 So you have to think of it more from like a mental health perspective.
00:39:26.000 I don't know, man.
00:39:27.000 They're so powerful and they know this information.
00:39:32.000 Then they're doing it on purpose.
00:39:33.000 They're doing it on purpose.
00:39:34.000 So what is it?
00:39:35.000 They're trying to win an election?
00:39:37.000 Yeah.
00:39:37.000 And like, the weird thing is if you look at Google, or I mean, or Twitter, because they have different policies in different countries.
00:39:43.000 So, you know, the Twitter policy in Pakistan is totally different.
00:39:46.000 It's way more restrictive than the Twitter policy in the US.
00:39:49.000 And, you know, they're all like, oh, well, we won't go into China.
00:39:53.000 You know, Google won't buckle for China, but Google is buckling for all of these other countries with super oppressive laws.
00:40:02.000 And Google Search has totally different censorship policies than YouTube, and it's like, what are you guys doing?
00:40:09.000 I'm censored on Google.
00:40:11.000 My main channels, you cannot Google search.
00:40:14.000 This channel, however, you can.
00:40:16.000 Yes.
00:40:17.000 It's almost like it was on purpose that we made a new channel.
00:40:19.000 Oh, weird.
00:40:20.000 If you search for Timcast, like my actual channel, you can take, this is the craziest thing, you can take the full title of any one of my main channel videos from YouTube.com slash Timcast, put the URL in Google, and Facebook comes up.
00:40:34.000 Because I upload also to Facebook, the Facebook URL gets displayed on Google, not YouTube.
00:40:40.000 Now how does that make sense?
00:40:42.000 Maybe, maybe, now they're gonna be like, there's no, there's no trust problem here, you know, you don't need to break up the company, see, we're promoting a competitor!
00:40:49.000 Yeah, do a search.
00:40:51.000 Yeah, Timcast News and Timcast won't appear, Timcast IRL does.
00:40:55.000 And this happened to a lot of other channels around the time we all started getting smeared.
00:40:59.000 The left, in my opinion, I think many of these people in media know full well exactly what's going on.
00:41:07.000 There's a radicalization, initially, I think because of Facebook, a radicalization in both directions.
00:41:13.000 But there's very few, like, real, you know, quote-unquote, far-right people in the U.S., you know, thousands out of millions.
00:41:20.000 And I noticed one of the things that happens when you start to isolate all the right-wing people is that they just find the little corners that make them happy, and they go off and find people who think just like them.
00:41:20.000 Yeah.
00:41:30.000 And that is, that's dangerous.
00:41:32.000 Like, I really don't like the idea of pushing someone off a public platform so they can go find people who think just like them.
00:41:37.000 Yeah, I mean, they don't care about the internet.
00:41:39.000 The internet is... Wait, who's they?
00:41:41.000 The extremists?
00:41:43.000 The big tech censorship platforms.
00:41:46.000 I mean, you have to think of the internet as a community.
00:41:49.000 It's not... I mean, it's a finite space.
00:41:52.000 So they know what's going on and they're just playing games and honestly they will probably... I feel like inevitably the data is going to prove this.
00:42:04.000 That they're engineering radicalization.
00:42:08.000 On purpose?
00:42:10.000 Or it, or accidentally.
00:42:11.000 Maybe some people on purpose, maybe not.
00:42:13.000 I don't want to speculate about like how malicious it is.
00:42:16.000 Maybe they're doing it with good intentions, but just, and the people making decisions don't have an understanding of the data, or maybe they're too scared to face it.
00:42:26.000 Like what would it be like if Facebook and YouTube just suddenly started allowing like super radical content with the understanding that like, put it this way, You take 50,000 content moderators that each of these companies have.
00:42:40.000 Instead of having them just go ban crazy every day, have them actually be reaching out to people and talking to people and engaging in conversation.
00:42:50.000 Guess what?
00:42:50.000 It takes 10 years to de-radicalize somebody.
00:42:53.000 Is that a study confirmed 10 years or something?
00:42:55.000 Oh, well, I mean, we— Are you just giving hypotheticals?
00:42:57.000 I mean, Daryl would say, Daryl Davis would say that it takes years.
00:43:01.000 I mean, I don't think that there's a set amount of time.
00:43:03.000 It probably could be faster or slower for different people, but just imagine, like, tens of thousands of moderators actually trying to help people.
00:43:11.000 What's interesting is the more extreme, ban-happy platforms, I feel like, make huge mistakes.
00:43:19.000 They lose all leverage.
00:43:21.000 You know, there was a... You're familiar with Sargon of Akkad?
00:43:24.000 Yeah, of course.
00:43:25.000 For those that aren't, I assume many of you are, he's a YouTube commentator.
00:43:29.000 And he spoke on a stream, you know, like early 2018 or something.
00:43:34.000 It was a small channel.
00:43:36.000 He ironically used the n-word to make a point about people he thought were racist.
00:43:41.000 And so, a year later, like nine months later, somebody shows that in the middle of a two-hour, you know, livestream in a small channel with only a few thousand views, Patreon nukes him instantly.
00:43:53.000 And I was like, if Patreon just gave him a warning, then they could've been like, hey, don't do it again.
00:43:59.000 Sargon would've been like, okay.
00:44:00.000 And that would've been it.
00:44:01.000 He'd be like, oh yeah, I was just trying to make a point.
00:44:03.000 I get it, I won't do it.
00:44:04.000 Instead, they terminated his income.
00:44:06.000 And so when I was talking to the company, to Patreon, I'm like, why would you do that?
00:44:11.000 It's counterproductive.
00:44:12.000 You're guaranteeing these people can only go in one direction.
00:44:15.000 You know, my thing is, like, if someone is behaving in a certain way, what Twitter is doing is actually kind of scary.
00:44:21.000 It's social manipulation.
00:44:23.000 It's mass social engineering.
00:44:25.000 They tell people, here's what is and isn't acceptable, and then the scared people who don't want to lose their followers fall in line and won't say naughty things.
00:44:33.000 I got no problem tweeting articles from Dr. Harvey Risch, MD, PhD of Yale, when he, you know, when he talks well of hydroxychloroquine.
00:44:42.000 And I've done several videos on that stuff, and I know that's a banhammer.
00:44:46.000 But I have a line, you know what I mean?
00:44:48.000 I'm not gonna let them keep pushing and pushing and pushing.
00:44:50.000 Because what ends up happening is, the more censorship we have like this, the more we end up with people who only speak about the other side.
00:44:58.000 That's what I was saying before.
00:44:59.000 So basically, you look at COVID.
00:45:00.000 There was a story.
00:45:01.000 I don't know if you heard this.
00:45:03.000 What was it?
00:45:04.000 The Department of Natural Resources or something?
00:45:06.000 Yeah, I think so.
00:45:07.000 They told people to wear masks when they're on Zoom calls, even if they're home alone.
00:45:12.000 Right.
00:45:13.000 It's like, why?
00:45:14.000 For show?
00:45:16.000 Exactly.
00:45:17.000 For show.
00:45:18.000 How did 15 days to slow the spread turn into 15 times 10.133 days to slow the spread so far?
00:45:26.000 Because anyone who says anything like, okay, was that enough?
00:45:30.000 Shut down.
00:45:31.000 The media won't allow it.
00:45:33.000 Our culture doesn't allow it.
00:45:35.000 I'm not sure it's on purpose, man, to be honest.
00:45:39.000 To a certain degree, I think so.
00:45:40.000 I think we're trapped in a hysterical mob running around with pitchforks and you can't reason with anybody.
00:45:46.000 Do you think people who wear masks in their car driving are mostly doing it because they're actually scared or because they're trying to signal?
00:45:55.000 Both?
00:45:55.000 To people, yeah, both.
00:45:56.000 I think it's both.
00:45:57.000 I mean, to be fair, I'm sure someone's seen me driving and I had my mask on.
00:46:01.000 Because, like, when we go out and I'm like, we're like a block out from the Walgreens or whatever, I just put, yeah, it's like, gear up, mask on!
00:46:09.000 Just like that, it's that cool.
00:46:10.000 From the goggles, you know.
00:46:11.000 Pull up the vest, you know, strap.
00:46:13.000 All the good stuff.
00:46:14.000 It's super cool.
00:46:14.000 I just put the mask on because, you know, I put over my ears and then, like, we're about to get out of the car in a few minutes.
00:46:18.000 Get comfortable in it.
00:46:19.000 But I've seen people driving on the highway with masks on.
00:46:23.000 And then I'm just kind of like, No, no, I don't get that.
00:46:26.000 I think Joe Rogan was ragging on it and he got made fun of for it.
00:46:30.000 I think I figured it out.
00:46:31.000 I think people are just lazy and they just don't want to take it off between trips or whatever.
00:46:35.000 I think it's weird.
00:46:37.000 I find it uncomfortable.
00:46:38.000 I got no problem wearing them.
00:46:40.000 But anyway, the point is not to get into a big mask conversation.
00:46:43.000 Like, I got no problem wearing a mask.
00:46:44.000 Someone sent me this really cool mask, got a beanie on it.
00:46:46.000 Yeah, it's so cool.
00:46:47.000 A little beanie on the mask.
00:46:49.000 Yeah.
00:46:50.000 Go ahead.
00:46:51.000 I was going to rewrite my point to kind of get back on track.
00:46:53.000 It's that when you see 10 articles per day, the end is nigh, the end is nigh, the end is nigh.
00:46:59.000 And then you see one article on the right pop up saying, perhaps the end is not so nigh as we first thought.
00:47:04.000 Banned!
00:47:05.000 Then the only thing anybody sees is the end is nigh.
00:47:08.000 Then the next day it's the end is nigh-er.
00:47:10.000 And the next day it's the nigh-est.
00:47:11.000 And this is it.
00:47:12.000 This is the end.
00:47:13.000 So now we're on day 152 since Donald Trump tweeted out 15 days to slow the spread.
00:47:19.000 Because you can't challenge the mob.
00:47:22.000 I think what we might be seeing is that these big social media companies, Facebook, Twitter, even YouTube, They're beholden to the mob the same as everybody else, not realizing they're the biggest players in directing it.
00:47:34.000 You know?
00:47:36.000 How do you convince people?
00:47:38.000 First of all, like, it's like you need YouTube, Twitter and Facebook and every other company to just make a hard stand and say, no, we're allowing this.
00:47:47.000 We're done.
00:47:48.000 And then allow the conversations to happen.
00:47:50.000 So this is what we've been advocating for quite a bit.
00:47:52.000 I mean, mostly me, but I think, you know, Lydia agrees.
00:47:55.000 Reform 230.
00:47:57.000 Reform Section 230.
00:47:59.000 You're familiar with Section 230?
00:48:00.000 Yeah, of course.
00:48:01.000 So how so?
00:48:02.000 In what sense?
00:48:03.000 So right now, you saw what Trump did with the executive order.
00:48:06.000 Yeah.
00:48:08.000 Can you explain to people what Section 230 is?
00:48:10.000 Yeah, so digital intermediaries who host user-generated content have immunity, a certain degree of freedom from liability over the content.
00:48:21.000 Um, but it doesn't actually say that you can't moderate.
00:48:26.000 It says that you can.
00:48:28.000 You can, yeah.
00:48:29.000 In good faith, which is a problematic term to put into law, I would say.
00:48:35.000 It says, essentially, you know, you can't hold these platforms liable for what other people say.
00:48:40.000 Right.
00:48:40.000 And these platforms can moderate so long as it's in good faith that they're trying to remove objectionable, lewd,
00:48:46.000 lascivious, violent, you know.
00:48:48.000 Right.
00:48:49.000 Or in other ways, objectionable content.
00:48:52.000 Mm-hmm.
00:48:52.000 And so all of a sudden now, you get Twitter saying, hashtag Learn to Code is objectionable.
00:48:56.000 Well, I just don't like the bait and switch.
00:48:59.000 The thing is that they all started as, you know, Twitter slogans, free speech wing and the free speech party.
00:49:05.000 Not anymore.
00:49:06.000 Not anymore.
00:49:06.000 That was a joke, they said.
00:49:07.000 You know, Google, don't be evil.
00:49:08.000 Facebook, revolution starting in the Middle East from Facebook.
00:49:13.000 Everybody, you know, thought that these were for free speech.
00:49:15.000 And so they put years of their lives into building up followings there, thinking that they, and they could say all this stuff for years.
00:49:22.000 You know, they could actually, to a certain... I mean, the content policies were always pretty bad, in my opinion.
00:49:28.000 They never were, like, First Amendment-focused, but they were way better, and so, to me, it's more false advertising, and that is definitely grounds for some legal action.
00:49:40.000 I have to imagine...
00:49:44.000 When Twitter started banning people for saying, learn to code, there's no reasonable person who would consider that objectionable.
00:49:51.000 No reasonable person.
00:49:52.000 They'd be like, I don't understand.
00:49:53.000 What does that mean?
00:49:54.000 It's like, oh, it's a reference to getting a job in the coding industry.
00:49:57.000 Like what's wrong with that?
00:49:58.000 So how does, how does Twitter still have Section 230 protection if they already are, you know, removing content outside of the realm of what the law allows?
00:50:08.000 Does someone just need to sue them to like make it, to get started?
00:50:12.000 I think people are trying.
00:50:13.000 Yeah, but I know, so Trump has this executive order.
00:50:17.000 They want to define what these terms mean specifically.
00:50:20.000 Maybe that's a first step.
00:50:21.000 What I'm thinking for Section 230 is we definitely don't want to get rid of it.
00:50:26.000 We definitely need it.
00:50:27.000 But it should also add a provision saying illegal speech, right?
00:50:33.000 And that's specifically declaring threats, you know, incitements to direct violence, you know, announcing that you are literally going to go commit a crime or something.
00:50:42.000 But if someone says naughty words and stupid opinions, that's protected speech.
00:50:47.000 I don't think, I think, let me know what you think, because my assumption is if they amended section 230 to say you can remove content that is deemed, you know, illegal, everything else, it's fair game.
00:51:01.000 Yeah, I mean that is one path, though then you are sort of forcing, you know, like a Christian blog to suddenly have to keep content I do think that platforms should be able to do what they want to do, but it's all about context, with which billions of people joined.
00:51:27.000 Seriously.
00:51:28.000 Actually, let's expand on that, because that was a really good counterpoint.
00:51:31.000 You're saying, if we change the law this way, then a Christian blog that's really small, with only a small handful of users, gets inundated by a bunch of people posting porn, and they can't do anything about it.
00:51:41.000 Right.
00:51:42.000 Yeah.
00:51:42.000 Yeah.
00:51:43.000 I think that that, but I also, you know, look, we're trying to build out like blockchain perma web storage so the content can't even get deleted.
00:51:52.000 So you have the option to post to, um, this is a teaser for what's probably coming out in the next, next month or so.
00:51:59.000 But, uh, there's a really cool, uh, block weave project called our weave and, So you'll be able to post to a totally decentralized database that cannot get taken down.
00:52:10.000 Now the nodes in that network can choose to ignore certain content, and they do, and they actually have a pretty strict content policy, but certain nodes can always access.
00:52:22.000 They've already backed up like all Wikipedia and archive.
00:52:24.000 Wow.
00:52:26.000 Bro, that's kind of scary though, man.
00:52:27.000 It is, it is, but it's sort of... We're not forcing that, but I do feel like you should have the option.
00:52:38.000 Sure.
00:52:38.000 I mean, put it this way.
00:52:39.000 You sort of have to decide.
00:52:41.000 It's like a tattoo, man.
00:52:42.000 What's scarier?
00:52:43.000 1984 in the context of...
00:52:47.000 Everything is permanent and known or everything can get burned?
00:52:53.000 Fahrenheit 451 or 1984?
00:52:55.000 Is that what 1984 was?
00:52:56.000 Like everything was known?
00:52:58.000 Yeah, it's more everything was known.
00:53:00.000 The panopticon.
00:53:01.000 Look man, what if you are, you know, 20 and you're like, I think, you know, what's it?
00:53:08.000 What's it?
00:53:09.000 I can't think.
00:53:10.000 What's it?
00:53:10.000 I think Nickelback is great!
00:53:13.000 And now, you know, that was a long time ago.
00:53:14.000 Today you're sitting there and you're like, I can't believe that will exist forever.
00:53:19.000 Yeah, I mean, it exists in time and space, so, I don't know.
00:53:24.000 But you shouldn't have posted it to, I mean, here's the thing, it's already on the internet, so, you know, it just came out that Instagram wasn't deleting anything that people were deleting for a year.
00:53:37.000 Really?
00:53:37.000 That just came out.
00:53:38.000 I didn't hear about that.
00:53:40.000 Yeah, so people thought that it was deleted and it was not deleted and some hacker researcher figured it out and so you know even in centralized databases like sometimes it's hard to delete data.
00:53:56.000 Can you even delete data?
00:53:58.000 Exactly.
00:53:59.000 Because I remember there's tombstones, there's all this stuff with the way that the internet works.
00:54:03.000 You think that when you're walking around the internet, you have some sort of right to just delete actions, like rights to databases?
00:54:13.000 Yeah.
00:54:14.000 It's not that easy, even if you wanted to.
00:54:16.000 I do think you should have the ability to delete your content if you want to.
00:54:20.000 But I mean, there were hard drives in the World Trade Center that were smashed and burnt to a crisp, and they got data off them.
00:54:28.000 Right.
00:54:29.000 I see these stories.
00:54:30.000 I remember I once did, like, a mass purge on, like, an old computer.
00:54:35.000 And I, like, rewrote it, formatted it, got some special software.
00:54:38.000 Because I was doing this at a hackerspace with, like, some friends.
00:54:42.000 And then he ensured me, like, this program is going to be able to pull stuff off of it.
00:54:46.000 And we were able to pull off videos and stuff.
00:54:47.000 And I'm like, I thought... And it was probably more to do with the fact that these things to actually wipe the hard drives don't.
00:54:54.000 So even if Google, even if Facebook, you're like, I would like to delete my data, they go, you got it, and then it exists in other forms.
00:55:01.000 You know about Facebook's shadow profiles?
00:55:04.000 Yeah.
00:55:05.000 So, basically, even if you don't use the platform, they've collected so much data on you from other people, you've got a shadow profile.
00:55:14.000 So how is that any different from what TikTok was doing with their collection of the phone numbers and everything?
00:55:19.000 Giving it to the communists?
00:55:19.000 It's not that different.
00:55:21.000 Oh, right.
00:55:21.000 Giving it to the communists.
00:55:22.000 But that's the thing.
00:55:23.000 The TikTok ban is... It's like you have to have standards.
00:55:29.000 Like, why?
00:55:30.000 Just because it's... I mean, I agree they're doing super shady things, but there's plenty of U.S.
00:55:35.000 companies that are easily selling data to China.
00:55:38.000 And so it's like, if you're going to have surveillance standards, have standards.
00:55:42.000 don't just like.
00:55:43.000 Pick an app, but they can control like so if there's a US company doing that if they
00:55:47.000 want to, they can step in and just crush them with tick tock operating outside.
00:55:52.000 They could crush them sort of in the US like banning them and which is what basically what
00:55:55.000 they're trying to do.
00:55:56.000 Yeah, now and trying to get Microsoft to Trump to buy Microsoft.
00:56:00.000 Well Twitter talked about buying tick tock.
00:56:01.000 That's gotta be the stupidest thing I ever heard.
00:56:03.000 They shut down vine.
00:56:04.000 And then they already had.
00:56:05.000 Yeah, what are they so dumb?
00:56:07.000 But I guess TikTok is still a little different.
00:56:10.000 I honestly don't trust that many of these new social networks are real anyway.
00:56:13.000 I think they use that technique where you create fake profiles, then you give young people fake followers so they
00:56:20.000 think they're, they get addicted to it.
00:56:21.000 They see the number going up and they're like, I'm doing it, I'm doing it, and then they brag to their friends, look how
00:56:25.000 many followers I got, and their friends join.
00:56:27.000 And it's just a really easy and cheap way to trick people into signing up.
00:56:31.000 Because I've seen, without naming some of these apps, I've seen it very clearly botting.
00:56:35.000 And then I've seen people dedicate their businesses to this stuff.
00:56:38.000 Remember what Facebook was doing with Facebook videos?
00:56:41.000 Oh, just giving views based on nothing?
00:56:44.000 Yeah, so, well, I'll say my understanding and opinion, for legal reasons.
00:56:48.000 Yeah, basically, so you put up a video, the number would be ridiculous.
00:56:53.000 And then they would brag about how their video views were bigger than YouTube, and it was just vanity numbers, it wasn't real.
00:56:57.000 It was a quick scroll by.
00:56:59.000 Yep.
00:56:59.000 Yeah.
00:57:00.000 So I remember sitting down with this big production company out of San Francisco.
00:57:03.000 I'm talking like one of the biggest networks.
00:57:05.000 And I said, YouTube, man.
00:57:06.000 I was like, YouTube's where it is.
00:57:07.000 YouTube's where it's at.
00:57:08.000 And I don't see that changing.
00:57:10.000 And they were like, I don't know, man.
00:57:11.000 I'm looking at these Facebook numbers.
00:57:13.000 You gotta understand how crazy this is.
00:57:15.000 And then I remember that actually came to when I worked for Fusion, which was the ABC News company.
00:57:21.000 Straight up they had the conversation and I said not Facebook and this guy's like yeah, but listen man You know these Facebook numbers.
00:57:29.000 We're looking at like a million views on this video We just did and I'm like there.
00:57:33.000 It's not you're not getting views, bro Then they dedicate all this money into Facebook building out infrastructure for production and then overnight it crumbled in front of them That's the scary thing and they just strangled the reach I Yeah.
00:57:47.000 YouTube is one of the few that has maintained, it seems like their default feed, it is annoying, they do not go to Chronological, but they do have the subscriptions feed.
00:57:57.000 But it seems like the organic reach on YouTube is better than other social networks, and You know, it's smart for any network to give lots of organic reach.
00:58:09.000 You're gonna retain people longer in the long run.
00:58:16.000 If you created a reverse chronological social network, wouldn't it just devolve into lunacy and extremism?
00:58:25.000 Like, if that's all it was?
00:58:27.000 I mean, it's based on who you subscribe to.
00:58:30.000 But like, people would exploit the system, people would personally choose to create the most bombastic and extreme content, that would get more shares, and then people would choose to subscribe to the craziest people, you know?
00:58:42.000 Like Twitter, basically.
00:58:45.000 Yeah, but from a news perspective, what you said earlier, that's where I'm at in terms of needing to feel confident that you're going to have access to the information that you've been expecting to be able to get.
00:58:58.000 I, you know, I think about the challenges of trying to run these big networks.
00:59:03.000 And I think there is, it is fair, you know, to be fair.
00:59:07.000 It's not, there's no simple solution.
00:59:08.000 No, man.
00:59:09.000 And algorithms are not bad.
00:59:11.000 It's like, that's like saying math is bad.
00:59:12.000 You can't.
00:59:13.000 Do you know, do you know what YouTube used to be like in the early days?
00:59:17.000 It was all thumbnails of women in bikinis.
00:59:19.000 I'm like half kidding, but it basically was.
00:59:22.000 A lot of it was these short clips that were just, I mean, it was kind of like Vine almost, you know, or TikTok, these short clips of funny viral moments.
00:59:30.000 Charlie bit my finger and things like that, right?
00:59:33.000 Or, you know, goats screaming like humans.
00:59:36.000 Now it's 10, 20, 30 minute videos.
00:59:39.000 That's doing well because YouTube wanted to promote more substantive content.
00:59:43.000 So, podcasts started doing way better.
00:59:45.000 Politics started doing way better.
00:59:47.000 And then the other issue is, early on when thumbnails, it was all about, if you get clicks, we show you, right?
00:59:53.000 The people who put up the bikini women on their thumbnails got more clicks than anybody else.
00:59:56.000 But then you click the video and it'd be some dude in his room talking about, you know, I don't know, Barack Obama.
01:00:00.000 Well, that's the thing.
01:00:01.000 People are going to yell at whatever social network it is no matter what they do.
01:00:05.000 So if people, you know, even if you don't have some crazy algorithm that feeds one political side and you just feed more of the video of the creator that you're watching, Some people yell at the network, you're just, you know, leading them down the path of that person.
01:00:21.000 Yeah.
01:00:21.000 It's like basically any recommendation is going to potentially have problems associated with it.
01:00:29.000 Yeah.
01:00:30.000 But I don't know.
01:00:30.000 I think keeping it simple.
01:00:32.000 What do you think is the best recommendation?
01:00:34.000 For like how to run the system of delivering content?
01:00:36.000 Yeah.
01:00:37.000 I don't know, man.
01:00:38.000 You know, I think social media is just a busted system.
01:00:43.000 Maybe we need to re-imagine it, rebuild it.
01:00:46.000 Because right now, you know, we started this conversation based on the tearing down of statues, and how many of these people who are—many on the left who support this don't realize they're tearing down George Washington, they're tearing down, you know, priests, they're defacing Jesus.
01:01:04.000 I'm sure many of them don't care about that, but some of these people are—Hans Christian Hegg, for instance, died fighting to free slaves.
01:01:11.000 They tore him down.
01:01:12.000 Someone tore down Frederick Douglass.
01:01:14.000 One of the most epic, you know, anti-slavery dudes, period, worked with Harriet Tubman.
01:01:18.000 This guy was awesome.
01:01:20.000 They don't know this because the algorithm won't show them.
01:01:23.000 And so if you only do reverse chronological, people still have confirmation bias.
01:01:28.000 They probably still won't engage, you know?
01:01:31.000 Yeah, I think that alternative feeds that aren't reverse chronological can still be really valuable.
01:01:37.000 And maybe there could be echo chamber breaking algorithms that could feed you from both sides.
01:01:43.000 Libertarian, socialist, Democrat, Republican, centrist, like, you know, and you can...
01:01:50.000 You can train feeds and have people train feeds.
01:01:54.000 We're trying to work on like a decentralized reputation system.
01:01:57.000 You've been really focused on figuring out how to do fact checking as well.
01:02:01.000 And I think that if you can start to build credibility in certain tags, then like, do you think that your vote should be higher than somebody else's vote in journalism topic on a certain social network?
01:02:20.000 Me personally?
01:02:21.000 Yeah, probably not.
01:02:23.000 No?
01:02:23.000 Not than some random person?
01:02:26.000 Probably a little bit more.
01:02:27.000 It's tough because...
01:02:31.000 I don't know how to answer that other than I don't like the idea because what ends up happening is you create a system where the New York Times is considered more credible and then what do we see with the Covington kids?
01:02:31.000 I don't know.
01:02:43.000 A wave of fake news.
01:02:44.000 None of them did any work and we're supposed to trust these people.
01:02:46.000 They're all verified.
01:02:48.000 So Twitter has created essentially this.
01:02:49.000 The verified accounts are considered more credible.
01:02:51.000 But what if it was based purely on like peer-to-peer voting?
01:02:54.000 It wasn't based on like... Partisanship.
01:02:58.000 I would have an extremely low rating.
01:03:00.000 No, you wouldn't.
01:03:01.000 I'd have a 50-50.
01:03:01.000 I would.
01:03:03.000 Yeah, the left would come to brigade me and say false, and moderates in the right, anti-SW.
01:03:08.000 The two culture war factions.
01:03:11.000 They'd both be battling it out on my page or someone else's with negative comments.
01:03:16.000 And then it would be like 50-50.
01:03:18.000 But then you'd end up with some random guy at the New York Times with a hundred followers and all of his, you know, colleagues will give him a thumbs up and it'll say a hundred votes credible.
01:03:26.000 Tim Poole, 56,000, 50-50.
01:03:27.000 And so people wouldn't be able to determine whether or not—they would be like, well, we know Tim Poole's popular, I guess?
01:03:35.000 Like, people know who he is?
01:03:37.000 But half the people don't like... I think it'd be... to be fair, a lot of leftists wouldn't actually...
01:03:42.000 I'd probably end up with like a 65 or 70, because a certain amount of leftists would just be
01:03:46.000 spamming me because they hate me. But mostly they don't bother with me.
01:03:50.000 Do you think that it would be worthwhile information to have?
01:03:56.000 Not to say that necessarily that was the, you know, end-all be-all voting metric for, you know, what is featured on the site, but do you think it would be interesting to let the community vote on different people's It might be fun to try.
01:04:16.000 It might be like an interesting secondary metric.
01:04:18.000 It might be fun to try.
01:04:19.000 Yeah, just to see what happens.
01:04:21.000 I've talked about it in the past, you know.
01:04:23.000 And that was the first thing that was brought up to me was, you would just get the left attacking any conservative.
01:04:29.000 They have these, you know, email lists, they have these Twitter accounts.
01:04:34.000 You'd get one of these Twitter accounts with 400,000 followers and they would say, everybody tweet negative on this guy.
01:04:40.000 And then all of a sudden one day you got a guy with 10,000 followers, 8,000 thumbs up, 1,000 thumbs down, really great rating.
01:04:47.000 And then one day, 30,000 people from one of these activist sites or Twitter accounts just thumbs down and now it's like 10% credibility.
01:04:55.000 And so people are like, whoa, this guy's really bad.
01:04:57.000 Yeah, there would have to be some sort of decaying mechanism for mobs.
01:05:04.000 It'd have to be something, you know, how do you check?
01:05:04.000 Yep.
01:05:09.000 It's tough because they're going to brigade people they like.
01:05:12.000 And I don't know if there's a way an algorithm can determine unless like you make an algorithm that searches for certain words that only appear on the left.
01:05:18.000 So you know how the stock exchange freezes if it drops more than a certain number of points in a certain range of time?
01:05:23.000 You could install something like that and just say, if you drop more than a certain amount over the course of maybe a day, then we can pause it.
01:05:31.000 Mob detector.
01:05:32.000 Exactly.
01:05:32.000 Yeah, like the mob.
01:05:35.000 I don't know, man.
01:05:35.000 Nice.
01:05:36.000 That's my thought.
01:05:36.000 I have no idea.
01:05:37.000 We kind of went off on a tangent on that one about algorithms, though.
01:05:40.000 But it was a good conversation.
01:05:41.000 How about we get back to, I guess, a little bit more silliness, huh?
01:05:44.000 Cool.
01:05:44.000 Oh, yeah.
01:05:45.000 So, you know, I used to talk about stories like this on more of my secondary channel, but we've been inundated with so much news about chaos and calamity, I figured the get-woke-ago-broke stuff would probably be more fun to have just like a chill hangout conversation and kind of make fun of this stuff.
01:06:03.000 But for those that didn't see the story, this came out from Bounding into Comics, August 14th.
01:06:08.000 DC Comics publisher Jim Lee reveals 25% of company's publishing line wasn't breaking even, commits to diversity and inclusivity.
01:06:18.000 Amazing.
01:06:19.000 They've decided that they weren't working, so they would just roast it, burn it to the ground.
01:06:26.000 Bounding into Comics says DC Comics publisher and Chief Creative Officer Jim Lee recently discussed the recent layoffs at the company and the future of DC Comics.
01:06:34.000 Speaking with The Hollywood Reporter, Lee stated, This week has been a really heavy, difficult time not just for me, but for the entire organization.
01:06:41.000 We've said goodbye to people that have been huge contributors and who have helped define and make DC what it is today, he stated.
01:06:48.000 And then he got a bunch of comic pictures.
01:06:50.000 He would specifically be asked if DC Comics is still publishing comics.
01:06:53.000 Lee answered, absolutely 100%.
01:06:55.000 It is still the cornerstone of everything that we do.
01:06:58.000 The need for storytelling, updating mythology, it is vital to what we do.
01:07:02.000 He then added, The organization leans on us to share and establish the
01:07:06.000 meaningful elements of the content that they need to use and incorporate on all of their
01:07:11.000 We think about reaching global audiences, and we see comics as helping drive that awareness,
01:07:11.000 adaptations.
01:07:17.000 and that international brand is very much a part of our future.
01:07:20.000 Then, Lee shockingly revealed just how bad their lineup had been performing.
01:07:25.000 He also detailed they would be reducing the lineup significantly.
01:07:28.000 Lee stated, That said, we will be reducing the size of the slate, but it's about looking at everything and looking at the bottom 20%, 25% of the line that wasn't breaking even.
01:07:39.000 or losing money. Lee would try and put an optimistic spin on the dismal state of the lineup.
01:07:44.000 It's about more punch for the pound, so to speak, and increasing margins of the books that we are
01:07:48.000 doing. I think we're starting to see something really interesting here, especially when he gets
01:07:52.000 into... So this is me, I'm not reading anymore. The diversity and inclusivity specifically.
01:07:56.000 They're losing money, but I don't see the get... I don't think this is a get what go broke.
01:08:03.000 I think this is a get broke, go woke.
01:08:05.000 It's kind of backwards.
01:08:06.000 It's backwards, right?
01:08:07.000 I think it could be.
01:08:09.000 Basically, you know, different initiatives in other areas, blah, blah, blah.
01:08:13.000 Let me get straight to the part where he talks about diversity and inclusivity.
01:08:16.000 He says, he then noted, they plan to align the comics with franchise brand content.
01:08:21.000 It's unclear what this means and Lee doesn't really explain it.
01:08:23.000 He states, it was about aligning the books to the franchise brand we've developed and making sure that every book we put out, we put out for a reason.
01:08:30.000 When asked about promoting Marie Javins and Michelle Wells to interim editors-in-chiefs, Lee committed the company to diversity and inclusivity.
01:08:39.000 He told The Hollywood Reporter, we thought it would be great pairing, a great pairing to bring
01:08:44.000 them together to help draft and organize the content we're doing along these lines,
01:08:48.000 across digital, across global. We want to make sure we have diversity and inclusivity,
01:08:53.000 and making it in a way that we have authenticity to the storytelling that we're doing.
01:08:57.000 It's really about consolidating all of our efforts and having every editors involved in all these
01:09:02.000 directives and also organizing, broadly speaking, in content that is for kids 6 to 11, and then 12
01:09:08.000 12-45.
01:09:09.000 It's about consolidating format and oversight to a smaller, more concentrated editorial group," he elaborated.
01:09:14.000 Okay, I don't care about the nitty-gritty for the most part of, like, the comics, you know, and their business function.
01:09:20.000 But, have you seen stories like the Go Broke kind of stuff?
01:09:24.000 Yeah, of course.
01:09:25.000 This, to me, I think one thing that often gets overlooked, these companies are already failing.
01:09:30.000 I think they're trying to get woke in a desperate attempt to sell more comics.
01:09:35.000 Like, you know, you've got a bunch of, a specific group of people who are reading your comics.
01:09:41.000 They then start thinking, if we're only getting young men, how do we get young women of color?
01:09:48.000 Diversity and inclusivity.
01:09:49.000 Then they do this big launch.
01:09:51.000 It doesn't work.
01:09:53.000 Then they lose all their money.
01:09:54.000 You know?
01:09:55.000 Yeah, I mean, the idea that by narrowing the types, I mean, it's broadening the type of content, but it's also narrowing it a lot.
01:10:07.000 So Crowder got remonetized, which was interesting, but I mean, the idea that there aren't conservative brands, there aren't advertisers who want to advertise on conservative content is just ridiculous.
01:10:22.000 I mean, so there's actually more money For being open to more content.
01:10:29.000 More controversial content.
01:10:29.000 Yeah.
01:10:32.000 You're limiting your company's revenue if you limit the amount of content that it's going to run over.
01:10:40.000 I don't think they see it, though.
01:10:41.000 I think they're sitting there saying, we have a dwindling sales in this one demographic.
01:10:48.000 We need to reach another demographic.
01:10:51.000 I remember, you know, when I worked for Fusion, I remember talking to them and I asked them, because of the content they were producing, are most of the people that watch and read these articles, are they women?
01:11:02.000 And the marketing guy said, no, it's evenly split.
01:11:04.000 And I was like, evenly split among what, ten people?
01:11:07.000 Yeah, there was like nobody watching.
01:11:09.000 And so, you have these companies that are like, In order to reach more people, we need diversity.
01:11:15.000 And then what ends up happening is they do reach an equal number of people of all different types, substantially less.
01:11:22.000 All of a sudden now they're getting no clicks, they're getting no traffic.
01:11:26.000 It's just artificial.
01:11:27.000 Inclusion is good, but inclusion for the sake of inclusion is just...
01:11:34.000 You know, it's like seeing a cheesy commercial with like one person from, you can tell that it's contrived.
01:11:41.000 They're all like that.
01:11:42.000 No one is drawn to that because that's not authentic.
01:11:45.000 Isn't it weird?
01:11:46.000 It's funny because there's like, there are these, I don't know what you call them, far right, I guess.
01:11:51.000 I don't know if that's the right way to explain who these people are, but there are conspiracy theorists who think that these commercials that have like mixed race families and stuff are conspiracies to, you know, I don't know, spread some agenda.
01:12:03.000 Right.
01:12:04.000 Yes, over the top.
01:12:05.000 It's literally like Corn Pops saying, we want Latino families, black families, and white families to all buy Corn Pops.
01:12:11.000 I'm not saying Corn Pops did this, I'm just referencing a random brand.
01:12:16.000 So random brand, you'll see a commercial, and it'll be a family of four different ethnicities, and then all of a sudden you get these conspiracy theories online that they're trying to get woke.
01:12:26.000 They're trying to sell their product to everybody, and it's not culturally relevant to everybody.
01:12:31.000 It would probably be more effective to just run four different commercials with, so it's not so obvious and in your face.
01:12:38.000 Yeah.
01:12:39.000 Because, I mean, I can see that.
01:12:42.000 I think that, you know, like Jordan Peele's stance on, you know, how he's casting his movies.
01:12:49.000 Have you, have you read about that much?
01:12:51.000 What, he doesn't want white men, right?
01:12:52.000 Yeah, he just, you know, he's making a deliberate effort to to do that, but he's not.
01:13:00.000 The casting isn't contrived like, OK, he's doing that.
01:13:04.000 He should be able to do that.
01:13:05.000 He can do whatever he wants.
01:13:06.000 And he makes good movies.
01:13:09.000 It's not one person of every race in just a shallow way.
01:13:13.000 You know, I haven't seen any of his movies.
01:13:15.000 You haven't actually.
01:13:16.000 Maybe I've seen one.
01:13:18.000 Um, get out.
01:13:19.000 Nope.
01:13:20.000 Didn't see it.
01:13:20.000 No, no, no.
01:13:21.000 It's all right.
01:13:22.000 I heard it was okay.
01:13:23.000 It's good.
01:13:24.000 The reason I didn't, I didn't see it.
01:13:25.000 I didn't see that other one.
01:13:27.000 I can't remember.
01:13:27.000 It's because I told it was preaching.
01:13:30.000 I told that I was like, look, man, I mean, no disrespect to Jordan Peele, right?
01:13:35.000 He's a funny guy, but I saw birds of prey and it scarred me.
01:13:39.000 Well, to be fair, you also got a similar warning about Knives Out, which you then end up really liking.
01:13:44.000 That's true.
01:13:44.000 I don't know.
01:13:45.000 Dude, Key and Peele is pure genius.
01:13:47.000 I loved Key and Peele.
01:13:49.000 Oh, of course.
01:13:49.000 Absolutely.
01:13:50.000 One of the best segments they did was when, you know, Key is playing this very flamboyant gay guy.
01:13:59.000 And Jordan is playing this like normal guy who keeps asking him, please tone it down.
01:14:04.000 And he keeps getting called a homophobe.
01:14:06.000 And then it ends with him saying, this is my boyfriend or whatever.
01:14:09.000 And then, you know, Key's character realizes he's not a homophobe, he's just being kind of a dick.
01:14:14.000 You know what I mean?
01:14:14.000 That was like one of the best segments.
01:14:16.000 Because it made a point about, you know, what does it really mean for someone to be gay?
01:14:20.000 What are they supposed to act- what are the stereotypes?
01:14:22.000 It was hilarious.
01:14:23.000 It broke down the stereotypes while pointing out that, yeah man, a lot of these people who claim to be anti-racist or whatever are actually just mean, bad people who do violent things.
01:14:32.000 Yeah, and they play other characters, other orientations all the time.
01:14:37.000 Like, they don't go so far to say, oh, you have to be, you know, gay to play a gay person.
01:14:42.000 You have to be... Oh, we're there, man.
01:14:44.000 Yeah, but Keith Peel don't buy into that.
01:14:46.000 I don't think that they... They understand, I think, to a degree that acting is acting.
01:14:51.000 And it's actually incredible to watch somebody who is not mentally retarded to play that character.
01:15:01.000 Like, Rain Man, Is necessary.
01:15:04.000 We need that movie.
01:15:05.000 Yeah, that's the r-word.
01:15:06.000 That's that's that's I think it's bannable.
01:15:08.000 Yeah, I was like, oh no, you do.
01:15:11.000 That's that's something that such ship game works, baby.
01:15:13.000 Whatever, man.
01:15:15.000 Yeah, but the issue I see with with Jordan Peele is that he straight up says, I'm not gonna cast white dudes.
01:15:21.000 I'm like, dude, I get it if you, you know, look, I get it, I really do.
01:15:25.000 We had all the Marvel movies, the first thing that come out, it's like, why dude, why dude, why dude?
01:15:29.000 And now they're like, we want to do movies that can speak to other, you know, audiences or whatever.
01:15:34.000 I get it, I do.
01:15:35.000 I think the issue is when it's, like, fake.
01:15:38.000 Like you were saying, it's forced.
01:15:40.000 Because I'll preach to no end how awesome Spider-Man to the Spider-Verse is.
01:15:44.000 I love that movie.
01:15:45.000 It's a great movie.
01:15:46.000 And it's diverse.
01:15:48.000 It doesn't need to be a thing.
01:15:49.000 That's why it's like when Peele comes out and he says it, I'm just like, I don't know, man.
01:15:53.000 Because I often see these movies that try to do it are really bad movies.
01:15:57.000 Like Ghostbusters and Birds of Prey.
01:15:59.000 Just off the top of my head, those are the ones I can name.
01:16:01.000 And I've seen other movies.
01:16:03.000 Where they're like, we're going to be diverse.
01:16:04.000 And then it's like, you've, you've sacrificed too much of your budget towards ideology instead of a good movie.
01:16:11.000 Yeah.
01:16:11.000 And in that instance, they really buried the lead by talking about how diverse and inclusive they want to be.
01:16:16.000 I think that they've not done enough market research because if they actually cared about
01:16:19.000 their market, they would want to be coming up with interesting stories, for example,
01:16:24.000 not just rerunning and doing sequels and trying to build on previous universes.
01:16:28.000 They might actually pause and think, what would make a really great story? Because Tim can sit
01:16:33.000 down and tell you like eight different stories off the top of his head. I'm sure that they have
01:16:36.000 people like that there who are just as creative and interesting. They're kind of being stifled.
01:16:40.000 I think they feel like they're probably a little bit handicapped by having to put all of these
01:16:46.000 barriers on who they can cast for certain things.
01:16:48.000 I think it's stupid.
01:16:50.000 I think it's dumb.
01:16:51.000 It is dumb, for sure.
01:16:51.000 Honestly, I think that context is... I think these companies are realizing they have to detect context.
01:16:58.000 Like, that's why they remonetized Crowder.
01:17:01.000 I think that they know where he's coming from and that you can say certain words in a way that can be understood to not be offensive.
01:17:12.000 And like, that's their biggest challenge because it's a super difficult problem with the A.I.
01:17:18.000 just running through all the language, pulsing in from audio, and trying to detect context and intention.
01:17:27.000 Because we cannot live in a world where we can't say certain words in a respectful context.
01:17:34.000 It's just insane.
01:17:34.000 You can't, though.
01:17:35.000 You can't.
01:17:36.000 That's what I was saying.
01:17:36.000 Is the R word really?
01:17:38.000 I slipped.
01:17:39.000 Oh, totally.
01:17:40.000 Sorry, dude.
01:17:41.000 Yeah, yeah.
01:17:42.000 Isn't that crazy?
01:17:43.000 Yeah.
01:17:43.000 That's stupid.
01:17:43.000 That's so crazy.
01:17:45.000 I mean, I don't know what'll happen, you know.
01:17:46.000 Yeah.
01:17:47.000 But yeah, that's the R word now.
01:17:49.000 And that's where we're headed.
01:17:50.000 That's where we're at, man.
01:17:51.000 Yup.
01:17:52.000 That's how insane everything is getting.
01:17:54.000 Like, specifically with what they're doing with these comics, with these movies.
01:17:57.000 Soon it's going to be like people wearing grey jumpsuits with shaved heads, or wearing hoods, you can't see their faces.
01:18:04.000 Everyone will have a big box they'll hide in, so you can't tell how tall they are, or what they sound like, or if they're a man or a woman.
01:18:10.000 Because as soon as someone does, they get offended, it's not fair, there's privilege, they start banning everything already.
01:18:16.000 You can't control what offends people.
01:18:19.000 But what we're seeing, with the shutting down of certain content specifically, is that It's mass hysteria.
01:18:25.000 And because the initial bias was towards the left, that's the direction it went.
01:18:30.000 And look, man, I think it's happened historically with other countries in either direction, right, left, whatever, religious.
01:18:38.000 Once regular people don't stand up, they're sitting back.
01:18:43.000 And all of this starts happening, people get scared to actually push back once that tide gets reached.
01:18:50.000 And I think the number I was reading is like 10%.
01:18:53.000 Once about 10% of the people have an ideology, it takes over.
01:18:59.000 And we're already past that with the far left.
01:19:02.000 So now, Coke, Pepsi, right?
01:19:04.000 You know, all these big brands, the biggest brands, the biggest advertisers, have fully embraced this stuff.
01:19:09.000 The left is pumping out crazy conspiracy theories like, the post boxes are being stolen!
01:19:13.000 You've seen that?
01:19:15.000 It's totally nonsensical.
01:19:16.000 That's where we're at?
01:19:18.000 And you can't, you can barely challenge any of it.
01:19:21.000 So yeah, Crowder gets re-monetized.
01:19:23.000 Of course, the left exploded, screaming, no, he's harassing me!
01:19:27.000 All too bad.
01:19:28.000 He won that one.
01:19:29.000 Straight up, he won it.
01:19:30.000 But they took income away from him for a long time.
01:19:32.000 Yeah.
01:19:32.000 He was out of the partner program.
01:19:34.000 Yeah, I mean, we got banned from the Play Store for like nine months.
01:19:38.000 And then I sent them a link of all of the porn on Twitter and they were like, OK, they banned you for porn.
01:19:43.000 No, just well, it's like a half naked image which had a explicit blur over it.
01:19:48.000 Wow.
01:19:48.000 And then I just emailed them and they they put it back.
01:19:52.000 So it's just totally arbitrary and insane that they'll just destroy businesses like that.
01:19:57.000 Have you seen this post from... This guy is GrantB911.
01:20:03.000 He's one of the founders of Breaking911.
01:20:06.000 And he tweeted, My daughter just started second grade at metro schools.
01:20:09.000 I will be pulling her out immediately.
01:20:12.000 Her first English lesson of the year is teaching her that white people are bad, mean, and racist against African Americans and Mexicans.
01:20:19.000 My daughter is seven, is not racist, nor is her family.
01:20:22.000 This stuff has become so pervasive across the board that I feel like, you know, the position I'm in right now is the only thing that stops this, and I don't even know if it will, is a straight Republican supermajority victory to just push this insanity out.
01:20:39.000 This is what happens when the only thing you're allowed to say is this.
01:20:44.000 Like, you can't—think about what 4chan did with that campaign.
01:20:47.000 You ever see that campaign, It's Okay to Be White?
01:20:50.000 Yeah.
01:20:50.000 The goal of that was to point out that the establishment, our mainstream society, is so insane that you can't even say it's okay to be white.
01:21:00.000 No, they call it white supremacy.
01:21:02.000 Straight up, there's a funny photo of, it's like three Hispanic dudes, a white guy and a black guy, and the far left posts the photo saying white supremacists.
01:21:11.000 It's, this is, our brains are broken, like, not ours, the brains of society completely fractured at this point, as far as I can tell.
01:21:21.000 I'm scared about the upcoming school season.
01:21:24.000 I mean, you know... We got pictures.
01:21:27.000 Check this out.
01:21:28.000 The white kids told the Mexican girl to go back to the Mexican school, it says.
01:21:31.000 And they have these images.
01:21:33.000 This is what they're teaching kids in school.
01:21:35.000 At least in this one school.
01:21:36.000 I mean, this is really crazy stuff, man.
01:21:40.000 I mean, look at these photos they're showing kids.
01:21:42.000 It's like a bunch of... Why are they showing kids this?
01:21:46.000 You know, I grew up on the south side of Chicago.
01:21:49.000 I grew up in a classroom full of Hispanic people from different backgrounds.
01:21:53.000 Same, South Norwalk.
01:21:54.000 I grew up with people, some black people, because it's south side of Chicago.
01:21:58.000 Kids, I shouldn't say people, just a bunch of kids.
01:22:00.000 All different types.
01:22:01.000 Filipino kid, kid from Poland, black kid, Mexican kid, one kid spoke Spanish, one kid spoke Polish, didn't mean anything.
01:22:10.000 And guess what?
01:22:11.000 When we were growing up, we were like, racism is bad, because these are my friends.
01:22:15.000 Now they're just, you know, jamming this into the face of kids, and it's like cult zealotry.
01:22:21.000 It's just some of the weirdest stuff I've ever seen.
01:22:25.000 But you've got, I don't know how personal you want to get.
01:22:28.000 Yeah, sure.
01:22:28.000 What are your thoughts on this?
01:22:29.000 Because I think this affects you personally.
01:22:30.000 I mean, I'm scared for...
01:22:34.000 You know, my daughter to have to just physically wear a mask all day to me is torturous.
01:22:42.000 Like, I feel so much empathy for people who have to do that and have to go to jobs and wear that all day.
01:22:49.000 That just seems insane.
01:22:52.000 But then to the what the creepy part that we talked about is that the whole class is going to be live streamed.
01:22:58.000 Like, they're in class?
01:22:59.000 They're in class, but the kids who are remote, who are not coming to school, can still participate, so there's definitely cameras on all day, and they're all on their... Where's it streaming to?
01:23:14.000 Zoom.
01:23:15.000 And it's being recorded, probably?
01:23:17.000 So you can Zoom-bomb it.
01:23:19.000 I don't know what the deal is with how locked down it is.
01:23:23.000 Yeah, I don't know.
01:23:23.000 I know that there was, we mentioned the Patreon court case earlier.
01:23:27.000 Somebody accidentally had their microphone on, and so the judge is like, who is that?
01:23:31.000 Turn that off.
01:23:31.000 I gotta kick somebody out of the room.
01:23:33.000 Like, how crazy is that?
01:23:34.000 I mean, I get it, you can walk into a courtroom and start screaming and they'll throw you out.
01:23:38.000 But you could have these kids in these Zoom classes, and all of a sudden, some crazy random stranger jumps in and starts posting, like, horrifying things, and the kids are gonna see it.
01:23:48.000 Yeah, that happened.
01:23:49.000 It did happen?
01:23:49.000 Yeah, there was a little Jewish family, I think, that had their little homeschooling pod and somebody got in and started bombing anti-Semitic stuff at them.
01:23:56.000 The scarier thing to me, though, is the idea that you can't really have troublemakers in class and that, you know, the troublemakers are going to be on camera.
01:24:04.000 That's just not how it's supposed to be.
01:24:06.000 Kids are supposed to be a little rambunctious.
01:24:08.000 Yeah.
01:24:09.000 Learn what it's like in the real world.
01:24:10.000 They have no idea.
01:24:11.000 They don't even know what it means to be streaming.
01:24:14.000 Did you see there was this viral thread from a teacher saying, I'm worried about parents finding out now what we're teaching their kids?
01:24:22.000 I didn't see.
01:24:23.000 That's the scariest thing.
01:24:24.000 And he like locked his account afterwards.
01:24:26.000 He did.
01:24:26.000 It was a full thread of this guy being like, I'm worried about the conservatives.
01:24:31.000 They'll start seeing what we're teaching their children.
01:24:34.000 I'm also worried about the liberals, too.
01:24:36.000 And it's like, these teachers know full well they're indoctrinating children with zealous fanaticism.
01:24:43.000 And they're scared people are going to find out now because of COVID.
01:24:46.000 I think, man, time to get out of the city.
01:24:49.000 It's time to pull your kids out.
01:24:51.000 Homeschooling is intense.
01:24:53.000 It's a lot of energy, but I do, you know, I'm very open to it.
01:24:58.000 I think that if you look at how rapidly kids learn, I mean, there's a ton of value to be going to school.
01:25:06.000 Yeah.
01:25:06.000 For social reasons.
01:25:08.000 But, you know, when they all have to stand in a little bubble, apparently they're going to force the kids to play games in little bubbles outside at the local school.
01:25:16.000 In bubbles?
01:25:17.000 So like, there's no recess.
01:25:18.000 There's no just, Going out and playing on the playground.
01:25:21.000 That's not allowed.
01:25:21.000 The playgrounds are off-limits, but they're gonna put them outside into these little circles, and they have to play, like, a board game?
01:25:27.000 Outside?
01:25:28.000 And not wear a mask?
01:25:29.000 This is some of the craziest stuff, you know?
01:25:31.000 I remember when I was little.
01:25:34.000 Not when I was little, but when I was, like, younger, as a teenager.
01:25:36.000 I always thought, you know, times change, and you don't want to be that person who gets stuck in the past.
01:25:41.000 You know?
01:25:42.000 Older people become conservative, and they talk about how things used to be so much better.
01:25:47.000 And I always thought, like, you know what?
01:25:48.000 Times change.
01:25:49.000 But this is some kind of ridiculous psychosis.
01:25:53.000 You know, it's one thing I'm growing up and it's like, by the way, you know, gay people can get married now.
01:25:58.000 And I'm like, this affects me in no way.
01:25:59.000 I don't care.
01:26:00.000 Now it's literally like, you can't go to the movies, you can't go to a bar.
01:26:03.000 What's that?
01:26:04.000 You went to a park and you weren't wearing a mask?
01:26:05.000 You're under arrest.
01:26:07.000 We're going to kick your door and we're going to shut your business down.
01:26:09.000 Now the kids are going to be in bubbles.
01:26:12.000 Whatever, I don't know.
01:26:13.000 Circles.
01:26:13.000 It's like a painted circle on the ground.
01:26:16.000 But I mean, I wouldn't be surprised if there were bubbles.
01:26:19.000 I mean, they're putting up Plexi between the desks.
01:26:23.000 And they have to wear masks.
01:26:25.000 This is insane.
01:26:26.000 Yeah.
01:26:27.000 For what?
01:26:29.000 Look, I understand there's COVID, but I mean, you look at the metrics, it does not justify what everyone is doing.
01:26:38.000 Sweden didn't lock down.
01:26:39.000 And they had some problems, and now things are kind of slowing down.
01:26:43.000 It really does look like, early on, we had a problem.
01:26:47.000 And we did the 15 days to slow the spread.
01:26:49.000 We certainly did.
01:26:50.000 Now we got a bunch of cases and nobody's going to the hospital.
01:26:54.000 It's hysteria.
01:26:55.000 It's hysteria driven by... We are locked in this culture where the left is at the wheel, and nothing can check them.
01:27:04.000 So they're just spinning the wheel as crazy as possible.
01:27:06.000 And the left thinks Trump's driving, but he's not.
01:27:09.000 And the switch from COVID to protests slash riots with with no question to me that was there was so much dissonance in my head I was I was bugging out for like a few days just nobody else seeming to to care and I thought I mean I saw you made some posts and you were just like I it's it's over I I don't care anymore.
01:27:32.000 Oh, I'm done.
01:27:33.000 No, it's straight up done.
01:27:34.000 The moment the riots, the protests, I'm like, you cannot make me care anymore.
01:27:39.000 Now I can care about the authoritarian lockdowns.
01:27:43.000 Sorry, man.
01:27:43.000 Look, we are taking precautions here.
01:27:46.000 You know, Bill came down.
01:27:47.000 We've got sanitizer.
01:27:48.000 We're, you know, we're distanced and all that stuff.
01:27:50.000 And we're being careful just because I think it's responsible.
01:27:53.000 I don't want to, I'm not going to be one of these.
01:27:55.000 You see these stories of these guys who are like, it's all fake.
01:27:57.000 Then they get sick and they die.
01:27:58.000 I'm like, no, no, no, no.
01:28:00.000 Look, I recognize there's, you know, we've got something, but at this point, I think we've probably, we've probably developed herd immunity or something.
01:28:09.000 Why are we at 152 days of lockdown?
01:28:12.000 Because we can't talk about it.
01:28:14.000 I run the risk of getting banned for simply saying this.
01:28:16.000 I'm not even kidding.
01:28:16.000 Yeah.
01:28:17.000 You had, you had, you had, you saw those doctors get, get, uh, Facebook banned those doctors for holding a, I'm sorry, they banned Breitbart for filming doctors.
01:28:27.000 Hosting a press, uh, putting on a press conference hosted by a Republican.
01:28:30.000 Yeah, NewsGuard changed their status.
01:28:32.000 Negative?
01:28:32.000 Yeah, they put it into, uh, like, uh, they were considering it again.
01:28:36.000 Right, right, right.
01:28:37.000 They've removed their green check and they're like, oh, we're not sure about Breitbart anymore.
01:28:40.000 So if I, if I film a press conference, I could get, I could get shut down.
01:28:44.000 So I mean, I feel like you've already sort of taken the stance that you're going to, within reason, talk about what you want to talk about despite it being controversial.
01:28:54.000 And it seems like, I honestly think that based on your intention, which is pretty clear,
01:29:00.000 that you're just trying to get information out, you're having an honest take on what's
01:29:03.000 going on, that you just have to hope that the tech overlords are going to just get it.
01:29:12.000 That, you know, coming at it from a good place, and you have to be able to talk about these
01:29:15.000 things.
01:29:16.000 So with YouTube, I have a direct contact.
01:29:20.000 And when my videos get demonetized, I basically send them in, and I would say 99% of my videos, about 95%, are monetized.
01:29:31.000 There's been big changes over the past few years.
01:29:32.000 It's been fantastic.
01:29:35.000 Just recently, in the past few months, I've been finally granted on my second channel, TimCastNews, I've been granted what's called self-certification, which means now almost every video I do is getting monetized.
01:29:46.000 However, this is what's really messed up about the whole system.
01:29:50.000 I have like, I don't know what, 1,500 videos on one channel, and 29 of them are incorrectly certified.
01:29:58.000 So here's how it works.
01:30:00.000 I upload a video, it says, do any of these things appear in your video?
01:30:03.000 I put no.
01:30:04.000 I don't swear, I don't show graphic images.
01:30:06.000 Well someone, for some reason, thought that me criticizing Black Lives Matter was hate speech.
01:30:12.000 So they flag it.
01:30:14.000 Now I have 29 out of 1,500, so now YouTube's put me in this thing where they have to do a pending ad.
01:30:20.000 Like, they put you in a 20-minute holding pattern every time you upload.
01:30:24.000 So that's actually been very detrimental.
01:30:27.000 So then whenever I get one of these false flags, I gotta send a huge list to Google like, all of these are wrong.
01:30:33.000 Like, there's no hate speech in any of my content.
01:30:36.000 And they can't do all of them, because it's a combination of an automated system, and then certain people get access to, you know, Google employees who will do an override, and they still can't do every single one.
01:30:49.000 Well, mines.com slash TimCast is fully monetized.
01:30:53.000 Yeah, yeah, yeah.
01:30:53.000 Well, we've got TimCast.net coming up.
01:30:55.000 TimCast.net.
01:30:58.000 That's going to be the way to do it.
01:30:59.000 So, for those that aren't familiar, TimCast.net used to redirect to my main YouTube channel, Now it's set up to redirect to, essentially, I don't know how to describe it.
01:31:10.000 It's a site powered by Minds.
01:31:12.000 Right.
01:31:13.000 So it's basically, we've talked about this before, that we're going to be setting up a standalone website for the podcast, for my other shows, that you can go, you can become a member, get exclusive content and all that stuff, and it's being built through the Minds Pro backend, I guess.
01:31:27.000 And I don't want to speak too much as to how it works, but people who are signed up can use Minds or whatever.
01:31:33.000 Yeah, you can log in to TimCast.net with your Minds Creds.
01:31:38.000 Honestly, anyone out there who wants to monetize, Minds is open for monetization.
01:31:42.000 So you have ads now running and everything?
01:31:44.000 No, not ads yet, but we're essentially sharing our revenue with the pro creators who help us drive traffic.
01:31:51.000 So we can't have advertisers come to us and say, you have to monetize this content because we're sharing our subscription revenue, Minds Plus and Minds Pro.
01:32:01.000 With the creators who are helping drive traffic and we're giving competitive RPMs.
01:32:05.000 So, you know, check out minds.com slash pro if you're interested.
01:32:09.000 And yeah, man, I mean... The general idea for what I'm trying to do is creating something that's standalone.
01:32:16.000 So if, you know, they ban me, they ban any of my channels, well, I'll still have TimCast.net.
01:32:21.000 I'll still exist in some form and not just simply get wiped out.
01:32:24.000 And so, you know, Minds being a much better, safer system, in my opinion, for, you know, for speech, that's what's being built on.
01:32:32.000 And then you guys have added, like, YouTube Sync and stuff.
01:32:34.000 Yep, and we also have a peer-to-peer advertising system, which I honestly think is sort of the future of where brand-to-brand advertising is going, where actually people right now can send you offers on Minds of any amount of dollars or crypto, And saying, hey Tim, here's a thousand bucks, share my post to your followers.
01:32:56.000 And he'll get a notification that says, hey, do you want to accept this offer or reject it?
01:33:01.000 So it's direct between brands as opposed to having to go through us for advertising.
01:33:07.000 So if someone gets demonetized on YouTube, guess what?
01:33:13.000 For everybody who gets demonetized on YouTube, There are thousands of brands who probably would send them a direct offer, and then they could run the content.
01:33:23.000 Yeah.
01:33:23.000 And YouTube doesn't need to be getting involved.
01:33:26.000 Right.
01:33:27.000 I mean, I can put ads in my videos.
01:33:29.000 No, yeah, yeah.
01:33:30.000 Right, that works too.
01:33:31.000 But if there was a system that it was automated for people to send you offers on YouTube, that would be sweet.
01:33:37.000 The big brands have that.
01:33:39.000 Right.
01:33:39.000 On YouTube, if you know.
01:33:41.000 I'm not going to name any of the big companies, but the big companies apparently have direct access to the ads that run on their platform.
01:33:46.000 It's ridiculous.
01:33:47.000 You know, they should have opened it up a long time ago, but that would have avoided the adpocalypse problem, I guess.
01:33:54.000 I think a lot of these big companies really are scared that if they do nothing, the platform goes insane.
01:34:00.000 You get extremists across the board and just weird content of like, you know, Hitler dancing with the Incredible Hulk, like we've seen.
01:34:06.000 And if they try to do something, then they're invariably going to be favoring some political ideology based on their own views, or there's no real way to—other than just let it go.
01:34:16.000 It's such a complex problem, man.
01:34:18.000 I was just listening to a podcast with Sam Harris and this New York Times reporter who reports on child I don't even want to say it.
01:34:27.000 Trafficking?
01:34:28.000 Trafficking, yeah.
01:34:29.000 Yes.
01:34:30.000 And basically, you know, acknowledging that you need encrypted solutions, but that basically
01:34:39.000 like some ridiculous, like over 40% of all child trafficking reports come from Facebook
01:34:47.000 Messenger.
01:34:48.000 Really?
01:34:49.000 Yeah.
01:34:49.000 Wow.
01:34:50.000 And so they're scared that if you encrypt everything, then there's going to be no access to those people.
01:34:57.000 But obviously, you need to encrypt everything because if you compromise encryption, it makes everybody less safe.
01:35:03.000 So, it's like, how do you actually deal with this?
01:35:06.000 But the answer is not create a backdoor.
01:35:09.000 The answer is not censor everything that has this word in it.
01:35:13.000 It has to be a more nuanced solution, and we just have to have a more open conversation about it.
01:35:20.000 The platforms are so powerful that they can step up and make the decisions.
01:35:30.000 I was just on a call, a live stream with a bunch of, like the president of the ACLU.
01:35:35.000 Really?
01:35:35.000 A couple high-level people from the ACLU.
01:35:38.000 They've lost it.
01:35:39.000 They've lost it in the sense of their social media, but they do still, from what they were saying, uphold these values.
01:35:45.000 And we were actually, there were some de-radicals, former radicals on the call, former jihadi recruiter was on the call.
01:35:53.000 Wow.
01:35:53.000 who works with Darrell Davis.
01:35:55.000 Wow.
01:35:56.000 And he basically started this group, Parallel Networks, which is a de-radicalization group
01:36:00.000 that goes on social networks and tries to help bring people back from the edge.
01:36:05.000 But they were agreeing with it.
01:36:08.000 And so I do think that the smartest people in the world know, I'm not saying these are the smartest people in the
01:36:18.000 world, I'm saying, but all the cybersecurity encryption experts know that you have to encrypt everything.
01:36:26.000 And all of the de-radicalization experts know that you can't ban everybody.
01:36:31.000 And so there'll be.
01:36:33.000 They're beholden to the mob.
01:36:34.000 That's it.
01:36:35.000 Everybody knows you can't do this.
01:36:37.000 The mob, like you're saying, the mob doesn't even actually want deradicalization.
01:36:43.000 Because if you look at the data and you actually want to minimize, even if you wanted to minimize hate speech, banning makes more hate speech.
01:36:53.000 Right.
01:36:54.000 And they don't get that.
01:36:55.000 Yeah, they don't get that.
01:36:56.000 They're like, look, we banned this person, they've gone away forever.
01:36:58.000 Have you changed anything?
01:36:59.000 No.
01:37:00.000 No, the number of people who have called for bannings have never changed anybody's mind.
01:37:07.000 They've been banned.
01:37:08.000 That's what's funny.
01:37:09.000 It's like, you know, there was a comic that got banned recently, and it was a black woman wearing a mask and her shirt said, I can't breathe.
01:37:19.000 And the white woman looks over and said something like, well, then take the mask off.
01:37:23.000 And that was the comic.
01:37:24.000 I got a chuckle out of it.
01:37:26.000 It got censored, I think, from Instagram or somewhere for being, not from Instagram, from some company or something, for being, from a newspaper.
01:37:32.000 That's what it was.
01:37:33.000 It was offensive.
01:37:34.000 And they started complaining about it.
01:37:35.000 And I'm like, welcome to the party.
01:37:37.000 You want offensive content removed.
01:37:39.000 Now you get removed.
01:37:41.000 It affects them, you know, and they don't learn.
01:37:43.000 But I look at these big tech companies.
01:37:45.000 Yeah, you're right.
01:37:45.000 They know all these things.
01:37:47.000 I don't believe the ACLU actually has civil liberties at heart.
01:37:50.000 They oppose civil liberties.
01:37:52.000 So how can I trust them to... I'll give you an example.
01:37:55.000 They've supported discrimination against minorities at universities.
01:37:59.000 Straight up, no questions asked, not hyperbole, not an exaggeration.
01:38:03.000 They say it is okay for universities to discriminate against someone based on their race.
01:38:07.000 That's not civil liberties.
01:38:09.000 So how can I trust them to actually have to do the right thing when, sure, they can be on the phone and they can say things like, oh, yeah, yeah, yeah, we know we're going to do the right thing.
01:38:17.000 And then they turn around and they spit on the Constitution or they spit on civil rights.
01:38:22.000 They do still represent some extreme racists in certain cases, but it's by far the minimum of the legal work that they're doing, and it seems like their social media has become totally Polarized.
01:38:40.000 So, you know, when you talk to the ACLU and you grill them about these issues, they do still try to hold on.
01:38:49.000 But, yeah.
01:38:50.000 No, I'm over it.
01:38:52.000 It's like Jack Dorsey.
01:38:53.000 They whisper everything you want to hear into your ears and then do nothing.
01:38:57.000 And you look at how they act on social media and the things they inflame, the things they empower, and it's insanity.
01:39:03.000 And it's part of the ongoing problem.
01:39:05.000 We can't have honest conversations because of companies like the ACLU.
01:39:09.000 Because they won't stand up for free speech.
01:39:10.000 They're the ones who we need to be standing up.
01:39:12.000 They won't.
01:39:13.000 They turned their back on free speech.
01:39:15.000 They've straight up turned their back on it.
01:39:17.000 And you can look at these organizations that are advocating for... I think the funniest revelation or, you know, thing to happen was the FreePress.net, the free press organization, supporting censorship.
01:39:31.000 Like, they literally have multiple initiatives on censoring content.
01:39:35.000 Like, you're called free press, dude!
01:39:38.000 That's how insane everyone has gotten.
01:39:41.000 And I believe it's because everyone feels like they're forced to say certain things because everyone around them, you know?
01:39:49.000 It's the weirdest thing.
01:39:50.000 Yeah.
01:39:50.000 Like, nobody really wants this, but everyone's scared, everyone else wants it, I guess?
01:39:55.000 It feels like some sort of a thought virus.
01:39:58.000 It's so much fear.
01:40:00.000 People are just terrified of social backlash.
01:40:07.000 But I honestly think that, again, I'm a broken record, but ultimately the data is just going to destroy the arguments.
01:40:15.000 I want to believe that The data about censorship is just gonna prove itself, and people, and it's just like, no!
01:40:23.000 Data!
01:40:24.000 Data!
01:40:25.000 I don't think so.
01:40:26.000 I don't think, I don't think so.
01:40:28.000 Like, we've known these things for a long time.
01:40:31.000 We've fought for these rights, we've fought for, you know, to be able to speak freely, to be able to associate, to be able to communicate.
01:40:39.000 Journalists used to be able to sit down with warlords.
01:40:43.000 Now you do that and they call you a minion of the warlord.
01:40:46.000 And so you can have all the data in the world.
01:40:49.000 I can publish on Twitter all day and night, like, look at these things.
01:40:51.000 Doesn't matter.
01:40:52.000 Because, you know, listen, you look at the science and the data around COVID, you'll get banned.
01:40:59.000 You post about FBI crime stats, you get banned.
01:41:02.000 You can't talk about these things.
01:41:03.000 Dude, other countries think we're nuts.
01:41:05.000 They do, yeah.
01:41:06.000 We just got like a quarter million users from Thailand who are freaking out about their government and censorship from their government.
01:41:14.000 Like, we have more censorship from corporations in the U.S.
01:41:18.000 than our government.
01:41:19.000 Of course.
01:41:20.000 Every other country, it's like, no, the government is saying you cannot criticize them.
01:41:24.000 That's where we're at.
01:41:26.000 We have the First Amendment.
01:41:27.000 But these big companies have taken over the commons, and whatever the left is today is hilarious.
01:41:33.000 Donald Trump talked about pardoning Edward Snowden, and the ACLU tweeted out, this is one thing we say yes to, and they got attacked relentlessly.
01:41:43.000 How dare you defend the president!
01:41:46.000 The orange man is bad.
01:41:47.000 This should not be allowed.
01:41:48.000 It got roasted on Twitter saying, no, he's a criminal.
01:41:50.000 That's who they've attracted.
01:41:52.000 You know, and this is what really bothers me about people, the things they do, the things they say, the things they chase.
01:41:58.000 The ACLU should take a good long look in the mirror and look at the people they've attracted because they don't believe in civil liberties.
01:42:06.000 You know, you can complain that there are people who follow me and comment, and they say naughty words.
01:42:11.000 I believe in free speech.
01:42:12.000 As long as they're not breaking the law and inciting the violence, well, I don't appreciate what they say, but I think they have a right to say it.
01:42:18.000 ACLU is the opposite.
01:42:19.000 They're the anti-civil liberties union at this point.
01:42:22.000 I'm ranting about the ACLU!
01:42:24.000 But it's funny though that you say they did acknowledge that so I would I would put them in sort of a part of the progressive realm that like Greenwald and Snowden even who and you know those who they have values they do have some standards I agree with you that they've gotten unhinged but they did say they agree with that? Because they had to. Because they tweeted
01:42:52.000 like a couple years ago and the tweet still exists and people highlighted it laughing at them and
01:42:56.000 they said no no we believe that.
01:42:58.000 But look at what happened with Charlottesville. They came out and defended free speech of
01:43:03.000 Charlottesville, got attacked, started bleeding subscribers, and then apologized and said oh
01:43:07.000 we're going to review our first amendment you know approach from now on. Isn't that crazy?
01:43:12.000 Imagine the money that the ACLU would get if they got the actual free speech community to start supporting them.
01:43:19.000 The free speech community on the internet is not giving money to the ACLU.
01:43:23.000 And it's like, the people listening to this right now.
01:43:26.000 If the ACLU would stand up for free speech, they would get a huge surge of subscriptions.
01:43:31.000 But anti-Trump hate unites the factions of the left, man.
01:43:35.000 From progressive far-lefts to moderate corporate dems to passive liberals, whatever.
01:43:41.000 They all hate Trump, and that's the go-to.
01:43:43.000 Yeah, it's funny.
01:43:44.000 It's like, you sort of have to pick which crowd of monthly subscribers you want.
01:43:52.000 Well, yeah.
01:43:53.000 Or you just stay true to yourself and people will come and go.
01:43:57.000 And some people will complain.
01:43:58.000 And you get emails where they're like, you've changed, man.
01:44:01.000 We don't like the direction you're going.
01:44:03.000 And I say, I can only do me.
01:44:05.000 I do what I want to do.
01:44:05.000 You know what?
01:44:07.000 No one's going to tell me what I can do.
01:44:09.000 Well, then reason.
01:44:10.000 If they're going to ban me from social media, well then so be it.
01:44:12.000 I'm going to do my thing.
01:44:14.000 But nobody's going to tell me what I have to do.
01:44:16.000 I just do what I feel like doing, what makes me happy.
01:44:19.000 And that's all I've ever done.
01:44:21.000 Well, you, you are a principled person and you are just one person.
01:44:24.000 So it might be a little more complicated for the ACLU.
01:44:27.000 So I understand kind of where they're coming from, but if they were principled at all, they would just say, you know what?
01:44:34.000 We just support all free speech.
01:44:35.000 And if you want to stop supporting us because we support everyone's free speech, fine.
01:44:40.000 The fact that we support everyone's free speech means that more people will be along later to give us money.
01:44:45.000 It's the long game.
01:44:45.000 I think that would be great.
01:44:46.000 Yeah, exactly.
01:44:47.000 Yep, but everybody is... I think if I was going to try and paint a picture of what was happening, it's that everybody is sitting, staring at each other, side-eyed, panicked, like, which one's gonna be the one to get me?
01:45:00.000 Dude, I am terrified by the fact that even just on this stream, it's just like watching what we say.
01:45:09.000 You said the R word.
01:45:11.000 I said sorry to you because I felt bad because you could potentially lose monetization on it and it's just like that is not where the focus needs to be.
01:45:23.000 It's so stupid.
01:45:25.000 Yeah.
01:45:26.000 I mean, you were trying to use it in the proper context.
01:45:28.000 I don't think it would, yeah.
01:45:29.000 No, it doesn't matter.
01:45:30.000 Rick and Morty made a joke about it.
01:45:32.000 It's just become, like, it's Fahrenheit 451, bro.
01:45:35.000 You say something that someone is offended by, everything's gonna be burned.
01:45:38.000 Let's make a bet right now.
01:45:41.000 It definitely will.
01:45:41.000 It'll be demonetized.
01:45:43.000 Alright, I'll bet you ten bucks just because.
01:45:43.000 No joke.
01:45:46.000 I wouldn't be surprised.
01:45:48.000 I was actually thinking, like, are they gonna pull the stream?
01:45:50.000 Nope, they did not.
01:45:51.000 They didn't.
01:45:51.000 Have you ever had a stream pull?
01:45:53.000 Not yet.
01:45:53.000 Not us.
01:45:53.000 But it happens to people all the time.
01:45:56.000 All the time.
01:45:57.000 There was a funny thing that happened.
01:45:58.000 I think this was Keemstar, big YouTuber.
01:46:01.000 He said, and I'm gonna space this out very properly to make sure.
01:46:05.000 Yes.
01:46:05.000 Oh, I remember this.
01:46:06.000 The letter E was the first part of the word, dash, and then he said the word girl, and the reason I said that is because when you say it really fast, it sounds like a potential slur.
01:46:17.000 YouTube recorded it automatically with their speech-to-text algorithm as a slur, and he got demonetized.
01:46:24.000 So sometimes you might not even say anything, and they'll say you do, and they will nuke your channel.
01:46:28.000 They will shut you down.
01:46:30.000 That's the state of the world today.
01:46:32.000 But I'll tell you what, man.
01:46:34.000 The left basically has impunity.
01:46:36.000 Basically has.
01:46:38.000 Because there are certain factions of the left, like the anti-war progressives and anti-establishment, that don't.
01:46:42.000 But if you're an establishment leftist, you can say whatever you want.
01:46:47.000 Ho-Tep Jesus said it the best.
01:46:50.000 People hate Donald Trump so much that Joe Biden can say whatever he wants about black people and get away with it.
01:46:55.000 That's the gist of what, that's like a good example of what's happening.
01:46:59.000 So how do you solve it?
01:47:01.000 I guess you keep doing your thing, but I'll tell you what, I'm just lucky.
01:47:04.000 I'm a disaffected liberal.
01:47:07.000 I'm challenging the Democrats and the left-wing establishment and the old school Republicans who have joined them.
01:47:12.000 They've banned a ton of the right-wing channels.
01:47:15.000 Conservative channels.
01:47:16.000 Not even the worst of the worst.
01:47:18.000 And the only reason I'm still here is because, as the cliff erodes, I wasn't standing on the right.
01:47:24.000 So my time will come.
01:47:25.000 They'll ban this channel.
01:47:26.000 They'll ban my other channels.
01:47:28.000 I fully believe so.
01:47:30.000 Now, to be fair, I think, you know, with Crowder getting his monetization back, there may be some pushback happening.
01:47:36.000 This may be a good thing.
01:47:37.000 They may be trying to stabilize, and I do think they like me to a certain degree, with their goal being, like, let's make sure we support channels that play by the rules, that try and be family-friendly and advertiser-friendly and all these things.
01:47:49.000 I have my limits, though, man.
01:47:51.000 I did several videos on hydroxychloroquine.
01:47:53.000 Like, you know what?
01:47:53.000 If they ban me over this, so be it.
01:47:55.000 And they've banned other... I've seen whole channels get purged for one video.
01:47:59.000 Not even three strikes.
01:48:00.000 You dare challenge the orthodoxy on COVID, and they will nuke you in two seconds.
01:48:05.000 That's how crazy it's gotten.
01:48:08.000 So you're no-go for Jesse?
01:48:11.000 For Ventura?
01:48:13.000 Oh, to vote for him?
01:48:15.000 Look, man, at this point I'm basically a one-issue voter.
01:48:19.000 And the first issue was the riots.
01:48:23.000 Look, we have this Democrat in Virginia getting two felonies for pulling down the statue.
01:48:29.000 And I'm like, we need to stop.
01:48:31.000 Because I got family in Chicago.
01:48:33.000 They raised the drawbridges around the downtown area for like a week.
01:48:36.000 Mass looting.
01:48:37.000 The looting wasn't even Black Lives Matter.
01:48:38.000 I mean, it kind of was, because they were defended by the group, and some were kind of yelling at stuff.
01:48:43.000 But it's just come down to mass chaos.
01:48:48.000 At this point, it's like, I can't support the Democrats no way.
01:48:55.000 Clearly.
01:48:55.000 There's just something about me that loves little glitches in the Matrix.
01:48:58.000 think Trump is all that bad. I think he's done a lot of really good things and when
01:49:01.000 I look at the options, I think the best chance at shutting down whatever it is
01:49:05.000 the left is doing is to make sure they don't get an office.
01:49:07.000 There's just something about me that loves little glitches in the Matrix. I
01:49:10.000 feel like Jesse Ventura is a glitch because he was a libertarian, he's
01:49:14.000 running the Green Party, and there's like that, you know, I feel like that's sort of
01:49:17.000 It's like this crossover between libertarian and progressive.
01:49:21.000 And it's just rational people who will talk about what's going on.
01:49:26.000 Honestly, that's it.
01:49:27.000 That's all like we can get that.
01:49:30.000 We it's not going to happen, but like nobody.
01:49:34.000 It's only black or white.
01:49:36.000 People won't even talk about.
01:49:39.000 I've seen nothing about Jesse.
01:49:41.000 Yeah, I haven't either.
01:49:41.000 I actually didn't even know that he was involved.
01:49:43.000 No one knows because no one will talk about him.
01:49:45.000 They don't think it's possible.
01:49:46.000 No one ever thinks that a third party is possible.
01:49:49.000 I mean, Trump was possible.
01:49:53.000 I think this.
01:49:54.000 Look, I'm not going to vote third party.
01:49:56.000 I don't care.
01:49:58.000 Vote for who you think needs to win and never let anyone tell you otherwise.
01:50:01.000 If you think Jesse's the right guy, you go out and vote for him.
01:50:04.000 If you think it's Joe Jorgensen, you go vote for her.
01:50:07.000 If you think it's Joe Biden, you vote for Biden.
01:50:08.000 If you think it's Trump, you vote for Trump.
01:50:10.000 For me, I normally don't vote.
01:50:12.000 But I think we're looking at a serious existential threat.
01:50:15.000 We've got mass ridings going on for, I think we're on like 11 weeks or some ridiculous number.
01:50:19.000 11 weeks.
01:50:20.000 And 30 plus people dead.
01:50:22.000 And just the other night in Portland, some dude got punted in the face and banged his head on the ground.
01:50:27.000 But this is just one more incident.
01:50:30.000 Yeah.
01:50:30.000 And so if the Democrats won't do anything to stop it, then I'm like, well, then you know what?
01:50:34.000 Trump's got to come in and, you know, actually... It's not even necessarily Trump, it's a local politician.
01:50:39.000 Well, and I do agree with you that the third parties, if they're ever going to have a chance... I'm not even saying... I don't have no idea who I'm going to vote for, but...
01:50:47.000 Everyone has a responsibility to make themselves known.
01:50:50.000 Jesse clearly isn't doing a good enough job to make it clear that he's going to stop the riots and bring out some sort of serious civil discourse that gets us through this.
01:51:01.000 But you know what, man?
01:51:02.000 I've always said I hate voting against someone, but I'm absolutely voting against the Democrats.
01:51:09.000 The old school establishment Republicans fled the Republican Party in panic, became the Never Trumpers.
01:51:16.000 These people are vile.
01:51:18.000 They joined the Democratic establishment, and the Democratic establishment is vile.
01:51:22.000 And so now you've got your choice.
01:51:24.000 Bernie Sanders sold out, joined the establishment, and now you have your choice between letting the establishment back in, take control, and do their thing again, no way, or Trump.
01:51:34.000 He's a bull.
01:51:35.000 He's shutting them down.
01:51:37.000 And I'm like, eh, I'll take the bull.
01:51:40.000 Let them go through and do something because these people are nuts.
01:51:43.000 These are bad people, the establishment politicians.
01:51:47.000 And I think both parties are trash, but for me, we got mass riots.
01:51:53.000 We can't, what do we do?
01:51:54.000 We just sit back?
01:51:55.000 We got mass censorship?
01:51:56.000 I'm not even convinced the Republicans will actually do anything about censorship, but it's better than nothing, I guess.
01:52:03.000 Let's take some superchats because we're a little late on superchats.
01:52:07.000 Gareth Green says, the only porn video I've ever seen in my life was posted on Twitter for all to see almost three years ago.
01:52:14.000 Also, do you think Jack is secretly trying to help the right by making them look sane by comparison?
01:52:19.000 Yes.
01:52:19.000 This is actually something that I talked about before.
01:52:23.000 By only getting rid of the worst actors on the right and letting the left go crazy, it makes the left look awful.
01:52:30.000 You know, the left is now dominated by these crazy policy ideas and by crazy people inadvertently makes the right look clean and good.
01:52:38.000 I don't know what you think.
01:52:40.000 If you think, you know, let's read some more super chats.
01:52:45.000 Brett Stubbs says, "...homeschooling doesn't have to be hard.
01:52:47.000 We've done it for 10 years.
01:52:49.000 We built an online distance learning homeschool co-op, and I'd love to talk to you about it.
01:52:53.000 Don't care about the money.
01:52:54.000 I care about my four kids and the millions of others who got life shut down."
01:52:58.000 Feel free to shoot an email over to spintheufo at gmail.com.
01:53:02.000 Yeah, I'll check it out.
01:53:02.000 That's it.
01:53:03.000 Yeah.
01:53:03.000 And we'll take a look.
01:53:04.000 Let's see.
01:53:05.000 Kick-sack.
01:53:06.000 Kick-sack-quicks?
01:53:07.000 I can't pronounce this.
01:53:08.000 My sister is a music teacher in the public school system.
01:53:11.000 She had a Zoom class session that had someone get in and start posting sausage.
01:53:16.000 Wow!
01:53:17.000 She ended the session immediately.
01:53:19.000 Zoom is a horrible solution for homeschool.
01:53:21.000 Why did we all of a sudden start using Zoom?
01:53:24.000 What happened to Skype?
01:53:25.000 Is it not big enough?
01:53:27.000 I don't know.
01:53:27.000 Something about maybe the stability of the connection was just... I have no idea.
01:53:31.000 It does seem arbitrary.
01:53:32.000 Yeah.
01:53:33.000 All I know is somebody was like, oh, we're doing a meeting on Zoom.
01:53:36.000 I was like, okay.
01:53:37.000 Oh, no, no.
01:53:37.000 It was the Patreon case on Zoom.
01:53:39.000 So I downloaded Zoom.
01:53:39.000 It crashed the sound card on my computer, and I'm panicking.
01:53:43.000 I'm like, I can't do my job because the sound card wasn't working.
01:53:46.000 It's not necessarily the sound card, but it's like the audio inputs were all busted.
01:53:51.000 The drivers were busted.
01:53:52.000 The government is basically guaranteeing their financial success.
01:53:57.000 It seems overly reliant on one platform.
01:54:02.000 There's one open source encrypted video chat solution that Snowden did shout out to as well called Jitsi.
01:54:12.000 Which is fine.
01:54:14.000 It's good.
01:54:14.000 It's open source.
01:54:15.000 It's encrypted.
01:54:16.000 It works.
01:54:18.000 They're not funded well enough, but if they were funded to the degree that Zoom was... We have a Jitsi integration in groups on mine.
01:54:25.000 It works fine.
01:54:27.000 I don't know, man.
01:54:29.000 I'll mention this, too, before we read some more Super Chats.
01:54:32.000 Bill came down because he's helping me set up my website, TeamCast.net, which is going to be members, exclusive content, behind-the-scenes stuff.
01:54:41.000 We're aiming for.
01:54:42.000 There's a lot of work that has to be done and that's all basically built on the Mines infrastructure.
01:54:46.000 So I just want to give a shout out to that because I know I've been talking about expanding recently and we're going to have an actual standalone website.
01:54:53.000 As we talk about all the censorship, this is the reason, you know.
01:54:56.000 Join for 10 bucks a month on TempCast.net.
01:54:58.000 Well, soon.
01:54:59.000 I mean, you could, but it's not there yet.
01:55:02.000 We literally just set up the domains today.
01:55:04.000 Yeah.
01:55:05.000 James Jimerson says, I came here for the truth, the news.
01:55:08.000 Tim, Lydia, love y'all.
01:55:09.000 Keep up the fight for our nation.
01:55:10.000 Trump 2020.
01:55:11.000 Appreciate the support.
01:55:12.000 Thanks, guys.
01:55:13.000 Top Gandhi says, ACLU body cam streaming, Tim.
01:55:18.000 So, I don't know exactly what you mean, but there was a reference where the ACL sued to stop the Portland police from live streaming.
01:55:25.000 I'm like, shouldn't the police be filming everything?
01:55:29.000 Well, they're filming people's faces.
01:55:30.000 I don't care.
01:55:31.000 I guess the idea is that body cam footage is private until necessary and stream footage is public all the time.
01:55:37.000 Can't they just run it through some sort of face blur system if it's released or something like that?
01:55:45.000 You'd think that would be pretty simple, right?
01:55:46.000 I don't know.
01:55:48.000 All right, we're gonna read some of the early superchats because many people are asking about Adam.
01:55:52.000 Johnny Mentology says, may we please have a eulogy for Adam?
01:55:56.000 He's doing his own show over at AdamCast IRL.
01:55:59.000 He broke 100k subs, gonna get his silver medal.
01:56:01.000 So we're stoked.
01:56:03.000 Yeah, we were talking a little bit earlier.
01:56:04.000 He's still here.
01:56:05.000 We're still hanging out.
01:56:06.000 He's just doing a show, man.
01:56:07.000 Sam Trendy J says, Hey Tim, I love the content.
01:56:10.000 I've been watching for the past month.
01:56:11.000 During your rant where you urged people to stand up to their bosses against anti-racism training, you mentioned you left a job at Disney.
01:56:18.000 What was that job?
01:56:18.000 I worked for a company called Fusion, which was an ABC News, Univision joint venture.
01:56:23.000 And I basically said, you know, this is, I don't want to do this and I don't want to be here, but I was under contract.
01:56:30.000 So, For that, they just said, well, you're under contract.
01:56:33.000 And I was like, yeah, golden handcuffs.
01:56:34.000 I got paid well, and that was about it.
01:56:37.000 A bunch of people saying the audio was messed up.
01:56:38.000 That was true.
01:56:39.000 That was true.
01:56:40.000 I hope it's not true anymore.
01:56:41.000 People are saying, turn the audio on and off again.
01:56:43.000 We did do that.
01:56:45.000 We got a lot of super chats from everyone saying the mics are bad.
01:56:47.000 It was very profitable.
01:56:48.000 And now more.
01:56:49.000 You know what?
01:56:49.000 We should do this more often.
01:56:51.000 I'll break the audio so that... Thanks, Puku.
01:56:54.000 It's also... I'm just going to mention it.
01:56:58.000 We had some beers on the show.
01:56:59.000 See, I don't normally drink.
01:57:02.000 It's very adult.
01:57:03.000 Yep, very adult.
01:57:04.000 Well, you guys don't have to owe me.
01:57:06.000 I was just saying, like, most people might not notice.
01:57:09.000 Dude, you thought we were gonna get demonetized for saying that?
01:57:12.000 For drinking?
01:57:13.000 Oh, I don't care.
01:57:14.000 I didn't say we couldn't, you know, I was like, we won't make a big deal out of, like, having some beers.
01:57:19.000 But I don't think it's a big deal.
01:57:20.000 The only other time I've seen you drink was when we actually got invited to Donald Trump's house.
01:57:26.000 Oh, the White House.
01:57:30.000 We had a drink before.
01:57:31.000 Bill and I both got invited to the White House, and we both went, and we went to a bar out front, and I'm like, I don't drink, but I'm having a drink.
01:57:39.000 I'm going to the White House, and I can't remember what I got.
01:57:41.000 It was like a margarita, right?
01:57:43.000 Vodka, maybe?
01:57:44.000 He has a vodka something.
01:57:46.000 Yep.
01:57:46.000 And then we went to the White House.
01:57:48.000 So the civic nationalist says, when this starts, it's 1 a.m.
01:57:52.000 over here. I watch you in the podcast to get a balanced news on issues going on over there.
01:57:56.000 Can you see the parallels with Germany before the rise of a H as what's happening in your
01:58:01.000 country? I mean, I think so, but I think it's you know, I was reading about the Spanish
01:58:06.000 I don't know if you've read any of this stuff.
01:58:08.000 It sounds a lot more, it sounds a lot like the Spanish Civil War.
01:58:11.000 I watched this really awesome YouTube documentary about it, and I was kind of like, wow, what's it called?
01:58:16.000 I can't remember, I can't remember.
01:58:18.000 But they talk about, you know, what people don't understand is that Civil War, from our perspective, is North versus South, because we had the Civil War here.
01:58:25.000 But they don't realize in many other countries, it was pockets.
01:58:27.000 It was like the cities turned blue, and the country turned red, and then they started fighting over territory, and then it split into dominant areas.
01:58:34.000 And in the Spanish Civil War, there were like... The left group was segmented in like three different areas at one point.
01:58:40.000 We could see something like that, I don't know.
01:58:42.000 I will say, I am trying to move from this.
01:58:45.000 We just totally upgraded the studio.
01:58:47.000 I'll post something on Instagram after we're done and do like a walk around to show you.
01:58:50.000 I just decided we're gonna fix everything up.
01:58:53.000 We've got a bunch of awesome guests coming.
01:58:55.000 We've got a bunch of people running for Congress.
01:58:59.000 I can't remember who.
01:59:00.000 We're talking to some people who haven't confirmed yet, but we've got some Republican candidates who are definitely gonna be coming down, which is really interesting.
01:59:07.000 And anyway, I'm trying to move out of here because I think it's going to be bedlam November to January or beyond.
01:59:15.000 I think we've already seen them go to residential neighborhoods.
01:59:18.000 I think it's going to hit the suburbs of every major city.
01:59:21.000 I could be wrong.
01:59:22.000 I am not giving you advice.
01:59:23.000 I am telling you what I see and what I'm going to do.
01:59:26.000 But I will say, I thought there were going to be riots earlier in the year.
01:59:29.000 Then we got mass riots.
01:59:31.000 I was worried about shelves running dry from food.
01:59:33.000 The shelves ran dry of food at many stores.
01:59:36.000 Things are still stable.
01:59:37.000 It's fine.
01:59:37.000 You go to the store, you can buy what you need.
01:59:38.000 It's not the end of the world.
01:59:39.000 That's why I've always said, the world's not going to end.
01:59:41.000 Just, you know, get extra beans and then have taco night if you don't, you know, if you're going to eat them.
01:59:45.000 But I think it's going to get nasty.
01:59:47.000 That's just my opinion.
01:59:47.000 Are you going to stock up harder in the new digs?
01:59:50.000 Definitely.
01:59:50.000 Lots of bullets.
01:59:52.000 I was advised by everyone, like a thousand bolts for every gun.
01:59:54.000 And I'm like, hmm, that seems low.
01:59:58.000 I'm kidding.
02:00:01.000 We already have some emergency food.
02:00:03.000 We're gonna have a ton of emergency food, mostly because we're upgrading to a place where we can have more people working.
02:00:09.000 So there's gonna be a lot of people in and out, and it's not so much about having a prepper haven with an underground bunker.
02:00:15.000 No, but we'll have food.
02:00:16.000 It's a well water system with great filtration and all the stuff.
02:00:20.000 And I'm not worried about the world ending.
02:00:23.000 I'm just worried about, you know, serious instability.
02:00:26.000 It's already really hard to buy certain equipment.
02:00:28.000 That surprised me.
02:00:30.000 It's already hard to buy certain clothing items.
02:00:33.000 Right for some areas like you can still basically get everything you need But I was shocked at how much rolled back throughout this year, so I'm like you know what man I'm never gonna be one of those guys with like you know a year's worth of beans in the basement But we're gonna have you know a pantry.
02:00:47.000 It's actually like we're moving the middle of nowhere So we're gonna have like a month on hand for the most part of canned goods and dry foods Just because we're not gonna drive two hours to the store every day or something like that Yeah, but I'm getting out of here.
02:00:57.000 Just cuz I'm like I Who knows?
02:00:59.000 You know what, man?
02:00:59.000 I lived in New York, I left, and then people were planting- I left the city, went to the Jersey side, people planted bombs, and then I'm like, I don't want to be here, man.
02:01:08.000 Now New York is a disaster zone.
02:01:10.000 If I stayed there, wow, that would have been bad.
02:01:12.000 Like, if I kept doing my show as I did it, and I went to New York or stayed in the New York metro, man, would I be unhappy.
02:01:21.000 And it'd be really hard to get out and move.
02:01:23.000 Standard of living just exponentially increases as you eject from highly concentrated and populated areas.
02:01:31.000 Yeah, it's pretty... So, why are people, like, what's the fetish with having this horrible standard of living and you, like, hardly go out and see anybody anyways?
02:01:41.000 You live in a gigantic concrete block on top of other people.
02:01:44.000 I did it for six years.
02:01:46.000 And everything smells like sour milk.
02:01:49.000 New York smells like sour milk, man.
02:01:51.000 Especially after you've been in the country for a long time and you come back.
02:01:54.000 People get, what is it called when you can't smell anymore?
02:01:57.000 Nose blind?
02:01:58.000 Nose blind, yes.
02:01:59.000 I don't know if that's like the scientific term.
02:02:00.000 You get nose blind when you show up and all of a sudden like, man, walking through Manhattan and seeing the milk running through the drain.
02:02:07.000 Right, it's like the reverse of smelling all the manure in the fields.
02:02:11.000 Yeah, yeah, yeah, right.
02:02:12.000 You lose that sense of smell when you live out there.
02:02:14.000 Or going to the beach and smelling the fish and the salt and the, you know.
02:02:18.000 Yeah, you go to the city, it's like the sour milk just becomes a natural part of the environment.
02:02:21.000 And it's just dirty, dirty, dirty.
02:02:24.000 When it rains, people don't know this, like when it rains in New York, it kicks all of the grime and chemicals and garbage from the street up into the air and you breathe it all in.
02:02:33.000 Ugh, New York's nasty.
02:02:34.000 Actually, one thing that Joe brings up all the time is the rubber, the brake pad dust in the streets.
02:02:45.000 You know invisible. Dude there's think about everything in the streets. There's oil. There's gas. There's dirt. There's
02:02:49.000 metal shavings when it rains People don't know this. They're like the smell of rain,
02:02:54.000 right?
02:02:55.000 It's dead plant matter the rain hits the ground and it kicks all the dead plant stuff into the air and you smell
02:02:59.000 it Imagine that in a city and you're like, oh, I love the
02:03:02.000 smell of rain all the lead and like gas and oil Just like going in your lungs
02:03:07.000 City living man You know, when I was younger, I was really excited to be in the big city.
02:03:11.000 Now it's just like, nah.
02:03:13.000 I mean, they're probably going to end up being like all self-driving electric cars.
02:03:18.000 Yeah.
02:03:18.000 Would improve it.
02:03:19.000 They're changing a lot right now.
02:03:21.000 Forcibly.
02:03:22.000 So we have a super chat here from Craig Bragg.
02:03:24.000 It says, Hey Tim and Lids.
02:03:26.000 Tim, I was wondering what kind of guns you own.
02:03:29.000 There's room for both of you guys and Adam on YouTube.
02:03:32.000 There's room for small channels like mine too.
02:03:34.000 Shameless plug.
02:03:36.000 I'm not sure if I'm supposed to say what kind of weapons I have, but I have many.
02:03:40.000 You know, it's crazy.
02:03:40.000 Look, in January, I was like, no guns in the house.
02:03:43.000 None.
02:03:45.000 Now I've got a gun relatively close to where I'm sitting right now.
02:03:47.000 A very quick evolution.
02:03:48.000 Yes.
02:03:49.000 Very, very quick change.
02:03:50.000 When you have threats, someone trying to break in your house, a pandemic, and mass riots for 11 weeks, now we got, there's a, there's actually, we have a bunch of recurve bows mounted on the walls.
02:04:02.000 Yeah.
02:04:03.000 Um, we got a Hungarian composite traditional bow.
02:04:06.000 That's just for fun.
02:04:07.000 Cause we, we, we, you know, we wanted to just do, I have like foam, big foam, like you put water on them and just bounce it off the wall and stuff.
02:04:14.000 It's fun.
02:04:15.000 But as soon as I could, as soon as I got that license to go get armed, I did.
02:04:21.000 So I can't say.
02:04:22.000 I can't say.
02:04:23.000 But there's many.
02:04:24.000 There's enough.
02:04:26.000 John Spock says, what do you think about the Millie Weaver situation?
02:04:29.000 Also, please excuse the mom mentality shown in my first tweet.
02:04:32.000 Unless we're bringing back a version of Ugandan Knuckles.
02:04:35.000 Soy Knuckles?
02:04:37.000 I looked at the story on the Millie Weaver situation and so far it seems like she had a conflict with her mother.
02:04:42.000 If the story is true, the crazy thing about the Millie Weaver scenario is that the mom didn't want to press charges or anything and then they came out with felony charges anyway.
02:04:52.000 So it definitely seems like some kind of I don't know, man.
02:04:58.000 They apparently issued a warrant, you know, on July 20th, like a month ago or something.
02:05:02.000 But some people think she's being assanged and they were looking for a reason to go after her.
02:05:06.000 I think that's over the top for me because her documentary has received way more attention because of the arrest.
02:05:13.000 I don't know if you heard about the noise and effect.
02:05:15.000 Exactly, exactly.
02:05:16.000 Right, so.
02:05:18.000 JVJGG says, love everything you guys do.
02:05:20.000 Stay true.
02:05:21.000 Guest suggestion, Hodge twins and Steven Crowder could even do Skype calls or something similar.
02:05:26.000 We're trying to really avoid Skype calls because audio quality and there's something really just better about having people in person.
02:05:33.000 It's way more fun.
02:05:34.000 And you know, it was actually fortuitous, I guess.
02:05:37.000 All this weird censorship stuff happened with like Babylon Bee and Bill Mitchell while you were coming down to help set up the site.
02:05:42.000 So I was like, Bill, sit in the chair.
02:05:43.000 Let's, let's talk about this stuff.
02:05:44.000 Yeah.
02:05:44.000 Babylon Bee set up on, on mines recently too.
02:05:47.000 Oh, cool.
02:05:48.000 That's exciting.
02:05:48.000 Oh, wow.
02:05:49.000 Right on.
02:05:50.000 Sarcastic Shadow says, Bill, is there any feature on mine so that my entire YouTube library can be moved over?
02:05:56.000 Several hundred videos.
02:05:57.000 If so, I'd like to make the jump.
02:05:58.000 If not, can such a feature be made?
02:06:03.000 Yes.
02:06:03.000 Turn on, go to minds.com slash canary, turn on the experimental beta mode, go to your settings, go to other, and you'll see some stuff.
02:06:11.000 Cool.
02:06:11.000 So you can just click sync.
02:06:13.000 You can, but it's in beta.
02:06:15.000 There can be lag time, so be patient with it, but.
02:06:19.000 Cool.
02:06:19.000 Yeah.
02:06:19.000 Yeah.
02:06:20.000 You guys anticipated that.
02:06:21.000 That's great.
02:06:23.000 All right, let's see.
02:06:25.000 We'll jump down here, because we're going a bit over, but it's fine.
02:06:28.000 We can start talking a bit.
02:06:30.000 Brian S. says, Kafka traps.
02:06:32.000 SJW's favorite weapon.
02:06:33.000 Don't fall for it.
02:06:34.000 Will not.
02:06:35.000 Will Ferra says, who is the new Soy Jesus?
02:06:38.000 So Adam has his own show.
02:06:40.000 Adam cast IRL on YouTube, and he recently broke 100k subs.
02:06:43.000 This is Bill, who's just in town.
02:06:45.000 We have another guest coming tomorrow.
02:06:47.000 We have Kerry Smith.
02:06:48.000 I prefer other plant-based protein.
02:06:51.000 Yeah.
02:06:51.000 I don't like soy.
02:06:52.000 No, I'm not into it.
02:06:53.000 Like oat milk.
02:06:54.000 But tomorrow we have... Carrie Smith is her name, right?
02:06:57.000 Yeah, that is her name.
02:06:58.000 That's correct.
02:06:58.000 She wrote this article, liberal who's now leaving the left, voting for Trump.
02:07:03.000 And we're going to have her on and we're going to talk news and politics like we normally do.
02:07:06.000 And we've got a bunch of other guests coming out, too.
02:07:08.000 Jack Murphy on Wednesday.
02:07:09.000 So I'm really excited.
02:07:10.000 We've got someone who you guys are going to absolutely just absolutely be stoked on on Friday.
02:07:16.000 But I'm not going to say it yet because, you know, maybe I should.
02:07:19.000 I don't know.
02:07:19.000 Whatever.
02:07:19.000 I'm not going to say it.
02:07:20.000 So Bill was here talking to us about censorship because he works with Mines.
02:07:23.000 He's the CFO for Mines, right?
02:07:25.000 Bill Ottman.
02:07:26.000 Am I getting this correct?
02:07:27.000 That's it.
02:07:27.000 Okay, just a second.
02:07:28.000 I just want to plug that real quick.
02:07:31.000 Let's see, Shu Shirako says, I don't do politics nor religion.
02:07:33.000 I wouldn't mind watching the world burn.
02:07:36.000 However, being in the know in MX, I'll pray for both Barr and Trump as they charge towards a Leviathan that traffics humans.
02:07:43.000 Well, that's that woman who, I'm not gonna say her name, but they absolutely are, yes.
02:07:49.000 And I wonder how deep that goes because there are deep connections to that woman.
02:07:54.000 You know who I'm talking about.
02:07:55.000 Yep.
02:07:55.000 I'll just say her, it's Ghislaine Maxwell, you know.
02:07:58.000 Let's see... I have a really... I have a fun conspiracy theory.
02:08:00.000 It's not a real conspiracy theory, it's just a fun thing to play around with, so I don't believe it's true.
02:08:04.000 Step 1. Find God.
02:08:06.000 AI Unleashed will probably be this.
02:08:08.000 It is the 2 plus 2 equals 4 revealed by James Lindsay.
02:08:11.000 I have a really, I have a fun conspiracy theory.
02:08:14.000 It's not a real conspiracy theory.
02:08:15.000 It's just a fun thing to play around with.
02:08:16.000 So I don't believe it's true.
02:08:17.000 I'm just giving you that warning before they clip this.
02:08:20.000 But hit that like button if you enjoy the show.
02:08:23.000 You can follow me on Twitter, Instagram, Parler, and Mines at TimCast.
02:08:27.000 And Bill, you're on Mines, I think.
02:08:30.000 Yeah, mines.com slash ottman.
02:08:31.000 O-T-T-M-I-N.
02:08:32.000 And that's Mines, M-I-N-D-S.
02:08:34.000 M-I-N-D-S.
02:08:35.000 Mids.
02:08:35.000 Mids.
02:08:35.000 That's different.
02:08:37.000 Mids.com.
02:08:38.000 It's a very different thing.
02:08:39.000 M-I-N-D-S.
02:08:40.000 I feel like Homer when he said S-M-R-T.
02:08:43.000 I am so smart.
02:08:44.000 I just saw the voice actor of Homer.
02:08:46.000 Have you ever seen that guy?
02:08:47.000 Yeah.
02:08:47.000 And he does like five different characters?
02:08:49.000 Yeah, yeah, yeah.
02:08:50.000 It's so weird to see him do it in person.
02:08:51.000 Right.
02:08:52.000 Yeah.
02:08:53.000 Anyway.
02:08:53.000 Here's my fun conspiracy theory.
02:08:56.000 Caveat.
02:08:57.000 I am not saying this is true.
02:08:59.000 I was wondering, what are all of these things that are happening and why they're doing it?
02:09:04.000 This is a fun fictional idea.
02:09:05.000 You ever see War of the Worlds?
02:09:08.000 Yes.
02:09:09.000 The movie?
02:09:09.000 Or like, you know, the aliens died from human pathogens?
02:09:12.000 Well, people keep saying, right, that aliens are next?
02:09:16.000 What if they are?
02:09:17.000 And the reason we all have to wear masks and wash our hands is because we're getting rid of as many pathogens as possible.
02:09:23.000 Here's what I was thinking, right?
02:09:25.000 We're not just getting rid of COVID.
02:09:26.000 We're getting rid of everything, you know?
02:09:30.000 Common cold, flu, like if people aren't coughing on each other, people wearing masks, if people are social distancing, it's not just COVID that's going to go by the wayside.
02:09:39.000 It's going to be a ton of random viruses, pathogens, bacteria, whatever.
02:09:43.000 Right?
02:09:44.000 What if?
02:09:46.000 Because the aliens are coming and the aliens will get sick.
02:09:49.000 So, you know, to avoid a war of the world scenario, the people of the world have to socially distance.
02:09:56.000 Yeah.
02:09:56.000 I'm half kidding.
02:09:57.000 But what makes viruses stronger?
02:10:01.000 What do you mean?
02:10:02.000 I mean, you know, in terms of building up immunities, I mean, what would be better for the aliens?
02:10:12.000 Not having us littered with random viruses popping out all over the place.
02:10:16.000 For us, we might be better, but the aliens would be better off with the least amount possible.
02:10:22.000 We would... I mean, arguably, we'd be better off with no viruses at all.
02:10:26.000 And, like, no... Like, if they didn't exist.
02:10:28.000 You know, not every virus kills us.
02:10:30.000 Some of them actually get along with us very well, to the point where we don't destroy them, our bodies don't know.
02:10:36.000 But you do want a strong immune system.
02:10:39.000 You do keep your immune system tough by having constant, you know, being constantly in battle.
02:10:45.000 But the joke I'm trying to bring up is it would be a funny thought.
02:10:48.000 But we have seen these weird, you know, UFOs.
02:10:50.000 You see these Miami videos, man?
02:10:52.000 What was that thing that Harry Reid said you were talking about earlier?
02:10:54.000 Yeah, Harry Reid, let's see.
02:10:57.000 He said something like- This is a quote from Harry Reid just about two weeks ago in the New York Times.
02:11:05.000 Mr. Reid said, more should be made public to clarify what is known and what is not.
02:11:10.000 Quote, it is extremely important that information about the discovery of physical materials or retrieved craft come out.
02:11:18.000 Oh, snap.
02:11:19.000 So, you know, whenever you think about that... There was a consultant who apparently testified that he believes they're off-world materials, like vehicles off-world or whatever.
02:11:31.000 You know what?
02:11:32.000 There's a correction.
02:11:33.000 Oh, there was?
02:11:33.000 There was a correction on that a little bit after he backpedaled.
02:11:38.000 An earlier version of this article inaccurately rendered remarks attributed to Harry Reid, the retired Senate Majority Leader from Nevada.
02:11:44.000 Mr. Reid said he believed that crashes of objects of unknown origin may have occurred and that retrieved materials should be studied.
02:11:51.000 Okay.
02:11:53.000 He did not say that crashes had occurred and that retrieved materials had been studied secretly for decades.
02:11:59.000 An earlier version also misstated the frequency with which the Director of National Intelligence is supposed to report on unidentified aerial phenomena.
02:12:07.000 It is 180 days after enactment of the Intelligence Authorization Act, not every six months.
02:12:14.000 I don't know, man.
02:12:15.000 I don't see how they could originally report that.
02:12:17.000 Like, that's a crazy quote.
02:12:19.000 I know.
02:12:20.000 Everybody went nuts.
02:12:22.000 I mean, to retract, like, that's a backtrack.
02:12:25.000 It feels like a backtrack.
02:12:26.000 Maybe they got the quote wrong, but it seems like that's so unlikely.
02:12:29.000 Here's what we all want to happen.
02:12:32.000 We want to have had happened that he was telling the truth, and in his old age slipped up, and then, you know, the secret government organization told him to walk it back, and he did.
02:12:41.000 What probably happened is that he said some things and the reporter screwed it up.
02:12:47.000 And the reporter- You think that's more likely?
02:12:49.000 Absolutely, dude.
02:12:50.000 You know- You know what the Galliman amnesia effect is?
02:12:53.000 No.
02:12:54.000 You, let's say you were reading the news, and you saw an article about social media, and it said some ridiculous nonsense like, you know, by using the Twitter's ADI, you know, they're able to connect, you know, other programs on the web, and then you're like, ADI?
02:13:13.000 You mean API?
02:13:15.000 What is this?
02:13:15.000 Who wrote this?
02:13:16.000 They have no idea what they're talking about.
02:13:18.000 Then you click over the next link, and it's like, war in Syria, you know, president declares blah blah blah, and you go, wow.
02:13:25.000 That's the Galman-Amish effect.
02:13:26.000 That you didn't, you read something in which you're an expert, and notice it's fake news, and then assume the rest of it's all good.
02:13:34.000 So, when they come out with these quotes or whatever, what really happened is, he was talking, somebody was writing things down really fast, went back, couldn't read their handwriting, and said, mmm, here's what he said.
02:13:45.000 I feel like the New York Times is, especially with sensitive articles like this, which they only put out...
02:13:51.000 Every so often, you know, a big UFO piece out of New York Times is like, that's what everyone's waiting for these days.
02:13:55.000 And you just have to, I would think that the editors would be scrutinizing these quotes more than that initially.
02:14:02.000 And put it this way, even if they, even if you're right, The context of this is still like, wait a second, how is this not the only thing that's being, you know, this should be getting at least some percentage of, of constant airtime.
02:14:15.000 The fact that this is.
02:14:17.000 Yeah.
02:14:17.000 If I made a video titled Joe Biden drops out of the race, Hillary Clinton, you know, decides to run, I'd get a million views in an hour.
02:14:27.000 And then all I have to do is put correction.
02:14:31.000 None of it was true.
02:14:33.000 Yeah.
02:14:33.000 Let's say Hillary Clinton recently came out and said she's ready to serve in the Biden administration.
02:14:41.000 A news outlet could just write, Hillary Clinton announces she will be serving in the Biden administration, they'll get a million hits, they sell all the ad space, they make all that money, and then an hour later after they've milked it, they put, correction, she said she was willing to, not that she is.
02:14:54.000 That's our mistake.
02:14:56.000 They, I think they are bad at what they do.
02:15:00.000 It's not that it's I totally agree that that is what would happen.
02:15:03.000 But the you know, this article, regardless of if there was a backtrack, is saying that UFO findings are becoming public and it's in the Intelligence Authorization Act that this is mandated.
02:15:16.000 I would love to believe the aliens are next, but I just don't.
02:15:19.000 It's been a crazy year, man!
02:15:20.000 So that's good news at the end of the day, regardless of backtracking.
02:15:24.000 I would love to believe the aliens are next, but I just don't.
02:15:29.000 It's been a crazy year, man.
02:15:31.000 It's been the craziest year, you know?
02:15:35.000 Who knows what's coming?
02:15:37.000 Joe Biden might fall asleep during the debates.
02:15:40.000 Some weird, crazy thing.
02:15:42.000 Aliens land.
02:15:43.000 Then they come out and they say, thank you for social distancing.
02:15:46.000 And now we won't be getting sick.
02:15:48.000 The war of the world scenario.
02:15:49.000 I don't know.
02:15:50.000 Maybe, maybe everybody's just lost their minds because of social media.
02:15:56.000 You know what?
02:15:56.000 You know what the great filter is?
02:16:00.000 Fermi's paradox.
02:16:01.000 Yeah, Elon tweeted about it yesterday.
02:16:03.000 Maybe we didn't realize that social media would be the great filter.
02:16:05.000 life exists, why haven't we encountered it yet?
02:16:08.000 The great filter theory is that all great civilizations eventually wipe themselves out
02:16:11.000 because something filters them.
02:16:14.000 Maybe we didn't realize that social media would be the great filter.
02:16:19.000 We thought it was nuclear bombs.
02:16:20.000 Nope.
02:16:21.000 It was mass hysteria.
02:16:23.000 Humans weren't meant to operate on this kind of scale.
02:16:27.000 There's a lot of things humans weren't meant to do in terms of how we function and exist.
02:16:32.000 It doesn't mean we can't survive, because we have brains that can adapt very, very quickly, but this level of information is creating random pockets of insanity.
02:16:41.000 You've got conspiracy theories that persist on the right.
02:16:44.000 And these people are marginalized in seconds.
02:16:47.000 They're mocked and ridiculed.
02:16:48.000 And some of them stand up for it.
02:16:50.000 On the left, you have unhinged conspiracies running rampant for the past decade, non-stop.
02:16:55.000 I mean, Russiagate, Ukraingate, now the Post Office?
02:16:58.000 It's all just ridiculous insanity.
02:17:00.000 So we're just losing our minds.
02:17:02.000 Maybe that's the great filter.
02:17:05.000 Every great civilization, once they get to a point where they have instant transmission communications, you know, the information flow is so rapid, you can't actually create a controlled system.
02:17:14.000 It's almost like we're living our- it's almost like social media is static.
02:17:19.000 There's no cohesive message.
02:17:20.000 It's just random, everything crashing into each other.
02:17:23.000 But here's the thing.
02:17:23.000 I feel like, and I'm not trying to say that we're, you know, we have, we're smarter or anything like that, but, you know, would you agree that people who are more unhinged have less access to information?
02:17:39.000 Like, are they absorbing the full spectrum of information available in order to be able to come up with an informed
02:17:45.000 decision-making ability?
02:17:48.000 It's not an issue of whether they have access to it, it's an issue of how human beings are and the system that's
02:17:56.000 being handed to them.
02:17:57.000 So when given the opportunity to explore information they choose what makes them feel better.
02:18:02.000 They go insane.
02:18:03.000 But they're also being engineered to believe these things and so you know you've done good work to educate yourself outside of just what is in your Twitter feed.
02:18:13.000 Right.
02:18:14.000 And so therefore you're able to make informed decisions.
02:18:16.000 So I would just Hope.
02:18:19.000 I, you know, maybe it's not true, but the more access to information that we have, the more ability we have to make decisions about what the hell is going on.
02:18:27.000 You know, if we had access to what was really going on with aliens, the classified information, then we can start to understand why the hell we're here.
02:18:34.000 Some things are opinions.
02:18:35.000 You know, like, uh, some, some people say we should run this program to save people in this way.
02:18:41.000 And someone says that's a bad idea.
02:18:42.000 This is a better idea.
02:18:43.000 And even if you know, I mean, let's think about the questions of like the death penalty.
02:18:48.000 Some questions just don't have easy answers no matter what you know.
02:18:54.000 Ultimately, I think if we did synchronize with the great network to better understand everything.
02:18:59.000 There's an Outer Limits episode about this where everyone has a thing on their head that just gets the information.
02:19:04.000 We would still be polarized based on ideology.
02:19:07.000 I guess eventually one side will dominate and wipe out the other side.
02:19:12.000 Maybe that'll happen before the aliens actually arrive.
02:19:15.000 Or maybe the aliens are actually here right now.
02:19:18.000 We have gone over by about 20 minutes, so I'm gonna wrap it up there.
02:19:21.000 If you haven't already, you can hit the like button to really help out the channel.
02:19:24.000 Sharing the show really helps.
02:19:26.000 We are, uh...
02:19:28.000 Picking things up, getting ready for a new set of guests, show's a little bit more chill, because, you know, Adam was basically the hype man, so you can follow Adam on YouTube, AdamCastIRL, and his channel is still linked in, uh, on, his channel is linked on our channel, so definitely make sure you check out Adam if you're a big fan, and, you know, he's gonna do really, really well on his show.
02:19:48.000 We're gonna have more guests coming up.
02:19:50.000 So if you want, you can follow me on Twitter, Instagram, Parler, and Mines at TimCast.
02:19:54.000 You can also check out TimCast.net, which will be, you know, up and running soon.
02:19:58.000 We're getting there.
02:19:59.000 And you can check out YouTube.com slash TimCast and YouTube.com slash TimCast News.
02:20:04.000 Those are both my channels with way more content, because I put out a ridiculous amount, like 12 videos per day that are like, I don't know, I record like four, five hours every day.
02:20:14.000 I think I record more than any other political commentator in the world.
02:20:17.000 You're an animal, dude.
02:20:18.000 Yeah, seriously.
02:20:19.000 I have some kind of weird problem where I can't stop talking.
02:20:23.000 It's true.
02:20:24.000 I mean, they've noticed.
02:20:25.000 They're watching.
02:20:25.000 They're like, yeah, Tim doesn't shut up.
02:20:27.000 I know!
02:20:27.000 It turned into a job, you know?
02:20:30.000 You take the cards, you're dealt, and you play with them.
02:20:32.000 That's right.
02:20:32.000 So I don't know if you want to mention anything before we wrap up.
02:20:35.000 No.
02:20:35.000 Hit me up.
02:20:36.000 Mines.com slash opmin.
02:20:37.000 Let's do this.
02:20:38.000 One of the answers to censorship is competition in the market.
02:20:41.000 So, you know, Bill, glad to have you.
02:20:43.000 Thanks for doing what you do.
02:20:44.000 So much fun.
02:20:44.000 And then, of course, there's Lydia at Sour Patch Lids.
02:20:46.000 L-Y-D-S.
02:20:47.000 You can follow her on Twitter and Parler.
02:20:50.000 Twitter and Parler.
02:20:51.000 Not Instagram.
02:20:52.000 Otherwise, you'll make some poor person really upset.
02:20:55.000 And Mines too.
02:20:55.000 Oh, I am on Mines.
02:20:57.000 Oh, my gosh.
02:20:57.000 OK.
02:20:58.000 I'm on Parler, Twitter and Mines.
02:21:00.000 Excellent.
02:21:01.000 Just so you know.
02:21:02.000 Diversify.
02:21:02.000 That's right.
02:21:03.000 Thank you all so much for hanging out.
02:21:04.000 We'll be back tomorrow with Carrie Smith.
02:21:06.000 She's a liberal who decided to vote for Trump, and I think, like, SJWs are, like, one of the biggest reasons.
02:21:11.000 So we'll see you all then.
02:21:12.000 And again, thanks for hanging out.
02:21:14.000 Adios.