Timcast IRL - Tim Pool - November 22, 2022


Timcast IRL - Elon Musk UNBANS THEM ALL, DJT, MTG, Veritas ALL BACK w-Darren Beattie & Mary Morgan


Episode Stats

Length

2 hours and 1 minute

Words per Minute

187.4682

Word Count

22,843

Sentence Count

1,806

Misogynist Sentences

54

Hate Speech Sentences

37


Summary

Elon Musk is unbanning everyone, Donald Trump is unbanned, Project Veritas is back, Marjorie Taylor Greene is back. But you know who's not back? Alex Jones. We talk about the pros and the cons of this and more on this week's Pop Culture Crisis.


Transcript

00:00:00.000 Elon Musk is unbanning everyone.
00:00:24.000 First, Donald Trump is unbanned.
00:00:26.000 He has this Twitter poll.
00:00:27.000 There's this really funny news clip where a guy says, it wasn't the American people who voted, it was Russia.
00:00:32.000 It was them.
00:00:34.000 They got Trump reinstated.
00:00:35.000 These people are starting to realize that their narrative was fiction and it was propped up by the machine.
00:00:41.000 And now that regular people get to speak, hey, how about that?
00:00:44.000 They don't actually like the weird world you live in.
00:00:46.000 Not just Donald Trump, though.
00:00:47.000 Project Veritas is back.
00:00:49.000 James Lindsay is back.
00:00:50.000 Marjorie Taylor Greene is back.
00:00:53.000 But you know who's not back?
00:00:54.000 Alex Jones.
00:00:56.000 So we gotta talk about all this Elon stuff because, uh, look, a lot of it's really, really good.
00:01:00.000 Overwhelming.
00:01:01.000 Net positive.
00:01:02.000 But Elon Musk's refusal to reinstate Alex Jones is making him look like a hypocrite, and I know Elon understands this, so I kinda can't believe him when he says, oh, it's cause I have no mercy for people who exploit children in this way for fame or whatever.
00:01:19.000 Alex Jones was banned for making fun of Oliver Darcy.
00:01:21.000 He wasn't banned for saying anything about kids or anything like that, so what's Elon talking about?
00:01:26.000 Well, let's talk about it.
00:01:27.000 The pros and the cons.
00:01:28.000 There's a whole bunch of news to break down as it pertains to Twitter.
00:01:32.000 Then, of course, we have over in Arizona, the Attorney General is launching an investigation into election irregularities, and the assistant AG is refusing to certify the election.
00:01:41.000 I have no idea what YouTube thinks of that.
00:01:44.000 YouTube?
00:01:45.000 What?
00:01:46.000 No idea.
00:01:47.000 I don't know.
00:01:47.000 How do the rules—I have no idea.
00:01:48.000 So we'll talk about it anyway!
00:01:50.000 So, uh, before we get started, head over to TimCast.com, become a member.
00:01:53.000 If you would like to support our work, click that beautiful Join Us button, sign up, and as a member, you will help support all of our journalists, and you'll get access to the exclusive uncensored Members Only show.
00:02:04.000 We're gonna have one of those coming up for you tonight around 11 p.m.
00:02:08.000 So don't forget to smash that like button, subscribe to this channel right now, and share the video.
00:02:12.000 If you're watching on YouTube, take that URL, just post it everywhere you can.
00:02:16.000 Word of mouth is the best way to support us, and it helps us bypass the censorship, because sure enough, we're already getting people messaging us saying that there are no notifications, they're having trouble finding the video, so surprise, surprise.
00:02:29.000 But, you know, smash that like button.
00:02:31.000 Joining us today to talk about all of this and more is Darren Beattie.
00:02:36.000 Great to be here.
00:02:37.000 Thanks for having me.
00:02:38.000 Who are you?
00:02:39.000 I'm the founder-editor of a great news site called revolver.news, which has just published a really bombshell piece on the next FTX scandal, which I hope we'll have a chance to talk about.
00:02:53.000 Yeah, is that Tether?
00:02:55.000 Yes.
00:02:55.000 There's been a lot of talk about Tether for a while, but we'll definitely get into that.
00:02:58.000 That sounds very interesting.
00:02:59.000 And I'll also mention the last episode we had you on was our biggest episode we ever did for a while, until we had Joe Rogan and Alex Jones at the same time.
00:03:08.000 But like, come on, having those two guys at the same time, that shouldn't count.
00:03:13.000 Having you as a single guest and getting, it was like 2.4 million views or something.
00:03:16.000 Right, you have to divide it in half.
00:03:19.000 Actually, you know what?
00:03:20.000 I can tell you this.
00:03:22.000 Actually, with the live viewership and the VOD stuff we put on Rumble, it actually is still the biggest.
00:03:27.000 There you go.
00:03:28.000 Because we had like 300k, so I think it's like, if you were to add the YouTube views from the day plus Rumble, it would be 200k more than the AJ Rogan episode.
00:03:36.000 It's all how you count.
00:03:38.000 It's all how you count, it's all how you count.
00:03:39.000 Alright, right on, man.
00:03:40.000 Thanks for joining us.
00:03:41.000 And we're also hanging out with Mary Morgan.
00:03:43.000 Hello, everyone.
00:03:44.000 I'm sitting in Ian's seat tonight.
00:03:46.000 Happy to be here.
00:03:47.000 Who are you?
00:03:48.000 I'm on Pop Culture Crisis.
00:03:50.000 Ah, yeah.
00:03:50.000 That's a show.
00:03:51.000 Here at Timcast.
00:03:51.000 That's right.
00:03:52.000 So you should go subscribe to that.
00:03:53.000 That's a great show.
00:03:55.000 I think so, too.
00:03:57.000 You know what's another great show?
00:03:59.000 youtube.com forward slash we are sage.
00:04:01.000 Anyway, today I come here with one simple message.
00:04:04.000 Make Orwell fiction again.
00:04:06.000 It's just getting too real out there.
00:04:07.000 This is one of the shirts.
00:04:09.000 This is the message on my shirt.
00:04:11.000 It's one of the first shirts that we actually got out there to the general public.
00:04:14.000 It's one of the first ones that kind of went viral.
00:04:17.000 And I'm wearing it today.
00:04:18.000 If you want to wear it, you could get it on thebestpoliticalshirts.com.
00:04:21.000 Because you do, this is how you guys support me and my efforts here, so thank you so much for having me.
00:04:26.000 Splurge?
00:04:27.000 Back once again.
00:04:28.000 Hello, Luke.
00:04:29.000 Hello, Mary.
00:04:30.000 Good to have you guys here.
00:04:31.000 Are you okay with the fact that he calls you Splurge?
00:04:33.000 This is Scary Mary.
00:04:35.000 This is between me and Splurge here, okay?
00:04:36.000 I didn't know this was a thing.
00:04:38.000 Scary Mary.
00:04:39.000 I think Luke has a lot of nicknames for everybody here, so I'm cool with it.
00:04:42.000 That means we're friends.
00:04:43.000 Is Lou now?
00:04:44.000 Yeah, that's right.
00:04:46.000 Call me Lou.
00:04:47.000 That's fine.
00:04:48.000 Lou.
00:04:49.000 Okay, let's talk about news.
00:04:51.000 Here's the first story.
00:04:53.000 Elon Musk's Twitter.
00:04:54.000 I like how they call it Elon Musk's Twitter.
00:04:55.000 Reinstates rep Marjorie Taylor Greene.
00:04:58.000 So this is from the past couple of hours.
00:05:00.000 Of course, Donald Trump has been reinstated.
00:05:02.000 Don't know if he's going to come back.
00:05:04.000 Project Veritas is back.
00:05:05.000 James Lindsay is back.
00:05:06.000 Who else is back?
00:05:07.000 I'm seeing everybody come back.
00:05:09.000 ALX.
00:05:09.000 Andrew Tate is back.
00:05:10.000 Andrew Tate is back!
00:05:11.000 Anybody else?
00:05:13.000 Marjorie Taylor Greene?
00:05:14.000 Don't know.
00:05:14.000 Elon Musk is just, you think he's going through a list and then just like looking at why they were banned and like okay we'll unban them and like who's this and like unban them.
00:05:22.000 But the graveyard of suspended accounts is just so large that there's no way to know who's never gonna come back.
00:05:29.000 Well, we're waiting for Sargon, Carl Benjamin, to be unbanned.
00:05:31.000 Yeah, I was just thinking of him.
00:05:33.000 Did he get unbanned?
00:05:34.000 I feel like I saw it today.
00:05:35.000 Or people were just calling for it.
00:05:37.000 It blends together.
00:05:37.000 Yeah, I think Michael Malice tweeted out that he should be unbanned.
00:05:41.000 Yeah, there's a bunch of people.
00:05:42.000 Rick Hader-Lost, still banned.
00:05:44.000 He's still banned.
00:05:44.000 Carrie Wendler, banned.
00:05:46.000 Dr. Malone, Dr. McCullough.
00:05:49.000 Anti-Media Freethought Project.
00:05:51.000 We could keep going.
00:05:51.000 We could do a whole show just by naming all the people that have disappeared off the face of the internet.
00:05:55.000 I know Milo said he wouldn't come back to the platform or he wouldn't continue tweeting, but come on, how are you going to resist?
00:06:02.000 Milo has to come back.
00:06:04.000 I really want him to.
00:06:05.000 Yeah.
00:06:06.000 What do you think, Darren?
00:06:07.000 Who needs to come back, and do you think Trump will come back?
00:06:09.000 Oh, there are a lot of people that need to come back.
00:06:11.000 Trump, he desperately wants to, and Elon understands that.
00:06:15.000 Hence the recent tweets that he sent out expressing the temptation to which Trump is yet to give in.
00:06:24.000 He's just got to.
00:06:25.000 He needs to find the right opportunity to do so.
00:06:28.000 They say there are potential legal complications given his fiduciary responsibilities to Truth Social.
00:06:37.000 So that has to be worked out.
00:06:38.000 And as you know, the system will use any opportunity it can get to file another case against Trump.
00:06:46.000 There are already three or four in the works.
00:06:48.000 So it's complicated business with Trump.
00:06:51.000 He wants to, and if we subscribe to the modification of Occam's razor that Elon propounded on Twitter, namely that the most entertaining outcome is the most likely, I think we can all expect Trump to be on there and many more colorful figures to come.
00:07:12.000 We need to pull this meme from Elon Musk.
00:07:14.000 He tweeted, and lead us not into temptation, and it is the meme of the woman showing her private parts to the monk, and the monk is refusing to look.
00:07:23.000 The monk, of course, is Donald Trump, and the woman's derriere is Twitter.
00:07:28.000 So, uh, very good.
00:07:29.000 There's another one where it's Lois Griffin, all disheveled-looking, looking over at a bottle of prescription pills.
00:07:35.000 And, uh, uh, I just gotta point out, there's a woman.
00:07:38.000 She blocked me, though.
00:07:40.000 She quoted this and then wrote, this is rape culture.
00:07:44.000 A billionaire promoting rape culture on his platform, and then I was just like, men not wanting to have sex with women who want to is rape now?
00:07:52.000 I guess?
00:07:53.000 And then she blocked me for it, so.
00:07:55.000 No, but it's actually true.
00:07:56.000 The rape is actually him not giving in to the temptation.
00:08:01.000 Well, if anything, she's assaulting him!
00:08:05.000 He's praying, right?
00:08:06.000 He's minding his own business.
00:08:07.000 And then she comes along and she pulls up her dress and starts looking at him.
00:08:10.000 You know, this guy's minding his own business.
00:08:12.000 Yeah, he's not being a simp.
00:08:14.000 Maybe he should be praised.
00:08:16.000 Maybe that's what she meant.
00:08:19.000 Simps sink ships, okay?
00:08:23.000 This is a serious deal here.
00:08:25.000 So obviously, you know, you're portraying Donald Trump doing the right thing here, right?
00:08:29.000 Maybe that's what she meant.
00:08:30.000 Maybe I misunderstood.
00:08:31.000 Maybe she was saying the woman was raping the man.
00:08:36.000 Nowadays it's a little bit confusing.
00:08:38.000 Elon Musk was trying to imply that he was trying to seduce Trump.
00:08:41.000 But it's been nearly three days, and Trump still has the ability to tweet, but yet hasn't done it.
00:08:47.000 He can't.
00:08:48.000 And, I mean, he can't because of financial interests with Truth Social.
00:08:52.000 So we should explain it in kind of basic form here.
00:08:55.000 Trump is having the opportunity to make a lot of money with Truth Social, rather than, of course, have a voice on Twitter.
00:09:02.000 He did make a couple statements a couple months ago saying that he wouldn't return to Twitter, even if Elon Musk would buy it.
00:09:07.000 Uh, will he return?
00:09:08.000 I think he has to if he's running, if he's going to be running a successful presidential campaign because there's no other way to get out in front, to get your message across, but most importantly also protect yourself against the incoming attacks and slanders and all the fake news media bullcrap along with the DOJ indictments.
00:09:26.000 There's no better way to protect yourself than in the court of public day.
00:09:29.000 Uh, and, and, and that of course means using Twitter.
00:09:32.000 Will he use it?
00:09:32.000 What's the court of public day?
00:09:33.000 Something I just made up right now.
00:09:35.000 Court of public light.
00:09:36.000 Court of public opinion.
00:09:38.000 Thank you.
00:09:39.000 Do you think Elon should buy Truth Social then for way more than it's worth?
00:09:44.000 And then he technically still wins?
00:09:46.000 I got the solution.
00:09:47.000 Literally.
00:09:48.000 Seriously.
00:09:49.000 Donald Trump tweets a sentence with a link to Truth Social which contains the paragraph.
00:09:56.000 So that way, he's still getting his idea on Twitter, he's maximizing his audience, but he's actually helping Truth Social grow.
00:10:02.000 I think that's the best thing to do because more competition in this space is good, and if he can build another community, then we never have to worry about monopolistic censorship.
00:10:12.000 From what I read, Truth Social has a six-hour exclusivity window.
00:10:17.000 And therefore Trump must post on, this is what I read, it could be wrong, but Trump must post on Truth Social, then six hours later he could post the same thing on another social media platform.
00:10:28.000 And there's exceptions to the rule, like fundraising, so he can post on Twitter according to some of the mainstream media sources out there and their kind of larger agreement with Truth Social.
00:10:37.000 But at the end of the day, I mean, is he prioritizing money in his platform over this platform?
00:10:44.000 And will this kind of outcome be in his benefit?
00:10:47.000 I don't think it will.
00:10:48.000 Are you saying Truth Social has the possibility of becoming profitable?
00:10:52.000 Because I don't see that.
00:10:54.000 Oh, definitely.
00:10:55.000 Yeah, it could be profitable.
00:10:56.000 Who uses it?
00:10:57.000 Do any of us at this table use it?
00:10:59.000 No.
00:10:59.000 Well, I have one, and I will say I don't use it, but there's a lot of engagement on it.
00:11:06.000 It is significant.
00:11:08.000 Elon's probably trying to steal back that Oh, the users that went over to it?
00:11:13.000 Yeah, I think Elon saw this as a business opportunity.
00:11:15.000 He probably was just like, look how many people they've kicked off the platform.
00:11:19.000 They're losing money because they're bad at business.
00:11:21.000 I can come in, post some spicy memes, everyone will start screaming and spitting and yelling, they'll all come back, and then we'll go public again in a few years.
00:11:29.000 Right.
00:11:30.000 And there's been a massive influx into Twitter since Trump's count was reinstated.
00:11:36.000 A massive number.
00:11:37.000 Look at this.
00:11:40.000 This tweet from Donald Trump from 2011.
00:11:43.000 I feel sorry for Rosie's new partner in love, whose parents are devastated at the thought of their daughter being with Rosie a true loser.
00:11:51.000 This history was wiped from existence by Vijayagada and Dorsey.
00:11:55.000 And with Elon Musk coming in, we get the whole archive of Trump tweets.
00:11:58.000 It's a veritable library of Alexandra.
00:12:01.000 A cornucopia of one line of his tweets.
00:12:05.000 There's literal books of his tweets that were being sold on Amazon.
00:12:08.000 Wow, that's amazing.
00:12:09.000 I love all of Trump's old tweets telling Robert Pattinson to break up with Kristen Stewart.
00:12:16.000 I'm looking them up now because I just love him so much.
00:12:18.000 It's amazing!
00:12:20.000 See, here's the issue.
00:12:22.000 Even this Trump isn't on Truth Social.
00:12:25.000 True.
00:12:25.000 No.
00:12:25.000 It's not.
00:12:26.000 Truth Social posts are like paragraphs about like MAGA stuff.
00:12:30.000 Come on, Donald Trump, call somebody a horse face, right?
00:12:33.000 We need to bring back the snarky one-liners.
00:12:35.000 This was hilarious, man.
00:12:36.000 I can't believe it.
00:12:37.000 This stuff's 11 years ago.
00:12:39.000 Yeah.
00:12:40.000 Trump was sitting there on his golden toilet, just insulting people on his phone, and we all thought it was funny.
00:12:45.000 Yeah, like 3 in the morning.
00:12:46.000 Not just insulting people, but like, remember when he posted at his, like, taco bowl?
00:12:51.000 Yeah, that's great.
00:12:52.000 Like, that's so wholesome.
00:12:54.000 His taco bowl.
00:12:56.000 There's nothing so lighthearted on the app, is when that went up.
00:12:59.000 Yeah, bring back Twitter Trump, I guess.
00:13:01.000 Man, all the crazy stuff on Trump's Twitter thread.
00:13:05.000 We have to go back in time and look at all the archives.
00:13:07.000 He's not letting me word search anything he's tweeted right now.
00:13:11.000 You know, there have been glitches on Twitter that I've noticed.
00:13:15.000 Mostly that you'll try and load a tweet, and then it'll say, um, this tweet is unavailable.
00:13:20.000 Or the replies are upside down.
00:13:22.000 What?
00:13:22.000 Yeah, that happens so much to me, too.
00:13:24.000 So, like, they usually are below a tweet, but they're above a tweet, and they're in a non-linear order above the tweet.
00:13:31.000 That happened before they slashed the number of employees.
00:13:33.000 Yeah, yeah.
00:13:35.000 It's happening a lot more for me.
00:13:37.000 They appear above the tweet.
00:13:38.000 Yeah.
00:13:39.000 Wait, what?
00:13:40.000 You have to scroll up.
00:13:41.000 And then it's confusing and then it's all messed up and discombobulated.
00:13:45.000 That doesn't happen to you!
00:13:46.000 If Twitter implodes and is wiped off the face of the earth, I'll take it as a win, too.
00:13:49.000 Like, you've got this platform that is clearly biased against anyone who opposes the establishment.
00:13:55.000 Namely conservatives, because Trump came in.
00:13:58.000 But even anti-war people are getting banned.
00:14:01.000 So, if Elon Musk tries to fix it, it's a good thing.
00:14:04.000 If he unbans people, it's a good thing.
00:14:05.000 This could be... You guys ever hear that stuff?
00:14:08.000 Is that Pocus?
00:14:09.000 He's a terrorist.
00:14:11.000 Oh no, baby.
00:14:12.000 Have you guys ever... He's outside the door right there.
00:14:14.000 Have you ever heard the stories about how, like, someone on their deathbed all of a sudden will become, like, lucid and energized, and they'll sit up and start talking and be totally normal, and you'll be like, oh, they're getting healthy, and then they just die.
00:14:26.000 You guys ever hear that?
00:14:28.000 Like, people who are, like, dying and, like, laying on their bed will just one day get up and be like, I'm feeling better, I'd like to see my family.
00:14:34.000 And then the family will come in, they'll all talk and laugh, and then all of a sudden the person will just, like, like, croak.
00:14:39.000 Like, it's the body mustering up the last bit of energy to make your final, you know, finish your business and say your goodbyes.
00:14:47.000 Maybe this is what's happening.
00:14:48.000 Elon Musk comes in.
00:14:49.000 We're all like, yay, we feel invigorated again, like it's 2015, 2016, and then the replies flip over and the words start getting jumbled, the tweets stop appearing, and then it's gone.
00:15:00.000 But it definitely doesn't feel like 2016.
00:15:03.000 And then the microchip goes inside of your head without you even noticing it, without your consent, and then bada bing, bada boom, you're connected to the new WeChat of the United States.
00:15:11.000 Without your consent.
00:15:12.000 All right, let's talk about whiny celebrities.
00:15:14.000 We have Rolling Stone.
00:15:15.000 Jack White quits Twitter, calls Elon Musk's Trump reinstatement a whole move.
00:15:20.000 Trent Reznor also quit.
00:15:23.000 Who cares, dude?
00:15:23.000 You guys aren't part of the conversation.
00:15:25.000 You're welcome to be part of the conversation.
00:15:27.000 You weren't part of the conversation, so you cried and left.
00:15:29.000 So what?
00:15:29.000 You weren't tweeting anyway.
00:15:31.000 It's like that meme that's like the big thumbs up.
00:15:33.000 Yeah, right.
00:15:34.000 We're going to continue with our conversation now.
00:15:36.000 Thank you.
00:15:37.000 He said, this is straight up you trying to help a fascist have a platform so you can eventually get your tax breaks.
00:15:43.000 Why are these people so dumb?
00:15:47.000 Yeah, Elon paid the most amount of taxes than any American citizen ever in history.
00:15:53.000 So why?
00:15:55.000 And who is he accusing of being a fascist?
00:15:57.000 Yeah, who exactly?
00:15:59.000 Trump?
00:16:01.000 Trump hasn't tweeted.
00:16:02.000 It's been three days.
00:16:02.000 The world hasn't ended.
00:16:03.000 Like, come on.
00:16:05.000 People take themselves way too seriously.
00:16:06.000 They're claiming that Elon's forcing them to follow Trump.
00:16:12.000 That's ridiculous.
00:16:13.000 It's not true.
00:16:13.000 Yeah.
00:16:14.000 That's worse than, like, QAnon stuff.
00:16:16.000 They're tweeting, like, not only was Trump reinstated, but I found that I was following him, and I've never followed him, and I have to unfollow him.
00:16:22.000 And it's like, dude, you followed him.
00:16:23.000 Come on.
00:16:24.000 Yeah, we know you did.
00:16:26.000 Who's this guy?
00:16:27.000 Jack White.
00:16:27.000 The White Stripes.
00:16:30.000 Seven Nation Army?
00:16:31.000 Anything?
00:16:31.000 No.
00:16:32.000 Ring a bell?
00:16:32.000 Okay, cool, cool.
00:16:34.000 Have you noticed that all of these choogy people use a-hole as their number one insult?
00:16:39.000 A-hole.
00:16:40.000 It's really cringey.
00:16:41.000 I don't know why.
00:16:42.000 I don't know why that word is so particular.
00:16:43.000 And the way they curse, it just feels like a 13-year-old who hasn't cursed before who's like trying to get away with it.
00:16:49.000 I don't know.
00:16:50.000 Trent Reznor left too, and I'm just like, and?
00:16:54.000 Yeah, not very relevant right now.
00:16:56.000 No, I look.
00:16:57.000 You will not be missed.
00:16:58.000 My favorite was CBS News saying that they're going to be leaving temporarily because of security concerns and then coming back and then still everyone saying, we don't care.
00:17:07.000 It's like, it's okay.
00:17:09.000 Especially if you announce it, you won't have the willpower not to come back.
00:17:12.000 Yeah, then they came back.
00:17:14.000 They came right back just a few days afterwards, a few hours afterwards, and it's like, oh yeah, we're still looking, we're monitoring the situation.
00:17:21.000 Like, what?
00:17:22.000 The thing is, none of these people can really leave, and that's why it's not the last kind of death croak of Twitter, because nothing on the planet has been able to replicate the network effects that Twitter has.
00:17:34.000 True.
00:17:34.000 I would be astonished.
00:17:36.000 It's just too valuable to too many Major stakeholders in the system.
00:17:41.000 Just recently you had Elon, who by the way, I think he is maybe the first major captain of industry to mock the ADL since the ADL's inception.
00:17:53.000 That's a first!
00:17:54.000 It's a major thing.
00:17:55.000 It's incredible.
00:17:56.000 And he continued it.
00:17:57.000 So Israel did a tweet and Elon kind of mocked them too, but in a playful way saying, look, You know, we need more people to tweet.
00:18:10.000 We need more countries to tweet.
00:18:12.000 And so if countries are tweeting, you know, it's just, it's too valuable.
00:18:17.000 It's not like there's this other thing called, what is it called, like, Triceratops?
00:18:23.000 There's some stupid, like, left-wing version of Twitter that they're trying to do.
00:18:28.000 Mastodon.
00:18:28.000 Mastodon.
00:18:29.000 You can't do that because of the network effects.
00:18:32.000 No, no, no.
00:18:33.000 Actually, we're learning that Mastodon is quite based, actually.
00:18:37.000 They're officially telling people, if you have a problem, just block them.
00:18:42.000 We're not going to ban them.
00:18:44.000 And they're saying, I saw this woman, she said that she got suspended on Mastodon because she was calling out white people.
00:18:50.000 And they were like, yeah, it's racism, you're not allowed.
00:18:53.000 Was that an actual thing or just like an algorithmic mistake?
00:18:56.000 No, no, someone did it.
00:18:58.000 And they said that her posting about this was like, she got suspended.
00:19:01.000 But here's what people don't realize.
00:19:03.000 Mastodon, it's, they're different servers.
00:19:07.000 Okay, so it's the Fediverse.
00:19:09.000 Mastodon is like, I guess OneNote.
00:19:10.000 I'm not entirely sure how it works.
00:19:12.000 I'm sure people in chat probably know better.
00:19:13.000 But basically, when you sign up for it, you pick a server hosted by who knows.
00:19:18.000 And so they warn you.
00:19:19.000 This is some random guy's basement.
00:19:21.000 You don't know what he's doing.
00:19:23.000 When you sign up for this to get off Twitter, he's got all your data, your password, he's got everything.
00:19:27.000 And so, people are signing up, and they're thinking, this is gonna be better now.
00:19:30.000 And there's a mod, there's a viral tweet where he's like, yo, just block people.
00:19:34.000 We're not banning them, just block them.
00:19:37.000 Even on Twitter we weren't doing that.
00:19:38.000 Twitter was banning everybody.
00:19:39.000 Fair enough.
00:19:43.000 I think the point stands.
00:19:44.000 Twitter is something, it's unique.
00:19:47.000 It's function as a global public square is unique and it has network effects that are also unique.
00:19:55.000 And for that reason, a lot of stakeholders are going to do everything they can to prevent Elon from going the direction he seems to be going.
00:20:02.000 And he has a very difficult position.
00:20:04.000 He needs to exercise a lot of caution.
00:20:08.000 He needs to approach things with finesse.
00:20:10.000 And I would say so far, he's done about as well as we could reasonably expect him to do, given what the stakes are and given what he's up against.
00:20:18.000 Yeah, there's a couple hiccups that he deserves to be criticized on legitimately, but you do make a good point, because where else could you see the president of the ADL go after Elon Musk and Elon Musk respond saying, hey, stop defaming me?
00:20:33.000 Where else could you see Kanye West say, shalom, right?
00:20:38.000 This hive mind, this online reality is amazing, and the people quitting it, there's a reason they're coming back.
00:20:45.000 All these people, all these actors, hey, I'm quitting Twitter.
00:20:48.000 Hey, I'm still quitting Twitter, but I want to make one more post and another post.
00:20:51.000 And this is again something very similar to what we saw with Neil Young with Spotify.
00:20:56.000 This is the same thing we saw with all these celebrities saying that they're going to be moving to Canada.
00:21:00.000 A lot of these people are all talk.
00:21:02.000 They're all bark, no bite.
00:21:03.000 And we have to understand, these threats are empty, and they need to be called out as ridiculous, because Twitter is where it's at right now.
00:21:09.000 It's fun, it's entertaining, and it's where I'm at.
00:21:11.000 It's the old St.
00:21:12.000 Augustine, Lord, make me chase, but not yet.
00:21:15.000 Exactly.
00:21:16.000 This is amazing.
00:21:17.000 When Kanye West gets reinstated, people are starting to feel like, I don't have to worry about cancel culture anymore.
00:21:25.000 I don't have to walk on eggshells.
00:21:26.000 So what does Kanye tweet?
00:21:27.000 Shalom.
00:21:28.000 Hilarious.
00:21:29.000 The ADL, some guy, he tweeted, he's openly mocking us now.
00:21:33.000 It's like, uh-huh.
00:21:35.000 So what?
00:21:36.000 Like, the problem is these people were overly sensitive.
00:21:40.000 I don't think Kanye West is an anti-Semite.
00:21:42.000 I think he's probably got some wacky views.
00:21:44.000 I think he said some silly things.
00:21:46.000 But I think if you sat down with him and talked to him, you'd understand him, right?
00:21:50.000 He's not like this caricature of a guy marching around with a hood and a tiki torch or anything like that.
00:21:54.000 But these people want to say that Libs of TikTok, for instance, by simply tweeting out, I'm sorry, retweeting videos posted by people publicly, she's murdering them.
00:22:08.000 So there's this viral tweet, I comment on it, where they're like, you know, Libs of TikTok has murdered hundreds of LGBT people by sharing these videos.
00:22:17.000 And it's like, bro, the videos are public.
00:22:19.000 Literature Doctor isn't doing anything!
00:22:20.000 If that's their standard, and people have to abide by those kind of psychotic individuals, they're scared to speak out.
00:22:28.000 Now, Elon comes in and says, have fun guys, and what are we seeing?
00:22:32.000 Spicy memes, jokes, trolling, and fun.
00:22:36.000 Yeah, comedy.
00:22:37.000 And you know what, Jack White?
00:22:38.000 Well he doesn't like fun.
00:22:40.000 He says fun is bad.
00:22:41.000 So he has to leave.
00:22:42.000 Okay, bye bye.
00:22:43.000 Go play your music dude, I don't care.
00:22:45.000 We'll have fun and post spicy memes.
00:22:48.000 Bring back the meme wars.
00:22:52.000 Yeah, it's sad and pathetic.
00:22:55.000 But yeah, the whole thing about Libs of TikTok killing people that they have this new term that a lot of these kind of midwit mediocrities with master's degrees are talking about, called stochastic terrorism.
00:23:08.000 That's right.
00:23:09.000 And it's it's really such a joke concept, but it effectively means that You can't criticize anyone because of the possibility that the criticism would be out there in the ether and some crazy person might attack some kind of affiliated group and then of course you're responsible.
00:23:28.000 So it's the latest of many censorship predicates.
00:23:33.000 And then the woman behind the libs of TikTok, what's her name?
00:23:37.000 Chaya?
00:23:38.000 Is that her name?
00:23:39.000 She puts stochastic terrorists in her bio and now I'm seeing these leftist journalists be like, she's openly bragging about doing it!
00:23:45.000 And I'm like, are you kidding me?
00:23:47.000 Do you really not understand someone being facetious, making fun of you for calling them that?
00:23:53.000 Or are you intentionally just trying to lie?
00:23:55.000 You know what, man?
00:23:56.000 I'm glad Elon bought Twitter because he took over the space and it is nuts to see this.
00:24:01.000 Like Luke mentioned, telling the president of ADL is like, I can't believe Elon's doing this.
00:24:06.000 When he had a meeting with us, he promised us.
00:24:08.000 And then Elon says, stop defaming me.
00:24:10.000 Like, that's openly mocking them.
00:24:13.000 But I guess Elon Musk is the definition of F.U.
00:24:17.000 money.
00:24:18.000 Just outright.
00:24:19.000 We'll see how far it goes.
00:24:20.000 But so far, I think he deserves credit because there's so many people who you think you would have F.U.
00:24:26.000 money, but they never use it in that fashion.
00:24:30.000 And I think Elon is setting an excellent example, stepping into the actual arena, playing for keeps, and seeing where things land.
00:24:41.000 So, give him major credit and it's only just begun, so we'll see how it unfolds.
00:24:47.000 I think Elon actually tweeted at Trent Reznor, too, or tweeted about him or something.
00:24:52.000 I don't know.
00:24:52.000 Jack White posted on Instagram some long diatribe that's not worth reading.
00:24:56.000 That's not free speech or what the poll decided or whatever nonsense you're claiming to be.
00:25:00.000 This is straight up you trying to help a fascist have a platform so you can eventually get your tax breaks.
00:25:04.000 Like, I don't think this guy knows anything about tax law, you know what I mean?
00:25:08.000 No, I mean it's the typical sort of like maybe 105 IQ, you know, burned out idiot kind of pseudo clever conspiracy theory.
00:25:22.000 It's always like some dumb financial incentive that doesn't even make sense if you actually know how these things work.
00:25:30.000 There's a certain kind of framework for this particular type of accusation.
00:25:35.000 Um, and I would peg him around 105.
00:25:38.000 Well, let's do this.
00:25:38.000 That's high for me.
00:25:39.000 I'm gonna, I'm gonna, I'm gonna call out Elon Musk for hypocrisy.
00:25:44.000 We have this tweet from Schuanhead.
00:25:46.000 She said, Elon Musk confirms he won't reinstate Alex Jones, however you feel about Alex Jones.
00:25:51.000 Aside, am I wrong here?
00:25:53.000 Like, not that there was much, uh, there was much doubt before, but Elon just completely showed his ass.
00:25:59.000 Elon said, my first born child died in my arms.
00:26:02.000 I felt his last heartbeat.
00:26:04.000 I have no mercy for anyone who would use the deaths of children for gain, politics, or fame.
00:26:10.000 Now that's interesting.
00:26:11.000 Alex Jones was not banned from Twitter for anything related to any kids.
00:26:16.000 Alex Jones was banned from Twitter because he insulted Oliver Darcy.
00:26:22.000 Elon, is insulting a reporter a bannable offense?
00:26:26.000 Okay, ban yourself.
00:26:27.000 Uh, I'll wait.
00:26:29.000 Is Elon Musk gonna ban himself?
00:26:30.000 Is he gonna shut up?
00:26:31.000 Okay, look, I can respect all of the good things he's done, the people he's unbanned, and I will accept the win.
00:26:36.000 What I will not accept is... I will say it this way.
00:26:40.000 It is not victory for the people that we are living beneath the whims of a billionaire.
00:26:44.000 What is a victory for the people is a clear set of rules, policies, and procedures so we can all fairly understand the rules of the platform.
00:26:53.000 I understand that.
00:26:53.000 Elon, it's your platform.
00:26:54.000 What are the rules?
00:26:55.000 If you come out and tell me the rule is free speech, I say, you got it, boss.
00:26:59.000 If you come out and say the rule is, you can't say this, that, or otherwise, I'll say, okay, well, that's dumb, but you got it, boss.
00:27:04.000 If you come out and tell me the rules are A, but then, oh, by the way, I have special rules for other people.
00:27:10.000 Well, then your your rules are garbage and completely meaningless.
00:27:13.000 Now, that's really happening is you're unbanning people you like.
00:27:17.000 Well, that's a very interesting point.
00:27:20.000 But what do you say to this is that what if in practice, in terms of how things cash out,
00:27:28.000 it's actually maximizes free speech on Twitter to have an arbitrary system rather than to
00:27:36.000 a system that follows prescribed rules.
00:27:39.000 Because you'd have to think, what kind of prescribed rules can he get away with?
00:27:43.000 He could say what he said before, which is that the law of the land in terms of speech will govern speech policy on Twitter, which in the U.S.
00:27:51.000 would mean First Amendment.
00:27:53.000 I don't know if that's practical given the implications.
00:27:56.000 I understand that.
00:27:57.000 So let's say, let's ask where the line is.
00:28:00.000 Insulting a reporter, is that a bannable offense?
00:28:02.000 Is that, is that, is that in need of flexibility?
00:28:06.000 I think they would all need flexibility, but I think it's easier to think in terms of who would be let back on versus kind of neutral principles that are kind of applied retroactively.
00:28:20.000 In practice, he can get away with more if he pursues the arbitrary approach.
00:28:24.000 Of course, of course, of course.
00:28:25.000 In terms of maximizing free speech on Twitter.
00:28:28.000 And he's giving himself a legal argument here because he's outright saying that for this personal reason he will not be reinstating Alex Jones.
00:28:34.000 I will stress.
00:28:35.000 I absolutely feel and sympathize with Elon over this feeling of losing his son.
00:28:43.000 I mean, it's a horrifying thing to have to experience and I can respect and understand that.
00:28:47.000 I just don't know what it has to do with Alex Jones insulting a reporter.
00:28:50.000 Now, if we're going to argue there needs to be some flexibility, sure, but some things are within the bounds of the Overton window and fine.
00:28:56.000 Insulting a reporter?
00:28:57.000 You don't get banned for it.
00:28:58.000 Now, if you're arguing that there's harassment, and what constitutes harassment, okay, now we're dealing with something more difficult.
00:29:03.000 Well, I'm saying more that, at this stage at least, in practice, his assessment—and he could very well be right on this—is that he can't get away with allowing Alex Jones on.
00:29:13.000 And he is probably imprudent, and I think he probably stepped over the line in attacking Jones the way he did, but the reality is probably he couldn't get Jones on.
00:29:23.000 And that's the cost of doing all the other stuff that he's done.
00:29:26.000 Perhaps.
00:29:27.000 I understand that, but I'm not going to apologize for Elon.
00:29:30.000 Oh, of course not.
00:29:30.000 I will accept the victories, and we're seeing tremendous victories.
00:29:34.000 It's funny, it's fun, we're laughing, there's jokes, there's memes, and all Elon had to do was not address the issue outright.
00:29:41.000 Probably the safest thing for him to do is, like, he's not addressed Carl Benjamin.
00:29:45.000 We all want him unbanned.
00:29:46.000 He's not addressed Milo.
00:29:47.000 But he did specifically address Alex, then said he's not going to unban him because he did a thing six years prior to his Twitter ban that offends Elon.
00:29:57.000 And that right there, I get it.
00:29:59.000 Maybe he's gonna have a legal argument in the future that clearly shows I will ban whoever I want for whatever reason.
00:30:04.000 There's no rules whatsoever.
00:30:05.000 And by doing so, if he does ban you, you can't sue him for breach of contract because he's clearly operating on a whim.
00:30:12.000 I'm happy he's at least telling us where his flawed decision is coming from, because it's an illogical decision that shows you that there's no pathway to redemption, and he's using emotional trauma in order to justify this larger banning for something that Alex Jones has already apologized for, something that Alex Jones is being fined 1.5 billion dollars for, something that of course cost Alex Jones almost everything, but at the end of the day here, if we're going to be punishing people for Using children for their own political gain or fame?
00:30:44.000 My original response to this was, hey Elon Musk, have you heard of Barack Obama?
00:30:49.000 The guy who dropped a bomb every 20 minutes for 8 years?
00:30:53.000 The guy who extra-judiciously assassinated a 16-year-old American citizen?
00:31:00.000 If we're going to be punishing people for hurting children, we might as well ban the President of the United States!
00:31:04.000 Let's not even get into Big Pharma!
00:31:06.000 We don't even have to go into Big Pharma!
00:31:07.000 Hold on there a minute.
00:31:09.000 Joe Biden out there groping and sniffing kids right now!
00:31:12.000 You gonna ban him?
00:31:13.000 That's a lot different than dropping a bomb on small children.
00:31:15.000 I'm saying Obama's not the president anymore.
00:31:15.000 No, no, no, I get it.
00:31:17.000 So we have that argument.
00:31:19.000 So ban him, obviously.
00:31:20.000 And Joe Biden, who's currently the president, is on camera groping and sniffing children.
00:31:24.000 So how about we have a standard?
00:31:26.000 Oh, that's the problem.
00:31:27.000 There's no standard.
00:31:28.000 Do you think any off-platform behavior warrants a ban?
00:31:31.000 Well, this is what Jack Conte of, what's it, Patreon?
00:31:31.000 Nope.
00:31:35.000 Argued specifically.
00:31:35.000 Yeah.
00:31:36.000 He created this new term, off-platform behavior.
00:31:39.000 Off-platform.
00:31:40.000 Yeah, yeah, off-platform behavior, which is absolutely ridiculous.
00:31:44.000 Because, again, you're going to arbitrarily say, you did this 10 years ago or 20 years ago.
00:31:49.000 I don't like that you did this.
00:31:50.000 It's cancel culture.
00:31:51.000 It's cancel culture, but again, My personal preference, which I hope ultimately he'll be able to achieve, is the First Amendment standard.
00:32:00.000 But he's a lot more practical than I think people realize.
00:32:04.000 And the First Amendment standard, it sounds like something very easily publicly defensible when you put it that way.
00:32:11.000 But then you say, okay, First Amendment standard means all legal speech.
00:32:15.000 That means that probably the most banned people on the Internet, like Andrew Anglin has to go on Twitter. I'm for that
00:32:23.000 because I'm for the First Amendment standard, but as a matter of practical reality, that's a hell of a
00:32:28.000 lot of incoming that Elon would have right up front. And so what kind of principles can you pick
00:32:36.000 that would cash out better, at least in this initial stage, than what?
00:32:40.000 Then don't lie to my face.
00:32:41.000 Then don't lie to me.
00:32:42.000 Don't say you believe in free speech when you don't.
00:32:43.000 Because if the assumption is that, perhaps, he's playing a game of 4D chess, 3D chess, and then he should not have addressed the issue at all.
00:32:52.000 Because if your argument is he has to to play the game, that means he's virtue signaling.
00:32:57.000 I don't care for virtue signaling, I don't care for being lied to, and I don't care for petty—I shouldn't say it that way—but I don't care for personalized arguments as to why certain people are below the standards.
00:33:11.000 I agree.
00:33:12.000 I think he shouldn't have gone there and said, oh, you know, use the kid thing.
00:33:18.000 That was unfortunate and crossed the line.
00:33:20.000 And he clearly doesn't know anything about it.
00:33:23.000 Sorry to interrupt.
00:33:24.000 No, I'm totally with you there.
00:33:24.000 Right.
00:33:24.000 No, I agree.
00:33:26.000 I want the First Amendment standard.
00:33:28.000 I'm just trying to put myself in Elon's shoes.
00:33:31.000 And I'm thinking, as much as we want a kind of neutral principled standard such that we can say, well, if you do this, what about this, which is, you know, how fairness operates.
00:33:41.000 But in terms of The result, the kind of consequentialist view of it, is that he can actually probably maximize free speech more if he does virtue signal a little bit, at least in these early stages.
00:33:55.000 There was something infuriating about Jack Dorsey and Vijay Agade lying to your face when you were like, hey, why was this person banned and why wasn't this person?
00:34:06.000 When you'd say something like, hey, look, here's an Antifa account advocating for instructing violence and they go, oh well it's a mistake.
00:34:14.000 And then they still won't take it down.
00:34:16.000 When they tell you, we're working on it, there will be a path to redemption,
00:34:19.000 I hear what you're saying, we're gonna try and fix it.
00:34:22.000 It's frustrating because you know they're lying to you.
00:34:25.000 It's even worse when it's a billionaire who just says, too bad.
00:34:29.000 Like, it's one thing to be like, I can hold on to that 1% hope that Jack Dorsey means it
00:34:36.000 when he says, we're gonna find a way to get people back on the platform,
00:34:38.000 we gotta figure out how to do it.
00:34:39.000 It's another thing when Elon Musk is like, I bought it, it's mine, too bad, there's no rules, I can do whatever I want.
00:34:45.000 We can't trust him on this.
00:34:46.000 What makes you think we can't trust him on other issues as well?
00:34:49.000 This brings a lot of doubt and speculation, because he's acting on emotion rather than morals and virtues.
00:34:54.000 Rather than saying, hey, I believe in free speech.
00:34:56.000 That means speech for people I despise.
00:34:59.000 He says, no, I don't like him.
00:35:00.000 But Elon, have you heard of a thing called Epstein's Island?
00:35:03.000 There's a former president there that used to attend Bill Clinton.
00:35:06.000 He's also on your platform right now.
00:35:08.000 If you care about children and children being heard now, We should be going after him, but he's not.
00:35:12.000 And having this kind of made-up scenario where you say, he's okay, but he's not good, based off your own emotional trauma, is not something that gives me a lot of hope in this platform moving forward.
00:35:23.000 So we all agree that the First Amendment should be the ultimate standard?
00:35:26.000 Free speech?
00:35:27.000 I disagree.
00:35:27.000 No.
00:35:28.000 You disagree?
00:35:29.000 I think even higher principles.
00:35:30.000 For Twitter, I disagree.
00:35:31.000 And on one issue, doxing.
00:35:33.000 Okay. There's probably, uh, the First Amendment doesn't cover criminal acts. There's, there's
00:35:38.000 challenges to whether or not something can be deemed, uh, criminal of its speech. So.
00:35:42.000 So First Amendment minus doxing.
00:35:45.000 Not necessarily. So listen, uh, the first, some people are free speech absolutists.
00:35:50.000 They believe that words can never be criminal no matter what you said.
00:35:53.000 That means literally, they think, incitement to violence and instruction on how to do horrible things.
00:35:57.000 And there's an interesting point there.
00:35:59.000 If we argue that there is criminal speech in any capacity, we're arguing Congress can pass a statutory law criminalizing speech.
00:36:07.000 That makes no sense, right?
00:36:09.000 So the Supreme Court upholds basically that if you incite violence, that is not protected by the First Amendment.
00:36:16.000 And my question is, I understand that.
00:36:18.000 I understand the arguments.
00:36:19.000 But how does that make sense?
00:36:22.000 Congress isn't making the law.
00:36:24.000 The Supreme Court just said, as we interpret what the First Amendment is supposed to mean, we've carved out an exception.
00:36:29.000 The Founding Fathers didn't carve out an exception.
00:36:31.000 So there's an interesting argument there because if we do agree that incitement to violence is criminal and not free speech, what happens if Congress or the Supreme Court, what happens if the Supreme Court, for instance, rules that actually hate speech is incitement because of stochastic terrorism?
00:36:49.000 Are we going to continue to allow the courts to decide that there are more, larger and larger limits to speech?
00:36:54.000 So I don't know that, ultimately... So you're saying the First Amendment standard is not good enough because it's subject to future kind of modifications from the Supreme Court's sort of interpretive development?
00:37:08.000 Because the First Amendment could be argued to mean you can incite to violence.
00:37:13.000 And so there's limits beyond.
00:37:15.000 So my point is this.
00:37:17.000 In the future, the Supreme Court may rule that doxing someone is not free expression.
00:37:22.000 It's actually an attack against or something, it's a violation of someone's privacy and thus criminal.
00:37:29.000 It would be interesting if Congress ruled that posting someone's private information, this may be a court that has, a case that has to be adjudicated actually, if Congress tries passing a law saying posting someone's private home details or private information, phone number, contact, etc.
00:37:46.000 Without their permission constitutes a crime.
00:37:49.000 The Supreme Court's going to have to determine whether or not that violates free speech, because posting someone's address doesn't really express your political opinion or views.
00:37:56.000 And that's an argument some people make about what free speech is.
00:37:59.000 Anyway, I don't want to make it overly complicated.
00:38:02.000 The First Amendment is an easy thing to say, but we really do need bullet point breakdown of what is okay and what isn't.
00:38:08.000 Gab, for instance, I believe bans doxing.
00:38:10.000 Okay.
00:38:11.000 And that's free speech.
00:38:12.000 You could walk around with a big old sign with someone's address on it, but on Gab you can't do that.
00:38:16.000 I agree.
00:38:16.000 I don't think you should be allowed to post someone's private details.
00:38:19.000 I don't think there should be permanent bans for these things, however, because, you know, I argued this to Jack Dorsey, there are people who commit murder and they get out in 25 years.
00:38:27.000 Sure.
00:38:28.000 There's a guy who, you know, posted a nasty meme and you've permanently removed his ability to speak in the public square.
00:38:33.000 There's no redemption according to the theology of cancellation.
00:38:40.000 I just want to point out the Clinton Global Initiative is on Twitter.
00:38:43.000 Yeah, there's a lot of war criminals and monsters.
00:38:43.000 You know?
00:38:46.000 There's, you know, terrorist organizations that are on there openly using the platform without any kind of problems when Twitter, a couple years ago, was looking at, of course, right-wing talking points that they needed to censor and take down.
00:38:58.000 It's crazy, because this is a very important issue, and we can't underplay it, because yes, I mean, we got to celebrate our victories.
00:39:05.000 Elon did incredible things on banning a lot of important people that were punished for their political speech, but if you're going to continue the punishment based on your own kind of made-up standards, that's something that doesn't give me hope.
00:39:19.000 That's something that I think a lot of people should be skeptical of, and I think that's something that a lot of people should be criticizing him on, because obviously it doesn't stand with any kind of virtue.
00:39:27.000 Let's pull up this tweet here.
00:39:29.000 Now I'm not sure if this is confirmed or not.
00:39:31.000 I don't know.
00:39:32.000 But you guys have probably seen it all over the place.
00:39:34.000 This is from Stuxx on Mastodon.
00:39:36.000 He says, What's it with people reporting every single person they don't like?
00:39:41.000 Please stop with that.
00:39:42.000 This is not Twitter.
00:39:43.000 Please use features like mute or block if you don't like people, but stop reporting, otherwise I'll start banning people who keep reporting for nothing.
00:39:50.000 I'm trying to keep things running with so many new people, and it's such a waste of time to hear whatever you don't like.
00:39:56.000 Otherwise, go waste Elon's time, not mine.
00:39:59.000 Elon Musk tweeted, please hall monitors, go on someone else's platform.
00:40:05.000 It was Nate Silver who said that he thinks Mastodon is a honeypot for the hall monitor types.
00:40:10.000 All of these people on Twitter who report everything and won't shut up!
00:40:14.000 are leaving and going to Mastodon and all Spider-Man meme pointing at each other and we're all having a good time.
00:40:20.000 It's like there's a party going on and they left.
00:40:22.000 So, you know, I'll take it.
00:40:24.000 How many employees have been slashed so far at Twitter?
00:40:27.000 Are there enough to the point where like they don't have as many moderators?
00:40:33.000 No moderators.
00:40:34.000 None?
00:40:35.000 Not a lot?
00:40:35.000 I'm pretty sure their moderation team's gone.
00:40:37.000 They've got to have some people.
00:40:40.000 Because they've now streamlined the process for reporting child exploitation materials, which I have been waiting for them to do for so long because they made it unnecessarily difficult.
00:40:51.000 You had to go through on the desktop version and go through a longer process to do that.
00:40:55.000 So they streamlined that process.
00:40:57.000 They have to have a good moderation team to keep up with that and there's been I've seen reports that
00:41:03.000 they're doing a crackdown on child exploitation on the platform. Yeah, he's number one
00:41:08.000 priority he said.
00:41:09.000 But you can't do that without moderators.
00:41:11.000 That's true, good point.
00:41:13.000 I mean, before, the moderators were just acting like demagogues about, you know... If you were having a conversation with someone and you weren't being academic enough in your language, then you would get banned, pretty much.
00:41:29.000 Right, that's how it is.
00:41:30.000 You're just at the mercy of any random individual that comes across this.
00:41:35.000 Let me say real quick.
00:41:37.000 Since the Beginning of the election month or whatever like October and then we're getting into November all of my videos pertaining to election issues have been demonetized for fake reasons and It's it costs a lot of money it really does like getting getting you know when you when you know I cut down the amount of videos I do I went from what was I doing like six to three and
00:42:02.000 Having one, it's bad.
00:42:04.000 It's a lot of money.
00:42:05.000 And what happens is I'll get the yellow dollar sign and then I'll request review, which takes a day.
00:42:11.000 You get all the views in your day, then you get no money.
00:42:15.000 If it is confirmed, like the next day it'll say, confirmed demonetized by manual review, and it will say, harmful or dangerous activity.
00:42:23.000 And it's me reading a poll.
00:42:25.000 And being like, the polls are in, and they say this, that, and then I gotta call Google, and then they're like, whoops, that was an accident, let's fix it for you.
00:42:32.000 And I'm like, who are you employing, who's lying on all of, they're clearly leftists.
00:42:38.000 Well, it happens to every one of my videos, but I have no one to call.
00:42:41.000 So, like, you saw my stats.
00:42:43.000 You saw my income.
00:42:44.000 It's absolutely insane what's happening on my YouTube channel.
00:42:47.000 But to answer your question, there's also a lot of third-party companies, usually a lot of international workers that get hired to do a lot of this moderation.
00:42:55.000 Some of the moderation is done in-house, but a lot of it is done through, of course, other private companies in Africa, in Asia, where, of course, the labor there is a lot cheaper.
00:43:04.000 And this has been documented many times, even with Project Veritas.
00:43:07.000 Project Veritas even released an expose talking about how Twitter employees were paid to view at everything, including private messages, people's posts, how Twitter engineers were there to, of course, implement shadow banning.
00:43:19.000 This is Project Veritas talking to Twitter engineers that were bragging about having Trump's private DM messages and were threatening to release them.
00:43:29.000 Again, when we look at that content moderation, when we look at the destruction of speech, we have so much power by so many few individuals that absolutely use it and abuse it for the worst sinister political purposes.
00:43:40.000 And if you dare to speak outside of the established narratives, you're going to get punished.
00:43:45.000 You're going to get screwed over.
00:43:46.000 And that's exactly what's happening to my YouTube channel in such an extensive way.
00:43:49.000 There's another dimension of demonetization that is very relevant to Elon's predicament with Twitter, and that is the latest scam of brand safety, which is basically like a mafia shakedown.
00:44:04.000 Because in some cases they have third world imports, in some cases they just have regular people.
00:44:12.000 But What they do is they basically tell these advertising agencies, they say, you know, your advertisement with this site is really inappropriate for your brand.
00:44:25.000 I think you should reconsider.
00:44:26.000 And if you don't, it would be a shame if something happened to you.
00:44:30.000 And they have a whole infrastructure in place to make good on their threat.
00:44:34.000 And so It really is.
00:44:36.000 It's a typical mob shakedown.
00:44:38.000 The basic framework hasn't changed, but they sell it as brand safety, meaning your brand is in danger if you don't do exactly what we say and stop advertising with this website that we don't like and we object to, no matter how much money it's making you and no matter how much the audience actually likes the content on that website.
00:44:58.000 And it happens on a large scale.
00:45:00.000 That's exactly the type of shakedown that's happening on Twitter now.
00:45:04.000 And I bet you that our good friend Jonathan Greenblatt of the ADL is hard at work making sure that these potential advertisers and current advertisers are sufficiently intimidated.
00:45:17.000 But it happens on a smaller scale too, to a wide range of sites that distribute content that the regime finds objectionable.
00:45:26.000 Including yours?
00:45:26.000 Including mine in a very big way. I have one stalker woman who's absolutely obsessed with me.
00:45:32.000 Every time we get some kind of new ad arrangement, she has a tweet thread about it.
00:45:39.000 It's really pathetic.
00:45:41.000 Is she a blue check?
00:45:42.000 I don't know about that.
00:45:44.000 I don't know what her cast is.
00:45:47.000 I don't want to give her attention by saying her name, but we can talk about her.
00:45:53.000 But this woman, and I think it's the same person, went after a weather network.
00:46:02.000 There's a cable channel that does nothing but weather and this leftist woman started like tweeting about it and she's well like a lot of these amount of followers her friends in media write up her stories and then she's actually getting advertisers to pull off a weather reporting station and it's like the only thing on the screen is it says like weather it's like raining in Texas. Like, but the sad thing is, the sad thing
00:46:28.000 is how easily so many people cave because usually their liaison, their point of contact at
00:46:33.000 these companies is like some 20 something woman who is ideologically aligned. And even
00:46:39.000 if not, all these unsophisticated people need to hear is the phrase conspiracy theory or, you know,
00:46:46.000 Trump or something like that.
00:46:47.000 And automatically is like, okay, bad danger. You know, that's all I need to hear.
00:46:54.000 Or even worse, an older woman without a family that is there to complain as well.
00:46:57.000 really got or or even worse than older woman without a family that is there to
00:47:01.000 complain as well so what could be worse intimidation campaign is the same thing
00:47:09.000 that happened with the YouTube ad pocalypse Is that what you're saying?
00:47:13.000 That was the Wall Street Journal.
00:47:16.000 They're saying, oh, the brands who are advertising mid-rolls on YouTube can't rely on whether the YouTube channels they're advertised next to reflect their brand values.
00:47:28.000 And I always thought that was ridiculous on its face because YouTube channels, including this one, are brands, they're companies in and of themselves, and they deserve to, you know, have advertisers that align with their values, right?
00:47:41.000 But let's not forget, at the end of the day, advertisers have a choice to not advertise with certain creators.
00:47:47.000 You could go, as an advertiser, and you could go into the Google settings and say, I don't like Alex Jones.
00:47:52.000 I don't like my look.
00:47:53.000 You can choose those.
00:47:54.000 You could choose who you advertise with and not.
00:47:56.000 And as a content creator, I could say, hey, I don't want any advertisements for the U.S.
00:48:00.000 military.
00:48:01.000 I don't want McDonald's advertisements.
00:48:03.000 I don't want these kind of politicians and these kind of ads associated with my brand.
00:48:07.000 And I could do that.
00:48:08.000 Just like Mastodon is complaining right now.
00:48:10.000 Yes, you can.
00:48:11.000 I used to be able to do that.
00:48:12.000 And I used to say, I don't want any McDonald's ads.
00:48:14.000 I don't want any... You still do that?
00:48:18.000 don't you start this on youtube on google on google yeah on your website you mean
00:48:25.000 uh... no no no on on google there was a setting i remember doing this for you
00:48:28.000 youtube channel i remember doing this a few years ago saying i don't want to
00:48:31.000 advertise with i don't want these advertisers on my youtube channel i don't know how to do that
00:48:35.000 i i i could show you uh... at the end of the day That's only fair, right?
00:48:39.000 But it's fair, exactly.
00:48:41.000 But we don't hear about this.
00:48:42.000 A lot of people don't know that advertisers have a choice.
00:48:44.000 And if they find someone despicable or they don't want anyone tied in with their associate, they don't have to have it.
00:48:50.000 They already have the infrastructure there.
00:48:51.000 That's not why they're pulling ads.
00:48:52.000 They're pulling ads because the far left marches around with bricks and conservatives sit in there, lazy boys, complaining.
00:48:59.000 That's it.
00:49:00.000 So, until law enforcement, the solution, and it's why they wanted to fund the police, partly why, law enforcement needs to stop these people, and we need to develop some kind of system, and it's tough, I know it is, the easiest example is storefronts in Berkeley will put up all the leftist signs in their window because they know if someone throws a brick through their window, the cops can't do anything about it.
00:49:21.000 Like, your car can get stolen.
00:49:23.000 Good luck!
00:49:24.000 The cop's gonna be like, well, We'll write down the license plate, and if we see it, we'll let you know.
00:49:30.000 But if you've got something like a bike, your bike's gone.
00:49:33.000 If you've got something like a moped, your moped is gone.
00:49:37.000 It gets stolen, the cops show up and say, what do you want us to do about it?
00:49:39.000 So if you piss off the far left, who are psychotic, violent individuals, and they start harassing your neighbor, the cops are going to be like, what do you want us to do about it?
00:49:48.000 So what do they do?
00:49:49.000 They say, look, Dave Rubin is not going to march.
00:49:52.000 This is the example I love giving.
00:49:54.000 Dave Rubin is never going to march to Twitter HQ to complain about the censorship.
00:49:57.000 He's never going to march with a bunch of classical liberals carrying torches to YouTube HQ to say no more censorship.
00:50:03.000 He's going to go and he's going to complain about it, and that's all he's going to do.
00:50:06.000 Antifa will literally throw up with crowbars and beat the crap out of people.
00:50:10.000 That terrifies them, so they say, I know who to avoid and who to cater to.
00:50:15.000 So these big brands, when they hear that far leftists are attacking them, they immediately say, guys, red alert, do whatever they say, and they'll go away.
00:50:23.000 That's it.
00:50:23.000 Yeah, it's going to be interesting to see how Elon Musk kind of navigates this very changing media advertising landscape, because right now, just a couple of minutes ago, he's promising Twitter being a good video platform that's going to offer, quote, according to Elon Musk, higher compensation for creators than YouTube.
00:50:42.000 So this is something that Elon Musk just tweeted a couple moments ago, responding to Mr. Beast.
00:50:47.000 How is he going to be doing that, specifically with so many advertisers boycotting him?
00:50:52.000 Will there be other advertisers?
00:50:53.000 And also, by and large, advertisers are pulling back, naturally, not because of cancer culture, but because of the way that the economy is just being absolutely screwed over right now.
00:51:03.000 Just how poorly it's doing right now, compared to everything else.
00:51:06.000 I think Sargon got unbanned.
00:51:08.000 I think Rakeda and Sargon got unbanned during the show.
00:51:11.000 Carl Benjamin is back!
00:51:13.000 Wow, dude, it's been so long!
00:51:16.000 Wow, that's amazing.
00:51:17.000 This is what people in the chat are saying right now.
00:51:19.000 We got a super chat.
00:51:20.000 Someone just said it.
00:51:20.000 I had to announce it.
00:51:21.000 Carl's awesome.
00:51:22.000 He's a good friend.
00:51:23.000 He helped me actually get all this stuff rolling.
00:51:26.000 When I started doing YouTube full-time, he hit me up and asked me if I would do a guest spot on his channel.
00:51:30.000 He had, like, 300,000 subscribers.
00:51:32.000 I had, like, 40.
00:51:34.000 And I was like, yeah, for sure, man.
00:51:35.000 I made a video.
00:51:36.000 Ended up getting hundreds of thousands of hits.
00:51:38.000 All of a sudden, I gained a whole bunch of subscribers, and it helped get the ball rolling.
00:51:41.000 So, very grateful to Carl Benjamin.
00:51:43.000 He's a good dude.
00:51:44.000 He hosts Lotus Eaters Podcast.
00:51:46.000 Glad to see that he's been restored, finally, after all these years to Twitter.
00:51:51.000 Oh, man.
00:51:53.000 This'll be fun.
00:51:54.000 Apparently, Rekito Law was, like, literally during the show, so... Yeah, no, just right now, people are saying, uh, that, uh, that they're back.
00:52:01.000 Great.
00:52:01.000 This is amazing.
00:52:02.000 So, uh, look, I'll, I'll... Elon, man.
00:52:02.000 Yeah.
00:52:04.000 Win's a win.
00:52:05.000 I'll take it.
00:52:06.000 I'm not, I'm not gonna ignore the problems I see with the Alex Jones stuff, but I will stress it again.
00:52:11.000 Hey, man, like, this is a huge victory, so take what you can get, I suppose, and, uh, and I'll roll with it.
00:52:18.000 I will say this now, too.
00:52:21.000 There's a tough question about, do we give Elon the $8?
00:52:27.000 I'm leaning towards yes, because I said if he freed the political prisoners, I would sign up, because I want the features, it's all good stuff, and I want to see Twitter succeed.
00:52:35.000 They deleted my ad campaign.
00:52:37.000 I tried advertising the SuperMAGA shirt, it's where Trump's going super saiyan, and they said it was political, you can't do it.
00:52:42.000 I said, okay.
00:52:43.000 So I tried doing the rooster shirt, and the ad ran for like four hours, and then they took it down saying it was inappropriate.
00:52:50.000 Elon, I'm trying to give you money, man.
00:52:51.000 Like, what's going on here?
00:52:52.000 We nerd jokes.
00:52:54.000 It's a rooster.
00:52:55.000 It says stand your ground.
00:52:55.000 I thought it was cool.
00:52:56.000 I'm like, okay, I'll try this, but anyway.
00:52:59.000 I think the Alex Jones thing is unfortunate, and I think Elon's wrong, for everything we described.
00:53:07.000 But I'm gonna take this win, and we gotta see Twitter make it.
00:53:10.000 So I think it's time to sign up.
00:53:13.000 We already got Tim Cass News signed up.
00:53:15.000 We'll get our other accounts, we'll get Pop Culture Crisis verified, we'll get all our business accounts verified.
00:53:21.000 I think unbanning Sargon, for me, is kind of like, oh wow.
00:53:24.000 He's a personal friend.
00:53:25.000 That says a lot.
00:53:27.000 We also now know exactly why Alex Jones isn't coming back.
00:53:31.000 At least Elon's clarified his reasons for not having him on the platform, whether they're legit or not.
00:53:35.000 I'm not so sure he doesn't have ulterior motives.
00:53:38.000 And he's just saying that he has this motive that's close to his heart.
00:53:40.000 Like 4D chess, you mean?
00:53:41.000 Yeah, I don't know.
00:53:42.000 What would be the game there?
00:53:44.000 I have no idea.
00:53:44.000 He gives up the Alex Jones pawn, but what other pawn does he get?
00:53:48.000 Who's more powerful than Elon Musk that can threaten him financially?
00:53:54.000 He has a very interesting conflict with Bill Gates, which I think is fascinating to see unfold.
00:53:59.000 And Tim, you bring up a good question.
00:54:02.000 It does kind of bring down a different kind of paradigm here.
00:54:06.000 Do you support him for doing all this good when he did this one thing that's bad?
00:54:11.000 It's a tough call for a lot of people, but I'm kind of leaning on, like, hey, he freed a lot of political prisoners.
00:54:17.000 He's given them a lot of voices, and for me, I think that's worth eight bucks.
00:54:21.000 It's a girlfriend, probably.
00:54:23.000 Elon Musk has some girlfriend on the side who's, like, telling him you can't unban Alex Jones.
00:54:28.000 That must be it.
00:54:29.000 That's more likely than anything.
00:54:33.000 I don't think so.
00:54:34.000 Women have powerful persuasion.
00:54:37.000 I'm trying to get the tweets to load on Sargon's.
00:54:39.000 Over betas.
00:54:42.000 I don't see any tweets on Sargon's account.
00:54:44.000 I think people are saying in the comment section that he's still sleeping.
00:54:48.000 No, I mean like his old tweets aren't back either.
00:54:51.000 It might just be a glitch, though.
00:54:51.000 But I can seize a count.
00:54:52.000 Yeah, because it's like whenever they unban you, it slowly comes back.
00:54:55.000 Rick Kada only has a couple thousand followers, but they're slowly getting back.
00:54:59.000 A lot of people were saying that when Trump was back, people weren't allowed to follow him, but that usually was happened when people were reinstated.
00:55:06.000 It took a while until all their followers and everything kind of came back to normal.
00:55:10.000 I wonder if he's going to unban people who evaded suspension and made new counts, like over and over again.
00:55:16.000 It's been like five years, I think, since Sargon got banned, right?
00:55:19.000 Yeah, it's like one of the first.
00:55:20.000 And he never made another account.
00:55:22.000 Don't think so.
00:55:25.000 Well, that's a gray zone there.
00:55:28.000 Yeah, I'll just leave it at that.
00:55:29.000 They were impersonating him.
00:55:31.000 Let's jump to this next tweet because, my friends, we're gonna have fun.
00:55:36.000 Lauren Chen tweeted, Twitter before Elon versus Twitter after Elon.
00:55:40.000 The before picture, it's all women.
00:55:43.000 The after picture, it's all men.
00:55:44.000 Yeah, but there's like a woman right there, and there's like a woman right there.
00:55:47.000 That might be a woman right there.
00:55:48.000 I'm not sure, but it looks like two.
00:55:50.000 Over here, I think there's like six guys.
00:55:51.000 There's like one, two, three.
00:55:54.000 Four, five, six.
00:55:54.000 Lots of soy, boys.
00:55:57.000 Is that six?
00:55:58.000 One, two, three, four, five, six.
00:56:00.000 The soy is palpable.
00:56:01.000 Seven?
00:56:02.000 You can smell and taste the soy.
00:56:04.000 Just imagine the stench.
00:56:06.000 Okay, so here's the thing.
00:56:08.000 I tweeted, Elon didn't fire women.
00:56:11.000 He asked who wanted to work hard.
00:56:13.000 LMFAO.
00:56:14.000 Because he put out this email where he was like, everybody, we're all going to be working really, really hard.
00:56:19.000 If you don't want to stay, then take a three month severance and get out.
00:56:24.000 And then these are the guys who stayed late with Elon.
00:56:27.000 However, There was a fact check.
00:56:30.000 Oh, the fact check, I guess, has been removed.
00:56:31.000 The birdwatch thing got taken down.
00:56:33.000 And it said that it's not a before and after.
00:56:36.000 It's just the comms team versus the engineering team, which also made me laugh because, once again, it shows a very clear difference between males and females on the issue.
00:56:45.000 The women are subject-oriented.
00:56:46.000 The males are object-oriented.
00:56:49.000 But the truth is, it is.
00:56:51.000 This image of all these women is from a tweet where a woman at Twitter says, I'm leaving on November 4th.
00:56:58.000 I'm quitting.
00:57:00.000 So it quite literally is a woman being like, it's been fun.
00:57:03.000 I quit.
00:57:03.000 Here's a picture of my team.
00:57:05.000 And then the picture of all the guys being like, we're staying until 2am to do hard work.
00:57:09.000 Yeah, my response to this tweet was, the women on the left better learn to code automatically.
00:57:16.000 And you do see a big, clear kind of difference.
00:57:19.000 And obviously, there are different teams.
00:57:20.000 And to answer your question, Mary, specifically, the misinformation team is gone, but the moderation team is still there.
00:57:27.000 So there are a lot of people who have different kind of Political belief systems that are aligned usually with their genders.
00:57:35.000 We saw and their kind of relationships.
00:57:38.000 We saw during these latest midterms that one of the biggest voting blocks that voted for the Democrats were women that didn't have a partner.
00:57:47.000 And those were people that came out and voted more than they previously have before.
00:57:52.000 So clearly there is a big kind of political shift in difference between these two pictures as well, and I think it's pretty clear to see the difference demonstrated.
00:58:01.000 You can't leave these chicks to their own devices, clearly.
00:58:05.000 Leads to disaster every time.
00:58:07.000 Oh man.
00:58:10.000 I bet they feel aggrieved right now.
00:58:11.000 Agreed.
00:58:13.000 I did a segment on this, I was talking about how 56% of women prefer to be in the workforce, according to Gallup.
00:58:21.000 56%.
00:58:21.000 That's actually not that many.
00:58:23.000 Yeah, exactly.
00:58:24.000 They were like an all-time high, 56%, and I'm like, you're telling me that half of women don't want to be at work?
00:58:28.000 Plus, the women who report that they want to be in the workforce, but actually don't, That's probably a huge percentage as well.
00:58:37.000 Socially driven, subject oriented, etc.
00:58:40.000 There's going to be a lot of women who are like, I, is it Boga's yelling again?
00:58:44.000 He's still yelling.
00:58:44.000 Yeah.
00:58:45.000 I feel bad for him.
00:58:45.000 The cat is at the studio door yelling because he wants to come in.
00:58:48.000 Should I let him in?
00:58:49.000 No.
00:58:49.000 Absolutely not.
00:58:50.000 Why?
00:58:51.000 Because the son of a gun woke me up at seven o'clock in the morning.
00:58:53.000 I'm pissed at him.
00:58:54.000 And he is, he is, it's an aged terrorist cat.
00:58:56.000 He was just hungry.
00:58:57.000 What are you talking about?
00:58:58.000 He's a cat!
00:58:59.000 Cat AIDS.
00:58:59.000 What?
00:59:00.000 Cat AIDS and taxoplasmosis is real, and I just don't get along with that cat.
00:59:05.000 I have toxo.
00:59:06.000 Bogus is super nice.
00:59:07.000 No, he's not.
00:59:08.000 Everybody loves him.
00:59:08.000 Absolutely not.
00:59:09.000 We all love him.
00:59:10.000 Absolutely not.
00:59:10.000 Yeah.
00:59:11.000 Anyway, about these women here.
00:59:12.000 What were we talking about?
00:59:14.000 Oh yeah, I was wondering also- Cat ladies, thank you.
00:59:17.000 Paxoplasmosis, a real thing.
00:59:19.000 I want to know how many mothers report wanting to stay in the workforce.
00:59:24.000 It's probably an even lower number.
00:59:26.000 Once they've started working.
00:59:27.000 I don't know, it's funny because we did a segment on this, and the Young Turks did a segment mocking me, saying like, you know, I said, of the women that probably don't want- I can't remember exactly what I said, but I said there are probably some women somewhere who say they want to work but don't.
00:59:42.000 And I was like, very vague with it, and I was like, they're gonna be unhappy if they don't have families.
00:59:46.000 And then Young Turks did this huge segment where they were like, Tim Pool thinks women want to be stay-at-home wives or whatever, and I'm like, yo, Gallup says they do!
00:59:52.000 Like, does half of women do?
00:59:54.000 But the reason I bring that up is because before COVID, the amount of men that were stay-at-home dads, 1 to 5%.
01:00:01.000 That's ridiculously low relative to women.
01:00:05.000 So guys want to be out in the workforce.
01:00:07.000 Women prefer to be at home, working at home, because it's... This is the craziest thing to me.
01:00:13.000 This idea that, like, taking care of a house and kids isn't work.
01:00:18.000 All of a sudden now it's like, do you want to be at work or stay at home?
01:00:20.000 It's like, well, they're both work, you know what I mean?
01:00:22.000 Because it's only work if it benefits some third party conglomerate, basically.
01:00:28.000 Right, yeah.
01:00:28.000 Just a quick observation about this picture now that I'm looking at it.
01:00:33.000 It really, it represents a different dimension, according to which Musk is challenging the system.
01:00:39.000 Because there's the free speech dimension we're talking about, but also like when you really think about how major corporations are structured, It's actually borderline illegal to have a corporation that's set up to maximize efficiency and output.
01:00:56.000 In California?
01:00:57.000 It's borderline illegal even at a national level.
01:01:00.000 It is illegal in California now.
01:01:02.000 It's illegal.
01:01:03.000 But it goes to show, on one hand, you can look at the left side of the picture and say, probably useless.
01:01:08.000 It's maybe a, you know, the project It's the project manager massacre, you could call it.
01:01:16.000 But on the other hand, there are so many useless people that companies are somewhat obliged to hire in order to shield themselves from a whole range of legal liabilities that have been sort of built into the nexus between the corporate economic structure and the legal structure.
01:01:38.000 And so there's this weird sort of phantom utility to having all of this dead weight in a
01:01:46.000 company simply as legal protection and political protection from precisely these third-party
01:01:55.000 NGO ADL-type rackets that exist to shake down your company.
01:02:04.000 And a lot of people, a lot of these people don't work.
01:02:06.000 We saw it from the TikTok videos.
01:02:07.000 They're like, oh yeah, we're going for our morning mimosas and then we're going to support the African-American owned business, the vegan restaurant downstairs that caters for us.
01:02:17.000 All the caterers are supposed to be there for us, but the office is empty.
01:02:21.000 Wonder why?
01:02:22.000 You wanna pull this up?
01:02:23.000 Yeah, yeah.
01:02:24.000 So this is, uh... I don't wanna play too loud.
01:02:27.000 A tale of two different realities.
01:02:30.000 Check out this video.
01:02:30.000 Welcome to a day in my life as a Twitter employee.
01:02:35.000 Two guys on the right.
01:02:36.000 I don't know what they're doing.
01:02:37.000 They're covered in... It's an oil rig.
01:02:39.000 It's an oil rig.
01:02:39.000 They're just gonna add pipe or take pipe away as it goes down.
01:02:42.000 Oh, that's what they're doing.
01:02:43.000 That's what the guys do.
01:02:44.000 Looks like they're really good at it.
01:02:45.000 Look at that.
01:02:45.000 They're covered in filth of some sort.
01:02:47.000 I'm going to my little work pod.
01:02:50.000 And on the left, it's a woman who's talking about going to her little booth and getting smoothies and eating her charcuterie boards and having her wines and living a yucky lifestyle and having her little meditation booth.
01:03:06.000 The world on the left is only possible because of the world on the right.
01:03:09.000 Foosball!
01:03:10.000 Yay!
01:03:11.000 We get to play foosball!
01:03:14.000 Oh, now she can get her safe space.
01:03:15.000 Look at that.
01:03:16.000 This is my farting room.
01:03:17.000 I go in there and I... Yoga mats.
01:03:21.000 Oh, yeah, because I yoga.
01:03:24.000 Yo, it's so disgusting seeing what these companies are.
01:03:30.000 And I've seen them firsthand.
01:03:31.000 I've been to Google HQ.
01:03:33.000 I've been to their San Francisco and New York offices, their LA offices.
01:03:37.000 It is absolutely insane.
01:03:40.000 It is daycare.
01:03:42.000 I can't even begin to describe it.
01:03:44.000 Having come from a life where I've had, look at that, wine dispenser, a red wine dispenser for a plastic cup while these guys work in the oil rig.
01:03:55.000 On the roof, wow.
01:03:57.000 Can you get better?
01:03:58.000 What can we do to get all these people fired from their jobs?
01:04:02.000 I think Elon just did it.
01:04:02.000 Get them pregnant.
01:04:04.000 Get them pregnant.
01:04:05.000 No, they're just gonna go get abortions.
01:04:07.000 Paid for by the company.
01:04:09.000 Paid for by Elon Musk.
01:04:10.000 Elon Musk is allowing employees to get paid abortions and is paying for their travel.
01:04:16.000 So Elon Musk is a part of this.
01:04:18.000 He's funding it?
01:04:19.000 Yes, Elon Musk is funding his employees to go out of state to get abortions.
01:04:23.000 That's amazing.
01:04:23.000 So again, I mean it's a corporate... Fact check on that?
01:04:26.000 Yeah, go fact check it right now!
01:04:28.000 It's Tesla, right?
01:04:29.000 Yeah, Tesla specifically, but this is probably also going to be the same policy in California because in California it's probably mandated by law.
01:04:38.000 Women's ability to create and sustain life definitely makes them weaker in the workforce and you either have policies like that or you're gonna hire women less.
01:04:51.000 You wonder also how much, what function these project managers' types are really serving.
01:04:57.000 There's the sort of legal shield, and there's also the morale question.
01:05:01.000 You know, maybe they're just kind of giving those guys on the right side of the picture something to look at after a hard four-hour coding session.
01:05:11.000 Oh no, hold on, hold on.
01:05:12.000 That was inappropriate, Darren.
01:05:14.000 Four-hour coding session?
01:05:15.000 These people don't code.
01:05:17.000 And four hours is way too much time for them.
01:05:20.000 These are like... They might break a nail.
01:05:22.000 No, no, I'm talking about the right side of the picture with all of the engineer guys.
01:05:28.000 Maybe the project managers are there for morale.
01:05:31.000 They're like the pinups.
01:05:32.000 That's so true.
01:05:33.000 Yeah, you know that... Because they don't have wives to come home to.
01:05:36.000 And those guys on the right, they wouldn't be able to do all that hard piping work if it wasn't for those Twitter employees.
01:05:42.000 You see, the project managers go there for the smoothies, but the project managers are the smoothies for the coding team.
01:05:53.000 That's so actually true.
01:05:55.000 The way I was describing it earlier is that the most important role, in my opinion, is the stay-at-home mom, or stay-at-home dad if the dad's doing it, but the person who's actually protecting and taking care of the family.
01:06:07.000 Because if a guy were to go out and fight a bear with his own hands and take it down, the question is, for what purpose?
01:06:15.000 If he has no family, he has no one to protect, so what's the point of fighting the bear?
01:06:18.000 If he's able to just take care of himself, he can eat badger, rabbit, and whatever else.
01:06:23.000 He wouldn't be able to actually eat the bear.
01:06:25.000 So the great conquests of the man who's going out and working hard is only for the family, and if there's no one there to actually help protect his family, then what's the point?
01:06:33.000 Raising the family is the most important job you could have in our society, or who else is going to be raising them?
01:06:39.000 But they make it disrespectful.
01:06:41.000 So like, when I say something like, half of women would prefer to be at home, the left gets offended by it.
01:06:47.000 How dare you?
01:06:48.000 We would be better served working $50,000 a year jobs for some multi-millionaire in his factory.
01:06:56.000 Okay, okay, I guess.
01:06:57.000 I don't think so.
01:06:58.000 I think you'd be better served raising kids and having a family and teaching human beings to exist.
01:07:04.000 But that's not what the state and central controllers want.
01:07:06.000 They want a bigger tax base.
01:07:08.000 They want a bigger base of sheep.
01:07:09.000 They want people that don't get to raise their children, so the indoctrination centers get to raise them.
01:07:15.000 So the big social tech media platforms get to raise them.
01:07:18.000 So the televisions, the boob tubes, get to raise them.
01:07:21.000 And when, of course, you have children raised by the state, you have Better sheep.
01:07:25.000 And that's, of course, what they're also looking for, in my opinion.
01:07:27.000 Better that you don't have children at all, though.
01:07:30.000 You're just the last of your line of progeny.
01:07:33.000 I disagree, especially with the upcoming population collapse that's going to be coming and destroying and wrecking havoc on civilization, which Elon Musk also talks about as well.
01:07:41.000 No, I don't think that.
01:07:42.000 I'm just saying the corporations— I think the people should be having children.
01:07:44.000 No, I do, too.
01:07:45.000 I'm saying the corporations would rather you not have children at all than indoctrinate your children.
01:07:51.000 Less trouble for them.
01:07:52.000 I think I'm going to take a class on how to build an electric motor.
01:07:56.000 Just because if there is a population collapse.
01:07:59.000 You know, I'll stop and I'll rephrase it.
01:08:02.000 Population collapse is a terrifying thing.
01:08:04.000 If you think about it.
01:08:07.000 The only reason I have a cell phone right now that can pull up videos and do all this crazy stuff is because there's tens of thousands or hundreds of thousands of people who know how to do each and every one of these little things to make it, right?
01:08:19.000 There's a person who mines the raw materials, the rare earths, the metals.
01:08:23.000 There's a company that makes the glass.
01:08:25.000 It's not like Apple makes it or Android makes it.
01:08:28.000 Then there's a company that makes the chips.
01:08:29.000 There's a company that makes the cameras.
01:08:31.000 It all comes together into this device.
01:08:33.000 If the glass company ceases to exist, touchscreens are gone.
01:08:38.000 Like what do you get?
01:08:38.000 Plastic screen maybe with buttons.
01:08:40.000 If the rare earths are gone, well now we gotta source them from somewhere else, we can't get that.
01:08:44.000 If we lose access to these specialty positions, I'll tell you, if there's a population collapse, we will lose more than people realize because it is only because of the massive population that we're able to have such highly refined and powerful tools.
01:08:59.000 Specialties, as there's more and more people, there's finer and finer specialties.
01:09:04.000 You go back 500 years.
01:09:05.000 Actually, you go back thousands of years, it was possible to, as a single person, to know everything humans knew.
01:09:12.000 Just being one person, like, because there was so little knowledge.
01:09:15.000 Now there's so much knowledge, there's no way one person could actually manufacture a cell phone by themselves, let alone a toaster.
01:09:23.000 That famous TED talk where a guy says, I tried to make a toaster from scratch.
01:09:27.000 This is what it'll be like if population collapse happens.
01:09:29.000 It'll be a hundred years from now.
01:09:31.000 You'll have a kid, and you'll be telling a story about how we used to have small devices.
01:09:35.000 They were like cards.
01:09:37.000 And you'd touch it, and you could make images appear.
01:09:40.000 You could control people on the screen.
01:09:43.000 You could move around.
01:09:44.000 You knew where it was raining around the world.
01:09:46.000 You could talk to anyone around the planet.
01:09:48.000 Those kids are going to be like, that's magic.
01:09:50.000 You're gonna be an old person, like, I swear, we had this thing, and you just, look!
01:09:54.000 And there'll be, like, relics of it, and the kids are like, how do you even use it?
01:09:57.000 How does it even turn on?
01:09:58.000 You're like, well, the network's gone, the technology's gone, we lost it all.
01:10:02.000 Yeah, forget about that.
01:10:02.000 And to them, they'll hear the story, and then they'll tell their kids, magic.
01:10:07.000 And then in 50 years, 60 years, the kids are gonna be like, they used to believe in magic.
01:10:12.000 So dumb.
01:10:12.000 Yeah, I know.
01:10:13.000 Forget about the luxuries.
01:10:14.000 The medical implications are also going to be very vast, because there's not going to be enough young people to take care of the old people.
01:10:20.000 There's going to be so many horrible effects, especially when it comes to the larger economic system, which is going to collapse, especially with the lack of people, since, of course, economies grow with people.
01:10:31.000 Less people, less growth, bigger collapse.
01:10:34.000 Japan is a good test for all of this because, first of all, Japan is one of the lowest,
01:10:38.000 if not the lowest, fertility rates.
01:10:40.000 All the places with the lowest fertility rates seem to be the most technologically advanced.
01:10:46.000 But then, other than, you know, the West and European countries that seem to have attempted
01:10:52.000 to solve this problem by importing massive amount of people from the developing world,
01:10:58.000 Japan has decided to develop robots to replace them, which may in the end turn out to have
01:11:05.000 been a better choice.
01:11:06.000 We just don't know.
01:11:08.000 I don't think so.
01:11:10.000 Well, China is also doing something very similar.
01:11:12.000 They have very weird robots that we can't talk about on the show that I brought up before that everyone here knows about.
01:11:18.000 But China also is facing a huge population of machines.
01:11:21.000 There are these Boston Dynamics robots.
01:11:24.000 They're terrifying, and I've been on good authority that actually there's an ulterior motive there, is that actually the Boston Dynamics robots are secretly programmed for a future in which it is a state law that everybody has to watch Alex Friedman videos.
01:11:44.000 Oh no!
01:11:47.000 If you don't watch the Lex Fridman videos, the Boston Dynamic Robots will come and make sure that you do.
01:11:52.000 I got so scared for a second.
01:11:53.000 You have to watch at least three per month to be in good standing.
01:11:58.000 Don't make me do it.
01:11:59.000 What happens is, the Boston Dynamic Robot shows up at your house and then a screen comes out of its head and Lex Fridman... Have you seen this?
01:12:06.000 And it just starts talking.
01:12:10.000 You're being chased as the dogs are running after you with screens on their back, and you're like, no!
01:12:15.000 And they pin you down, and it just plays the part.
01:12:17.000 I don't want to listen!
01:12:19.000 Meanwhile, Lex is just talking to you, quietly.
01:12:22.000 Well, I don't, I was like, you guys don't like Lex Friedman's show, or what?
01:12:25.000 I think he's fine.
01:12:26.000 I think it's fascinating.
01:12:27.000 Nothing for or against, but I think it's...
01:12:30.000 It's safe to say, it's fair to say that this is the most algorithmically driven phenomenon on the entire internet.
01:12:38.000 I hear so many stories of saying, I went to bed watching this on YouTube and I woke up and Lex Fridman was on it.
01:12:44.000 It's like all paths lead to Fridman.
01:12:47.000 And it's a very interesting thing.
01:12:49.000 You wonder sometimes.
01:12:51.000 how these phenomena occur. Maybe he's just such a great charismatic interviewer that it's just
01:12:59.000 manifestly obvious why he would, you know, be elevated to such a station with so many
01:13:06.000 interesting guests. Maybe there are other factors, but I definitely think it's,
01:13:11.000 you know, I really think the Boston Dynamic robots, they're made to enforce it.
01:13:18.000 Maybe there's going to be a SCOTUS decision, just like the Obamacare, they can force you to buy health insurance, they can force you to watch the Fridman videos.
01:13:26.000 There are all sorts of possibilities in the future.
01:13:28.000 I always get recommended to him all the time, and I'm like, no, I just don't want to.
01:13:32.000 Like, all the time, nonstop.
01:13:35.000 Did they fall asleep watching Lex Friedman, or did they end up on Lex Friedman?
01:13:37.000 That's also a good question to think about.
01:13:39.000 Well, there you go.
01:13:41.000 No disrespect, Lex.
01:13:42.000 It's all good.
01:13:43.000 He's just spreading love.
01:13:45.000 But it's nothing against him, but it is weird.
01:13:47.000 It is the most algorithmically driven thing on the entire internet.
01:13:51.000 Remember that story about that chick who lived in the van?
01:13:54.000 She put up two YouTube videos and then got millions of subscribers?
01:14:00.000 It exposes that there is absolutely an algorithm driving everything and controlling what you see.
01:14:05.000 And my conspiracy theory is that the van life trend on YouTube was to try and convince millennials to not buy stuff.
01:14:12.000 And to live in a pod.
01:14:14.000 Right.
01:14:14.000 To live in a pod and eat the bugs.
01:14:15.000 Yep, eat the bugs and not own anything.
01:14:18.000 Because people can't own homes now.
01:14:21.000 Their parents are telling them, hey, you should save up money and own a home.
01:14:24.000 That's not possible for the average person nowadays with the salaries that they're having, with how much the price of real estate has increased.
01:14:30.000 That's impossible.
01:14:31.000 But you could have a van by the river, right?
01:14:35.000 Or you could have some cryptocurrencies.
01:14:37.000 You could have some tether.
01:14:38.000 Right, Darren?
01:14:41.000 Buy some tether, watch some Lex, buy some tether, you know?
01:14:46.000 Get into a little geometric unity theory.
01:14:49.000 There you go.
01:14:51.000 Do we have time to talk about your article?
01:14:56.000 Like Tim said, there's been a lot of talk about Tether for a long time, but this piece that's up on my news site revolver.news is generating a lot of buzz in the crypto world and otherwise, and I'll just lay out the basic data points and people can kind of decide for themselves.
01:15:15.000 It's a very weird story.
01:15:16.000 It's the third largest cryptocurrency in existence.
01:15:20.000 It's a stablecoin, meaning that its value is not mined in the way that Bitcoin is or Ethereum, that its value allegedly comes from U.S.
01:15:29.000 dollar reserves backing the tether.
01:15:32.000 It's never been fully or properly audited in its entire existence, which is weird given that its whole value is based on the fact that it allegedly has these reserves.
01:15:43.000 And then if you look at the cast of characters behind Tether, one of them is a washed up child Disney actor called Brock Pierce, Who was involved in all sorts of things.
01:15:56.000 A weird sort of underage sex scandal that he got embroiled in.
01:16:00.000 He was apprehended by Interpol in Spain.
01:16:03.000 There is allegedly child pornography and all this stuff involved in the apprehension.
01:16:08.000 He was not Arrested, weirdly.
01:16:11.000 And then he turns up as the founder of this major cryptocurrency, which is a stablecoin, which has never been audited, which defies the U.S.
01:16:23.000 Treasury in various respects, and which just happens to be the official cryptocurrency of several U.S.-backed rebel groups Geopolitically, including the Rohingya rebel groups in Myanmar.
01:16:38.000 And so the thesis adduced in this piece suggests that there could be an historical antecedent to the function that Tether may fulfill, and that is the BCCI bank, which is this bank set up by the CIA to facilitate all kinds of money laundering operations and so forth.
01:17:02.000 But the cast of characters is really remarkable here.
01:17:06.000 Of course, there is a Jeffrey Epstein connection, and so I encourage everyone to go and look at it.
01:17:11.000 And for crypto enthusiasts and experts, I welcome feedback.
01:17:16.000 Are we barking up the wrong tree or not?
01:17:19.000 But so far, I've received extremely positive feedback from the crypto community in terms of Tether being a highly questionable proposition.
01:17:28.000 The issue with so many cryptocurrencies, they're clearly scams.
01:17:31.000 And there's that Alameda woman, Bankman Freed's girlfriend, who I think she said something like, crypto is just scams or something.
01:17:39.000 She's like, it's all scams and something else or whatever, and like drugs or something.
01:17:43.000 Crypto is, I think, mostly scams, but that's not to disrespect crypto itself.
01:17:50.000 Everything's pretty much a scam these days.
01:17:52.000 What isn't a scam?
01:17:53.000 The issue is that there are people who are using crypto to scam, but there are absolutely amazing and legitimate cryptocurrencies and the technology itself is incredible.
01:18:02.000 So, we have this story from Bloomberg.
01:18:04.000 U.S.
01:18:04.000 prosecutors opened probe of FTX months before its collapse.
01:18:09.000 Sweeping inquiry examined crypto exchanges with offshore reach.
01:18:12.000 Manhattan prosecutors recently changed tack as FTX unraveled.
01:18:16.000 So they knew about this.
01:18:18.000 Yeah, how can they not?
01:18:19.000 This company came out of nowhere and had all the huge institutional money.
01:18:23.000 Everyone was asking the question, where did they get all this money?
01:18:26.000 And the corporate media, in response to this, said, he's the new JPMorgan and Chase!
01:18:31.000 He's a genius!
01:18:32.000 And no one had the receipts.
01:18:33.000 No one knew what was going on here.
01:18:35.000 And SBF was also given hundreds of thousands of dollars to the House committee members that are investigating him.
01:18:41.000 I don't see this going anywhere except for a potential false flag being used here in order to bring in more regulations, bring in more control of what essentially would be a decentralized cryptocurrency, but now is going to be hyper focused to be a central bank digital currency.
01:18:56.000 That, of course, is going to be a part of the Great Reset and Build Back Better agenda, which they've been calling for for a very long time.
01:19:02.000 And this is the way this is the false flag that they're going to I don't know, though.
01:19:06.000 I feel like this stuff is shaking confidence in crypto.
01:19:09.000 Absolutely.
01:19:10.000 Well, if they want people to adopt a reserve crypto, this is not the way to do it.
01:19:13.000 A federal reserve cryptocurrency, a digital dollar, is essentially what they want.
01:19:17.000 They don't want everyone just on Bitcoin.
01:19:20.000 They don't want everyone on decentralized platforms.
01:19:22.000 They want everyone on their platforms so they could say, hey, cryptocurrency was bad because it was reckless and they had scams.
01:19:28.000 We have a bigger scam here, the US dollar for you, that's going to be Getting the good principles from it, and we're not going to be doing the bad principles, even though we really have all the bad ones.
01:19:37.000 Make every transaction you ever have publicly available.
01:19:41.000 Exactly.
01:19:41.000 Yeah, people would not like that.
01:19:43.000 Track, trace everything, automatically take money out of accounts, like a social credit score system like they have in China.
01:19:50.000 So this is essentially the endgame here, and central bank digital currencies are something that's being rolled out with the New York Federal Reserve just a couple days ago.
01:19:57.000 And we're seeing this with the G20 just announcing that they're going to have an international health passport that they're going to implement all over the world.
01:20:05.000 So when we look at, you know, the scams in our society, there's a lot of them.
01:20:08.000 FTX is just scratching the surface to all the bigger scams out there.
01:20:12.000 And just including perhaps Tether, and just as an addendum, Alameda, the hedge fund set
01:20:22.000 up by Sam Bankman Freed that's associated with the FTX scandal, they were the major
01:20:29.000 backer of Tether.
01:20:30.000 In fact, they were one of the two major purchasers of Tether on their exchange.
01:20:36.000 So there are interesting connections there as well.
01:20:39.000 Looks like FTX was a kind of money laundering scheme to filter through the sort of Democrat machine and the Clinton overworld.
01:20:48.000 And Tether may be a kind of BCCI, which is used for sort of CIA—the new Iran-Contra type operations globally.
01:20:59.000 Yeah, that was an interesting saga, the BCCI scandal.
01:21:03.000 Right.
01:21:05.000 Can you expand on that just a little bit?
01:21:06.000 Because I think that's an important history lesson that we should kind of revisit, especially when it comes to understanding the larger kind of— Right.
01:21:13.000 the larger intelligence agencies and their involvement with the big banks and how they
01:21:17.000 actually work together to screw you over.
01:21:20.000 Right.
01:21:21.000 The reason these things are important is that the government does not reinvent its own playbook
01:21:26.000 very often at all.
01:21:28.000 And if there's an historical antecedent that serves an important function, you can pretty
01:21:33.000 much bet that it still exists in some modified version.
01:21:38.000 And so the fact that the BCCI existed in the 70s and 80s, which is a bank set up by a Pakistani allegedly, but it was probably set up by an intelligence agency.
01:21:49.000 And it, you know, major scam, it screwed over its many of its depositors, it had a lot of shady figures, drug Cartels, arms dealers, all kinds of shady folks working through it, operating through it.
01:22:04.000 And then the question was, why was this manifest, obvious scam allowed to function for so long, uninterrupted?
01:22:14.000 Well, it's because the CIA was in on it and the primary beneficiary of it.
01:22:19.000 And so that gets to the sort of crypto thing.
01:22:22.000 It's like some scams are allowed to exist because they're the scam of the most valuable player, so to speak.
01:22:32.000 And in a kind of darkly ironic twist that may in fact end up being the saving grace of Tether and crypto more broadly is that there are too many major scams wrapped up into it that it's essentially too big to fail, at least on a medium timeline.
01:22:49.000 Me and Tim were talking about this earlier today.
01:22:51.000 If there was a way to destroy cryptocurrencies and Bitcoin and decentralized currencies, what would they be doing differently than what they're doing right now?
01:23:00.000 And I think it also provided a huge potential, and it still might in many aspects, of allowing people to have a lot of freedom.
01:23:06.000 Barack Obama called Bitcoin the ability of human beings to have Swiss bank accounts inside of their own pockets.
01:23:12.000 I think that that's a power that threatens a lot of people, and I think what we're seeing with FTX, what we're seeing with SBF is a deliberate destruction of that power.
01:23:20.000 Right.
01:23:21.000 But you don't really have that power if you're operating through an exchange on an exchange, right?
01:23:26.000 Absolutely.
01:23:27.000 Most exchanges are also honeypots as well.
01:23:29.000 Exactly.
01:23:30.000 So like, there are two different things.
01:23:31.000 There's the, I guess, and I don't mean this derisively, there's the nerd money kind of function of Bitcoin, which is like, you have your own keys, you do all this, but it's a pain in the ass and people don't really want to do it.
01:23:43.000 It's not grandma friendly.
01:23:45.000 Exactly.
01:23:45.000 But the scalable version of Bitcoin, um is replete with all of these scams and again maybe it's it's not going to fall down precisely because the scams are too valuable in the in the in the specific case of Tether potentially being the new BCCI.
01:24:04.000 It's literally the official cryptocurrency of the Rohingya rebel group in Myanmar which is bizarre that a rebel group would have an official cryptocurrency but that's the case.
01:24:15.000 It's beloved by the Syrian, the Sunni moderates in Syria that John McCain loves so much.
01:24:23.000 It's beloved by cartels, which of course the US intelligence agencies don't have any relationship with.
01:24:29.000 Yeah, it's not like they helped them get their start in Mexico with that secret police unit that they were training down there.
01:24:35.000 They had nothing to do with that.
01:24:36.000 Well, come on.
01:24:36.000 Right.
01:24:37.000 You guys are saying things a little over the top.
01:24:39.000 That's a little out there.
01:24:40.000 It's not like Barack Obama.
01:24:41.000 It's like Obama was giving weapons to the cartels.
01:24:43.000 Exactly.
01:24:44.000 It's not like we had the Iran-Contra scandal that was laundering weapons and heroin and cocaine into the country and guns and weapons, you know?
01:24:51.000 It's funny because you ever see that meme where it's like, Okay, so look, I know the CIA was doing bad stuff in the 50s, the 60s, the 70s, the 80s, the 90s, the 2000s, the 2010s, but nothing's changed.
01:25:01.000 They've never been held accountable.
01:25:03.000 Right.
01:25:04.000 So they're definitely changed.
01:25:06.000 They're definitely not doing that now.
01:25:08.000 Yep.
01:25:09.000 That's it, that's it.
01:25:10.000 New management.
01:25:11.000 Yes.
01:25:12.000 And you know, the way this country is going, there's that viral video of the dude in Arizona with the dreads and he's like yelling about box number three in Arizona and all that stuff.
01:25:23.000 Oh, that's a great video.
01:25:24.000 And I'm just like, I'm watching that video and I'm thinking, the average person in this country does not like what the intelligence agencies are doing.
01:25:32.000 So who are they serving but themselves?
01:25:34.000 Now, maybe these people have the idea, like, well, they don't know what's good for them.
01:25:38.000 Okay, well, dude, like, that defies the core of what this country is supposed to be.
01:25:41.000 It's supposed to be dangerous freedom, not peaceful slavery.
01:25:45.000 Just you, as an intelligence officer or whatever, deciding, you know what's best for us!
01:25:49.000 You don't!
01:25:50.000 And clearly, all of this luxury has been bad for us.
01:25:54.000 It's created a whole generation of gluttonous morons.
01:25:57.000 So maybe, maybe we need things to be a little bit less luxurious, and people need to go out and chop some lumber to heat their homes.
01:26:03.000 Maybe all of this luxury is making weak people, which makes everything worse for everybody, and all we get is a bunch of whiny complainers, and, um, what do the young people call them?
01:26:12.000 Hall monitors.
01:26:13.000 Hall monitors, yeah.
01:26:13.000 I said a nation of hall monitors.
01:26:15.000 Yeah, there's only so much of an economy that you can make being a hall monitor, but that's really it.
01:26:19.000 Those guys who are working that oil rig we watched in that video a moment ago, those guys are doing hard work.
01:26:24.000 The lady at Twitter?
01:26:25.000 Hall monitor.
01:26:26.000 Vice Media?
01:26:27.000 Hall monitors.
01:26:28.000 BuzzFeed?
01:26:28.000 Hall monitors.
01:26:29.000 NBC News?
01:26:30.000 Hall monitors.
01:26:30.000 That's all they do.
01:26:31.000 They walk around complaining, contributing nothing.
01:26:34.000 Yeah, we gotta do something about that.
01:26:35.000 They're bogging us down.
01:26:37.000 I want to post jokes.
01:26:39.000 I know.
01:26:40.000 I want to post memes.
01:26:41.000 Yeah.
01:26:41.000 Without being, you know, afraid of getting censored.
01:26:44.000 Or express political ideas.
01:26:46.000 Or debate political ideas that might be a little edgy or controversial.
01:26:50.000 I want to have the ability to talk through bad ideas.
01:26:53.000 You know?
01:26:54.000 And we still, let's be honest here, we're still testing the waters with Twitter.
01:26:57.000 We still don't have that fully.
01:26:59.000 And that's a shame because that's what we had before.
01:27:02.000 And then we were led astray by the centralization of it all.
01:27:05.000 And now the solution's going to be more centralization, which is absolutely absurd.
01:27:09.000 You mean with Elon, or what?
01:27:11.000 More centralization as far as what's happening with cryptocurrencies, what's happening with social media platforms being in the hands of less and less people, and media outlets being in the control of less and less people as well.
01:27:23.000 Everything's being centralized.
01:27:25.000 Weird, like centralization minus censorship really was sort of the sweet spot because
01:27:32.000 centralization allowed speech at scale. So that brief period in the internet when there were
01:27:39.000 centralized platforms, major platforms, but before the speech crackdown post-2016 was really kind of
01:27:50.000 the sweet spot.
01:27:51.000 Maybe it just can't sustain it.
01:27:53.000 Maybe the regime is simply incompatible with that level of free speech at scale.
01:27:59.000 Maybe no regime is compatible, but certainly ours isn't given how How ridiculous and ultimately untenable it is.
01:28:08.000 And so when they say that, you know, Elon Musk is a national security threat, free speech is a national security threat, I think that's true in quite a literal sense, is that if people are allowed to speak freely on the whole host of things that I'm not even able to mention now, that is I don't think the regime can really survive it.
01:28:31.000 And so it's an existential issue from their point of view.
01:28:34.000 And that kind of illustrates the stakes associated with what Elon Musk is doing.
01:28:42.000 You know what really grinds my gears?
01:28:44.000 I hear this all the time.
01:28:45.000 People will tweet something like, tell me one time in history when the people censoring speech were the good guys?
01:28:50.000 Because it's just, like, World War II?
01:28:53.000 The United States?
01:28:54.000 Were we the bad guys?
01:28:54.000 I don't know.
01:28:55.000 Like, the U.S.
01:28:56.000 had an office of censorship.
01:28:58.000 But it's like, people just assume outright that there's never a reason for controlling information.
01:29:03.000 Of course there is, like, in war and things like that.
01:29:06.000 To your point, the intelligence agencies probably do think that, you know, we're in a constant state of conflict and we need to be able to control information if we're going to win.
01:29:14.000 But that conflicts with what this country is supposed to be.
01:29:17.000 People having a right to choose.
01:29:18.000 If they can't be informed, they can't choose.
01:29:20.000 So this country is clearly already lost, if that's the case.
01:29:24.000 But I will point out, in World War II, with the Office of Censorship, Loose lips, sink ships.
01:29:30.000 And this was it.
01:29:31.000 There were censors who would stop things from appearing in newspapers and the radio.
01:29:35.000 Censorship all over the United States.
01:29:36.000 The Federal Office of Censorship.
01:29:39.000 Because we're at war.
01:29:39.000 Well, you know, there's a logical approach to this, specifically when it comes to saying, hey, there's all these troops moving in this direction going here.
01:29:47.000 Obviously, you can't allow that kind of information during warfare because it's going to give the enemy the upper hand.
01:29:53.000 It's not even... But treason doesn't fall under free speech.
01:29:56.000 No, no, no, but it's not that.
01:29:58.000 It's that someone might be like, I work at a steel mill and they just, some crazy thing happened where the alarm went off today.
01:30:04.000 That's the kind of thing they don't, they, loose lipsync ships wasn't just about telling people where our military was, some people didn't know that.
01:30:10.000 It was about giving up information on what you were doing to people who shouldn't know about it.
01:30:14.000 They wanted everyone to shut up.
01:30:16.000 It's a fine line, too, because where do you draw the boundaries?
01:30:19.000 Because in the United States, we did have the Japanese internment camps.
01:30:23.000 We did have the bombing of Dresden.
01:30:25.000 We do have a lot of policies that we could definitely criticize the United States for, which we should have criticized, which criticism could have prevented, but that didn't happen because of that censorship effort as well.
01:30:34.000 So again, where is that fine line?
01:30:36.000 I think that's impossible.
01:30:37.000 I don't know.
01:30:37.000 Dan, do you have an answer?
01:30:39.000 Well, I think censorship and information control are necessary to any type of regime.
01:30:47.000 I think the important issue is if the noble lie is necessary in some degree, as you know, the famous thing from the Republic, The problem is that we become the ignoble lie.
01:31:04.000 And it's less that, oh, we want a total free for all free speech.
01:31:09.000 It's that fundamentally what America has come to represent, what I call the globalist American empire, is really the wokeness, or whatever you want to call it, really is basically the de facto official religion, the de facto official ideology of the United States.
01:31:29.000 And once it's seeped that deeply into the marrow of the body politic, then anything that's set up to sustain the security of it ultimately just reinforces that ideology.
01:31:43.000 That's the problem that we're in.
01:31:44.000 And that's not Per se, a problem of censorship as such, it's a problem of what the regime has become in its most fundamental sense.
01:31:57.000 It is a non-theistic religion.
01:32:00.000 It's crazy.
01:32:00.000 I mean, it's becoming more and more clear every day.
01:32:02.000 Peter Boghossian was talking about it years ago, and now we can see it clearer than ever.
01:32:08.000 All right, we're gonna go to Super Chats!
01:32:10.000 If you haven't already, would you kindly smash that like button?
01:32:12.000 Subscribe to this channel, share the show with your friends, become a member at TimCast.com.
01:32:16.000 We're gonna have a members-only show coming up around 11 p.m., and I think we're gonna talk about some fashion company with handbags or something?
01:32:23.000 Is that what it was?
01:32:24.000 Yeah, it was Balenciaga.
01:32:26.000 Controversial topic.
01:32:26.000 There you go.
01:32:28.000 Controversial topic, to say the least.
01:32:30.000 And you can also do one more thing.
01:32:32.000 Follow Sargon of Akkad on Twitter, because he's back after, like, five years.
01:32:38.000 And it's Sargon underscore of underscore Akkad, A-K-K-A-D.
01:32:43.000 Or just go to my Twitter, at Timcast, and I tweeted out that he's back, and you can follow him there.
01:32:46.000 He's probably sleeping, because he's in the UK, but I'm really excited.
01:32:49.000 Jordan Peterson tweeted something like, surprised to see you here Sargon, so we're all excited that Carl Benjamin's back on Twitter, so give him a follow.
01:33:04.000 Alright, let's see what we got!
01:33:06.000 Kay says, please get the Critical Drinker on both Timcast IRL as well as Pop Culture Crisis.
01:33:11.000 He's the best movie critic and fighting the message from the woke entertainment corporations.
01:33:16.000 Sounds good.
01:33:18.000 Oh, this is funny.
01:33:19.000 Sargon got unbanned right before the show started.
01:33:22.000 Neo Unrealist says, Sargon Avocado is unbanned from Twitter right before the show started.
01:33:26.000 Yeah, it was like 30 minutes before.
01:33:27.000 Man!
01:33:28.000 A lot of people were tweeting about him and a lot of other people.
01:33:31.000 I mean, for him, it's been five years.
01:33:31.000 As well.
01:33:33.000 It's like watching, you know, the corpse come out of the ground, like, when you're like, he's back!
01:33:38.000 Yeah!
01:33:39.000 He's gonna dust off the phone.
01:33:40.000 Yeah, right.
01:33:42.000 I mean, he's been doing his thing, but it'll be cool to see him back in the conversation.
01:33:42.000 Yep.
01:33:47.000 Yeah, I just watched a part of his Lotus Eaters program.
01:33:49.000 He did a thing about millennials and Gen Z stuff, and it was like the old Sargon, where he was just talking to and sitting down and chatting, which was cool to see.
01:33:55.000 So it's funny he's back right now.
01:33:56.000 It's great.
01:33:57.000 Oh yeah, all of our Super Chats are Sargon Unbanned!
01:33:59.000 Sargon Unbanned!
01:34:01.000 Wow!
01:34:02.000 William Nichols says, you're shadowbanned.
01:34:04.000 I recorded it for proof.
01:34:05.000 This show is.
01:34:05.000 That's right!
01:34:07.000 And it is insane that despite the fact that we are shadowbanned, you guys still watch.
01:34:13.000 And it's weird.
01:34:15.000 I mean, we're in a weird place.
01:34:17.000 This show should be trending every night based on how many views we get.
01:34:20.000 It never does.
01:34:22.000 And maybe not every single show.
01:34:23.000 I'm being a little hyperbolic, but it never does.
01:34:28.000 All right.
01:34:30.000 Mind of Madman says, no notification, and I believe in clank and beanie supremacy.
01:34:35.000 Well, all right.
01:34:36.000 No notification.
01:34:37.000 How about that?
01:34:38.000 Shout out clankers.
01:34:40.000 Yep.
01:34:41.000 Rob says, Tim and Co, I'm setting up a Ligma Johnson candle e-store.
01:34:45.000 Thank you for the inspiration.
01:34:46.000 I know the thing you've inspired me to do, and I look forward to your order.
01:34:51.000 You know what we're thinking of doing is, uh, I tweeted out a piece of land in West Virginia, it's like 180 acres, and I was like, this could be the Ligma Johnson Woodland Preservation.
01:34:59.000 We could actually, like, allocate plots of ownership, you know, and you get a little card.
01:35:05.000 Fake Ligma Johnson lords?
01:35:07.000 Yeah.
01:35:08.000 It's interesting because like the price per square foot's actually not that expensive.
01:35:12.000 And then you could, you're not really a lure, we wouldn't do that, but you'd have like, you know, we could create a public park and then it would be owned by the people who own the square footage and that could be a lot of people.
01:35:22.000 You're an official ligma.
01:35:23.000 Yeah.
01:35:25.000 Yeah, you can be a... I don't know, what kind of title do we give?
01:35:29.000 Can't you still buy stars?
01:35:30.000 Ligma Richard?
01:35:31.000 You could name a couple of stars, Ligma Johnson.
01:35:34.000 Who has the right to the stars?
01:35:36.000 That's a good point, though.
01:35:36.000 You could name stars.
01:35:37.000 You'll find out eventually, I don't know.
01:35:38.000 What if we just, like, all of these programs, like, go to a star-buying company and then just buy as many stars as possible and name them all Ligma Johnson?
01:35:46.000 And they'll start giving them numbers like Ligma Johnson, 9C31.
01:35:48.000 No, Ligma Dixon.
01:35:49.000 Ligma Richards.
01:35:52.000 Other ones will be Suggma.
01:35:54.000 What else?
01:35:57.000 Suggma?
01:35:58.000 I can't think of other ones.
01:36:00.000 Tugma?
01:36:01.000 Leave it in the chat, y'all.
01:36:03.000 I know they know.
01:36:04.000 They are, and they definitely have good answers.
01:36:07.000 Okay, thanks, guys.
01:36:09.000 All right, DDMegaDudu says, Hey Tim and crew, I would like to know if Luke's nuclear bazongas implants are still up for grabs.
01:36:17.000 What have them popped?
01:36:19.000 I'm asking you and wondering if she followed through.
01:36:22.000 Yeah, I walk in, and it's on the floor, and it's leaking all over the place.
01:36:27.000 I did mention it, just for the record, but I didn't actually ask, because they're damaged goods now.
01:36:33.000 You wanted them first?
01:36:34.000 He wanted them!
01:36:36.000 Not me.
01:36:36.000 You were asking about them.
01:36:37.000 We are going to be giving out those, as well as the post-its from Milo I mentioned.
01:36:43.000 We're bogged up because of the holiday, so a lot of people are already heading out, getting ready for Thanksgiving, some people have to travel.
01:36:50.000 I'm not even sure if we're gonna be able to do our Wednesday show, because everyone's traveling to Thanksgiving dinner, nobody wants to drive on Thanksgiving morning, so... We'll just have to figure it out, but that also means that, like, no one's gonna be here, so this is just one of those weeks where very little ends up happening.
01:37:06.000 You gotta get the chicken in here.
01:37:08.000 I'm not bringing a chicken in.
01:37:09.000 Everyone keeps trying to get me to bring a chicken in here.
01:37:10.000 I'm like, just take a dump on the floor.
01:37:12.000 Look, I love the chickens, clearly.
01:37:14.000 Roberto Jr.
01:37:14.000 is a man's man, you know?
01:37:16.000 He's tough.
01:37:17.000 But he will take a dump on the floor.
01:37:19.000 How are you going to get that out of the carpet?
01:37:20.000 Get a steam cleaner in here?
01:37:22.000 It's fine.
01:37:23.000 It's worth it.
01:37:25.000 Mary, will you be here?
01:37:27.000 I say do the show.
01:37:27.000 I need something to watch as I'm driving up.
01:37:29.000 Wait, for Wednesday or for episode 666?
01:37:31.000 Wednesday.
01:37:33.000 Which would be episode 666.
01:37:34.000 No, no, no.
01:37:35.000 If there is a Wednesday.
01:37:36.000 If we do the show on Wednesday, Monday will be 666.
01:37:39.000 If we don't, then Tuesday will be 666.
01:37:41.000 I think you should leave Monday for 666.
01:37:43.000 Why Monday?
01:37:44.000 Why?
01:37:45.000 It's better than doing it the day before.
01:37:46.000 You just don't want to work.
01:37:47.000 See?
01:37:47.000 Women don't want to work.
01:37:48.000 No, no, no.
01:37:48.000 She's saying we should do the Wednesday show, but the issue is I don't know if we have anybody here.
01:37:54.000 Yeah.
01:37:55.000 I mean, I don't like taking shows off, but Ian and Luke are going to be gone.
01:37:59.000 We can just have you monologue like the old days.
01:38:04.000 We can just turn the camera on and eat turkey and you can watch us eat turkey.
01:38:08.000 Bring on a member of the members area.
01:38:09.000 Well, I mean, I think Mary, you're here.
01:38:12.000 Yeah.
01:38:12.000 Hannah Montana, she's also here.
01:38:14.000 We can grab a couple people, we'll figure something out.
01:38:17.000 But if we don't do the show, that means Tuesday is 666, and we have an awesome guest on Tuesday, which I'm excited about having for 666.
01:38:22.000 That'd be worth it.
01:38:23.000 I think save it.
01:38:24.000 But that means, like, I don't like not working for three days, because it'd make me lose my mind.
01:38:29.000 True.
01:38:30.000 I'm going to be sitting there on Wednesday, like, shaking, like, what's happening?
01:38:33.000 Need to talk about stuff.
01:38:35.000 And I'll start talking to the cat.
01:38:37.000 I'll be like, Boko, let me explain to you what's going on.
01:38:39.000 You already do that.
01:38:39.000 You can have him.
01:38:40.000 No, no, like I'll start monologuing to him, like I need to tell someone.
01:38:42.000 I'll look into his eyes and it'll travel into hell and then all the demons will be watching.
01:38:47.000 Alright, let's read some more.
01:38:49.000 Alright, Bobby says, Elon's remarks on Alex Jones are inexcusable.
01:38:52.000 Let's stop acting like Elon fanboys.
01:38:55.000 You know, there's the question there, right?
01:38:59.000 Take the win.
01:39:00.000 I agree with you on Alex Jones.
01:39:01.000 I think it's BS.
01:39:03.000 And Darren said the same thing.
01:39:04.000 He should not have gone there.
01:39:05.000 He should not have said it.
01:39:06.000 What are you gonna do?
01:39:07.000 Are we gonna be like, well, let's forego the entire victory we have on all of these things.
01:39:11.000 Sargon being back on Twitter, of all people, you know?
01:39:13.000 Nah, I'll take the win.
01:39:13.000 I'll take the win.
01:39:15.000 I will let Elon know I find his statements incorrect, morally wrong, and objectionable.
01:39:23.000 And then what are you gonna do?
01:39:24.000 You know?
01:39:25.000 We're better off.
01:39:26.000 I mean, he's accusing Alex Jones of weaponizing the death and trauma of children and their families, yet he himself is using his own trauma of the death of his child as a bludgeon.
01:39:39.000 Well, I guess—you know, I understand what he's saying, right?
01:39:42.000 He's like, Alex Jones exploited this, he suffered it.
01:39:45.000 You know what I mean?
01:39:47.000 But either way, it's just—it's an emotional thing.
01:39:50.000 I just don't agree.
01:39:51.000 Raymond G. Stanley Jr.
01:39:52.000 says me, tell me you're from the city without telling me you're from the city.
01:39:56.000 Tim, a man can kill a bear by jamming his arm down a bear's throat.
01:40:00.000 I didn't mean that literally, Raymond.
01:40:02.000 I was joking.
01:40:03.000 I made a point early where I was like, if a guy's gonna go fight a bear and kill him with his own hands by like punching him in the throat and holding his arm causing the bear to choke and then killing it, I didn't literally mean someone would do that.
01:40:15.000 My point was that A man is not going to defeat a bear with his bare hands.
01:40:19.000 Like, there are some stories of it happening, but it's never like a grizzly or anything.
01:40:23.000 No, it's always a black bear.
01:40:24.000 I'm talking about, the point I was trying to make is, the grand story of the man, of David versus Goliath, is pointless if there's no family.
01:40:31.000 Like, a dude does not need to kill the emperor stag of the forest that weighs hundreds of, you know, thousands of pounds or whatever, because what is he going to do with it?
01:40:40.000 He's going to be like, well, it's dead.
01:40:41.000 No, the conquest is always in support or saving of someone else.
01:40:47.000 He might be revitalized to code for another few hours if there's an attractive woman working in the HR department.
01:40:56.000 I think that's true, actually.
01:40:58.000 I'd be willing to bet that.
01:41:02.000 I think this is fairly obvious, okay, that if you take two guys and ask them to, you know,
01:41:07.000 work out and we're going to track how many reps they can do until they have to stop,
01:41:11.000 I bet if you took two guys and put them in a gym and said, do as many lifts as you can until your
01:41:17.000 arms are too sore to move, if you then brought a woman in, a very attractive one, they'd probably
01:41:21.000 start lifting again.
01:41:22.000 They'd like, they'd find, but for real though.
01:41:24.000 A second wind.
01:41:25.000 And it's not a statement on like magic or anything, it's a statement on just what human beings are.
01:41:30.000 The guy's gonna be like, I'm gonna do it, you know, like give him a reason.
01:41:33.000 Give him a physical reason.
01:41:36.000 More importantly, I've read those stories about women who have lifted cars off of their kids.
01:41:40.000 Oh yeah, yeah, yeah.
01:41:41.000 Like, we know that humans do things for other humans.
01:41:43.000 This is the way, that's cool actually.
01:41:46.000 Reading the story, like a kid got hit by a car, and the mom, she's like 5'5", she like lifts the car up, and like tears her muscles, but she doesn't care.
01:41:52.000 She's like, I will save you!
01:41:53.000 Yeah, she like compressed like vertebrae in her back, right?
01:41:56.000 Wow, that's cool.
01:41:58.000 I remember watching a story about a guy who got crushed by a boulder, and then he lifted the boulder off of him, and it was like, it was like 700, I don't know how heavy it was, but doing so tore his muscles.
01:42:09.000 He like put so much strain into doing it, and they were saying like, your muscles actually have five times more lifting capacity, but they destroy themselves in doing it.
01:42:19.000 So that's why they're, like, limited.
01:42:20.000 So, you know, we're actually pretty strong.
01:42:23.000 Stronger than we know.
01:42:25.000 Everybody was super chatting.
01:42:26.000 Carl is back on Twitter.
01:42:28.000 I'm glad we didn't notice because we made a lot of money off people trying to tell us.
01:42:32.000 But thank you for the super chats.
01:42:33.000 I appreciate it.
01:42:34.000 I'm stoked to see Carl on, man.
01:42:35.000 I'm really excited for that.
01:42:37.000 He's amazing.
01:42:37.000 It's great.
01:42:38.000 Lotus Eaters podcast is fantastic.
01:42:39.000 Yeah, it is.
01:42:41.000 Biko says, is Timcast interested in hiring pop culture writers?
01:42:45.000 If so, what would be the best way to apply?
01:42:47.000 I don't know and I don't know.
01:42:49.000 I guess Mary's in charge of that.
01:42:51.000 Yeah, I'm in charge.
01:42:52.000 You should DM me on Twitter.
01:42:54.000 There you go.
01:42:55.000 Yeah, you know, she's on pop culture crisis.
01:42:58.000 That's their purview.
01:43:02.000 All right.
01:43:02.000 AK Storm says, Tim, please ask Mary about the BDSM teddy bears marketed to children she covered on Pop Culture Crisis today.
01:43:09.000 Truly disturbing.
01:43:10.000 We'll talk about that.
01:43:11.000 Yeah, we'll talk about the after show.
01:43:11.000 That's for the after show.
01:43:12.000 We're gonna go off.
01:43:14.000 Yeah, we'll make it.
01:43:15.000 We'll get right into it.
01:43:16.000 That'll be really, really good.
01:43:18.000 Alright, Pinochet's Helicopter Tour says, well Tim, Google really didn't like your 4 o'clock video, and I can confirm it.
01:43:25.000 I saw that and I was like, uh oh, what does that mean?
01:43:27.000 Well, the good news is, the video I put up at 4pm is that the Arizona Assistant AG is refusing to certify the election.
01:43:35.000 This is true, it's in the news, it's happening.
01:43:37.000 It is, uh, it's doing really well, people are able to get it, but it is demonetized, so, uh, there you go.
01:43:43.000 Which means I'll make, probably, I don't know, 20 bucks off of that video for the day.
01:43:52.000 So, you know, it is what it is.
01:43:53.000 I'll take 20 bucks, you know what I mean?
01:43:55.000 But that's like erasing the revenue off of it, you know?
01:43:59.000 You're making 20 bucks?
01:44:01.000 I want 20 bucks!
01:44:02.000 I'm not even getting that!
01:44:03.000 This is why we shifted focus to doing TimCast memberships for the company, because you've got activists trying to take your ads away, and because YouTube's trying to take your ads away.
01:44:14.000 So I also, this is a thing too, I've picked up, I started doing established titles as sponsorship on the TimCast channel, I normally don't do ad reads on, because they're demonetizing everything.
01:44:26.000 So I'm like, okay, let's play that game, demonetize Shadowband, whatever, people who wanna watch my content are gonna watch it, I'll sell my own ad if you're not going to run ads against it.
01:44:35.000 And so that makes up the difference.
01:44:36.000 So we will always find a way.
01:44:39.000 We're not going to let censorship get us down.
01:44:42.000 The One Freeman says, odd how all tech platforms laying off thousands of people in unison.
01:44:47.000 Elon said Twitter was hemorrhaging $4 million a day.
01:44:50.000 What was keeping them afloat until Elon burst the bubble?
01:44:54.000 Was someone subsidizing them in exchange for censorship?
01:44:57.000 No, it's that a bunch of advertisers pulled off the platform when Elon moved in because they're biased lunatics.
01:45:04.000 And so then they start losing money.
01:45:07.000 However, I do think something interesting is happening in terms of all these layoffs because we're seeing a bunch of ad buyers say that they're cutting down on sponsorships and marketing firms are saying there's less money to go around.
01:45:20.000 So the reporting that I heard is that next year there's going to be a major economic downturn.
01:45:26.000 Where'd you hear that?
01:45:27.000 I just scuttlebutt on Twitter.
01:45:30.000 I saw that really funny Twitter account.
01:45:32.000 It was InverseKramer.
01:45:34.000 You ever see that one?
01:45:35.000 Oh yeah, I have seen that.
01:45:36.000 Whatever Jim Cramer says, do the opposite.
01:45:38.000 Yeah, yeah, yeah.
01:45:38.000 And then they show everything he was wrong about.
01:45:41.000 And they're like, if you do the opposite of what he says, you get rich.
01:45:43.000 It's like inverse Cramer is like George Costanza when he decided to do everything opposite.
01:45:49.000 Phoenix Ammunition says, we've sent several appeals to Twitter to be reinstated and so far nothing.
01:45:53.000 We never actually broke a rule in the first place.
01:45:55.000 Elon, please help.
01:45:57.000 Y'all guys should tweet about Phoenix Ammunition because they were unjustly banned from Twitter.
01:46:03.000 Were they?
01:46:04.000 Yeah.
01:46:04.000 They're cool guys.
01:46:06.000 Yeah.
01:46:06.000 They do good stuff.
01:46:08.000 They're the ones who did the website where, when you wanted to buy ammo, it asked you if you voted for Joe Biden, and if you put yes, it kicks you out of the website.
01:46:15.000 I think it sent you to, like... His gun control page or something.
01:46:20.000 He's like, we don't need your money.
01:46:22.000 That's a Phoenix with an F-E-N-I-X.
01:46:24.000 Yeah, F-E-N-I-X.
01:46:25.000 Good guys.
01:46:25.000 FireBurnsPeople says, went to start watching TimCastIRL tonight and was not in my subscription tab or on the channel page found, found it on my home page under the live tab.
01:46:34.000 Isn't that something?
01:46:37.000 Ah.
01:46:37.000 Isn't that something, huh?
01:46:39.000 The same thing happened to me.
01:46:40.000 I was going on my phone to the channel and I couldn't find it.
01:46:44.000 That happens a lot.
01:46:46.000 Man, I gotta tell you, if you guys didn't like watching the show, we would have been annihilated by YouTube censorship a long time ago.
01:46:54.000 If we were like Lex Freedman, as Darren described it, we're the inverse of Lex Freedman.
01:47:00.000 Like, the way he describes it is people are just stumbling.
01:47:03.000 Maybe that's what YouTube's doing.
01:47:04.000 They're like, get everybody to watch.
01:47:05.000 Everybody who watches TimCastIRL, go watch Lex Freedman instead.
01:47:09.000 So we're getting punished in the algorithm, but people are like, YouTube, stop.
01:47:11.000 I want to watch this show.
01:47:13.000 And then meanwhile, people are waking up and they're watching Lex.
01:47:15.000 I don't know if that's actually true.
01:47:17.000 I got no beef with Lex.
01:47:18.000 It's not just speculation.
01:47:19.000 No, it's nothing against Lex, but there were conversations that our friend Susan of YouTube was having, and basically people were criticizing her for not doing a full-on ban of Ben Shapiro.
01:47:36.000 And her defense of that was saying that we've run a lot of studies on it, and they've shown that Ben Shapiro is actually a very effective stopping point.
01:47:47.000 That is to say, he serves a very important de-radicalization function.
01:47:53.000 And for that reason, I think there's a real utility to the Susan Wozniacki, whatever, WoJackie, Wojcinski, WoJack Susan.
01:48:06.000 But there's a real utility to the censors to say that, look, these people are an off-ramp.
01:48:13.000 They're de-radicalizers.
01:48:15.000 They'll go and they'll watch Lex Fridman.
01:48:18.000 And in many cases, And again, nothing against these people personally, but there is a kind of fool's trade whereby a certain type of talking head will earn your trust by saying controversial things like, boys have penises and girls have vaginas.
01:48:36.000 And in exchange for that trust, will shove down orthodoxies on everything else down your throat.
01:48:42.000 And that is a very important de-radicalization effort, and there's a whole cluster of people who may fit that description.
01:48:49.000 And I think it's fair to say that the Lex phenomenon may be adjacent to to all of that. Or Ben Shapiro, you know, telling people to
01:49:02.000 take medical procedures.
01:49:03.000 Also, Ben Shapiro gets a lot of his traffic from Facebook, overwhelmingly a lot.
01:49:07.000 Well, he has the Daily Wire. I've heard...
01:49:10.000 If you look at the number one shared kind of articles on Facebook, it's usually Daily Wire.
01:49:14.000 No, it's, again, Daily Wire is the Lex Fridman of Facebook.
01:49:18.000 And it's not accidental.
01:49:23.000 I've heard, I'm not going to say for a fact, but I've heard on good authority, Shapiro has a great relationship with Zuckerberg, and they've had the great relationship for a long time.
01:49:32.000 And again, it's there's a utility to having Ben Shapiro, who's very reliable on the key things, being basically as right wing as you can be, and not punished by the algorithms.
01:49:45.000 There's utility to that in terms of how the larger conversation is controlled.
01:49:48.000 Well, speaking of that, we've got Chris Toast who says, Show is hidden from my feed on both Android and desktop.
01:49:55.000 Oh, that's amazing.
01:49:55.000 Then we have, uh, what is this right here?
01:49:58.000 Sergeant Wolf says, Hey Tim and crew, I figured I'd chat and let you know that the show is not in my feed, not on my home screen or subscribed uploaded feed.
01:50:06.000 Had to go to the channel and find it there.
01:50:09.000 Really interesting.
01:50:10.000 We're getting more messages than normal, so Share the URL on every single platform.
01:50:15.000 Twitter, how about that?
01:50:16.000 Post it to Twitter where you're allowed to make jokes again.
01:50:20.000 And hopefully, the hope I have is that if enough people who watch this video share the URL every time it goes live, no amount of censorship will stop a natural phenomenon of people saying, check this show out.
01:50:32.000 Because clearly they're trying to stop us.
01:50:35.000 You know, part of me is like, oh, you know what, if we finally get banned, I can go take my van down by the river and just go fishing.
01:50:40.000 Not have to worry about it.
01:50:42.000 But, you know.
01:50:43.000 For the time being, we're here to fight the culture war, and it looks like we're winning.
01:50:46.000 So, there's gonna be the death rattle, there's gonna be the panic attacks, YouTube's gonna lose its mind.
01:50:52.000 They're not happy about it.
01:50:53.000 The reason why Say, Elon Musk buys Twitter.
01:50:57.000 The reason why these changes are happening is because we have not stopped pushing back and demanding free speech, calling out the lies in the machine, and they wish we would just roll over.
01:51:06.000 So.
01:51:08.000 It's a little thing we can do.
01:51:09.000 But I say the same thing of Steven Crowder.
01:51:11.000 You know, they keep giving him strikes on YouTube, watch his show on Rumble, watch Lotus Eaters, watch The Quartering, and then just share all the content.
01:51:20.000 It's the most powerful thing you can do.
01:51:21.000 They can't censor it if everyone just keeps sharing.
01:51:23.000 It's like whack-a-mole.
01:51:24.000 They can't do anything about it.
01:51:25.000 I think that's one of the reasons I still exist.
01:51:27.000 You know?
01:51:28.000 Dealing with all the crap.
01:51:30.000 I think it's viral marketing.
01:51:31.000 We don't even do any regular marketing.
01:51:33.000 It's you guys sharing the videos.
01:51:35.000 Yeah, YouTube's not recommending us.
01:51:38.000 YouTube recommends our videos to only people who are already in that bubble.
01:51:43.000 People still tell me they get unsubscribed from the channel when they never unsubscribed.
01:51:46.000 So, again, lots of crazy things happening here.
01:51:50.000 The Ass says, hey Lucas, I bought a shirt from your website, never got a return email.
01:51:54.000 I messaged you via Instagram, so help me out.
01:51:56.000 Also love the work that y'all do.
01:51:58.000 Yeah, you should be able to message the company that you bought it from and they'll send you a tracking number.
01:52:02.000 You should, if you have any problems, also if you have any, sometimes, you know, rarely this happens, I get a message, someone saying that the graphics weren't that well, you could get a new t-shirt right away after emailing the company that of course did all the processing.
01:52:14.000 So if you have any problems with quality or shipping, reach out to either Teespring or another company we work with and they usually solve all your problems right away.
01:52:23.000 Ian Kinney says Kanye was hanging out with Milo Yiannopoulos, said he's running for president in 2024, and Milo will be his campaign manager.
01:52:31.000 I saw that.
01:52:31.000 Is that true?
01:52:32.000 I saw the video.
01:52:33.000 There's an actual video of a random guy filming outside, and Kanye's like, yay, he's like, hey, come on inside!
01:52:40.000 He's like, Milo's there, and that's what Kanye said.
01:52:45.000 He's running.
01:52:46.000 Yeah, yay 2024.
01:52:47.000 All right.
01:52:52.000 Okay, let's see.
01:52:53.000 What is this one?
01:52:57.000 Tackty Platty says, Tim, Mary is a far better co-host than Ian for this show as well.
01:53:03.000 Not cringe half the time and doesn't say upsetting things.
01:53:06.000 Please have her on more.
01:53:07.000 Upsetting things?
01:53:07.000 What does Ian say that upsets you so much?
01:53:10.000 Hey, Mary is great, but so is Ian.
01:53:12.000 Ian, I think, adds another component to the show that is very rare and is awesome and I think is needed.
01:53:18.000 Like pissing people off, upsetting them.
01:53:20.000 Well, I think people need to do that.
01:53:23.000 No, I get it.
01:53:25.000 It shouldn't be a circle of jerks patting each other on the back.
01:53:31.000 Yeah, we're not here for that.
01:53:34.000 We want to hear opinions that are not ours, and Ian provides that, and he allows us to have conversations.
01:53:39.000 I can disagree with you more if that's what you want.
01:53:41.000 Yeah, sure.
01:53:42.000 No, but he's being real.
01:53:43.000 He's being, you know, I believe he's being genuine.
01:53:46.000 There's several instances where Ian has brought up very bad points, and also where he's brought up very good points.
01:53:51.000 When we were talking about the lockdowns and stuff, and why we didn't think the government should be able to lock things down, he said, what about an airborne Ebola?
01:53:57.000 Like, where's the line?
01:53:59.000 And I was like, okay, like, let's entertain, like, let's talk about that, because we're all very much opposed to government lockdowns, but then, and then we sort those things through.
01:54:06.000 If we all just agree with each other, your ideas aren't strengthened.
01:54:10.000 Like, you actually need someone to contradict, even if it's not always a good argument.
01:54:15.000 But the other thing too is, well, I guess that's it.
01:54:19.000 Simply put, right?
01:54:20.000 You need some kind of challenge.
01:54:23.000 I'll add one more thing to it.
01:54:25.000 Some people comment, they're like, Ian doesn't know about this stuff, why is he on the show?
01:54:28.000 And it's like, that was always the reason why I asked Ian to be on the show.
01:54:32.000 I was like, we need someone who's gonna be like, what is that?
01:54:35.000 I don't know what that is.
01:54:36.000 So to give us an opportunity to explain it, because when we started this show, I assumed most people would not have the deep political knowledge that we do.
01:54:47.000 And we don't know everything either.
01:54:48.000 But the point was to have someone who's going to be like, what do you mean Joe Biden did that?
01:54:52.000 Oh, let me explain.
01:54:54.000 Because the assumption is if Ian, who's not overtly political, doesn't know, there's a lot of people at home who don't, and we want to make it more accessible.
01:55:00.000 But again, Ian and those semantic arguments.
01:55:02.000 Ian, come on, buddy.
01:55:04.000 Semantic arguments, they don't go anywhere.
01:55:06.000 You guys are both great.
01:55:07.000 We did a live show, and then, you know, we had, like, Ian was there, we were jamming, and then someone asked something about Ian, and then I said, you know, Ian comes up with really great points often, we often disagree, but those semantic arguments, and then everyone in the crowd started clapping and cheering, and Ian was laughing.
01:55:24.000 I probably disagree with Ian about 99% of things.
01:55:28.000 Yeah, but everyone in the chat knows Ian rolls 20s.
01:55:31.000 It happens.
01:55:32.000 Yeah, he's pro-death penalty and he's pro-choice.
01:55:35.000 Yeah, wild.
01:55:36.000 And pro-acid.
01:55:37.000 It's like a weird combination of things, you know?
01:55:39.000 It's wild.
01:55:39.000 Like typically liberals, like I'm not saying Ian's a liberal, but liberals are like anti-death penalty, pro-choice, but Ian's pro-choice and pro-death penalty.
01:55:47.000 At least he's consistent there.
01:55:49.000 Yeah, right?
01:55:49.000 You know, it's all right, man.
01:55:53.000 Yeah.
01:55:53.000 All right.
01:55:53.000 Let's see.
01:55:54.000 Fair Frozen says, the men are covered in hydraulic mud, a liquid prepared with soil, water, and glycol and other aggregates used to inject into the drill.
01:56:03.000 It acts as a barrier between gas escaping from the well tap.
01:56:06.000 Oh, that's right.
01:56:06.000 I remember that.
01:56:07.000 Interesting.
01:56:07.000 Yeah.
01:56:08.000 Yeah.
01:56:10.000 All right.
01:56:11.000 Noah Zork says, are there non-farting rooms?
01:56:15.000 We should set up a fart booth in the basement.
01:56:18.000 And it's just like, if you have to fart, you got to go in the booth.
01:56:20.000 That'd be hilarious.
01:56:21.000 And it's just got, like, an air freshener, like, fan going.
01:56:25.000 We'll call it the fart booth.
01:56:26.000 Most air fresheners are scams.
01:56:29.000 What?
01:56:29.000 What does that mean?
01:56:30.000 Do they give you cancer or something?
01:56:31.000 They, like, mess up your endocrine system.
01:56:34.000 Yeah, I've heard that before.
01:56:36.000 They're horrible.
01:56:36.000 Every time I see it, just trash right away.
01:56:39.000 Especially those little trees.
01:56:40.000 I think those are the ones that come out.
01:56:41.000 Especially, like, Febreze and all these other stuff.
01:56:44.000 Horrible for you.
01:56:45.000 Absolutely destroys, like, your physical body.
01:56:49.000 So, what is this?
01:56:51.000 Takfuchi says Apple deleted all their tweets and pulled ads.
01:56:56.000 Head of the App Store deleted their account.
01:56:57.000 They might pull the Twitter app.
01:56:59.000 This as Elon finally removes exploitation from the platform.
01:57:04.000 Yup.
01:57:05.000 Did Apple really remove all their tweets?
01:57:07.000 Before Steve Jobs died, I know that Apple was good about preventing child exploitation on the App Store.
01:57:16.000 But then ever since then, it's been less enforced.
01:57:22.000 Hermes Bird says this guest sounds exactly like DeSantis.
01:57:25.000 It's uncanny.
01:57:26.000 Do people tell you that, Darren?
01:57:28.000 That you sound like Ron DeSantis?
01:57:30.000 I've never heard that before.
01:57:31.000 When I saw that, I was like, yeah, a little bit.
01:57:34.000 Like, I can hear something.
01:57:36.000 Aren't you in Florida right now?
01:57:38.000 Miami.
01:57:38.000 Oh, yeah.
01:57:39.000 We were talking about it before the show.
01:57:40.000 That's correct.
01:57:40.000 I think they just mean I'm very tired.
01:57:43.000 Ah, yes.
01:57:43.000 That explains it.
01:57:46.000 People are mentioning Balenciaga?
01:57:48.000 Is that what it's called?
01:57:49.000 Balenciaga.
01:57:50.000 Balenciaga!
01:57:51.000 There you go, there you go.
01:57:52.000 Is it C or G?
01:57:54.000 What?
01:57:54.000 I think in Italian the C is a C sound, right?
01:57:56.000 It's not an Italian brand.
01:57:58.000 It was founded by a Spanish man and it's headquartered in France.
01:58:03.000 Oh!
01:58:04.000 So if it's French then it's probably just B. It's not French though, it's Spanish.
01:58:10.000 Have you ever seen that viral video about the French language where it's like a ta ta ta ta ta ta ta ta ta ta ta
01:58:15.000 ta ta?
01:58:16.000 Yeah, because they were like all these words are the same thing. It's like ta, ta, and they all mean something
01:58:21.000 different. It's like, okay. I guess it's just a tongue twister though, you know what I mean? Like, we have those in
01:58:26.000 English. We do. We have the same. Yeah.
01:58:28.000 Rhino, uh, Rhino Batha says... Bota, Bota.
01:58:33.000 Bota, is that what it says?
01:58:34.000 Yeah, it's an Afrikaans name.
01:58:35.000 Oh, there you go.
01:58:36.000 He says, good morning from South Africa, where I always find your shows in my feed.
01:58:39.000 Hey!
01:58:40.000 That's cool.
01:58:41.000 Interesting.
01:58:41.000 What's up, Randit?
01:58:43.000 Jason Lippert says, I'm in Canada, have no issues finding show.
01:58:46.000 Do you need a tissue there?
01:58:47.000 Do you speak Afrikaans?
01:58:49.000 I do to the most, but my parents will say I don't, but... Oh, really?
01:58:54.000 Yeah.
01:58:54.000 What is it like Dutch and English on them?
01:58:56.000 It's basically Dutch.
01:58:57.000 It's like more like old Dutch because we didn't get influence from like the Spanish taking over Holland and then, you know, creating the masculine feminine in Dutch now.
01:59:05.000 But essentially it's like old Dutch.
01:59:06.000 Yeah.
01:59:07.000 Little known.
01:59:08.000 Well, how about that?
01:59:10.000 All right.
01:59:11.000 Nate B says, how do you know you aren't a stopping point, Tim?
01:59:15.000 I think we are.
01:59:15.000 I agree.
01:59:16.000 I've talked about this too.
01:59:18.000 I've been saying for a while that the reason YouTube probably tolerates us more than other shows is because they view us as a de-radicalization force or something.
01:59:28.000 Or the way I put it is, YouTube wants to ban the right.
01:59:33.000 But they know that if they do, the right will go somewhere else.
01:59:36.000 So they need to allow certain channels to stay on the platform, so that if they ban half of them, the users will stay.
01:59:44.000 And then they force them into this particular ideological bubble.
01:59:48.000 I've long talked about that.
01:59:50.000 It's not going to change my opinions on things, I guess.
01:59:52.000 But I will say this.
01:59:54.000 Smash that like button, subscribe to this channel, share this show with your friends.
01:59:57.000 Become a member at TimCast.com because we're going to talk about this creepy story with, uh, was it Balenciaga?
02:00:03.000 Yeah, Balenciaga has BDSM teddy bears in their new ad campaign with children.
02:00:11.000 And court documents pertaining to child exploitation.
02:00:13.000 Really weird.
02:00:14.000 We'll talk about that over at TimCast.com.
02:00:16.000 So become a member.
02:00:18.000 We usually start recording right after we wrap, and then we upload as soon as we're done.
02:00:22.000 You can follow the show at TimCast IRL.
02:00:25.000 You can follow me at TimCast.
02:00:27.000 Smash the like button.
02:00:28.000 And Darren, do you want to shout anything out?
02:00:30.000 Revolver.news, check it out.
02:00:32.000 Very, very hot.
02:00:34.000 Tether piece, the next FTX, but bigger.
02:00:37.000 A lot of people are talking about it, so check it out.
02:00:41.000 Are you on Twitter?
02:00:42.000 I am on Twitter, at Darren J. Beaty.
02:00:46.000 You can find pictures of me on Instagram at Mary Archived.
02:00:51.000 You can read my inane thoughts on Twitter also at Mary Archived.
02:00:56.000 And you can subscribe to Pop Culture Crisis on YouTube if you're interested in us talking about entertainment, celebrities, movies, etc.
02:01:06.000 over there.
02:01:07.000 Not political, but more fun.
02:01:10.000 So go subscribe over there.
02:01:12.000 When you super chat on Pop Culture Crisis, money guns fire money into the air.
02:01:16.000 Yeah, it doesn't phase me anymore.
02:01:17.000 I just barely even notice it, but it scares the guests a lot.
02:01:22.000 That's cool, that's great.
02:01:23.000 She's shell-shocked.
02:01:25.000 You'll be okay.
02:01:26.000 I'll never be the same.
02:01:27.000 This is awesome.
02:01:27.000 Thank you so much for coming.
02:01:28.000 I always appreciate when you're on.
02:01:30.000 My YouTube channel is youtube.com forward slash WeAreChange.
02:01:33.000 I'm getting absolutely screwed over there, but I do work very hard.
02:01:36.000 I just got a new video out there about all the craziness in the world.
02:01:40.000 WeAreChange on YouTube and Elon Musk is promising a video platform on Twitter.
02:01:46.000 LukeWeAreChange on Twitter.
02:01:49.000 We need a tissue first.
02:01:50.000 We will see you all over at TimCast.com.