Timcast IRL - Tim Pool - November 30, 2022


Timcast IRL - Trump REFUSES To Denounce Fuentes, Tim Has New Details About Ye Show w-Michael malice


Episode Stats

Length

2 hours and 2 minutes

Words per Minute

214.24358

Word Count

26,152

Sentence Count

2,105

Misogynist Sentences

24

Hate Speech Sentences

33


Summary

On this week's episode of The Besties, the boys are joined by their good friend Zog to talk about the latest in the Trump-Russia scandal, the Hulapu incident, and Elon Musk. Plus, we have a special guest appearance from our good friend Michael Malice.


Transcript

00:00:00.000 You know, to be completely honest, uh, what is this?
00:00:24.000 Luke's already playing audio.
00:00:25.000 All right, to be completely honest, I don't like being the subject of the news story.
00:00:29.000 As much as people try to claim that, they're like, oh, he was trying to get attention or whatever.
00:00:33.000 You know, we wanted to have, we had an opportunity, we thought we were lucky to have several people on the show who were in the news, because even right now, Mitch McConnell's coming out, you've got Piers Morgan coming out, stories about Donald Trump refusing to denounce Nick Fuentes because he doesn't want to alienate voters.
00:00:48.000 This is the dominant story.
00:00:49.000 It's unfortunate we weren't able to actually talk about the news, but we have a lot of details to go through.
00:00:53.000 Stuff I talked about in the morning, new details that have emerged now, that they had a private plane ready for them after they left the show, which it's entirely possible they were able to to get a private plane very, very quickly, but it also seems Very, very rare and unlikely.
00:01:11.000 But again, I don't want to accuse them of anything.
00:01:13.000 I think it's possible that they had this planned.
00:01:16.000 They've been adamant they didn't plan this.
00:01:18.000 And I think it worked out very well for somebody who wanted revenge on Trump, considering the news cycle.
00:01:22.000 So we'll talk about that, plus what's been going on outside of that.
00:01:25.000 And of course, we'll talk about censorship and Elon Musk.
00:01:27.000 Before we get started, Head over to TimCast.com, become a member to support our work.
00:01:31.000 We will have a members-only uncensored show for you tonight.
00:01:33.000 We didn't have one yesterday because, you know, we had Yayan and he left.
00:01:37.000 And, uh, I'll say it again.
00:01:38.000 Dude doesn't owe me anything.
00:01:39.000 If he wants to leave, so be it.
00:01:40.000 Nobody has to stick around on this show.
00:01:41.000 They can always bail on me.
00:01:42.000 That's the way it goes.
00:01:44.000 Dude's a powerful billionaire, or former billionaire, whatever you want to call him.
00:01:47.000 The last thing you need to sit here and have me talk about whatever.
00:01:50.000 So, smash that like button, subscribe to this channel, share the show with your friends.
00:01:54.000 Joining us to apologize is our good friend.
00:01:58.000 I'll just throw it to you.
00:02:00.000 Tim, I'm sorry I left the show yesterday.
00:02:02.000 It was a complete apology, and I was very wrong, and I'll be back whenever you like.
00:02:08.000 Vote for me in 2024.
00:02:09.000 Okay.
00:02:10.000 How was that?
00:02:11.000 It was okay, I guess.
00:02:12.000 Heartfelt.
00:02:13.000 Well, I didn't want to be too, you know... Edgy?
00:02:15.000 No, because then I wouldn't want to steal roles from black actors.
00:02:18.000 Oh, yeah, yeah, yeah.
00:02:20.000 I don't want to get in the Hulapu situation.
00:02:21.000 Oh, right, right.
00:02:22.000 So you're Michael Malice.
00:02:23.000 I am Michael Malice.
00:02:24.000 Yes.
00:02:25.000 Shalom!
00:02:26.000 I'm here representing Zog.
00:02:28.000 Okay.
00:02:28.000 What do you do?
00:02:30.000 Thanks for coming, man.
00:02:31.000 I think most people know who you are.
00:02:32.000 Oh, okay.
00:02:36.000 What's your biggest claim to fame?
00:02:38.000 I am the organizer of the Anarchist Handbook.
00:02:40.000 You guys can get the hardcovers at anarchisthandbook.bigcartel.com.
00:02:45.000 And I was on a billboard because of you guys.
00:02:48.000 That's right.
00:02:48.000 In Times Square.
00:02:49.000 And my next book... Oh, this is the big news before we get into this stuff.
00:02:52.000 I finished my next book, The White Pill, yesterday.
00:02:56.000 I'm uploading it to Amazon Thursday, I think.
00:02:59.000 And we're going to launch it live on Timcast and see if we can get a book to number one on Amazon live and make internet history.
00:03:07.000 Are you cool?
00:03:08.000 You're self-publishing?
00:03:09.000 Yeah.
00:03:09.000 That's awesome.
00:03:11.000 Make Michael Malice the number one Amazon seller.
00:03:14.000 I hit number three.
00:03:15.000 This hit number three, so hopefully we can hit number one.
00:03:18.000 Very cool, man.
00:03:18.000 Well, how do you describe yourself for those who aren't familiar with you?
00:03:21.000 I am an anarchist without adjectives.
00:03:24.000 I have a podcast called You're Welcome, Twitter, douche, and lover of all peoples.
00:03:32.000 Well, okay.
00:03:33.000 We got Luke hanging out.
00:03:33.000 Right on.
00:03:34.000 I think I'm supposed to say, shalom, assalamu alaikum, right?
00:03:37.000 I think that's what I'm supposed to say there.
00:03:39.000 Hi there, my name is Luke Godowsky here of WeAreChange.org, and today I have a t-shirt of Klaus Schwab.
00:03:45.000 I thought I was supposed to be the Jewish one.
00:03:50.000 Of Klaus Schwab saying that you will get no presents and you will be happy, which you could exclusively get on TheBestPoliticalShirts.com.
00:03:58.000 Because you guys do that, that's one of the main reasons why I am here.
00:04:02.000 Thank you again so much for supporting me, TheBestPoliticalShirts.com.
00:04:05.000 Ian, I'm sorry you're perturbed.
00:04:06.000 Your crystals have been disturbed.
00:04:08.000 I told Nick, I was like, hey man, they have LSD on them.
00:04:11.000 And then he was like joking, and then like five minutes later he was like, Really?
00:04:16.000 I was like, yeah.
00:04:18.000 But then Milo came down and he just like got high on LSD.
00:04:22.000 Milo?
00:04:23.000 You like LSD, Milo?
00:04:25.000 No, I think your brain is broken.
00:04:26.000 I like those guys.
00:04:27.000 That was a chaotic situation last night.
00:04:28.000 I hope that we get a chance to talk to them in the future and kind of figure out, you know, like I was saying last night, they're like, Ian, would you have them back on?
00:04:34.000 And I'm like, yeah, I'm the kind of guy where they're like, why would you interview the devil, Crossland?
00:04:38.000 And I'm like, because people keep saying he's evil.
00:04:39.000 I want to know why.
00:04:40.000 He's also the devil's gonna be pretty Well, let's talk about this.
00:04:43.000 This is actually a good thing to get into.
00:04:45.000 Like, hosting them, what people were saying, your thoughts and everything.
00:04:48.000 So, we'll get into it.
00:04:48.000 And before we do, I want to point out, yes, you're welcome.
00:04:51.000 Michael Malice Show.
00:04:52.000 I don't know, YouTube?
00:04:53.000 And do you run it through any portals on your website or anything at this point?
00:04:56.000 No, just it's on YouTube.
00:04:57.000 It's on Spotify.
00:04:58.000 It's on Rumble.
00:04:59.000 It's on Odyssey.
00:05:00.000 I was on an episode.
00:05:01.000 If you guys haven't seen it, it was excellent.
00:05:02.000 Michael's a great interviewer.
00:05:03.000 It was a lot of fun.
00:05:04.000 Very fun.
00:05:04.000 And I haven't seen you since then, so.
00:05:05.000 That's true.
00:05:06.000 Good to see you, buddy.
00:05:06.000 You look good.
00:05:07.000 Thank you.
00:05:08.000 We got Search hanging out.
00:05:09.000 Hey, low energy Serge Duprio.
00:05:11.000 What's up, everybody?
00:05:12.000 How you doing, YouTube?
00:05:14.000 Take it away, Tim.
00:05:16.000 The worst by far is Serge.
00:05:18.000 Very weak, very low energy.
00:05:21.000 I do want to mention one quick thing, too.
00:05:22.000 I had a phone call with Google today.
00:05:24.000 They're panicking over Section 230 reform.
00:05:27.000 The Supreme Court has agreed to take up a case pertaining to recommendations.
00:05:31.000 And so YouTube is now actively lobbying prominent creators, I suppose.
00:05:36.000 I got an email and it asked me to sign up for a certain date to talk to their head of policy.
00:05:40.000 He was very nice, but I was personally offended at the things that he was saying.
00:05:46.000 So I'm not trying to be mean to the guy.
00:05:47.000 We're going to talk again probably tomorrow.
00:05:49.000 But it seemed like they were trying to lobby me to agree that YouTube should have the right to be politically biased and be immune from defamation, which I absolutely do not.
00:05:59.000 So we'll talk about that too, but we got to get into this stuff.
00:06:01.000 So here's the first story.
00:06:03.000 Many of you may have seen what happened last night on this show, and I have concerns that we had Ye, Fuentes, and Milo on the show.
00:06:12.000 A lot of people messaged me beforehand saying, why are you having them on?
00:06:16.000 They're using you, they have an agenda, and this is funny.
00:06:19.000 Who's they?
00:06:19.000 Who's they?
00:06:20.000 No, no, no, but listen, listen.
00:06:21.000 They say, they say, uh, they as in general people messaging me, they're using you, Tim.
00:06:26.000 They're using you.
00:06:27.000 And I said, for all of them, why do you think people come on this show?
00:06:31.000 Do you think they're coming on because they love me and they want to hold my hand and smile and look at my face?
00:06:35.000 Or do you think they're trying to promote a book?
00:06:37.000 Do you think they're trying to get a message out?
00:06:39.000 They're trying to promote their Twitter accounts.
00:06:40.000 They have ideas they want to share.
00:06:41.000 Of course, everyone's using everybody.
00:06:43.000 I have guests on so we can make an interesting show.
00:06:45.000 It's our business.
00:06:46.000 They come on because it's an opportunity for them to, to share or sell or whatever.
00:06:49.000 I thought we were friends.
00:06:51.000 Think again.
00:06:52.000 We're only having you on because you make us laugh.
00:06:55.000 Okay.
00:06:55.000 I guess I'm a clown.
00:06:58.000 So here's what I want to say because right now there's a lot of stories, you know, Trump is being told that he's got to denounce these guys.
00:07:07.000 There was a story that popped up on Fox 5.
00:07:10.000 Kanye West spotted in Frederick after storming off of podcast.
00:07:13.000 Here's the potential scenario to be fair.
00:07:17.000 The scenario is they abruptly left the show.
00:07:20.000 They were on it.
00:07:21.000 Ye didn't like that I was not agreeing with him or that I was pushing back in any capacity.
00:07:26.000 Got up and stormed off.
00:07:27.000 Milo and Nick, working for Ye, wouldn't stick around and left.
00:07:31.000 Immediately, they called a charter company to schedule a private plane who was very, very, they were very lucky that a hot crew, they're on the ground ready to go, was available nearby and was able to then dispatch a plane to Frederick that they could then get into within a couple hours' notice.
00:07:47.000 Entirely possible.
00:07:48.000 My personal opinion?
00:07:50.000 That sounds really, really crazy if that's the case.
00:07:53.000 So when they stormed off the show, my first thought was that, like, okay, he stormed off the show.
00:08:00.000 Then we talked about it and I said it was the perfect thing to do if you want to generate press.
00:08:04.000 There's a story going around that Donald Trump met with Nick Fuentes, an anti-Semite white supremacist.
00:08:09.000 That was all last week.
00:08:11.000 First day, first thing this week.
00:08:13.000 We have this show, and I think it was the first thing Nick said on the show was that something about, isn't it really them?
00:08:19.000 And then immediately Ye gets up and walks out.
00:08:21.000 Downstairs, smile on his face, eating cookies.
00:08:24.000 Apparently, one of our guys here said that when they were leaving, he said something like, I came, did the show, got what I needed, now I'm done.
00:08:31.000 I'm leaving.
00:08:32.000 And so that made me wonder about what was the goal.
00:08:35.000 It's possible, maybe I'm just thinking too much into it, that Milo wanted revenge on Trump.
00:08:40.000 Milo was quoted as saying he invited Nick because he knew Trump would mishandle it in the press.
00:08:45.000 He wanted to make Trump's life Trump's life miserable.
00:08:47.000 Trump is mishandling it in the press.
00:08:49.000 The first thing out of, uh, I believe it was the first thing out of Nick's mouth was, but isn't it them?
00:08:55.000 And then Ye leaves right around the half an hour mark when we're at 100,000 concurrent viewers.
00:09:00.000 And that said to me, they had to have planned this.
00:09:02.000 Then I saw the video of them going to the Frederick airport, boarding what appears to be a super mid private jet right after the show, within a few hours.
00:09:10.000 And I was like, how did they charter a private jet that fast?
00:09:14.000 It is entirely possible they did.
00:09:16.000 I know where the jet originated, we did some sleuthing to figure it out.
00:09:19.000 What happened was, about a half an hour's flight time away, a hot crew, this is what I'm told, it is entirely possible, these are by the experts in the private aviation, a hot crew was available, I'm sorry, what they said was, a plane nearby flew to Frederick, Landed for about an hour and then departed with them.
00:09:40.000 It is entirely possible that they happened upon what's called a hot crew, that's what they said, meaning there were people on the ground working, ready to take off, but for no reason.
00:09:50.000 The plane was not being used.
00:09:52.000 Yay, he's a very powerful, wealthy individual.
00:09:54.000 Perhaps he knows somebody and someone has his back.
00:09:57.000 It's interesting considering everything he's been saying about how they're trying to arrest him and shut him down and silence him and censor him, but apparently he was able to get, within two hours' notice, a super mid-private jet to fly from here to Los Angeles with a crew active.
00:10:12.000 I don't know what happened, but to me it sounds staged.
00:10:15.000 It sounds like they knew in advance they'd be leaving, at least Ye may have.
00:10:19.000 That's not the vibe I got.
00:10:20.000 I got that they seemed very genuine when they were here.
00:10:23.000 I know Milo.
00:10:24.000 I mean, he's a zany dude, but he's looking for redemption.
00:10:29.000 That's his path right now.
00:10:31.000 And Kanye's grasping at straws to try and find someone to help him.
00:10:35.000 And Nick was so kind the entire time he was here.
00:10:38.000 Like, I've heard stuff about him, I've seen him say stuff online out of context that's racist, but when you see him eyes-to-eyes, like, the guy is looking for friends.
00:10:45.000 Like, he doesn't probably, he's lived in an environment where he didn't have a lot of friends.
00:10:49.000 So, when you, what happened was, you and Kanye were going back and forth and kind of interrupting each other.
00:10:53.000 You mentioned at one point, Tim, you were holding your finger up, indicating you want to jump in.
00:10:56.000 Before the show.
00:10:57.000 And he was like, oh, and he's like, you want to say something, go for it.
00:10:59.000 Before the show.
00:11:00.000 Yeah, and even during the show.
00:11:01.000 So, as soon as the show started, His demeanor changed.
00:11:03.000 So you guys were kind of had an agreement, like, we're gonna kind of talk over each other, we're gonna flow, but then when Nick chimed in and you interrupted Nick, Kanye's face dropped, and then he got out.
00:11:12.000 He was like, that's, you crossed the line.
00:11:13.000 When you interrupt me, that's okay, you interrupt Nick, I'm out.
00:11:15.000 Before the show, when we're doing pre-show and setting up, it was so different.
00:11:19.000 Kanye said, uh, yay, I'm sorry, said a few things, I pushed back, and he just had a smile on his face and is nodding along like, whatever.
00:11:26.000 The show comes on, And all of a sudden he's like, we went to Trump's dinner.
00:11:30.000 I was invited first.
00:11:31.000 I called Milo.
00:11:32.000 So anyway, let's talk about antisemitism.
00:11:34.000 And I'm like, whoa, like we were just talking before the show.
00:11:37.000 You know what happened?
00:11:38.000 The first article.
00:11:39.000 No, he knew what the article was.
00:11:41.000 We had the article pulled up pre-show.
00:11:43.000 He mentioned Pence.
00:11:44.000 He asked Fuentes about Pence and what Trump.
00:11:46.000 I said we'll use this story to launch the dinner and the controversy caused by it, and I'll ask you about how this dinner came to be, what do you think about it, what happened with it, and then what is Yay 24?
00:11:57.000 And I was like, who am I kidding?
00:11:59.000 We're gonna get into the anti-Semitism stuff.
00:12:00.000 I'm gonna push back.
00:12:01.000 Here's how it'll play out.
00:12:02.000 If you guys want to say something, I'm gonna put my hand up and let you know.
00:12:06.000 Talk about this."
00:12:07.000 And they're like, of course, no problem, absolutely fine.
00:12:10.000 And then as soon as the show started, all of a sudden it was like, oh, how dare you bring up this article?
00:12:13.000 Oh my!
00:12:14.000 Here's what I don't understand about that dinner that doesn't make sense to me.
00:12:17.000 You do not get a plus three to meet with the president.
00:12:20.000 I don't care who you are.
00:12:21.000 That is not... Piers Morgan had a piece today that he, you know, he won The Apprentice.
00:12:26.000 He was, I think, the first winner of The Celebrity Apprentice.
00:12:28.000 He was interviewing Trump at Mar-a-Lago.
00:12:30.000 The Secret Service, you know, basically gives you an anal swab if you're going to meet a former president, especially if you're dealing at Mar-a-Lago, which not that long ago was raided because there were concerns about things that are classified or whatever, top secret, so on and so forth.
00:12:44.000 I don't understand how, if you're running for president and this is the scenario in your home, you're going to have dinner with someone, anyone, and be like, oh yeah, you know, bring your buddies and not Be paranoid even just for security reasons.
00:12:57.000 Who are these people?
00:12:58.000 Are they spies?
00:12:59.000 Are they so on and so forth?
00:12:59.000 I don't understand that.
00:13:01.000 We struggle with this.
00:13:03.000 Here, when we're bringing people in, we have to send this big, long-winded email about who's allowed here, what this means, vetted by you, stuff like that.
00:13:13.000 Now, I wouldn't say it's like we're that strict where we... But you're not the president!
00:13:16.000 Exactly.
00:13:17.000 You're not a former president.
00:13:18.000 It's absolutely beggar's belief.
00:13:20.000 And the other thing is, like, Cenk Uygur had a tweet about, like, this isn't a surprise, this is who Trump hangs out with, white supremacists.
00:13:26.000 It's like, when someone's the president, you know exactly who they're hanging out with.
00:13:29.000 You know every minute of their day, what they're having for dinner and so on and so forth.
00:13:32.000 It's not a big secret.
00:13:34.000 Their schedule is very public and very known.
00:13:37.000 And it's a very big deal.
00:13:38.000 This is the President of the United States, in this case, the former President of the United States.
00:13:41.000 So what Trump is saying does not cut ice with me.
00:13:46.000 And what else is crazier to me, in all seriousness, his daughter converted to Judaism, right?
00:13:50.000 So this is something that obviously—and I think one of his other kids married a Jewish woman.
00:13:55.000 So he—the joke is, like, what's the difference between a New York Times reader and Donald Trump?
00:13:59.000 Donald Trump's grandchildren are going to be Jewish?
00:14:01.000 It's like, if he has this in his family, for him to be like, well, oops, I didn't know, that's just confusing.
00:14:09.000 You think he just trusted Kanye and was like, if you want to bring whoever?
00:14:12.000 But why would you trust Kanye?
00:14:15.000 I think, first of all, he has this boner for celebrities that is really demented.
00:14:19.000 Because that explains why he's endorsing Dr. Oz, who believed Jussie Smollett, who was for trans kids surgeries, who had the whole laundry list of very lefty ideas.
00:14:29.000 Hershel Walker, who's just a football star for Senate.
00:14:32.000 The love affair Donald Trump has for people who are blue checks is demented.
00:14:38.000 I think Ian's right about one thing, that this guy's looking for friends.
00:14:43.000 I can't speak to the things that any of them have said in the past, like what I should say is like obviously they've said deplorable things, testable things, but what I genuinely believe is, we had Milo on this show a couple weeks ago, and he talked about supporting Trump, he talked about vengeance, he wasn't talking about this stuff.
00:15:00.000 Wait, I'm going to disagree because I have receipts from Milo.
00:15:02.000 Because when this all came out, I wrote a book about this.
00:15:04.000 in Fuentes and now all of a sudden this is like a particular component of his career
00:15:09.000 or his personality, what I think happens is when you cancel people, they go in the only
00:15:13.000 direction they can.
00:15:14.000 Wait, I'm going to disagree because I have receipts from Milo.
00:15:17.000 Because when this all came out, I wrote a book about this.
00:15:19.000 There's a chapter about Milo in my book, and then you write, I think Milo is very, very
00:15:23.000 charming.
00:15:23.000 charming.
00:15:24.000 He's very, very witty.
00:15:25.000 He's certainly very intelligent.
00:15:27.000 And of all the people who got cancelled, the reason he got cancelled, I think, was one of the more BS reasons.
00:15:32.000 He was the victim of childhood abuse.
00:15:35.000 He spoke out about it in a kind of tongue-in-cheek manner.
00:15:37.000 I don't begrudge anyone who suffered through something like that how they should deal with the situation.
00:15:42.000 And he was basically saying things like, okay, you know, this is something that I'm... He was kind of rationalizing it.
00:15:46.000 Like, I'm glad this happened to me at a young age, as opposed to being like, holy crap, You know, I got my innocence taken up, taken from me at a time when it shouldn't have.
00:15:53.000 So I think the fact that he was just like, you know, get out, like everything's ruined for you because of something that had been done to him where he was innocent as a kid or teenager, I thought was really kind of over the top.
00:16:03.000 And clearly it was like Al Capone going to jail because of income taxes.
00:16:07.000 It really wasn't income taxes.
00:16:07.000 They're just looking for an excuse to get rid of him.
00:16:09.000 But he did an article a couple years ago with the Jewish Journal, because when this all came out, I'm like, I thought Milo was Jewish, because I remember this being a thing at the time when he pushed back about the Nazis about this.
00:16:18.000 I sent the – hold on, let me pull it up.
00:16:19.000 I got the DM here, and here's the article from the Jewish Journal.
00:16:22.000 This is from 2019.
00:16:25.000 And he was talking about this then, because this is with the Jewish Journal, and he had said, I'm quoting Milo here, let me get it out, he goes, he thinks the Jewish community, the Jewish lobby would be well served to not throw a gasket every time someone throws out what it may appear to be an anti-Semitic trope, and he says, this is quoting Milo, just like I don't like left-wing political correctness about women and blacks and Muslims, I don't like right-wing political correctness about Jews and Israels.
00:16:50.000 So, and he said, people claim that really stupid things are anti-semitic that are not really anti-semitic, or they make more of a fuss about it than they need to.
00:16:56.000 So at a certain point, he has been addressing this at least 2019, so three years ago.
00:17:01.000 What I mean to say is, if these guys were given an opportunity, if some, obviously it would never happen, but if you went to any one of these guys and said, we want you to be a brand ambassador for a big company that's making a lot of money, but you got to stay away from these subjects, they'd say yes.
00:17:15.000 Yeah, probably.
00:17:16.000 I think what happens is when people get love bombed, you'll get somebody who like starts
00:17:23.000 getting a bunch of tweets where they're like, hey, I like what you're saying, keep saying
00:17:26.000 more of it.
00:17:27.000 And then they're gonna start feeling good.
00:17:29.000 They're gonna be like, oh, I'm getting all this attention, building followers.
00:17:32.000 And then some people are just gonna say, these are people supporting me and I'm gonna side with them.
00:17:38.000 But there's an inverse to that.
00:17:40.000 When you ban someone and they have nowhere to go and they can't get redemption, they will go to whoever is willing to accept them.
00:17:47.000 And on top of just the love bombs, it's the financial incentive, because if people are getting paid because they have a following, now they're getting ad revenue or they're getting direct subscriptions, no matter what they're saying, if they're saying really cruel, evil things and they're getting paid for it, that's free speech, but also that can incentivize the continuation of the behavior.
00:18:05.000 And it's kind of tough to dig out, especially if you get cancelled, you're like, well, the only people that pay me are these people, and they only watch me because of this content, so let's do more of that.
00:18:13.000 When you get cancelled, when you get censored, you get sent off to the far corners of the internet where a lot of people get radicalized.
00:18:20.000 And people need to understand bad ideas need to be fought with good ideas.
00:18:24.000 And we don't have this battle of ideas.
00:18:26.000 We never had this battle of ideas, mainly because of centralized controllers, big tech intervening and saying, no, we don't like what you're saying.
00:18:33.000 We don't like your political stance.
00:18:34.000 Be gone.
00:18:35.000 And then those ideas are never routinely challenged, routinely questioned.
00:18:40.000 There's no pushback against them, and only in places where they fester and grow are these kind of larger elements that, again, never see the light of public day.
00:18:50.000 And I think this is why we need to debate.
00:18:52.000 We need to argue.
00:18:53.000 We need to, of course, create steelman arguments and question and debate everything.
00:18:57.000 This is another takeaway I had from last night.
00:18:59.000 I love YouTube as a platform.
00:19:02.000 I've loved it since Steve Chen and Chad Hurley built the thing.
00:19:05.000 And the third guy, who I don't know the name of.
00:19:06.000 Sorry, dude.
00:19:10.000 Last night's conversation was not a conversation for YouTube in its current state.
00:19:13.000 If you guys want to fix up your terms of service into a more free speech oriented thing, it would be.
00:19:18.000 But when we have people on that there's a chance they might violate the terms of service of the platform, we've got to go to another platform.
00:19:23.000 Because we need to let those guys speak.
00:19:24.000 Those people need to speak.
00:19:25.000 And then let them... Because if Kanye went on for 30 minutes and then Nick responded for 20 minutes and they said all sorts of offensive things that YouTube would have stopped, we'd have a chance to rebut and talk about it and come to some sort of consensus that ideally we would all come out of better people from.
00:19:39.000 Michael's got his look on his face.
00:19:40.000 Well, I mean, I think the problem is whenever, like, let's look at, let's say something that's radioactive in a different way.
00:19:48.000 The recent ad campaign about Balenciaga, right?
00:19:51.000 So if you're going to have people talking about the pros and cons of Balenciaga, and the ads where they had those kids with the teddy bears and bondage, and I want to use words carefully, I can see how a corporation would be like, you know what, we're trying to have a certain image and have a certain platform and there's certain things that are going to be I'm okay with that.
00:20:17.000 There was another story about Balenciaga.
00:20:19.000 They had done similar ad campaigns, and people are saying there's no way that slipped.
00:20:24.000 Like, there's no way they produced an ad and didn't know what they were producing.
00:20:27.000 I think there's absolutely a way because this is how because I think we all we forget how dumb suits are so all it would take would be one art director and then for him to be like look it's already it's edgy the kids were not being shown themselves in any kind of provocative or state of They were just showing iconography, which is very different from, like, Mapplethorpe, who's in museums, where you're showing full frontal on little girls, which got funding.
00:20:55.000 Oh, yeah.
00:20:55.000 Oh, you don't know the whole Mapplethorpe thing?
00:20:57.000 This is another example of how the corporate press is just completely dishonest.
00:21:02.000 If you look at articles, Mapplethorpe had funding from the NEA, right?
00:21:07.000 The National Endowment for the Arts.
00:21:09.000 And he was this great photographer.
00:21:11.000 A lot of the photography he was taking was very, very transgressive.
00:21:13.000 It would be like two men, one has a whip somewhere, another one with the razor on his, you know what?
00:21:18.000 And then Jesse Helms, who was Senator at the time, says, this is not where our money should be going to.
00:21:24.000 And he was condemned as homophobic.
00:21:25.000 Well, many of these images were literally child pornography.
00:21:27.000 There's one of a little girl with her knee up, flashing her genitals to the camera.
00:21:31.000 But if you read any USA Today, Washington Post article, it really just sounds like you've got two guys kissing and Jesse Helms, the homophobe, has a problem with it.
00:21:38.000 Yeah.
00:21:39.000 I disagree.
00:21:39.000 I don't think it suits missing it.
00:21:40.000 Let me finish. So in terms of Balenciaga, if you look at it, you could be like, oh, it's kids with teddy bears.
00:21:44.000 It's like they're punks, you know, you got this badass thing. You're not gonna really read. Oh, here's the printout
00:21:50.000 of this lawsuit.
00:21:51.000 Here's what this book entails, so on and so forth. I disagree. I don't think it suits missing it.
00:21:55.000 I think the messaging was clear.
00:21:57.000 There's a history of it.
00:21:58.000 And I think there's a reason Balenciaga deleted their Instagram, deleted their Twitter, because people were picking up on more and more and more signs.
00:22:05.000 There's no way that you could see that image and be like, yeah, totally fine, totally okay.
00:22:09.000 There's no way.
00:22:10.000 It was so blatant.
00:22:12.000 It was so in our face.
00:22:13.000 And there was multiple instances, multiple times where it's just extremely troublesome.
00:22:20.000 And for an executive to look at it and say, oh, teddy bears, BDSM dolls.
00:22:24.000 Yeah, yeah.
00:22:25.000 Babies laying down in precarious ways.
00:22:28.000 Oh yeah, the tape here or this case.
00:22:31.000 There's no way they missed it.
00:22:32.000 I think it was a deliberate sign to just brag about what they were doing.
00:22:37.000 When you look at the fashion industry, they have a long history of being... Let me just finish really quickly.
00:22:42.000 They have a long history of having a lot of Jeffrey Epstein types.
00:22:45.000 Jeffrey Epstein was all in the fashion industry.
00:22:48.000 And a lot of the other individuals that were on that client list are still in the fashion industry, were never held responsible for their crimes.
00:22:55.000 And whether it's running Victoria's Secret, sending out this messaging, this messaging is not something that is an accident, because they've been doing it for many years, trying to normalize children as models.
00:23:06.000 And a lot of the female models look like young boys.
00:23:10.000 I don't think that's an accident.
00:23:11.000 I think a lot of this is on purpose, because there's a lot of very dark, sinister people in that industry.
00:23:16.000 that want to gloat and highlight how bad they are, and they want to get away with it.
00:23:19.000 And this was this representation from what I saw.
00:23:22.000 I'm not disagreeing with anything you said.
00:23:23.000 What I'm saying is, it's not that they missed it.
00:23:25.000 It's that the suit was like, this isn't my job.
00:23:27.000 You have a photographer, that photographer has a brand name.
00:23:30.000 I'm just the guy, the money guy.
00:23:31.000 I'm going to let him do what he wants.
00:23:33.000 It's not my place to jump in and reject.
00:23:35.000 And basically, in that industry, these people who have brand names, like Balenciaga is a brand name, can basically have free reign to do whatever the heck they want.
00:23:42.000 They spend millions of dollars on these fashion shoots, going over every little detail.
00:23:47.000 I mean, the President of the United States literally decides what kind of tie he wants to set a particular mood, to have the particular background.
00:23:54.000 There's no way they're spending millions of dollars on these fashion shoots and not micromanaging every little small element of it.
00:23:59.000 You're not hearing what I'm saying.
00:24:00.000 What I'm saying is... I don't think it's a suit.
00:24:02.000 I think it was deliberately done, and I think the suits were complicit in it.
00:24:06.000 Okay, I think a lot of times in big corporations the responsibility is very much disjointed and one hand is not knowing what the other hand is doing.
00:24:15.000 And I think the reason, wait hold on, the reason why the social media got pulled is because then someone realized, holy crap, Otherwise they would have left it up, because they're like, this has been going on for a long time.
00:24:25.000 We got to do an analysis to figure out when this started, how many of these iterations are there.
00:24:29.000 So you think people were like seeding this information into the ad campaign?
00:24:32.000 Oh, they definitely were.
00:24:33.000 It's not even a question.
00:24:34.000 It's everywhere in our entertainment, especially if you look at Disney, especially if you look at the subliminal subconscious mind control, which I particularly call it, and the larger kind of degeneracy that is being pushed on the individuals, and even the corporate media, even Talking about minor attracted persons, there's a huge agenda when it comes to trying to normalize this kind of debauchery, this kind of horrible behavior that a lot of powerful people inside of the government were committing, especially when they were going to that private island and doing absolutely horrible things to children that they kidnapped.
00:25:03.000 If you look at the cover of the DVD for The Little Mermaid, if you look at the scene in Aladdin where they're getting married, in the scene in Aladdin, the priest is excited.
00:25:12.000 Let me just put it that way.
00:25:13.000 If you look at the background of The Little Mermaid, one of the towers looks like a certain thing of a male anatomy.
00:25:19.000 My point is, a lot of times, these artists will put things that the suit will be oblivious to.
00:25:24.000 And that's one example where you're not… Wasn't there a movie, The Great Mouse Detective or something, where there was like an actual… they put like porn somewhere in it or something like that?
00:25:33.000 People chatted this to us before.
00:25:34.000 There's a bunch of adult content and wieners all throughout, you know, content for small children.
00:25:41.000 And I think when it comes to these particular cases, we know so little because a lot of this is done in secrecy.
00:25:48.000 We shouldn't be excusing it as an accident.
00:25:50.000 We should always, with the absence of evidence, when we see very sinister people who have a history of hurting small children, always think the worst case possible scenario and demand more evidence.
00:25:59.000 And I think excusing Excusing it as, oh, the suit's just missed it is an instant or element—I'm sorry if I got your argument wrong.
00:26:08.000 I'm not saying the suit's just missed it.
00:26:10.000 I said in very many cases how these people work is they will get things past the goalie because the people who are looking don't know what to look for and are oblivious to it.
00:26:19.000 And in fact, the fact that this has been going on—hold on, Ian.
00:26:23.000 No, you hold on.
00:26:24.000 Sorry, sorry.
00:26:24.000 The fact that this has been going on for so long and has only been picked up now speaks to the point that I'm making that they're very good at getting things on a subliminal level that people aren't going to register.
00:26:36.000 So if you have Balenciaga ads and you have literally tens of thousands of people seeing these ads and they're only getting picked up now, that just speaks to my point that a lot of times the money guy or the accountant or whoever is kind of signing off on this, they're not going to always be aware of what they're seeing.
00:26:51.000 I don't mind being compared to Ian.
00:26:53.000 I think Ian's great.
00:26:55.000 But at the same time, never try to blame something that could be construed as an accident when it could be malice.
00:27:09.000 Hanlon's razor is the opposite, would suggest Malice is right.
00:27:13.000 Never attribute to malice that which can be explained by incompetence.
00:27:16.000 Yes, thank you.
00:27:17.000 You said that a lot better than I did.
00:27:18.000 But that would imply that Malice's argument is more likely true.
00:27:21.000 I would just say one point on this.
00:27:25.000 I agree a bit with you, Michael, and I'm not trying to argue with either of you.
00:27:30.000 I think someone intentionally did this and intentionally did it for a long time.
00:27:33.000 And there are people who are supposed to be gatekeepers.
00:27:33.000 No question.
00:27:36.000 If hundreds of thousands or millions of ad viewers didn't even notice this, It's possible that the gatekeepers of these companies don't notice this stuff either, but someone is intentionally doing it.
00:27:48.000 If you look at Hollywood history, there were a lot of gay references that were meant there for gay audiences, whereas the straight people watching this and the executives would be completely oblivious.
00:27:57.000 There's a college course about this where they go back and deconstruct these old movies.
00:28:00.000 The higher you go up in a lot of these bigger institutions, especially corporations, the more likelihood that there's going to be a sociopath in them.
00:28:07.000 Yes!
00:28:08.000 So, on that assessment, and in the absence of evidence, I'm going to assume the worst, mainly because I know what they're capable of, and I think we should, from a point of view, not excusing their behavior, but at the same time saying, hey, they have a long history here, let's get the evidence here, because they probably are doing some of the most awful, sinister, horrible things that you can't even fathom and imagine yourself.
00:28:28.000 You're talking to me.
00:28:29.000 Who are you perceiving as excusing this behavior?
00:28:32.000 I'm saying the suits that just were not aware of it.
00:28:32.000 No, no, no.
00:28:37.000 But that's not an excuse.
00:28:39.000 That's an explanation.
00:28:40.000 If you're signing off on something... But that allows them, but that gives them a way to say, oh, we just didn't know.
00:28:46.000 Yes, absolutely.
00:28:47.000 We didn't know.
00:28:48.000 We just didn't see this happening.
00:28:49.000 I don't want to even give them that.
00:28:51.000 But I think where you and I disagree, and let's take away from Balenciaga, is if you look at, let's suppose, the Senate, right, or politicians, a lot of these people in power are not very bright.
00:29:02.000 And a lot of times it's their staffs that are doing these malevolent things and are putting things over on the American people, things in terms of war.
00:29:10.000 And that's one of the reasons why I'm so hopeful, is because when you look at the people who tend to be in power, they're really often very unimpressive.
00:29:18.000 I think they use that as a way to cover their larger actions.
00:29:22.000 You think Biden knows what the hell's going on?
00:29:24.000 Absolutely not.
00:29:24.000 He's a puppet, and I think there's bigger controllers and there's bigger interests right behind him.
00:29:28.000 That's all I'm saying.
00:29:29.000 We're on the same page.
00:29:31.000 Okay.
00:29:31.000 Well, it sounds like you guys are both right.
00:29:32.000 It was both malice and incompetence.
00:29:34.000 At some level, people were maliciously putting chalkboard in.
00:29:37.000 I say 100% malice, zero incompetence.
00:29:39.000 I did it.
00:29:39.000 I got away with it, and I'll do it again.
00:29:42.000 This is an interesting conversation.
00:29:43.000 I used to call them minor attracted people because I'm like, well, I want to be sensitive about these, this, this, whatever, disillusion or whatever.
00:29:49.000 And now I'm like, okay, pedophile.
00:29:51.000 But the thing about pedophile is there's a big difference between a 20 year old and a 17 year old hooking up consensually and a 25 year old and a nine year old getting it on.
00:29:58.000 When the 9-year-old doesn't know what's happening.
00:29:59.000 But Ian, Ian, that's just... With all due respect... But they're both considered pedophiles.
00:30:03.000 No, they're not.
00:30:03.000 Well, I mean, if one's a consensual statutory rape... No, okay, listen, Ian.
00:30:08.000 There's something called Romeo and Juliet laws.
00:30:10.000 A 20-year-old... 15 years old, or 16 years old, wherever it's not legal in this state.
00:30:14.000 There's another word for that, I guess.
00:30:15.000 What, a febophile?
00:30:16.000 I guess, I don't know.
00:30:18.000 Whenever you see that word written on the internet, run away from the hills.
00:30:21.000 Exactly.
00:30:21.000 I get it.
00:30:23.000 At that point, it's like, it doesn't matter.
00:30:24.000 My point is, if two young people are within three or four years of age, many states say that's protected, depending.
00:30:31.000 I think in some of the states that have these, like 13 is the limit or 14 is the limit.
00:30:36.000 So basically between 14 and 18 are shielded from these laws or whatever.
00:30:40.000 Or within three years of like 20.
00:30:42.000 So if you're like a 20 and a 17-year-old.
00:30:44.000 So like if a 25-year-old and a 17-year-old consensually were hooking up, It's a big difference than a 25-year-old and a 9-year-old.
00:30:50.000 No one's disagreeing with you, Ian.
00:30:52.000 But they're both called pedophiles.
00:30:53.000 That's the problem with the word.
00:30:54.000 It's so charged that if someone gets a statutory rape accusation and they're 25 and she's 17 or they're 23... 17 and 25 is not a pedophile.
00:31:01.000 Well, a lot of people, I think, consider them, because they're considered minors, they consider them pedophiles, and that's dangerous.
00:31:08.000 I agree that it's dangerous, but I think there was a New York Times article several years ago when they were looking at—I mean, this is going to be a darker episode than I expected.
00:31:17.000 Well, why don't we get back on subject?
00:31:18.000 Family-friendly language.
00:31:20.000 We were talking about yay and politics.
00:31:22.000 Yeah, let's get to the fun stuff, you know the truth.
00:31:24.000 There was a New York Times article where they were investigating child pornography.
00:31:28.000 And according to that piece, the FBI, I guess it was, didn't have enough staff to even handle the infants that were being in these videos.
00:31:37.000 So the amount of very young stuff that's out there is profoundly disturbing.
00:31:44.000 And I was going to write a book about this.
00:31:46.000 I talked to my agent about it several years ago.
00:31:48.000 And he's like, do you really want to be doing this research?
00:31:50.000 And I'm like, yes.
00:31:50.000 And there's multiple incidences of people coming to big platforms like Twitter a couple years ago and saying, hey, my photos here when I was underage are being leaked here.
00:32:00.000 Twitter saying, oh, yeah, yeah, we'll do something about it.
00:32:02.000 And then ignoring those people, screwing them over and then at the same time now Twitter for the first time is saying we're going to be addressing this problem and now the Apple App Store is going to do is threatening to cancel them that's absolutely crazy this is the one this is the you got me triggered now okay now I'm gonna go for Luke Elon Musk Eliza blue if you're out there I think she's been on the show yeah this has been her big issue getting a CP off of Twitter god bless her this is something that is
00:32:28.000 Unambiguously a problem, something unambiguously horrific and evil, period, end of story.
00:32:32.000 And a lot of times people who were in, as you said, were in these images or videos would contact Twitter and Twitter shrug their hands.
00:32:39.000 Elon Musk took over and he's like, all right, this is going to be like priority one.
00:32:43.000 Like this is a complete non-starter.
00:32:45.000 This has got to go.
00:32:46.000 We can worry about racism, homophobia, transphobia, gender pronouns, whatever.
00:32:50.000 This is a problem.
00:32:51.000 Forbes, who is an agent of the devil, wrote a tweet and an article that says,
00:32:57.000 Elon Musk has tried to take on Twitter's child abuse nightmare,
00:33:02.000 but according to experts, has only made it worse.
00:33:04.000 And they tweeted it nine times.
00:33:06.000 I looked up how often Forbes had mentioned Twitter and children in the past.
00:33:12.000 They've literally never even used those two words together previously.
00:33:15.000 So now that Elon is trying to do something about it, Forbes has an issue with it, because they don't have an issue with it.
00:33:20.000 They have an issue with Elon Musk.
00:33:22.000 So this speaks to what Luke is saying, how many people there are in power who do not care about children at all, but who only care about power and who get off on having power over their people, including young children.
00:33:33.000 And they brag about it many times by doing photo shoots.
00:33:36.000 Let's jump to this story from Fox Business in light of what you're bringing up.
00:33:40.000 Musk is planning to release the Twitter files on free speech suppression.
00:33:43.000 The public deserves to know.
00:33:46.000 So, of course, I sat down with, you know, Rogan and Jack and Vijay, and I believe they outright lied to us.
00:33:52.000 The whole thing was just lies.
00:33:53.000 Hadn't they lied under oath, too?
00:33:55.000 I don't know about them specifically, but I'm willing to bet Twitter staff probably have lied under oath to Congress, or at the very least, they've lied to Congress.
00:34:02.000 Or even less than that, been wrong to Congress.
00:34:05.000 Great.
00:34:05.000 We'll hear what's going on.
00:34:08.000 I gotta say though, the Twitter files, I would bet a substantial amount of money that they have files on child exploitation where there's a manager saying like, hey, don't do anything because it'll draw attention to us and it'll be bad for the stock price.
00:34:21.000 Stuff like that.
00:34:22.000 Oh my god.
00:34:23.000 Would you agree they were probably behind the scenes?
00:34:26.000 I am praying with every fiber of my being that you're wrong.
00:34:31.000 That's something that's so purely evil, that to have that in writing, it's just so disturbing.
00:34:36.000 What do you expect of these people?
00:34:37.000 You're right, but still, it's just horrifying to even think about.
00:34:41.000 Why weren't they taking it down?
00:34:44.000 Yeah, that's a great question.
00:34:45.000 That's a very important question.
00:34:46.000 Elon snaps his finger.
00:34:47.000 That's unfair.
00:34:48.000 He's working very hard getting everybody saying, shut it down.
00:34:51.000 And it's disappearing.
00:34:53.000 Yeah, and these people at Twitter were taking advantage of the power and responsibility they had for their own personal benefit, for the personal benefit of the powers in charge, for the government, for the intelligence agencies, for the multinational corporations that are calling the shots there.
00:35:08.000 So they're capable of doing a lot of very underhanded evil things.
00:35:12.000 And I think Elon Musk, even a couple days ago, was tweeting, You know, it's far worse than I even expected it to be, and I bet there's probably so much evil, so much just nasty actions being committed by these people that a lot of people can't even imagine how bad it gets.
00:35:28.000 Let me ask you, Ian, because you worked for Mines and you actually saw a lot of the content that violated, I can't imagine, like, based on your experience, how bad do you think it is with Twitter?
00:35:36.000 People posting this stuff.
00:35:37.000 It's gotta be magnitudes worse, because there's magnitudes more people using it.
00:35:41.000 And I mean, like, times 10, times 10, times 10.
00:35:42.000 There's probably a 10,000 more times more people using it, so I'd imagine there's 10,000 times more porn.
00:35:47.000 There's probably even more than that, because it's a centralized focus of interaction.
00:35:51.000 You actually had to deal with moderation when it came to this kind of stuff.
00:35:53.000 Yeah, and I'm wondering if the FBI went to Twitter and was like, oh, that's child porn?
00:35:56.000 Leave it up.
00:35:57.000 It's a honeypot now.
00:35:58.000 Let's see, everybody that shares it and comments on it, those are the people we're going after.
00:36:01.000 Yeah, the federal government has run many adult child websites, and the stories of what they were doing is absolutely shocking.
00:36:11.000 What about the Banju boys?
00:36:12.000 Is that how it's pronounced in Afghanistan?
00:36:14.000 I don't know.
00:36:15.000 Oh, right, right, right.
00:36:15.000 Yeah, our allies in Afghanistan had a thing where they would bring out young boys and have their way with them, and basically the U.S.
00:36:22.000 looked—this is also in the New York Times—the U.S.
00:36:23.000 armed forces were told, they're on our team, look the other way, this isn't our problem.
00:36:27.000 And then the soldiers that spoke out against it were court-martialed and punished that were trying to stop it.
00:36:33.000 Your tax dollars were paying for the U.S.
00:36:37.000 military protecting and aiding small children being hurt in more ways than one.
00:36:42.000 I just wanted to come here and make some Jew jokes.
00:36:44.000 You are.
00:36:45.000 Hey, the Balenciaga stuff opens the door.
00:36:50.000 We can talk about, I mean, what's going on with Twitter, because it's not just obviously the dark, dark stuff, but we're probably going to see overt political bias.
00:36:57.000 We're probably going to see election interference outright, them saying... The Hunter Biden thing off the top of their head.
00:37:03.000 There's no question that was outright election interference.
00:37:05.000 And they're probably going to be saying in the private chats, this is going to help Biden win, or we have to stop Trump, things like that.
00:37:11.000 Yeah.
00:37:12.000 Yeah, there's no question that's... It's dark, but I've lightened it.
00:37:15.000 Yeah, no, no, no, that is, although Hunter Biden with the niece and whatever, that's a whole other situation.
00:37:15.000 How about that?
00:37:22.000 Here we go again!
00:37:23.000 No, but there was this Norm Macdonald meme, which I don't think he ever really said, where the quote ascribed to Norm, which is, I'm starting to think that the pedophile devil worshippers who run our government don't have our best interests at heart.
00:37:37.000 And there's a lot of truth to that.
00:37:39.000 Yes, absolutely.
00:37:40.000 I'm wondering what other, when they pull up these Twitter files that Elon's talking about, if they're going to find words that have been downright—accounts that have been downright— Can I make everyone even more depressed?
00:37:49.000 Hell yes!
00:37:49.000 Let's do it.
00:37:50.000 Let's go.
00:37:50.000 Please do.
00:37:51.000 So the thing that really upsets me is the virtual certainty that nothing will happen as a consequence of this.
00:37:58.000 And let me give you an example when this happened.
00:38:00.000 We all remember the Brett Kavanaugh hearings.
00:38:02.000 And whatever you think of Christine Blasey Ford, Julie Swetnick was an example where it absolutely, you know, she's saying that Kavanaugh was at these parties where there was trains run on her.
00:38:12.000 It's like, why are you going back to these parties?
00:38:13.000 Michael Avenatti was her lawyer.
00:38:15.000 People were tripping over themselves to put her on TV.
00:38:17.000 It turned out her story made no sense.
00:38:19.000 It didn't add up.
00:38:20.000 She never met Kavanaugh, so on and so forth.
00:38:22.000 Chuck Grassley, Charles Grassley, was just re-elected, who was, I think, head of the Judiciary Committee, ranking hardcore Republican.
00:38:28.000 He put out a press release earlier this year, how he wrote a letter to both the Department of Justice and the FBI, asking for follow-up on someone who lied to effect a free court nomination, and they didn't bother replying, neither bothered replying to him.
00:38:42.000 And then he wrote them another letter, and this is his press release.
00:38:45.000 See?
00:38:45.000 I'm writing letters that are completely ignored.
00:38:48.000 Vote for me!
00:38:51.000 The fact that he's boasting about the fact that he can't even get a like someone on the phone from the Department of Justice or the FBI about something that is central to our legal system just speaks to me how little appetite there is in Washington among members of both parties to have any kind of repercussions for this and another great example of this is I really drives me crazy when boomer conservatives think all pedophiles are Democrats like as if it's somehow like you know they're into kids but they're also just socialized medicine.
00:39:19.000 The Speaker of the House, Dennis Hastert, was a predator on young boys, and he went to jail as a consequence for things that have to do with this issue.
00:39:26.000 The Democrats never bring it up.
00:39:28.000 They never use this as a cudgel.
00:39:29.000 They never say, why don't you bring back Dennis Hastert's money?
00:39:32.000 They're more interested in talking about, like, Margaret Taylor Greene or Trump or George W. Bush, who's now a good guy.
00:39:37.000 So that, to me, is very, very disturbing.
00:39:41.000 I have not given up on politics, but I'm so disinterested in attempting to use that corrupt system to fix a world that was—we need to make politics—like, politics is a result of a healthy society, so let's build a healthy society.
00:39:53.000 Then there will be politics.
00:39:54.000 Luke, fix him!
00:39:55.000 Hey, I'm trying.
00:39:57.000 I'm trying to get him to exercise.
00:39:59.000 That's been—you said you wanted to do it yesterday.
00:40:02.000 I did exercise last night.
00:40:03.000 I was enraged after the show.
00:40:04.000 I think you need to read this book.
00:40:06.000 I would love to.
00:40:07.000 You can only help people so much.
00:40:09.000 You've got to open the door, they've got to walk through.
00:40:11.000 I want to mention, just as we're talking about this subject, and boy are we in it, I guess because the Balenciaga stuff, we really needed to talk about this, but there is a positive for us who are challenging these things, trying to get these things taken down.
00:40:25.000 Grateful to Elon Musk for putting a stop to this, and it's that I was talking to a friend, And I said something like, semi-facetiously, like, oh yeah, like, you know, people believe that a cabal of powerful global elites are trafficking kids and doing weird things.
00:40:39.000 Pause.
00:40:40.000 Because now we know they are because of Epstein and Maxwell's conviction.
00:40:43.000 Hey, that's funny, right?
00:40:44.000 If you said that a few years ago, people would call you crazy, but something happened.
00:40:48.000 And now we know it happened.
00:40:50.000 They're doing it.
00:40:51.000 Maxwell locked up because of it.
00:40:54.000 And now we're still sitting here wondering, who were their clients?
00:40:57.000 So, not only do we know it happened, we know there are still people who have never been held to account.
00:41:01.000 She went down for providing, you know, a service for clients that weren't there.
00:41:07.000 That absolutely doesn't make sense.
00:41:08.000 And this wasn't the first time that there have been major government officials caught in these larger, horrible things that they were doing to children.
00:41:17.000 Dennis Hassert is one of them.
00:41:18.000 The former Prime Minister of the United Kingdom, Edward Heath, another major one.
00:41:22.000 No, no, no.
00:41:23.000 I looked him up.
00:41:24.000 Ed Heath, it's ambiguous, so let's give him some credit.
00:41:26.000 I mean, when he's hanging out with Jimmy Savile?
00:41:28.000 I didn't know he was hanging out with Jimmy Savile.
00:41:29.000 Yeah, I mean... Heath was... We don't have receipts on Heath.
00:41:33.000 Who's Heath, by the way?
00:41:34.000 He was the prime minister that Margaret Thatcher dethroned.
00:41:37.000 I talk him out of my next book.
00:41:38.000 Yeah, but again, there are many... Keynes?
00:41:41.000 Yes.
00:41:42.000 We just keep going, how many times and how many instances there are of government using your tax dollars to facilitate some of the most atrocious, horrible acts on the face of the earth that people can't even think about.
00:41:54.000 I think it was in 1969, 1971, if you look up, like, French Philosopher's Declaration, a bunch of prominent French philosophers, including Sartre.
00:42:01.000 Early 70s, late 60s.
00:42:02.000 The 70s, where they all signed a letter saying that age of consent laws should be abolished.
00:42:06.000 Wild.
00:42:07.000 Wow.
00:42:07.000 Yeah, it's wild.
00:42:08.000 There's evil out there.
00:42:09.000 So please look it up, double-check it.
00:42:11.000 Okay, I want to talk about evil and cabals, because you're saying these conspiracy theories, people don't know if they're real or not.
00:42:15.000 Well, hold on, hold on.
00:42:16.000 This is very known.
00:42:17.000 Right, right, like, the Epstein stuff is no longer the realm of conspiracy.
00:42:22.000 Maxwell was convicted.
00:42:23.000 I want to shout out Phoenix499, who superchatted us, saying, all you need to know is that Maxwell was the first person to be convicted of trafficking kids to nobody.
00:42:32.000 Yeah.
00:42:33.000 So there's people out there.
00:42:35.000 These are evil people.
00:42:37.000 They've gotten away with it for now.
00:42:40.000 And one more, let's get a little personal.
00:42:42.000 I was on Rogan a couple years ago because a friend of mine, Matt, and he told me to use his name, he came out to me because he had been a victim of childhood sexual abuse.
00:42:49.000 And the reason I bring this up every so often is it's still one of those things where there's such a stigma to it that people are scared to talk about it.
00:42:56.000 Like if I found out that your mom, you know, was an alcoholic, I'd feel bad for you, Luke, but our friendship wouldn't really change if I found out, you know, something happened with you, Ian.
00:43:03.000 But this is the kind of thing where people are scared to say something because if you talk to them about it, they think you're going to think they're a wounded bird, you're going to think they're a freak, you're not going to be able to make certain kinds of jokes about them.
00:43:11.000 And as a result of this, this social stigma, they keep it silent, you know, for the rest of their lives, which is really, really making a victim even worse.
00:43:18.000 And after I was on Rogue and I talked about this, four more of my friends came out to me.
00:43:22.000 So if I know five people, that means I definitely know more.
00:43:25.000 And that means this happens a lot more than people realize.
00:43:28.000 And until people start talking about it and normalizing coming out and accepting people who've had this done to them, they're going to keep getting away with it.
00:43:35.000 Because the reason these people get away with it is like, don't tell anyone.
00:43:38.000 And that you're a kid, you're not going to know any better.
00:43:40.000 So it's a very, very disturbing cycle that hopefully we're going to be able to break in the very near future.
00:43:47.000 You think it's just by normalizing sexuality, adult sexuality, in our society?
00:43:51.000 No.
00:43:54.000 People are afraid to talk about sex, and then they go and shame, watch porn, and abuse kids.
00:43:59.000 If they have a healthy conversation about it, maybe it will prevent that kind of behavior.
00:44:06.000 No, I can't.
00:44:07.000 I can't.
00:44:07.000 I'll give you this one.
00:44:09.000 See what I have to deal with?
00:44:10.000 I gotta deal with this every day.
00:44:11.000 We are the result of Puritans, and they were very anti... like, it was Puritanical has that adjective attached to it for a reason.
00:44:18.000 It's because it's like, no sex, no drugs.
00:44:19.000 This might be the most... you've gone full Ian.
00:44:22.000 This is the most...
00:44:27.000 I completely agree sexuality is okay.
00:44:33.000 If you want to do porn, I'll support you.
00:44:35.000 But I think this is something very different from people, you know, not talking about what kind of sex they have as opposed to like, Awful things were done to me as a kid, and they didn't use force, and on some level it was pleasurable, and I was confused because I was a child, and I wasn't physically hurt, they didn't punish me, I didn't know how to feel about the time, and now as an adult, I still don't know how to feel about it, and this has disturbed me all my life, and I'm turning to drugs or alcohol because I don't know how to deal with it.
00:44:59.000 For the record- In my opinion, Hunter Biden, perhaps.
00:45:02.000 Yeah.
00:45:04.000 That's a big can of worms that we could open up here in just a little bit.
00:45:08.000 But Ian, I do not support your adult history.
00:45:11.000 You haven't seen me work yet.
00:45:13.000 I do not.
00:45:13.000 I do not want to.
00:45:16.000 And there's a big difference between satanic evil people using sex as a way to gain energy and feel like they have power over actual genuine love and intimacy.
00:45:27.000 Teaching someone how to have sex.
00:45:28.000 Let's segue to censorship because I have a story from Lawfare Supreme Court Grants Certiorari, I'm pronouncing that wrong probably, in Gonzales v. Google and Twitter v. Tomne, an overview.
00:45:40.000 I'm not sure these are the exact cases, I believe these are the cases in question, but I highlight these stories to tell you that I had a phone call with Google today.
00:45:48.000 It was scheduled in advance a few weeks ago.
00:45:50.000 I received an email where it was sent, I believe, to larger YouTube channels warning that Section 230, this is the shield Big Tech uses to eliminate content they don't like as distasteful while being immune from any responsibility due to hosting some of this content.
00:46:11.000 I believe they're on track to lose these protections in a very serious way.
00:46:15.000 So Google started doing this reach out.
00:46:18.000 Got an email and they said… Can we focus on that point because I think it's a big one in terms of the odds that they're going to lose.
00:46:22.000 I think the odds are very high they're going to lose because their argument is we're not an editorial, we're just a publisher.
00:46:29.000 Anyone can put anything out there, our hands are clean.
00:46:31.000 However, if I'm editorial, if I'm picking articles, I'm promoting people, then I am having a voice and therefore the protections don't apply to me.
00:46:38.000 Then I can be guilty of libel or whatever, slander, whatever it is.
00:46:41.000 And very clearly, all of these Instagram, Facebook, Twitter, and YouTube are putting their finger on the scale and promoting some things over the other and make editorial decisions.
00:46:51.000 It's unambiguous.
00:46:52.000 So, they're panicking?
00:46:54.000 Yeah.
00:46:54.000 I got an email and it said, we want to make you aware about a Supreme Court taking on these cases, which will have a big impact on recommendations.
00:47:03.000 If the Supreme Court rules against us or our position, we may not be able to recommend your content anymore.
00:47:10.000 And you know what I thought when I saw that?
00:47:12.000 Ian should do porn.
00:47:14.000 Me too.
00:47:16.000 I thought maybe, that's a good one, but no, I left.
00:47:23.000 Are you kidding me?
00:47:24.000 That they would email this when we know for a fact they suppress our content.
00:47:27.000 And I say we know for a fact because let me tell you this.
00:47:30.000 I get a phone call from the guy.
00:47:32.000 We had a meeting.
00:47:33.000 I get the email.
00:47:34.000 It says, please schedule.
00:47:35.000 Here's my availability.
00:47:36.000 I don't want to be mean to the guy.
00:47:37.000 He called me and he was advocating for his position.
00:47:39.000 I respect that.
00:47:40.000 He wants to talk to me again tomorrow.
00:47:42.000 I scheduled the meeting for whatever date was available, and we had some emergency stuff going on today.
00:47:48.000 Obviously, we had a heck of a yesterday, so, you know, of course we get slammed with a bunch of stuff today.
00:47:53.000 You get it.
00:47:53.000 But I get a phone call, I'm in the car, and I'm like, well, I answer it anyway, and I'm like, hey, I'm in the car, and I'm like FaceTiming this dude.
00:47:58.000 And he basically was explaining that if the Supreme Court agrees with these people, That means YouTube... Look, Section 230 says you can't sue... I'll give you a simplified version.
00:48:12.000 You can't sue YouTube because of what Tim Pool says.
00:48:15.000 I am putting something out there.
00:48:17.000 YouTube did not say it.
00:48:18.000 However, it also says YouTube can't be considered a publisher of this content even if they moderate and have editorial control over the platform.
00:48:29.000 Now, how do you have it both ways?
00:48:31.000 You're a corporation.
00:48:32.000 You always have it both ways.
00:48:33.000 Right.
00:48:34.000 So, uh, so they call me and they say, listen, it may be, and this is so hilarious.
00:48:39.000 The guy's like, look, we, we might not be able to issue recommendations anymore.
00:48:43.000 That means search because we use an algorithm.
00:48:45.000 So if someone imagine someone searched for your name and the, and the videos wouldn't come up.
00:48:51.000 And I'm just sitting there like, oh, yeah, imagine it, huh?
00:48:53.000 Imagine it, huh?
00:48:54.000 I went off.
00:48:55.000 I'm sorry.
00:48:56.000 I snapped.
00:48:56.000 I said, we hit 800,000 views on our video.
00:49:00.000 No trending, no recommendations.
00:49:03.000 There was a period where Google removed this YouTube channel from all of their search.
00:49:09.000 You could type in the title of the video.
00:49:11.000 And I was like, and you're going to call me and tell me that I should be worried about this.
00:49:14.000 You understand that you have so much weight against me.
00:49:18.000 We succeed in spite of what you've done, that if YouTube strips you of your powers, we'll actually benefit from it.
00:49:24.000 First, people at home don't realize that when Tim gets this angry, his beanie becomes a mushroom cloud.
00:49:24.000 Two points.
00:49:29.000 So it's kind of really cool to watch.
00:49:30.000 And I'm the victim of this, too.
00:49:32.000 On Instagram, you can't search for my name.
00:49:33.000 You can't even tag me.
00:49:34.000 You have to spell out Michael Malice because I'm completely saturbanned.
00:49:37.000 On Instagram, if you try to follow me, it gives you a warning telling you not to follow me because I spread quote misinformation.
00:49:43.000 You try to tag me, it says don't you sure you want to tag this guy?
00:49:46.000 You spread misinformation?
00:49:46.000 Really?
00:49:50.000 YouTube has been screwing me for my entire career.
00:49:53.000 What you described, Tim, has been happening throughout my entire career.
00:49:56.000 YouTube promotes authoritative sources.
00:49:59.000 CNN, MSNBC, fake news spreaders.
00:50:02.000 They don't spread independent media.
00:50:03.000 I'm not in the search.
00:50:05.000 And it's almost impossible to find Eden's pornos.
00:50:05.000 I'm not in the results.
00:50:08.000 I have looked a lot.
00:50:09.000 Look on the CrossMag channel.
00:50:10.000 I want to tell you more about what this guy was saying to me.
00:50:13.000 He said something to the effect of, imagine if we could no longer recommend you and you were no longer part of like, I'm paraphrasing, but something like the mainstream conversation or something.
00:50:24.000 And I'm just like, Guy, that's what you've already done.
00:50:27.000 We created a new channel, TimCastMusic.
00:50:29.000 We put a video on it.
00:50:31.000 It gets half a million views in a day.
00:50:33.000 Two days later, it is trending number 23.
00:50:37.000 I think number 23 on YouTube.
00:50:39.000 Surprise, surprise.
00:50:39.000 A brand new channel with a big video gets played.
00:50:42.000 We're trending.
00:50:43.000 But IRL can get half a million views on an episode in two hours and never reach that mark.
00:50:48.000 Can I make a point?
00:50:49.000 My understanding is, from someone who has inside knowledge about this, that the trending section on YouTube is manual.
00:50:55.000 Right.
00:50:56.000 Of course.
00:50:58.000 It's not organic.
00:50:59.000 I think it's a mix, but I think it's largely editorial.
00:51:02.000 I think there's obviously a component, but my point was to the guy, I said, why is it that we know for a fact you can't Google search my videos, Like, Facebook comes up, you can't even search my channel.
00:51:15.000 When you would search for TimCast or TimCast IRL, it would just show you playlists created by other people because they removed us from search.
00:51:21.000 And you're telling me that you want the ability to be free from all liability when you choose to promote political speech from, say, ISIS, but then you expect me to defend you when you put the weights on us when we call out these bad players.
00:51:36.000 I hope, I said this to him, I hope That the Supreme Court rules against you and everything is forced to return to reverse chronological feed because I will do better from it.
00:51:46.000 Absolutely.
00:51:47.000 This talks to earlier about how the suits were often oblivious, because I bet you any money that he had the exact same
00:51:52.000 speech for everyone he was trying to call.
00:51:53.000 He had a list of like 50 people, and he had the exact same speech,
00:51:56.000 and was oblivious to who he was talking to.
00:51:57.000 And then the call got disconnected, I guess.
00:51:58.000 He- he- he- But that was us.
00:52:00.000 We- we- Oh.
00:52:01.000 We control the phones and the weather.
00:52:02.000 Didn't like that.
00:52:03.000 Maybe he got disconnected, but as I was talking, and I'm heated, but I was being polite.
00:52:09.000 It's not you personally, but dude, don't call me and think I'm going to side with you on this one.
00:52:14.000 I don't think he did his research.
00:52:15.000 I just love the idea that you're going to holler up Getanji Brown Jackson and be like, come on.
00:52:19.000 And she's like, all right.
00:52:21.000 Like, I've got a rule in favor of them.
00:52:22.000 No, no, right?
00:52:23.000 But I also thought that it was kind of funny that Google thinks they can start calling creators and saying, support our identities.
00:52:29.000 But they can.
00:52:30.000 I bet you a lot of these creators are completely art-synthetic and a function of their algorithm.
00:52:35.000 But I'm also willing to bet that when he calls leftists, they said the same thing I did.
00:52:40.000 LGBT creators have been complaining a long time that YouTube suppresses that because it's not advertiser-friendly.
00:52:45.000 So they're gonna call them and be like, defend our immunity.
00:52:47.000 They're gonna say, no.
00:52:48.000 Restore reverse chronological feed so that my video can be shown to my followers.
00:52:53.000 I bet you there's a lot of people who are happy to play ball.
00:52:56.000 Because if you look at any industry, the people at the top are often happy to lick the boot.
00:53:00.000 There's always some Ralphs willing to suck up to power and authority, but I think a lot of people are absolutely frustrated and pissed off at these algorithms controlling our society that have been absolutely a net negative, not just to the mental health of this country, but when we see the mental warfare that's being created out there, you're going to be doing something here.
00:53:22.000 What do you mean by algorithms?
00:53:24.000 Say it!
00:53:25.000 Name them!
00:53:27.000 I'm not going to bat an eye when the same huge organization that labeled me a conspiracy theorist now won't be able to editorialize.
00:53:35.000 I'm not going to be mad at that.
00:53:37.000 It's totally fine.
00:53:38.000 Let people see what they want to see.
00:53:40.000 It's that simple.
00:53:41.000 If they subscribe to something, if they want to see something, let them see it.
00:53:44.000 We're adults here.
00:53:45.000 Stop treating us like children.
00:53:46.000 I don't want to be recommended these stupid RV videos or these stupid truck life videos.
00:53:51.000 I don't want to be recommended some kind of psy-op Stop with the psyopsis, stop with the nonsense.
00:53:55.000 I want to listen to what I want to listen to, and that's what the people demand, and that's what the people will get.
00:53:59.000 I'm on Twitter, and they have their algorithmic feed, and their home feed is what it's called, and they have the reverse chronological feed.
00:54:07.000 I do really well posting my content and getting shares and getting followers without any kind of weird algorithmic impedance.
00:54:14.000 So YouTube deserves to have that stripped away from them.
00:54:17.000 My Twitter following went up by an order of quadruple new followers a day once some switch got switched 10 days ago.
00:54:26.000 And Ben Askren, who is now 0 for 3, my show episode with him is dropping next week.
00:54:30.000 When we recorded, he had more followers than me and was rubbing it in my face, and now I have 10,000 more than him.
00:54:35.000 So suck it, Askren.
00:54:38.000 Now I know how Jake Paul feels.
00:54:40.000 Point being, there was clearly some kind of algorithmic screwy on the back end that was hurting me before and is either neutral or helping me now.
00:54:48.000 Yeah, I experienced the same thing.
00:54:50.000 Probably you too, Tim.
00:54:51.000 Have you heard, there's like that internet meme law that says any online forum without moderation will become right... Something like that.
00:55:00.000 Any online forum without sufficient moderation will become right-wing.
00:55:04.000 Oh, okay.
00:55:05.000 Yeah.
00:55:05.000 I think it's more like it'll become radicalized.
00:55:07.000 That's not... I don't know who came up with that idea.
00:55:07.000 No, no, no.
00:55:10.000 It has to be right-wing.
00:55:11.000 I thought it was Conquest Laws of Power.
00:55:14.000 The second one is, any organization not dedicated to being right-wing will become left-wing.
00:55:19.000 So that might be an inversion of that.
00:55:20.000 Really?
00:55:21.000 Yeah.
00:55:21.000 I guess that's the assumption that the powers that be right now have been co-opted by some sort of leftist mentality, some sort of communist or socialist movement.
00:55:28.000 Well, but I, you know, so I don't know where that meme came from, but you take a look at like Reddit, pre-PSYOP control by, you know, big powerful PACs and organizations.
00:55:39.000 Yeah, the memes were all like- Well, that's where the Donald one came from.
00:55:43.000 Right, exactly.
00:55:44.000 And I think it was MIT Technology Review said that the Donald and 4chan's politically incorrect were the progenitors of almost all of the memes that were going viral.
00:55:52.000 They were funny, people liked them, the Pepes.
00:55:55.000 They'd end up on Twitter and then Katy Perry was sharing them.
00:55:57.000 No moderation.
00:55:59.000 It was just meritocratic.
00:56:00.000 The best memes rose to the top.
00:56:02.000 Everyone liked it, shared it.
00:56:03.000 That culture built.
00:56:04.000 And then all of a sudden they had what leads to, I guess, the dead internet theory.
00:56:08.000 This emergency panic session where moderation comes in like crazy.
00:56:13.000 Michael Tracy tweeted this.
00:56:14.000 Twitter's rules from 2015.
00:56:16.000 Pre this big emergency meeting was basically, don't incite violence, but we will not intervene in any disputes between people.
00:56:22.000 Now it's, we take a political stance, hardcore, etc.
00:56:25.000 Yeah, YouTube in 2006, when I started, June 2006, on the CrossMac channel, go check it out.
00:56:30.000 It was all owned by YouTube.
00:56:33.000 Google had not bought the company yet.
00:56:34.000 That was like a year and a half later.
00:56:35.000 And they were just featuring whatever they wanted every day.
00:56:37.000 There'd be 10 featured videos every day.
00:56:39.000 And throughout the day, they'd drop down one every like two hours, you know, a new one would come up.
00:56:44.000 So you'd be slowly siphoned off the front page.
00:56:46.000 It was all idealistically motivated.
00:56:49.000 Whoever was controlling YouTube at that time was deciding.
00:56:51.000 Steve Grove, who was running politics, was like, I like your video, Ian.
00:56:54.000 I'm going to feature it in News & Politics.
00:56:56.000 So he liked it.
00:56:59.000 Both our political agenda got pushed, which was re-elect Barack Obama.
00:57:02.000 But then at some point, Google bought the company.
00:57:04.000 I saw that's dangerous.
00:57:05.000 That's corporate conglomeration.
00:57:06.000 This could be really bad.
00:57:07.000 People started getting banned.
00:57:09.000 And then the politics people, new people probably came into the power and control.
00:57:13.000 So since the beginning it's been a platform and a public, like they are platform, they are deciding who gets to be seen since the very beginning.
00:57:20.000 I think Elon Musk has shown us a lot of what this is.
00:57:23.000 Big advertisers are scared because left-wing activists organize, and they organize predominantly on Twitter, these campaigns against them.
00:57:31.000 Now there's uncertainty, and they don't know what to do.
00:57:33.000 I think it's funny.
00:57:34.000 Advertisers announced that they were going to pull off their ads or reduce spending on Twitter due to uncertainty, and my personal view is the uncertainty likely is which side is going to attack us more, and we don't know who to side with anymore.
00:57:45.000 Same thing.
00:57:46.000 This is how Times Square for Pride became all rainbows, you know, 25-8.
00:57:50.000 I tweeted this before, that only corporate America can make sodomy and perversion seem downright boring.
00:57:56.000 But before gay marriage became universally accepted in corporate culture, they just kept their mouth shut.
00:58:01.000 Like, they'd have these little organizations, like, oh, you know, we have this little program, but now you buy a candy bar and it's, you know, it's a rainbow.
00:58:07.000 It's got a rainbow flag on it.
00:58:09.000 Every month is Skittles.
00:58:11.000 Except for their brands that operate in, like, the Middle East or whatever.
00:58:13.000 Those are the ones that never change.
00:58:14.000 Don't they have a banned substance in Skittles that was recently found out that was absolutely really horrible?
00:58:19.000 Adrenochrome?
00:58:21.000 No, no, I don't think that's where they hide their adrenochrome.
00:58:25.000 Do you think that we should advocate to ban corporate advertisement in social media?
00:58:29.000 No.
00:58:30.000 What?
00:58:31.000 Really?
00:58:31.000 That's such a First Amendment violation.
00:58:33.000 Are you communist?
00:58:35.000 Yeah, go back to Russia.
00:58:36.000 Michael keeps looking at me when Ian says stuff with this look of like, I can't believe he's here.
00:58:40.000 Social media is working in the commons.
00:58:42.000 I love him and I'm in love with him.
00:58:44.000 It's working.
00:58:45.000 He says something, and you look at me with this face.
00:58:45.000 I love you too, Michael.
00:58:47.000 What's he saying?
00:58:48.000 Is he for serious?
00:58:49.000 Yeah, because I think the huge social media networks, I mean, the ones that I think where their code should be freed, they're working in the commons.
00:58:55.000 And at some point, if the advertisers are controlling the commons by blackmail or by saying, we will take our money away if you don't do what we say, that's bad for the commons.
00:59:04.000 I don't disagree with that, but what I'm saying is this is something that you're trying to kind of square a circle, and there's no easy answer one way or another.
00:59:11.000 If you're gonna have a subscription model, then it's gonna be kind of Karen-oriented, because the ones who complain the most are gonna have disproportionate amounts of power, so things are gonna be kind of inoffensive, so I don't see an easy answer here.
00:59:21.000 Right, because if you remove advertisers from YouTube, for instance, all these people with the Partner Program are, I don't think it's called the Partner Program anymore, it's something else, but everybody would start losing their ad revenue.
00:59:30.000 That could destroy tens of thousands of careers, you know?
00:59:32.000 I think Elon recognized the power of cancel culture in what they were doing.
00:59:37.000 I don't think it's the sole component of why he bought Twitter, but you can see him tweeting at Tim Cook.
00:59:42.000 Apparently, the Financial Times reported that he was calling CEO saying, why are you dropping us?
00:59:47.000 I think Elon knows the fear they have of Twitter.
00:59:51.000 I think anybody, I think most people listening understand this.
00:59:55.000 If you've got Twitter followers and you have a problem and you tweet about it, that company bends over backwards to help you.
01:00:00.000 I don't like this system.
01:00:01.000 I think it's really dumb.
01:00:02.000 Wait, wait, wait, wait, hold on, hold the phone.
01:00:06.000 I've got a lot of Twitter followers.
01:00:07.000 I can make problems for people just by calling someone?
01:00:10.000 No, no, no, you can get whatever you want.
01:00:11.000 Wait, I want what I want.
01:00:13.000 Okay, so here's an example.
01:00:15.000 If you are boarding a plane, You can just- I had a terrible experience. This is horrible
01:00:21.000 and they'll say we'll upgrade you, we're so sorry, we'll give you whatever you want.
01:00:24.000 Wait, you- really?
01:00:25.000 Yes!
01:00:26.000 I can do this?
01:00:27.000 I strongly recommend it. I was sitting in the middle seat on my flight here and let me tell you, as soon as I got off
01:00:36.000 the plane, I talked to the manager from Timcast.
01:00:38.000 Were you on a middle seat?
01:00:41.000 I was in a middle seat!
01:00:42.000 Do you prefer a window?
01:00:43.000 I felt like an animal!
01:00:44.000 Were you Sidney Watson'd?
01:00:46.000 Oh yeah, squeezed between two large, larger people, according to Sid.
01:00:50.000 It wasn't according, she had receipts.
01:00:52.000 But the point I was making, just try and stay on subject I suppose, is um...
01:00:56.000 When you have a lot of followers and you have a problem with a company, you tweet about it, they will take care of you because they know you can direct a lot of bad reviews.
01:01:03.000 It's negative advertising.
01:01:07.000 Twitter has created a negative advertising space.
01:01:09.000 I'm going to tweet at White Castle right now.
01:01:11.000 I hate them.
01:01:12.000 They're really not good.
01:01:13.000 I'm not joking.
01:01:14.000 It's inedible.
01:01:15.000 So let's see what happens.
01:01:16.000 Hold on, hold on, hold on.
01:01:18.000 You misunderstand, Michael.
01:01:20.000 I'm tweeting anyway!
01:01:22.000 I don't want White Castle with White Castle.
01:01:25.000 Imagine this.
01:01:26.000 Elon Musk has 120 million followers.
01:01:28.000 I gotta imagine imagine this Elon Musk has a hundred and twenty million followers done
01:01:33.000 So when he says apples pulling ads or apples threatening to drop Twitter from the platform
01:01:38.000 And then he tweets directly at Tim Cook Tim Cook is on blast
01:01:41.000 His phone starts going... And he's going, what's going on?
01:01:44.000 He's like, oh no, Elon's tweeting at me.
01:01:46.000 Oh man, he's got 120 million followers.
01:01:48.000 What's happened, according to the Financial Times, is that when Elon calls the CEOs, they reinstate their ads, but to a minimum level.
01:01:56.000 Okay.
01:01:56.000 To keep the money flowing.
01:01:58.000 These ad networks are terrified of whoever controls the communication sphere, and now Elon has it.
01:02:04.000 Okay.
01:02:05.000 Well, it looks like Elon has it, but whose money is Elon using?
01:02:09.000 I told you not to talk about it on the air!
01:02:11.000 Global bankers' money that they can turn off your account at any moment, we found with Kanye's situation.
01:02:16.000 So, let's talk about who controls social media.
01:02:19.000 It's banking.
01:02:20.000 I mean, it's the money.
01:02:21.000 But you think Elon isn't acting under his own volition?
01:02:25.000 I think he is within the bounds that he's given, which is this is your US dollar.
01:02:29.000 This is how you get by. This is how you get food.
01:02:31.000 You're not completely off, but there is agency among the individual, right?
01:02:36.000 Elon is very powerful. This is the point I'm making. He has Twitter.
01:02:39.000 He can wield 120 million followers.
01:02:41.000 That's why they're scared to go too hard against him.
01:02:44.000 Because you look at what's going on in China right now.
01:02:47.000 Even with the power and the iron fist of the CCP, people have reached their point.
01:02:51.000 So, there are powerful interests, there are people, and it's self-interest too, like these big corporations, they don't like that Elon can push him around.
01:02:59.000 There are powerful government interests.
01:03:01.000 They also don't like the spotlight on them.
01:03:04.000 They want to stay like, okay, you know, the reason how it works is with like Black Lives Matter, you don't want to be that one corporation that didn't have that black square.
01:03:13.000 It's much easier to go with the herd because then no one could pick you off.
01:03:15.000 So when he calls someone or puts them on blast, all of a sudden there's no good answer for you as a corporation because now you're like, I'm going to offend somebody and that's my nightmare scenario.
01:03:23.000 Their stock price will drop.
01:03:24.000 Yeah.
01:03:25.000 Apple's stock price, yeah.
01:03:27.000 I mean, I don't know if it did.
01:03:28.000 It did, actually, yesterday, by 3%.
01:03:29.000 Oh, is that right?
01:03:29.000 I just saw a clip, a screenshot, yeah.
01:03:32.000 And they know that, and they're like, a 3% drop is how many billions of dollars?
01:03:36.000 Like, how many less iPhones are we gonna sell, and how much in ad revenue is that gonna, like, if we spend $10 billion on ads, how many, or $10 million, or whatever.
01:03:36.000 Exactly.
01:03:44.000 Yes, Apple's stock price is down 2%.
01:03:45.000 Well, a lot of stocks are down, but seeing the war between Elon and Apple is, Pretty glorious.
01:03:52.000 It's pretty amazing.
01:03:55.000 It's spectacular to see because, you know, Elon even said, hey, you know, if this gets too crazy, if you guys ban us from the Apple store, we're just going to create our own phone.
01:04:05.000 Yeah, I think is awesome.
01:04:05.000 Which you should do.
01:04:06.000 And I think the market is going to provide a solution to a lot of the censorious, a lot of the insane algorithms and control systems that is trying to, of course, suck out humanity from existence.
01:04:20.000 Well, do you think that the government should strip 230 protections from these companies?
01:04:24.000 I think if we have organizations there using these kind of speech platforms, using what is essentially our modern day town halls and censoring people, editorializing people, I think that right there is them taking advantage of a situation.
01:04:43.000 And what should be the larger solution to that?
01:04:47.000 I think they do have a monopoly.
01:04:49.000 I think it's fair to say that.
01:04:51.000 And I think it needs to be broken apart.
01:04:53.000 Oh, God.
01:04:55.000 Oh, my God.
01:04:56.000 Okay, now you both can go back to Russia.
01:04:58.000 Okay, how do you deal with Google?
01:04:59.000 How do you deal with the problem?
01:05:01.000 State power, obviously.
01:05:01.000 because I'm because like state power obviously yeah because they are they are
01:05:06.000 an institution that is a part of the state right well I'm not arguing there
01:05:10.000 there's there's absolutely intertwined between the them in the state yeah
01:05:14.000 You can't, like, break up Twitter.
01:05:16.000 Like, that doesn't really... No, I'm not saying break up Twitter.
01:05:18.000 I'm saying stop the government... Oh, I'm sorry, I misunderstood.
01:05:22.000 The government working with Google, working with Twitter, needs to stop immediately and need to be broken up and not have this unfair advantage that the government gives them.
01:05:29.000 I heartily endorse literally everything he just said.
01:05:32.000 Okay, thank you.
01:05:32.000 I appreciate it.
01:05:33.000 I know you got scared there.
01:05:34.000 I did!
01:05:35.000 Trust me, I don't want more states to solve the problem of I thought you were getting all Sherman-y.
01:05:39.000 No, no, no, no, no.
01:05:40.000 I'm not getting there.
01:05:41.000 I'm just saying the state empowers these monopolies.
01:05:44.000 The state needs to stop immediately.
01:05:45.000 Section 230 is government protection for big corporations to be free from any liability for what they publish, recommend or otherwise, while they are also simultaneously free to editorialize and promote whatever they want.
01:05:58.000 That is insane.
01:06:00.000 Imagine if we had a law in this country that said you as a newspaper can never be sued for defamation.
01:06:05.000 That's ridiculous!
01:06:06.000 This is one of the big arguments for anarchism, meaning of separation of government, of commerce and state, for the same reason you have separation of church and state, because we saw with the pharmaceutical companies, which is this deal on behalf of literally everyone in America, where you're going to make your shot, I don't even know what words I can say on YouTube anymore, and we really needed to rush to market, understandably, we're under an emergency, kind of unprecedented situation, but as a result of this, There are going to be no consequences for you if things go bad.
01:06:34.000 It couldn't even be like, you know what, here's what's going to happen.
01:06:36.000 If things go bad, we as the government are going to make some kind of welfare and pay out for people who had consequences for our wrongdoing.
01:06:43.000 Just like if there's certain situations, like you're in war and you're a soldier and you come back, we're going to take care of you.
01:06:48.000 They didn't even do that.
01:06:49.000 They're just going to be like, you know what, no one's going to be responsible.
01:06:51.000 And that to me is government at its most malevolent.
01:06:54.000 No liability at all, which is absolutely criminally insane, especially when there are clearly a lot of people getting hurt by this.
01:07:01.000 They have no one to go to.
01:07:02.000 Who's going to be paying for their medical bills?
01:07:04.000 Who's going to be taking care of them?
01:07:05.000 There's no ability to have any kind of restitution at all, which is insane.
01:07:09.000 Or even acknowledgement of wrongdoing.
01:07:11.000 Exactly.
01:07:13.000 Just use the blank check.
01:07:14.000 Yeah.
01:07:14.000 Go for it.
01:07:15.000 No, there's no blank check.
01:07:16.000 There's nothing.
01:07:17.000 Shut up.
01:07:18.000 Get away.
01:07:18.000 We told you to do this.
01:07:19.000 We manipulated you to do this.
01:07:20.000 We extorted you to do this.
01:07:22.000 We told you you can't travel.
01:07:23.000 You can't live.
01:07:24.000 You can't go see your grandma die unless you do this.
01:07:27.000 You did this?
01:07:28.000 You got hurt?
01:07:29.000 Who's responsible?
01:07:31.000 No one.
01:07:31.000 You're responsible for believing us.
01:07:32.000 That's essentially what they're saying at the end of the day here.
01:07:34.000 Relatively recently in history that they did that too, right?
01:07:38.000 No, this has been going on since like Teddy Roosevelt.
01:07:41.000 Oh, vaccine immunity?
01:07:44.000 Yeah, in fact, protections are probably the new thing.
01:07:47.000 Vaccine immunity protections is new.
01:07:48.000 No, no, no.
01:07:49.000 Corporate protections.
01:07:50.000 Corporations aren't... Do you know when the first corporation was?
01:07:53.000 Oh, I bet this has been an issue since like the 1890s.
01:07:55.000 I don't know when the first corporation was.
01:07:57.000 It might be the 1890s around then.
01:07:59.000 I'm not sure... I don't think so.
01:08:00.000 I think it's got to be earlier.
01:08:02.000 Well, there is India Trading Company.
01:08:03.000 I know it existed.
01:08:03.000 Yeah, exactly, yeah.
01:08:04.000 That was like a country, man.
01:08:05.000 That was so powerful.
01:08:06.000 Right, basically.
01:08:07.000 But, uh, you think you had any rights working for the East India Trading Company?
01:08:10.000 Oh, they would issue you script, like their own currency that you'd have to use on their territory to buy, like, food from their stores, you know?
01:08:17.000 And imagine what would happen if they're like, we want you to go take this machete and go clear bush.
01:08:20.000 Like, what if I get hurt?
01:08:21.000 They'd just laugh.
01:08:22.000 Like, how could you even ask that question?
01:08:24.000 Do it or don't eat.
01:08:26.000 Because you're replaceable.
01:08:27.000 There's no shortage of people from Ireland or Eastern Europe who, you know, we can kind of fill your shoes.
01:08:32.000 It's only recently in human history we started saying like, hey, you know, people should be responsible if they're misleading, if they're causing problems, or at the very least, if you release a toy that has an issue that needs to be recalled and you don't, you'll get sued.
01:08:46.000 You're responsible.
01:08:47.000 If someone hurts you, the party that hurt you is responsible, is liable.
01:08:51.000 The government steps in and says, no, they're not liable.
01:08:53.000 But also, to add insult to injury, they're also acting like the PR agency of Big Pharma.
01:08:58.000 If we look within the last few years, the White House has essentially become a marketing and advertisement arm of Big Pharma and other multinational corporations that they agree with, that enforce their agenda.
01:09:11.000 That to me is another layer of communism that we have to deal with that's absolutely insane and needs to stop immediately.
01:09:16.000 I feel so redundant.
01:09:18.000 He's saying all my things.
01:09:20.000 What do I do here?
01:09:22.000 I need you here when the statists are here.
01:09:24.000 When the collectivists, when the communists, when the right-wingers who want more government here.
01:09:28.000 I need you here.
01:09:30.000 Can we talk about this?
01:09:31.000 Thank you.
01:09:32.000 Oh my god.
01:09:33.000 Let's talk about this because this is something that blew my mind.
01:09:35.000 Because it really drives me crazy how conservatives use the word communist in the same way that progressives use the word racist.
01:09:42.000 Just means something I don't like.
01:09:43.000 I was on the blaze on election night and Steve Deese, who's a Christian conservative, who I was very impressed by.
01:09:49.000 He was very, very bright.
01:09:51.000 He really kind of understand how politics works.
01:09:53.000 But his conclusion at the end was to have Twitter be run as a public utility by the government.
01:09:59.000 And I'm like, you're fighting communism by having these organizations be run by the state under socialist...
01:10:05.000 That's literally socialism!
01:10:07.000 And then people take it even further by saying, you know, I think I know what's better for everyone.
01:10:11.000 I think they need this in their life and that in their life, and we need the state to mandate this and this and this.
01:10:15.000 And I'm like, you're sounding exactly like what you're fighting against.
01:10:19.000 Because the left is...
01:10:19.000 Supposedly fighting against.
01:10:21.000 The leftists are making the same argument saying, I know what's better for you, I know what's right for you, and this is why we need the state to intervene and use their monopoly of violence in order to push my wills onto you!
01:10:30.000 And I'm like, stop violating human rights!
01:10:32.000 One more thing, one more thing.
01:10:33.000 When the conservatives talk about we need to be teaching morality in schools, that's what CRT is.
01:10:38.000 That is progressives teaching their morality in schools.
01:10:40.000 We'll have a conversation about this.
01:10:42.000 I think you can do something like put Twitter under government control that's not communism or socialism.
01:10:49.000 It may be socialistic, but it would also enshrine, theoretically, First Amendment protections on this platform that's become a weapon for one political faction.
01:10:57.000 Yeah, but look at, like, NPR, right?
01:10:59.000 Like, NPR is heavily subsidized by the state, and they're so- Are they, though?
01:11:05.000 Minorly, I believe.
01:11:06.000 It's very low percent, surprisingly.
01:11:08.000 Then why are they allowed to call themselves National Black Radio?
01:11:10.000 Why were they called Federal Express?
01:11:12.000 Exactly.
01:11:12.000 It's totally misleading, and it should be illegal.
01:11:14.000 Well, you can't call yourself- Like, there was, like, an issue calling yourself Bank of something that, like, this was a big issue, because it was- Really?
01:11:20.000 Bank of America?
01:11:21.000 Well, I've been told.
01:11:21.000 Good for me.
01:11:22.000 I don't- What if I- What if I- You're right, I'm not sure.
01:11:24.000 What if I called my company official U.S.
01:11:26.000 government- Post office business.
01:11:29.000 Official U.S.
01:11:30.000 government candy.
01:11:31.000 That was the name of my company.
01:11:32.000 I mean, that would have to be illegal.
01:11:34.000 That makes no sense.
01:11:35.000 It would make the kids change their genders.
01:11:37.000 When you look it up, and maybe this is wrong, it says NPR does not receive any direct federal funding.
01:11:40.000 It receives small competitive grants from CPB and federal agencies like the Department of Education and the Department of Commerce.
01:11:47.000 So how are they funded?
01:11:48.000 Do they have advertisers?
01:11:49.000 From viewers like you.
01:11:50.000 I think it's mostly money.
01:11:52.000 Independent money.
01:11:52.000 It's actually kind of crazy where they get their money from.
01:11:56.000 I don't want to come out and say, I think I definitely agree that the problem is government, I think all power systems tend towards corruption over time.
01:12:04.000 And the problem is that government programs can't fail.
01:12:04.000 Yes!
01:12:07.000 They're surrounded by the monopoly on violence, like Luke was saying, and so they can never just stop.
01:12:13.000 So perhaps sunset clauses in anything is like, maybe a constitutional amendment, all government programs and laws will have a five-year sunset, must be re-voted upon by the people or something something like that.
01:12:23.000 The problem with that is if you look at like a trademark law, right?
01:12:26.000 It used to be that after X amount of years, a character becomes public domain.
01:12:31.000 And Disney is just lobbying Congress.
01:12:34.000 So Mickey Mouse and Superman, because they're made in the 30s, should have been public domain
01:12:37.000 I don't know how many years ago, meaning anyone can make a Superman movie, anyone can make
01:12:41.000 a Mickey Mouse movie, and they just keep pushing the bucket down the road because there's such
01:12:45.000 an asymmetry.
01:12:46.000 Disney has a huge interest in this.
01:12:48.000 The rest of us have little power or interest in this.
01:12:50.000 The asymmetry is never going to be in the favor of the freedom.
01:12:53.000 I think in regards to how you deal with a monopolistic social media network, I kind
01:12:58.000 of agree.
01:12:59.000 I don't want the government controlling who gets to say what on the network, because that's
01:13:01.000 just another kind of monopoly.
01:13:03.000 If the government was to be like, no, we're going to- How do you say monopoly?
01:13:05.000 It's just popular.
01:13:06.000 It's very difficult to have several search engines, because one is effectively going to be much more optimal than the others.
01:13:15.000 Yeah.
01:13:15.000 And because it's better, it's become... Maybe you're right.
01:13:18.000 Maybe the word monopoly is not the right word, but I feel like Google has a monopoly on internet search.
01:13:22.000 Maybe not, because Brave is there.
01:13:24.000 DuckDuckGo exists.
01:13:25.000 But if those other things had a competitive advantage for some way... Because Google beat Yahoo!
01:13:30.000 I'm old enough to remember Yahoo was the search engine.
01:13:31.000 There was AltaVista, and there was WebCrawler, and there was Ask Jeeves.
01:13:34.000 I don't know if those still exist, but the point is Google won because their search results were more useful.
01:13:39.000 So I don't know how you would even beat Google in that regard.
01:13:43.000 Well, I don't think you can... What do you mean beat them exactly?
01:13:48.000 What you can do is there's that film, sorry to interrupt myself, there's the film The Creepy Line, which is what they really should be going after, is how Google operates.
01:13:56.000 And as an example, if I am searching for, let's suppose, the guy, Robert, I forget, Robert Levine maybe his name was, The professor.
01:14:04.000 Let's suppose I'm searching for Hillary Clinton, right?
01:14:06.000 And Google makes it so the top ten search results are articles that are positive toward Hillary Clinton.
01:14:10.000 And if I search for Donald Trump, Google makes it so the top ten articles are critical of Donald Trump.
01:14:14.000 That is going to skew the electorate one way or another and all of us would be oblivious to see that Google has their thumb on the scale.
01:14:21.000 So that is the kind of thing where the government can be investigating and being like, alright, something here is not adding up because you're acting as a political agency and in that case there's all sorts of things that go with it.
01:14:31.000 All I come to is to, I mean, the audience has heard this before probably, but free the software code.
01:14:36.000 Yes, yes.
01:14:37.000 Amen, brother.
01:14:37.000 I agree with Ian on that in terms of, so we know the algorithms that are manipulating us.
01:14:41.000 Yes, yes.
01:14:42.000 But not the property line of it, right?
01:14:44.000 Correct.
01:14:44.000 You know, I would bring in experts to talk about what actually needs to be freed.
01:14:47.000 I don't think it's every ounce of lettering in every code base.
01:14:49.000 You should have him on.
01:14:50.000 The guy who he got he's like he got kicked off of Gmail because of this like he basically did the work and he's like we like for example another thing he found is like Facebook they would if you liked Hillary they would hit you with you should go out and vote what was if you like Trump Robert Epstein yeah I think that's the name yes correct hell yeah we should have him on yeah no relation you can't you can't search for him anymore Yeah, he was basically saying that the search algorithm was flipping votes.
01:15:20.000 And the way he described it, not only does it make perfect sense, you'd have no way of knowing.
01:15:25.000 There was even many scientific studies detailing how big tech social media companies can swing elections and help candidates win when they had no chance of winning.
01:15:34.000 There were Twitter hearings in Congress.
01:15:37.000 Republican brought up that when you are in D.C.
01:15:40.000 and you sign up for Twitter, Twitter recommends Democrats and only Democrats.
01:15:44.000 Wow.
01:15:46.000 He was like, why is it this is what you get?
01:15:48.000 Well, let's play devil's advocate.
01:15:50.000 It could be that the D.C.
01:15:52.000 is overwhelmingly Democrat.
01:15:54.000 So the odds are, if you're in D.C., you want to follow a Democrat.
01:15:57.000 And that's, you know, when I did this phone call with this guy, that's basically what he was saying.
01:16:01.000 He's like, don't you want us to be able to show you what you want?
01:16:04.000 And I'm like, I want to be shown what I choose to watch.
01:16:09.000 This is the way they think.
01:16:10.000 We know what you want more than you do.
01:16:13.000 You know, sometimes I get these Instagram ads and I end up buying a UFO.
01:16:17.000 and then buying a second UFO. Because it was a cool ad.
01:16:20.000 Wait, where's the second one?
01:16:22.000 In Adam Quigler's house.
01:16:23.000 It's gone. Yeah, unfortunate.
01:16:25.000 Well, this is the second one.
01:16:26.000 How much are those?
01:16:27.000 It's like 200 bucks.
01:16:27.000 Oh, okay.
01:16:28.000 It's cool. I saw it on Instagram and I'm like, oh, I want to get that. And so they kind of
01:16:32.000 figured it out. You know, I want something, right? But when it comes to political content,
01:16:37.000 it doesn't work the same way.
01:16:38.000 It becomes extremely nightmarish when you start... This is what I was explaining to Dorsey and Vijay Agade, that if they keep doing what they're doing, there's going to be... I don't think I said the word civil war back then, but maybe.
01:16:50.000 Maybe, if you go watch the episode.
01:16:52.000 Because what I was trying to get to is, you're basically, you're forcing everyone into boxes.
01:16:57.000 You are creating polarization on purpose.
01:17:00.000 And they were like, whatever.
01:17:01.000 They didn't care.
01:17:02.000 But I think in their defense of using that term very loosely, the thing is with social media and you have these conversations, it works kind of evolutionarily in that anyone's philosophy is going to be driven to its logical conclusion because you run enough iterations that any kind of contradictions are kind of going to go away and you're going to go towards the extreme or the logical conclusion of whatever their premises are.
01:17:22.000 So that is why kind of in many ways this moderate middle is vanishing because being in the middle is not really a coherent position.
01:17:28.000 It's just kind of a reaction to these two other poles.
01:17:31.000 Yeah, the middle can be like extreme right left really fast.
01:17:34.000 So fast that it looks like it's in the middle.
01:17:36.000 So it's still there's still a balance in the extremes if you can handle it.
01:17:40.000 Hopefully most people can.
01:17:41.000 I mean you kind of have to when you're on social media because it's so extreme when you log on.
01:17:44.000 It's so extreme.
01:17:45.000 Everything is so extreme.
01:17:45.000 It's like not everything.
01:17:46.000 It's like Mountain Dew.
01:17:47.000 They got me talking extreme.
01:17:48.000 I'm curious, some people have suggested that Elon Musk's moves are actually against the right, because there's this idea I've talked about.
01:17:56.000 You have two kids.
01:17:58.000 One is covered in filth and dirt and mud, and the other looks beautiful and pristine with a nice little suit and tie on.
01:18:04.000 Let's call them Ian and Luke.
01:18:05.000 Ian and Luke.
01:18:08.000 There's a mother.
01:18:09.000 No, you listen!
01:18:10.000 Listen to me, Michael.
01:18:13.000 Hear me out.
01:18:15.000 When you approach this woman and you see one child smeared with ice cream all over his face.
01:18:18.000 I should have stormed off.
01:18:19.000 That was my chance.
01:18:20.000 I blew it!
01:18:21.000 I blew it!
01:18:23.000 When you see a kid all messy covered in ice cream and one kid who is looking very clean, you assume, man, that kid must be unruly and crazy.
01:18:29.000 That kid must be responsible.
01:18:30.000 In reality, the mom just doesn't give the kid who's clean any treats or reprieve and is mean to him.
01:18:38.000 The kid who's messy gets whatever he wants.
01:18:40.000 They're both actually bad kids.
01:18:42.000 What happens on Twitter is they start removing people on the right.
01:18:45.000 Okay, this is bad, you're gone, bad, you're gone.
01:18:47.000 And all that's left are moderate and moderate conservatives to center, you know, like center-right to moderate conservative voices.
01:18:55.000 And on the left, they say you can do whatever you want.
01:18:57.000 So you end up with Antifa, violence, extremist posts.
01:19:01.000 So it skews and ends up making the left look insane.
01:19:06.000 Elon comes in, and I'm not saying he's doing this on purpose.
01:19:08.000 Maybe he's just genuinely like free speech.
01:19:10.000 Some people are saying allowing everybody back on who's crazy is going to actually end up making the right look bad because it's going to bring back the worst possible voices.
01:19:19.000 No, I don't think- he hasn't- first of all, he hasn't brought back Carpe Danktum, who's like number one on my list.
01:19:24.000 Because Carpe Danktum was making memes for Trump.
01:19:26.000 There's no reason for him to be banned.
01:19:28.000 It's completely crazy.
01:19:29.000 So Elon, please bring back Carpe Danktum.
01:19:30.000 You brought back- Well, he's bringing back everyone.
01:19:32.000 He's bringing back tens of thousands of users.
01:19:35.000 Very soon.
01:19:36.000 So is he bringing back everyone?
01:19:37.000 Is he bringing Milo?
01:19:38.000 Is he bringing back Chuck Johnson?
01:19:39.000 He did a poll saying a blanket amnesty for anyone that didn't spam or clearly violate the bigger rules here.
01:19:46.000 But everyone's going to be back.
01:19:47.000 There's that asterisk.
01:19:49.000 62,000 people.
01:19:50.000 Fine.
01:19:50.000 My point is we don't know that that's going to include the people who were like where Twitter had previously regarded as the worst of the worst.
01:19:54.000 I agree.
01:19:55.000 I agree.
01:19:56.000 Like Jones.
01:19:57.000 He said he wouldn't bring him back.
01:19:58.000 Does this mean he will?
01:19:59.000 That was interesting.
01:20:00.000 Someone asked Elon, I think it was Viva Frey, maybe?
01:20:03.000 Viva Fry?
01:20:04.000 Frey?
01:20:05.000 David, how do you say your last name?
01:20:06.000 Frey?
01:20:07.000 If he would bring back Alex.
01:20:08.000 And he was like, no, no, I have no respect for people that would... I'm paraphrasing what Elon's response was in text.
01:20:13.000 That he values children, and anyone that would demean and use children for political gain or power, he's totally disgusted by.
01:20:21.000 That's all of Washington.
01:20:22.000 And that's an example of, like, you're gonna let one guy run your social network.
01:20:25.000 I responded to that.
01:20:26.000 Do you know what Obama did in the Middle East?
01:20:27.000 He said how many weddings and children he bombed.
01:20:29.000 But it's an example of, like, you let one guy run a network, he has total emotional control of who gets to play.
01:20:34.000 That's not emotional.
01:20:35.000 I think he's making a coherent, if incorrect in your opinion, perspective.
01:20:40.000 Yeah, but I think he paid the price for saying those things in public and free speech.
01:20:48.000 Can we talk about this?
01:20:49.000 I really, really hate that term free speech.
01:20:51.000 I don't ever use it because it means so many different things to different people.
01:20:55.000 Like some people think, oh, if you're blocking me, you're blocking my free speech.
01:20:57.000 It's like if I'm not letting you in my house.
01:21:00.000 No, there's a lot of conservatives who say that, too.
01:21:02.000 Oh, I thought you were for free speech and now you blocked me off your page because I'm an idiot who's babbling MAGA stuff.
01:21:08.000 Did you tell them that freedom of speech doesn't mean freedom from consequences?
01:21:11.000 Well, I can reply to them once they're blocked.
01:21:14.000 That's the argument, because is it free speech to go on Twitter and say all these racist stuff and then you get banned?
01:21:19.000 It's still Elon's free speech to block whoever he wants.
01:21:22.000 But then you're like, well, is it a public company?
01:21:23.000 Is it a private company?
01:21:25.000 Now we're in new territory.
01:21:26.000 Free speech is a principle.
01:21:28.000 It's like an ethic.
01:21:29.000 It's a position that we hold.
01:21:30.000 Culturally, at least, we did for a long time until, you know, culture started shifting, I guess.
01:21:34.000 Everybody opposes free speech when you can knock them from power, as opposed to varying degrees.
01:21:37.000 I remember being a small kid, being like, yeah, it's a free country.
01:21:41.000 I can say what I want.
01:21:42.000 Now we don't have that.
01:21:44.000 On Twitter, if you are allowed to go on and say your opinions, as repugnant as they may be, that is free speech.
01:21:49.000 If Elon Musk says some opinions are not allowed, he does not believe in free speech.
01:21:53.000 Free speech.
01:21:54.000 There are reasonable limits, I think is fair.
01:21:57.000 I don't think doxing should be allowed, and that is First Amendment-protected speech.
01:22:01.000 So, you know, even I am not an absolutist.
01:22:04.000 Let's talk about doxing.
01:22:05.000 If someone's contact information is public, right, like in a phone book, whatever it is, and you republish that, do you consider that to be doxing?
01:22:15.000 That's tough.
01:22:16.000 Right?
01:22:16.000 That's a tough one.
01:22:17.000 Is it doxing if someone just Google someone you could find their address?
01:22:20.000 I don't think that actually qualifies.
01:22:21.000 But I don't like it.
01:22:24.000 I think that's really kind of egregious and I think you guys would probably agree.
01:22:27.000 Oh, I think it is egregious, yeah.
01:22:29.000 Right?
01:22:29.000 The government also sells your addresses many times through the DMV, through private corporations.
01:22:34.000 Is that right?
01:22:35.000 And then later you could buy, yeah, people's, there's services online where you could, where people go and buy government registries and then have people's addresses where people could buy it online and find out where they live.
01:22:44.000 It's like if I send your address to one person, is that doxing?
01:22:47.000 If I send it to 10, now is that?
01:22:49.000 If it's 100?
01:22:49.000 At what point does it become a dox?
01:22:52.000 I think it becomes a dox when you're sending it to strangers.
01:22:55.000 So if it's public, or privately to strangers, to people I don't know?
01:22:58.000 Yeah, I think, with the intent certainly, it kind of plays into here.
01:23:02.000 It's not like, hey, wish Ian a happy birthday, send him a card to his address.
01:23:06.000 Is that doxing?
01:23:07.000 It's a fine line.
01:23:08.000 We've got to respect people's privacy, but also at the same time, we've got to respect people's speech.
01:23:12.000 Where do you draw the line?
01:23:13.000 Yeah, because I'm allowed to say what your address is.
01:23:16.000 I'm allowed.
01:23:17.000 In public, under the First Amendment, doxxing is protected.
01:23:21.000 You can hold up a big sign with someone's address and walk around with it.
01:23:24.000 It's actually also really kind of crazy.
01:23:26.000 I'm reading an autobiography of this woman, Mabel Dodge Lujan, who had these salons in her house in the early 20th century, where modernism started.
01:23:33.000 And the articles in the New York Times would be like, Mrs. Mabel Dodge, comma, of 5 East 23rd Street, New York.
01:23:39.000 And it would just have her address there.
01:23:41.000 It's like, this seems like it's a problem.
01:23:43.000 Yeah.
01:23:44.000 Well, I guess culture is the biggest issue, in my opinion, when it comes to everything.
01:23:50.000 Most of the political stuff we're dealing with, it's cultural.
01:23:52.000 When you have a culture that is cohesive and agrees on morals and ethics, you can leave your door open at night.
01:24:00.000 You don't have to worry as much about crime.
01:24:03.000 But when you have a society, a country, where everyone's just like, you're not a part of my community, I don't know you, I don't care, now you gotta lock your doors.
01:24:12.000 It seems like when you have small localities, there tends to be a homogenization of culture because there's only 70 of us, you know, we all know where 7th Street is.
01:24:19.000 But as soon as you introduce 100,000 or 10 million unknowns to the culture, you're in a state of like, I can't leave my I can't leave my address online anymore.
01:24:28.000 We used to have the yellow pages.
01:24:30.000 Yeah, like they used to be books of everyone's addresses.
01:24:32.000 This used to be the big appeal of cities, that if you're from some small town where
01:24:36.000 everyone knew your business, you could go somewhere and get lost and vanish.
01:24:39.000 And that it's not just a thing for criminals.
01:24:41.000 It could be things like, you know, I want to rediscover myself.
01:24:43.000 I want to have create a new identity for myself.
01:24:45.000 I want to get away from this kind of upbringing I had.
01:24:47.000 So this was, back in the day, a benefit of cities, but thanks to the internet, that's kind of gone away.
01:24:51.000 I don't need to go live proximately to people who share my worldview.
01:24:54.000 I could just find it through social media, so on and so forth.
01:24:57.000 And given how just deleteriously cities have been collapsing in the last 10 years in terms of just basic safety and public services, it's just the question on the table, I think, for many of us is, are cities an outdated mode of organization?
01:25:14.000 Or governments, because cities have way, way more government than a lot of other places that have a lot less government.
01:25:22.000 You guys are minarchists.
01:25:25.000 That's the most insulting thing you can ever tell me to my face, Ian.
01:25:27.000 He just called you gay!
01:25:29.000 Worse than that!
01:25:29.000 I know!
01:25:31.000 You said that social media needs a little bit of government oversight.
01:25:34.000 Those are fighting words.
01:25:35.000 Earlier, I think you were saying about Twitter that— My goodness!
01:25:37.000 Maybe you—are you indicating— I have a machete here.
01:25:39.000 You're down to break up monopolies with government?
01:25:42.000 Is that what you're—like, you said earlier that you want the government to be involved with social media.
01:25:45.000 Do you see what I—I love this show so much!
01:25:47.000 Do you see what I have to deal with every day?
01:25:49.000 Well, you've told me you're a minarchist.
01:25:50.000 Every day.
01:25:51.000 No, he's never said that.
01:25:52.000 Yeah, to my face here, like, three weeks ago.
01:25:52.000 I've never said that.
01:25:54.000 Anarchist, maybe.
01:25:56.000 But zero government, because then the corporations just take control.
01:25:59.000 Ian, please read this one.
01:26:02.000 If you think Klaus Schwab has not created a government?
01:26:04.000 Are you going to talk about roads soon?
01:26:05.000 He's going to be talking about roads.
01:26:06.000 I would be happy to bring up public roads.
01:26:08.000 He's going to bring up roads.
01:26:09.000 Who controls the roads?
01:26:10.000 It's me, because I have the most guns.
01:26:14.000 Malish, you deal with this.
01:26:15.000 One sentence.
01:26:16.000 If you have a job, that's basically a government.
01:26:18.000 But think about corporations can become a government if they're unchecked.
01:26:21.000 The only entity or institution anywhere for any reason at any point that could make a road is the government.
01:26:28.000 And dominoes.
01:26:29.000 Well, I think that, well, yes, but that's besides the point.
01:26:31.000 And pornhub and pigs.
01:26:32.000 A socialized government can protect free roads for people better than a corporation, in my opinion.
01:26:38.000 But that's, your opinion is, you sound like Jason Whitlock.
01:26:41.000 Your opinion's based on nothing.
01:26:42.000 I love Jason Whitlock.
01:26:43.000 Your opinion's based on nothing.
01:26:45.000 We can look very quickly to find out private roads versus public roads and how safe they are, how often they break down, potholes, things like this, and it's not even a question.
01:26:54.000 It costs.
01:26:54.000 Well, hold on.
01:26:55.000 You know, when the government builds the road, there's a simple solution to the potholes.
01:26:59.000 KFC comes in and fills them and then puts the KFC- It was Domino's.
01:27:02.000 It was Domino's who did that?
01:27:03.000 Oh, okay.
01:27:04.000 That's what you were saying.
01:27:05.000 I think the evidence would be that corporations are authoritarian by nature.
01:27:09.000 You have one person in control that decides who stays, who goes.
01:27:11.000 And if they control a busy roadway... You do this.
01:27:14.000 I deal with this every day.
01:27:16.000 Your turn.
01:27:17.000 So a busy roadway, for instance, if Google owned I-77... I'd rather talk to Kanye!
01:27:25.000 Most people might agree with you.
01:27:27.000 You have an opportunity.
01:27:28.000 He's going to storm out.
01:27:29.000 Yeah.
01:27:29.000 So what do you do if... We've got to both storm out.
01:27:32.000 If Alphabet or Microsoft owned like I-77 North and then like, tomorrow we don't want any black people on that road.
01:27:39.000 And you're like, where's our government to protect us from this psychotic corporation?
01:27:43.000 Okay.
01:27:43.000 So let's walk through this thought experiment.
01:27:47.000 And I've never used the word thought more loosely.
01:27:49.000 What do you think would happen to Alphabet if they publicly said, we are going to have a part of our company that is forbidden for use by black people?
01:28:01.000 Forgetting lawsuits and discrimination law.
01:28:03.000 Literally, what do you think would happen to that company?
01:28:05.000 Antifa would show up with crowbars.
01:28:06.000 The only thing that would happen is the stock price.
01:28:09.000 Overnight.
01:28:10.000 Let's think about this.
01:28:11.000 Who controls the stock market?
01:28:13.000 What if people tried to storm the headquarters of Antifa overnight and they have armed guards outside that killed them all?
01:28:18.000 And where's the government to stop that from happening?
01:28:21.000 Okay, I'm talking about this peacefully.
01:28:23.000 Forget Antifa.
01:28:24.000 If a company declares, we are not going to have part of our organization accessible to black people, do you think there will be no extreme, immediate consequences for that company?
01:28:37.000 I would hope that there were, but I think that when Vanderbilt shut off the railroads through New York, there was obviously, there comes times where he had so much control of the system.
01:28:45.000 I'm not talking about Vanderbilt.
01:28:46.000 Talk about now, 2022.
01:28:47.000 A general boycott isn't enough because it controls so much aspects of society already.
01:28:52.000 I think this is, you don't even need a boycott because anyone who's a publicly traded company is there at the behest of the stock owners and the board.
01:29:02.000 So if I am, whoever's running Alphabet now, I don't even know his name, and say... First of all, I don't even know how they're going to... I don't even know how they're going to get this plan through.
01:29:10.000 But if someone... Let's... Elon.
01:29:13.000 If Elon Musk tomorrow says, alright, here's the new rule on Twitter.
01:29:17.000 No one who's black can be on Twitter.
01:29:19.000 And that even includes, what's his name?
01:29:23.000 Sean, whatever.
01:29:25.000 Sean King.
01:29:26.000 Sean King.
01:29:27.000 Even Sean King, you're not allowed.
01:29:28.000 Talcomex.
01:29:28.000 Talcomex, thank you.
01:29:30.000 He's not allowed on Twitter.
01:29:31.000 The number of audience would implode.
01:29:34.000 The number of news articles would be through the roof.
01:29:37.000 And the stock price would be nothing.
01:29:39.000 And all that money that he's owed, I don't know how that would work, but those banks would call in that debt immediately.
01:29:44.000 So you might be right.
01:29:45.000 Because of the emotional charge about racism, you're right, there could be reactions.
01:29:50.000 Now let's take something more insidious.
01:29:51.000 BlackRock wants to buy farmland in the United States.
01:29:53.000 What if they buy it all?
01:29:54.000 Where's our government protection?
01:29:56.000 Okay, do you understand what determines price?
01:30:00.000 So, if I am buying, let's suppose just eggs, right?
01:30:03.000 I mean, if they're colluding, the government decides you can't set your prices.
01:30:10.000 You know, we have protections so that the corporations aren't deciding what the price
01:30:13.000 is.
01:30:14.000 Wait, so if I am buying, let's suppose just eggs, right?
01:30:18.000 The more eggs I buy, the higher the price of eggs becomes, right?
01:30:22.000 And then it becomes asymptotic, right?
01:30:24.000 In like a mechanical system that's supposed to happen.
01:30:27.000 But real estate is a great example of this.
01:30:29.000 Like Austin and New York.
01:30:30.000 As more and more people are trying to buy real estate, the prices are extremely quickly increasing, right?
01:30:35.000 We know this.
01:30:36.000 Taylor Swift tickets.
01:30:37.000 Yeah, right.
01:30:38.000 I don't know how much farmland there is in America.
01:30:40.000 It's a lot.
01:30:41.000 But at a certain point, the costs are going to increase very, very high.
01:30:46.000 And BlackRock, if they're not earning a return on their... I'm not a fan of BlackRock, by the way.
01:30:50.000 I'm not saying this is something we should be, like, applauding.
01:30:52.000 I'm just saying there are mechanisms already in place that the idea that BlackRock can buy America... I don't even know how much the cost of real America would be.
01:30:59.000 It just would be insane.
01:31:00.000 And I want to point out, too, your concerns about BlackRock are based on the fact that they get free money from the government.
01:31:04.000 Yeah, exactly.
01:31:05.000 It's not the government.
01:31:06.000 I mean, it's the Bank for International Settlements.
01:31:07.000 So you want more government to deal with the problem of government?
01:31:10.000 No, it's not the government.
01:31:12.000 It's the Federal Reserve and the Bank for International Settlements that are supplying the funds to BlackRock.
01:31:15.000 It's not the American government.
01:31:16.000 We are in favor of treating the members of the Federal Reserve as war criminals.
01:31:20.000 I think I can speak for Luke in this.
01:31:22.000 Is that fair to say?
01:31:22.000 Yes.
01:31:24.000 private company but it was no no war criminals war criminals reserve is a private company are you
01:31:29.000 so i i i i might am i speaking i'm not saying they're not war criminals i'm just saying it's
01:31:34.000 a private company they're not a private company at all and they should be treated like war criminals
01:31:38.000 are you suggesting they're a government agency i'm not suggesting it i'm saying it
01:31:42.000 No, the Federal Reserve is a quasi-private public organization, but it was invented by J.D.
01:31:47.000 Rockefeller, John Rockefeller, Paul Warburg in 1913.
01:31:50.000 Ian, this is the nightmare scenario of when you have this kind of corporate government intertwining, where you have this almost like a Frankenstein creature that's half one foot, half the other, and no accountability, no liability, and that is exactly what anarchists like me and Luke, and anarchists of all colors of the black flag are opposed to.
01:32:07.000 Because this is what Marx referred to as capitalism.
01:32:10.000 Fascism.
01:32:11.000 Now, I agree that fascism is the other end of the spectrum, whereas corporatism is equally as dangerous.
01:32:16.000 Corporatism is fascism.
01:32:17.000 It's just fascism with a better brand name.
01:32:18.000 Fascism was going to bring government into it.
01:32:20.000 Corporatism could be like, John Rockefeller owns every road, and now if John Rockefeller doesn't like you, you can't drive.
01:32:26.000 Okay.
01:32:27.000 That's bad.
01:32:27.000 How is he going to afford—do you know how much every road would cost?
01:32:31.000 Like, even Elon Musk with $45 billion, that can buy you maybe the roads in, like, one city.
01:32:37.000 Yeah, but they know the bankers that print the money, and they can... But again, that's the Federal Reserve.
01:32:42.000 It's not American government.
01:32:43.000 That's different.
01:32:44.000 It's a private company the American Reserve has given their power to.
01:32:46.000 It's total co-opt of our government.
01:32:48.000 Yes, that is corporatism.
01:32:50.000 They're using government to do this.
01:32:53.000 The government is the fall guy.
01:32:54.000 The American Republic is legit.
01:32:56.000 What is the American Republic?
01:32:57.000 A republic is the government.
01:32:58.000 It's supposed to smash up monopolies.
01:33:00.000 Supposed to is a blue-pilled word.
01:33:02.000 Supposed to doesn't exist in reality.
01:33:03.000 We have, you know, antitrust laws to break up like Rockefeller Standard Oil at the end
01:33:07.000 of the 1800s.
01:33:08.000 We had to break it up.
01:33:09.000 We have antitrust laws so monopolies will play ball by whoever's in power.
01:33:13.000 Even Teddy Roosevelt and Wilson, who were the first two progressive presidents, both
01:33:16.000 explicitly said there are good trusts and bad trusts, meaning the monopolies that do
01:33:21.000 what we want and monopolies that don't do what they want.
01:33:23.000 This is exactly a mechanism, just like Facebook and Google and other things, Biden will call you up because he's got these antitrust laws behind him, or whatever other laws at his disposal, and be like, you know what?
01:33:35.000 We'd really like it if you were censoring this misinformation.
01:33:38.000 It's dangerous to our democracy.
01:33:40.000 And they're like, oh, of course, we're doing it privately.
01:33:42.000 But it's not privately because it's a complete collusion between the state and the free market.
01:33:47.000 Ostensibly free market.
01:33:48.000 Do you think that if government were totally removed and it was just market that it would work out?
01:33:53.000 That everybody would be happier and wealthier?
01:33:57.000 Not everyone.
01:33:59.000 Certainly the politicians, it's going to be really hard for them when consequences will never be the same.
01:34:03.000 I feel like corporation tendency to profit over human goodness is like... You can only profit... You can only... Slavery is not profitable.
01:34:11.000 It's expensive, number one.
01:34:12.000 And besides, it's immoral.
01:34:14.000 Would you be buying candy bars from a company that's built on slave labor?
01:34:19.000 I mean, I got this thing.
01:34:22.000 It's probably built in a Chinese lab somewhere.
01:34:24.000 I think everything in here might be like...
01:34:27.000 Not everything, but a lot of it.
01:34:29.000 This table is actually American-made locally, but the cameras are foreign-made by people in horrible conditions.
01:34:34.000 Well, that's a problem.
01:34:35.000 It is, I agree.
01:34:36.000 That's why we make governments to stop that, in my opinion.
01:34:38.000 The benevolence of government is that it can protect us from those things.
01:34:41.000 But in fact, over the past couple of decades, the government has been colluding with foreign corporations to sell all of our jobs overseas to benefit China.
01:34:50.000 It's what Nixon did with Rockefeller and Henry Kissinger.
01:34:54.000 It was an open policy, the open China policy that they implemented.
01:34:56.000 So you gotta understand, Ian, at the end of the day, all roads lead to, of course, the government abusing its power because they don't need to provide a service based on any kind of reputation or any kind of consequences.
01:35:11.000 We're gonna go to Super Chats!
01:35:12.000 Sorry to cut you guys off.
01:35:13.000 If you haven't already, we just smash the like button, subscribe to this channel, share the show with your friends, become a member at TimCast.com.
01:35:20.000 I have gotten word from the crew here that there is going to be a behind-the-scenes breakdown of what happened with Ye and the crew when they came, because we filmed their journey here.
01:35:29.000 There's conversation and stuff that happened, so we are going to have some kind of members-only footage of how everything went down, and it'll be interesting.
01:35:36.000 But we're going to have a members-only show up for you guys tonight around 11 p.m.
01:35:39.000 is when we upload it, so smash that like button, subscribe to this channel, share the show if you really want to support us.
01:35:44.000 All right, we got SetMeFree who says, Balenciaga is for abusers only.
01:35:49.000 If you rock it, we know how you get down.
01:35:50.000 What's wrong with you?
01:35:53.000 There you go.
01:35:53.000 Yeah, I was hoping to talk to Ye about that yesterday, because he did a lot of work with Balenciaga.
01:36:00.000 And, you know, a lot of people are being questioned about it, rightfully so.
01:36:04.000 And I think, you know, I wonder what he knows from inside of the industry, if it's even, you know, sinister from there.
01:36:10.000 But sadly, that didn't happen.
01:36:12.000 Yeah.
01:36:13.000 Raymond G. Stanley Jr.
01:36:14.000 says, Tim, dude, everywhere I look, Tim pool this, Tim pool that, I can't stand all of
01:36:18.000 the weak AF BS out there. We have a nation to save. We have a movement to continue. There's
01:36:22.000 no time for childish BS games.
01:36:26.000 I think forgiveness is the fever of the day.
01:36:29.000 So, I want to point something out.
01:36:30.000 Forgiveness?
01:36:31.000 I want vengeance.
01:36:32.000 Are you kidding?
01:36:32.000 For what they did for two years?
01:36:34.000 Maybe do both.
01:36:35.000 Yeah, we could forgive them after they- Vengeful forgiveness.
01:36:37.000 Vengeance and forgiveness.
01:36:38.000 I think justice.
01:36:40.000 I try to be careful with the idea of vengeance because- But it's so fun.
01:36:43.000 I agree.
01:36:44.000 It feels good.
01:36:45.000 Yeah, it does.
01:36:46.000 And if it feels good, you should do it, especially if children are involved.
01:36:49.000 No!
01:36:50.000 That was the lesson of the day, no?
01:36:51.000 No.
01:36:52.000 You know that song?
01:36:52.000 I mean, it is episode 666.
01:36:54.000 You're wonderful, just who you are.
01:36:55.000 You know that song, if it makes you happy, then why the hell is it so bad?
01:36:58.000 Yeah.
01:36:58.000 I always heard that and thought about, like, just drug abuse.
01:37:00.000 And I was just like, what is wrong with this person singing this?
01:37:03.000 Like, what is she saying?
01:37:04.000 That's true.
01:37:05.000 But I know the song isn't about drug abuse, but, you know, I just kind of was like, there's a lot of things that make you happy that are very, very bad for you.
01:37:10.000 What were you just saying before Michael derailed?
01:37:14.000 Because you were about to say a point that I wanted Michael to hear.
01:37:17.000 Which one?
01:37:17.000 I don't remember.
01:37:18.000 What was the super chat?
01:37:19.000 About everything?
01:37:20.000 Vengeance.
01:37:21.000 Oh, yeah.
01:37:21.000 Vengeance and justice.
01:37:24.000 After the Kyle Rittenhouse trial, you know, the conversation was like, oh, do we try and book Kyle Rittenhouse?
01:37:29.000 And I was just like, no.
01:37:31.000 If they reach out to us, of course.
01:37:33.000 He just followed me last night on Twitter.
01:37:34.000 There you go.
01:37:35.000 But what I don't like doing is being like, let's be the first to get the big person.
01:37:39.000 And then I just got to say, I am not happy with how things went down yesterday.
01:37:46.000 And I wake up in the morning and it's like, I'm trending.
01:37:49.000 I hate trending.
01:37:50.000 I have never, you know, I've never trended and it really drives me crazy because like all my friends have trended.
01:37:54.000 I'm such a loser.
01:37:55.000 I know, I know.
01:37:56.000 Same.
01:37:57.000 Everybody right now, if everybody right now tweeted Michael Malice just right now, you'd be trending.
01:38:02.000 I'm a very, very, very little man.
01:38:06.000 In every sense of the word.
01:38:07.000 But we end up doing this booking because it was like, hey, we could have this big story on the show and it implodes on us.
01:38:14.000 And I'm like, I'd rather just have people in that we find are interesting and talk about the news.
01:38:20.000 I didn't think it imploded on you at all, to be honest.
01:38:23.000 I'll put it this way.
01:38:24.000 Our plan was not to have them storm out.
01:38:27.000 You know what I mean?
01:38:28.000 I thought we were going to talk about news and we just happened to have the people who are in the news here.
01:38:32.000 And it turned into... I feel like, yeah, he planned it.
01:38:36.000 I really do.
01:38:37.000 Because when we were talking beforehand, he was calm.
01:38:40.000 The points I made during the show were similar points I had just made 20 minutes before.
01:38:45.000 You didn't raise your voice, you didn't interrupt him.
01:38:49.000 It's funny because people don't know what happened before the show, so there's all these comments about what I did wrong and everything, and I'm like, the pre-show, you know, they're like, you shouldn't have brought up that article about Pence, and I'm like, I brought it up an hour before the show started.
01:39:02.000 There's a big screen right there that everyone can see, and it was right there.
01:39:06.000 Ye looked at it, read it, and then said, What, did Pence, like, betray Trump?
01:39:10.000 Is that what happened?
01:39:10.000 And then Fuentes started talking about what, what, January 6th and Pence.
01:39:13.000 But bringing up Pence was germane.
01:39:15.000 Pence was condemning the dinner, so ask him his opinion about it.
01:39:19.000 Well, you weren't saying, Pence is right, get out of my house.
01:39:20.000 No, I said, this is the news.
01:39:21.000 Yeah.
01:39:22.000 There's a dinner that happened, tell us.
01:39:23.000 And he immediately went into the subject, and it was very, very different to everything that happened before.
01:39:29.000 Can you describe how his affect changed?
01:39:31.000 Because watching him sitting here, he really seemed like he had a huge chip on his shoulder.
01:39:35.000 Before the show, he was sitting here texting on his phone, minding his business.
01:39:40.000 He looked up, he was calmly talking to Milo.
01:39:42.000 He was smiling.
01:39:43.000 He asked questions.
01:39:45.000 There was something that came up about sin.
01:39:47.000 Are all sins created equal?
01:39:48.000 And Milo said, certainly some sins are worse than others.
01:39:51.000 Like, no one's going to claim, you know.
01:39:53.000 And then he makes a few comparisons.
01:39:54.000 And then, yeah, he smiles and he says something of like, you know, all sin.
01:39:57.000 And then he looks at the article, he makes a comment.
01:39:59.000 It was very, very calm and we were chill and we were chilling.
01:40:02.000 Yeah, he was talking about doing the show every week.
01:40:04.000 He was like, we gotta do this more often.
01:40:06.000 We should do this once a week.
01:40:07.000 And I'm like, that could be Hollywood talk though.
01:40:09.000 Sure.
01:40:10.000 But he brought up something about how... I can't remember exactly what he was saying, but it was about Jewish people.
01:40:15.000 I said something... No, did he say Jewish people or did he say the Jews?
01:40:18.000 He said Jewish people.
01:40:19.000 Okay, that's interesting.
01:40:20.000 Okay.
01:40:20.000 Yeah, yeah, yeah.
01:40:20.000 I mean, isn't that what... That's what I thought he kept saying, but whatever.
01:40:25.000 He said, you know, Jewish people or something, very calm, laid back, smiling.
01:40:30.000 And I made a point about Bezos, Elon, I mean, these are wealthy people, and he goes, yeah, but come on, Elon works for the Jews, right?
01:40:37.000 And then he looks at Milo and he says, didn't they have him do that thing or whatever, that ceremony or something?
01:40:42.000 Very calm, and even when I pushed back and said, he's on Twitter, he's unbanning people, he was chill, totally calm.
01:40:48.000 Camera goes on, he starts going off, I gotta talk about this, and don't interrupt me, because I'll walk out on your show, I tell you, very, very different.
01:40:54.000 That's why, if you look, I'm like confused, look at my face when he walks out, I'm like, is he walking out?
01:40:59.000 Because you were just, you said, I don't agree.
01:41:01.000 And he's like, deuces.
01:41:01.000 It's like, you didn't yell at him.
01:41:04.000 Yeah.
01:41:04.000 But anyway, anyway, I don't want to keep rehashing this.
01:41:06.000 I just wanted to bring up the point that I'm wary of doing shows like that because there are people who are like, I want to get press.
01:41:17.000 Oh yeah, yeah, yeah.
01:41:18.000 And so I was concerned and I said it, I was like, him walking out made the story very big.
01:41:23.000 Like, I want to do a show where we talk about ideas and, like, we don't do drama.
01:41:28.000 We don't like drama.
01:41:30.000 And now, like, before the show started today, we were talking about, like, what's the big story to talk about?
01:41:34.000 And it's like, you know, Ian, you were like, we're the news.
01:41:36.000 And I'm like, that's, you know, you're not wrong.
01:41:39.000 I'm like, that kind of sucks.
01:41:40.000 I want to talk about what Trump's doing, I want to know what his plans are, I want to know how that affects us, and here we are in sort of a... I shouldn't say we're the center, Ye is the center, but we're on the edge of it, and it's our fault, because we want to do a show that turned into a PR spectacle instead of a conversation about what was going on.
01:41:56.000 I think it's the natural evolution of the show as well, that it's just going to keep getting bigger and bigger, and it'll become more and more the focus of the human consciousness, the conversations like this, I think.
01:42:06.000 You're right, like Rogan.
01:42:07.000 You know, Joe said, I just want to hang out with my friends and talk, and then all of a sudden his shows became the news.
01:42:12.000 And that worries me.
01:42:14.000 I'm going to play devil's advocate because it is 666.
01:42:17.000 I'd rather that be the news than whatever is going on in Anderson Cooper or Sean Hannity that night.
01:42:22.000 I think it's much more honest.
01:42:24.000 We'll keep bringing Super Chats because I don't want to leave you guys hanging.
01:42:26.000 Deprived Dolphin says Project Veritas has a new video on federal-sponsored human trafficking.
01:42:31.000 I saw a bit of it.
01:42:32.000 Did you guys see some of this report?
01:42:33.000 Like a whistleblower from the government talking about this stuff?
01:42:36.000 I gotta watch the whole thing though.
01:42:37.000 I just had James on my show.
01:42:38.000 He did a great job.
01:42:39.000 It was a lot of fun.
01:42:39.000 Oh, Keith?
01:42:40.000 Yeah.
01:42:40.000 Excellent.
01:42:42.000 I'm excited they got unbanned.
01:42:44.000 Did James personally get unbanned?
01:42:45.000 Yes, I believe so.
01:42:46.000 I'm 90% sure.
01:42:48.000 That was so egregious.
01:42:50.000 It's crazy.
01:42:52.000 And the creepy cultists who cheer that on.
01:42:55.000 Good news though, good news though.
01:42:57.000 Is that episode out yet?
01:42:58.000 Yeah.
01:42:58.000 On You Are Welcome?
01:43:00.000 Yes, on You Are Welcome.
01:43:03.000 All right.
01:43:04.000 Bub Savvy says, Tim, Destiny thinks Fuentes would run circles around you in a debate.
01:43:07.000 Hilarious.
01:43:08.000 You should have Ruslan K. Dion, the No Jumper Pod episode today.
01:43:13.000 All think you're anti-Semitic apparently.
01:43:15.000 Oh, because they didn't watch the hour of me ranting against anti-Semitism that resulted in people saying we get it, Tim.
01:43:21.000 Shut up.
01:43:22.000 Literally how, dude?
01:43:23.000 Yeah, like, we were complaining about all of that stuff, but they're not real people.
01:43:28.000 And I'll tell you this.
01:43:29.000 We're not real people?
01:43:30.000 And you're not an anti-Semite?
01:43:32.000 You all heard him.
01:43:33.000 You all heard him.
01:43:35.000 The trolls online are drama baiters.
01:43:39.000 Thank you.
01:43:41.000 This guy.
01:43:42.000 This guy, Michael Malice.
01:43:43.000 Master drama baiters.
01:43:44.000 Let me tell you.
01:43:45.000 You've seen my videos.
01:43:47.000 Destiny thinks Fuentes would run circles around me in a debate?
01:43:51.000 Oh yeah, he probably would.
01:43:52.000 I don't debate people.
01:43:55.000 He prepares for this, it's all he does.
01:43:58.000 I don't think I'm the smartest person who can... I have convictions, I have beliefs.
01:44:03.000 Sometimes they're wrong.
01:44:04.000 Seamus, there's a really important point.
01:44:07.000 I was arguing with Seamus on a show.
01:44:09.000 Boy, was it embarrassing.
01:44:10.000 I kept saying abortion means this.
01:44:11.000 You're wrong.
01:44:11.000 No, you're wrong.
01:44:12.000 And then, like, a few days later, after reading it, I was like, oh boy, I was wrong.
01:44:16.000 Wow, that's embarrassing.
01:44:17.000 And then I came out and I was like, Seamus, you were right about that.
01:44:19.000 I was just wrong about NPR today.
01:44:20.000 Dude, it's so valuable to find out when you're wrong.
01:44:24.000 I think everybody's wrong at some point in their life.
01:44:27.000 When you realize it, that is so valuable for your mind.
01:44:29.000 Here's my point.
01:44:31.000 I love Ian so much.
01:44:34.000 Someone who is confident will win a debate knowing nothing over a scientist with a PhD in that field.
01:44:41.000 If you are a talker and you can speak, you can make it sound like you were right.
01:44:44.000 That's debate.
01:44:45.000 There's this great debate, which I couldn't sit through because I was bored, with Bill Nye, the science guy, against a creationist.
01:44:51.000 And the creationist knew what Bill Nye was going to say, whereas Bill Nye thought the guy was just going to be like, God put Noah blah blah, idiocy.
01:44:57.000 And the creationist just knocking him out.
01:44:59.000 And Bill Nye's like, oh, what do I do?
01:45:01.000 Yep.
01:45:02.000 Well, Bill Nye's not really a science guy.
01:45:04.000 Correct.
01:45:05.000 You know, that's a funny thing to say.
01:45:06.000 He's an engineer, isn't he?
01:45:07.000 But there are people who are really good at debate, and you don't need to know facts to be good at debate, but it does help.
01:45:12.000 And those people tend to become lawyers.
01:45:14.000 Yeah, debate tactics.
01:45:15.000 I'm looking them up now, but there's things like argument.
01:45:17.000 That's a debate tactic.
01:45:18.000 There's awe.
01:45:19.000 Is that a debate tactic, where you can actually put your opponent in awe, and then they back down?
01:45:23.000 Yeah.
01:45:24.000 Gish gallop, I think, is the best example.
01:45:26.000 Yes.
01:45:27.000 It's where you throw out a whole bunch of points all at once, so they can't answer any of them.
01:45:31.000 But if they only answer one, it's gonna take 15 minutes.
01:45:33.000 That leaves the others on the table, yeah.
01:45:34.000 Yep, yep.
01:45:37.000 So that's why I'm like, I like having discussions.
01:45:39.000 I'll invite people in, we'll have a talk, and if I disagree, I will say what I disagree on.
01:45:43.000 And if I'm wrong, we had Matt Bender here, and I mixed up a city, and I was very confident, arrogant, about like, you're wrong!
01:45:51.000 And I pulled up and went, whoops!
01:45:53.000 I hit the wrong city, you got me.
01:45:55.000 And then they clipped the video, and they're like, aha, Tim's wrong.
01:45:57.000 I'm like, yeah, I know.
01:45:58.000 I don't think people appreciate at home how often, if you're doing this live and the conversation's dynamic, you're gonna even just have brain farts and misspeak.
01:46:07.000 It happens.
01:46:07.000 Here's a good one.
01:46:08.000 Mr. P says, Episode 666 on the 333rd day of 2022.
01:46:10.000 How about that?
01:46:10.000 Whoa!
01:46:15.000 Okay.
01:46:15.000 Does that mean something?
01:46:16.000 Mechanical simulation, baby.
01:46:18.000 Amen, baby.
01:46:19.000 For Ian, it does.
01:46:20.000 I got chills right now, and I'm not lying.
01:46:21.000 I got goosebumps.
01:46:22.000 And it's currently 9.45 p.m.
01:46:25.000 I don't understand that.
01:46:26.000 You know what that means?
01:46:26.000 4 plus 5 equals 9.
01:46:27.000 Oh, yeah.
01:46:29.000 3, 6, 9.
01:46:29.000 The sacred numbers, the universal numbers.
01:46:31.000 And it's 6, 6, 6 on the 333rd day at the 9th hour.
01:46:33.000 What's 90 cut in half?
01:46:35.000 9, 4.5.
01:46:35.000 I agree.
01:46:35.000 It's 9.46 now.
01:46:35.000 333rd day at the ninth hour. What's 90 cut in half? 45.
01:46:40.000 9, 4.5. I agree. Let's dismember Ian.
01:46:45.000 The numbers have spoken.
01:46:46.000 This was a cult the entire time?
01:46:48.000 Yeah, man.
01:46:48.000 Alright, let's see what we got here.
01:46:50.000 What do you think happened to Lydia?
01:46:53.000 Takfuji says, I wasn't very interested in Malice's opinions, but then I saw his underwear ads in Times Square and I decided I would fight the state.
01:47:01.000 Then I saw a cat and noticed I had Twitter notifications and now lost.
01:47:05.000 Sheathunderwear.com promo code Malice.
01:47:08.000 Was it in Times Square?
01:47:09.000 No.
01:47:10.000 Oh, okay.
01:47:10.000 Oh, he's talking about your ad.
01:47:12.000 Yeah.
01:47:12.000 Oh, okay.
01:47:13.000 He said underwear ads in Times Square, but it was just you.
01:47:15.000 Yeah.
01:47:15.000 Should we put you up in your underwear in Times Square?
01:47:17.000 I mean, maybe when I get my cum gutters back.
01:47:19.000 I think you're going to be back up in Times Square again, too.
01:47:22.000 Oh, awesome.
01:47:22.000 We're doing New Year's Eve.
01:47:23.000 Oh, heck yeah.
01:47:23.000 You told me that.
01:47:24.000 Yeah, yeah.
01:47:24.000 Yeah, the whole tower.
01:47:25.000 Yeah.
01:47:25.000 That was very depressing going back to New York.
01:47:27.000 I was very, very upset.
01:47:28.000 When were you there?
01:47:29.000 Just in August to see the billboard that you guys put up.
01:47:32.000 It really upset me a lot.
01:47:33.000 What was the worst?
01:47:35.000 Ching the billboard.
01:47:37.000 It's on face.
01:47:38.000 If you think I'm ugly now, look at me when I'm this tall.
01:47:42.000 No, it was that I'd been gone for a year.
01:47:44.000 This is the city I lived my entire life, and you would think there'd been a year to heal, and it remained exactly the same.
01:47:52.000 Anything that had changed had changed for the worse.
01:47:54.000 To have like 30% of the storefronts not be occupied, and within 10 minutes, or 15 let's say, I saw someone peeing on a van in the street, not on the sidewalk.
01:48:04.000 And people are like, that's always been New York.
01:48:06.000 I've lived in New York all my life.
01:48:07.000 That's not been a common occurrence.
01:48:08.000 It was just corporate chains.
01:48:09.000 I was just back a couple days ago, too.
01:48:11.000 It's horrible.
01:48:12.000 I hate it.
01:48:12.000 I'm so saddened by it.
01:48:14.000 The Curly Afro says, I have waited so long to see Michael Malice return on Timcast.
01:48:18.000 Just to say you, sir, are a national treasure that must be protected.
01:48:21.000 Oh, that is so nice.
01:48:22.000 Thank you.
01:48:24.000 Thank you.
01:48:25.000 How much did you pay him to say that?
01:48:27.000 He paid a couple bucks.
01:48:29.000 50 shekels.
01:48:31.000 Vacant Stare says, I prefer Malice's Razor.
01:48:33.000 Never attribute to Malice that which can be explained by Hanlon.
01:48:39.000 I like that.
01:48:39.000 You guys are brilliant.
01:48:40.000 That was really smart.
01:48:41.000 My fans are smart.
01:48:44.000 Okay.
01:48:45.000 Okay.
01:48:46.000 Keto Thor says, Ian looking mighty cozy tonight.
01:48:48.000 Thanks dawg.
01:48:48.000 Yo, it's cold in here.
01:48:49.000 Yeah.
01:48:50.000 The heater stopped working.
01:48:51.000 So like the AC gave out and then we got the AC fixed, but we didn't realize the heat wasn't working.
01:48:56.000 So now it's cold out.
01:48:57.000 Now the heat's not working.
01:48:58.000 Yeah.
01:48:58.000 So I'm like, I was texting the show like, yo, I'm cold.
01:49:00.000 These are some first world problems.
01:49:01.000 Yeah, it was cold too.
01:49:01.000 He was wearing that big coat.
01:49:02.000 He was like, why is it so cold in here?
01:49:03.000 I'm like, bro, I don't know.
01:49:04.000 You know why?
01:49:05.000 Cause we control the weather.
01:49:06.000 In this room specifically.
01:49:08.000 Like why he walked out.
01:49:09.000 It was just cold!
01:49:10.000 It's like average cold in here!
01:49:12.000 F this, I'm out.
01:49:14.000 Okay, what do we got here?
01:49:16.000 You know where it's not cold?
01:49:17.000 Israel.
01:49:18.000 Oh yeah.
01:49:18.000 What's the temperature like there?
01:49:20.000 Perfect.
01:49:20.000 It gets cold there.
01:49:21.000 It's always perfect.
01:49:22.000 Is it though?
01:49:23.000 Is it like a good 70 degrees?
01:49:25.000 It's the desert.
01:49:25.000 It's got to be super hot.
01:49:27.000 One day, I want to take the Balfour Declaration, the creation of Israel, I want to have a long conversation about it in a safe place, in a place where we can talk about it online with people.
01:49:35.000 I think that that place does not exist.
01:49:37.000 Then we need to create it.
01:49:39.000 A safe place to discuss that on the internet?
01:49:41.000 I don't know about that.
01:49:42.000 Yeah.
01:49:42.000 Ian wants a safe space.
01:49:44.000 Hey, look, maybe the government can build him one.
01:49:48.000 Yeah.
01:49:49.000 On their road.
01:49:50.000 All right.
01:49:51.000 AI says, I've always said if the Dems were serious about investigating Trump, they would have a special committee on Epstein.
01:49:56.000 What do they do?
01:49:57.000 Investigate a fake dossier and a bunch of MAGA nobodies.
01:50:00.000 Priorities.
01:50:01.000 If they were serious about Trump, then it would have been Epstein first, but they're not serious about any of it.
01:50:07.000 They're just going after political opponents.
01:50:10.000 All right.
01:50:11.000 One Pissed Off Hippie says, The Machine Elf Michael Malice, please unblock me on Twitter.
01:50:15.000 I won't call you a lawn gnome or share the meme.
01:50:18.000 Please end my struggle session.
01:50:19.000 What's his username?
01:50:20.000 I'll do it right now.
01:50:21.000 One Pissed Off Hippie.
01:50:21.000 He's cool.
01:50:22.000 Is it the number one or the word one?
01:50:24.000 The number word one.
01:50:24.000 Is that his Twitter handle?
01:50:25.000 Yeah.
01:50:26.000 One Pissed Off Hippie.
01:50:26.000 Number one Pissed Off Hippie.
01:50:27.000 Is that you, Ian?
01:50:28.000 No, no.
01:50:29.000 Just some cool dude online.
01:50:30.000 That's totally Ian.
01:50:31.000 I want to make sure it's the number one.
01:50:33.000 No results, it says.
01:50:34.000 Maybe it's the word one.
01:50:36.000 How do you spell it, Tim?
01:50:37.000 Is it H-I-P-P-Y?
01:50:40.000 Just at me, Ian, and I'll with his username and then I'll block him.
01:50:44.000 Yeah, I'm gonna go for this and make sure it happens tonight, dude.
01:50:47.000 Thanks for super chatting that.
01:50:49.000 All right.
01:50:50.000 Eraserhead says, if YouTube and other sites lose the ability to do search, doesn't the internet just revert back to where everyone uses YouTube to host their videos, and those videos get embedded in the personal websites of creators and brands?
01:51:02.000 Or, YouTube is a reverse chronological feed where people see the videos that were recently posted, which will... Oh, like an Instagram almost.
01:51:10.000 Exactly.
01:51:11.000 Because they can't recommend it, they can just, when someone posts it, it'll appear if someone chooses to see it.
01:51:16.000 I think that would be a good thing.
01:51:17.000 Yeah, that actually would make a lot of sense.
01:51:19.000 A lot of big, prominent people would disappear.
01:51:21.000 I guess, according to Darren B., Lex Fridman would be gone.
01:51:24.000 Really?
01:51:24.000 Why?
01:51:24.000 Because he's made the argument that YouTube's algorithm is just forcing Fridman on everybody.
01:51:29.000 In all fairness, I mean, you can't be surprised that one robot's going to recommend another robot.
01:51:34.000 Yeah.
01:51:37.000 But it is an interesting point, though.
01:51:38.000 You go on YouTube and you look at these things and it's like, instantly recommends Lex Friedman podcast.
01:51:42.000 Yeah.
01:51:43.000 Is Lex Friedman as awesome in person as he is on TV?
01:51:48.000 I absolutely... Lex and I are next door neighbors, right?
01:51:52.000 And there may have been times when I've never... I feel bad saying this, but I'll say it anyway.
01:51:58.000 My Nest, my video camera, looks at his house.
01:52:01.000 So sometimes when I'm bored, I keep... I'm like, when is Lex leaving?
01:52:05.000 That's weird!
01:52:06.000 And there was this one time where there was like this chair in the garbage and I was like oh I wonder when he threw this chair out and then I looked and he actually went the garbage can and made it stick out more and I have to ask him about this but you know what you should you guys should do a funny bit where like you're watching his door and then he walks out and then he just like he walks out the door and it stops and slowly No, no, no, Tim, this is the joke and I told this to Lex.
01:52:30.000 I hear when the garage opens because my office is right there.
01:52:34.000 He had a guest leaving his podcast and he was saying goodbye to them and I'm like, I gotta get like a clan uniform here so that like when I hear him saying goodbye to guests, I like open the door, I'm like, hey Lex, how's it going?
01:52:46.000 Like the wacky sitcom neighbor and he has to explain, oh no, it's just Michael, he's just like that.
01:52:51.000 So if someone wants to send me a clan uniform to add to the seven I already have, Yeah, that'd be really great.
01:52:57.000 No, but Lex is really one of, he has, for someone who is made of string and wires, he has like the biggest heart maybe of anyone I know.
01:53:08.000 Does he like bring you fruit baskets?
01:53:10.000 No, he's very kind of keeps to himself and I really don't want to be like always knocking his door.
01:53:14.000 I want to respect his space.
01:53:15.000 I don't like that kind of thing.
01:53:17.000 But he really takes a lot of stuff to He's very passionate about things.
01:53:21.000 Him online is a lot like what he is in real life.
01:53:25.000 Maybe he's a lot funnier in person, a lot more jokey, maybe it's because of me, but he really cares a lot about people.
01:53:31.000 Something I can't relate to at all.
01:53:34.000 Bro Cody says, Tim, not that I'm counting or anything, but you're 0-2 with rap artists on your podcast.
01:53:40.000 Fair point.
01:53:40.000 That's true.
01:53:42.000 Have we had any other rappers?
01:53:43.000 I could bring more.
01:53:44.000 Tom McDonald, please come on the show.
01:53:46.000 Give us one victory.
01:53:47.000 I could bring some more if you want.
01:53:49.000 Oh yeah, Luke.
01:53:52.000 Luke was the one who wanted Kanye on the show.
01:53:54.000 Hey, one pissed off hippie, tweet at me really quick so I can see your account.
01:53:57.000 And the other guy who, you know.
01:54:01.000 Ready to Rumble says, Tim, your chat has become toxic, hater vomit.
01:54:06.000 You know, people are allowed to say things even if they don't like me.
01:54:09.000 So it's like, what do you do?
01:54:11.000 If people show up and they get in the chat and they start saying that they don't like the show or I'm bad or whatever, I'm like, okay.
01:54:17.000 But your chat's really dynamic.
01:54:18.000 I mean, it goes very, very fast.
01:54:20.000 I try to read all the chat.
01:54:22.000 We have to put it on slow mode.
01:54:24.000 Oh, okay.
01:54:24.000 And subscriber only, and so all the people who hate me are subscribing, and they want to say they don't like me, and they're allowed to say they don't like me.
01:54:33.000 Yeah, of course.
01:54:34.000 Catch that check.
01:54:35.000 During Occupy, I was livestreaming, and it was like all these leftists, like liberal people who were watching, of course, and then a bunch of conservatives came in and started smack-talking and everything.
01:54:45.000 The viewers that were like more like, we had like 2,000 viewers at the time on my phone.
01:54:49.000 They were like, Tim, you got to ban these people.
01:54:51.000 You got to ban them.
01:54:52.000 And then I just said, why would I ban them?
01:54:53.000 They're allowed to talk.
01:54:53.000 They're allowed to dislike me.
01:54:55.000 They don't have to agree with Occupy Wall Street.
01:54:57.000 They want to know what's going on too.
01:54:58.000 And then all of a sudden they're like, oh, this Tim guy's pretty cool.
01:54:59.000 And they started like saying like, oh, okay, we're going to keep trolling, but we appreciate that you're letting us.
01:55:03.000 And I'm like, just don't spam guys.
01:55:04.000 At the end of the day, you got to appreciate the trolls, you know?
01:55:06.000 I got to call bull.
01:55:07.000 No one's ever called you cool.
01:55:08.000 I don't believe this story for a second.
01:55:10.000 I bet you could.
01:55:11.000 It's not Tim cool.
01:55:13.000 No, not yet.
01:55:14.000 But there are other words that rhyme with cool.
01:55:17.000 Do you know what I think the funniest name for Trump was, and it took them four years to think of this?
01:55:21.000 I've called you those things on this show.
01:55:26.000 There's a lot of people.
01:55:27.000 He calls me Pim Tool off the air.
01:55:31.000 You know what I think?
01:55:32.000 Dim Fool.
01:55:33.000 Pim Tool is a dim fool.
01:55:34.000 Do you know what I think the funniest name for Trump was and it took them four years
01:55:37.000 to think of this?
01:55:38.000 Tronald Dump.
01:55:39.000 I'm like, that's actually a good one.
01:55:43.000 I sent you his Twitter, by the way.
01:55:44.000 I think it's one pissed hippie.
01:55:46.000 I'll just reply to yours.
01:55:47.000 Did you DM me?
01:55:47.000 I just messaged it, yeah.
01:55:48.000 Okay, perfect.
01:55:50.000 Jack Ryan says, Milo declared on your November 9th show, you cannot give people the First and Second Amendment unless they're Christian.
01:55:56.000 You just can't.
01:55:56.000 Those are rights that I, a non-believer, fought for, and better men have died in war to protect for all American citizens.
01:56:03.000 He said non-believer.
01:56:04.000 Wait, can you repeat that?
01:56:06.000 I don't remember that.
01:56:07.000 Yeah, I'm pretty sure he said something like that.
01:56:09.000 You said you can't call a non-believer Christian?
01:56:11.000 No, he said, you cannot give people the First and Second Amendment unless they're Christians.
01:56:14.000 You just can't.
01:56:15.000 It was a quote.
01:56:17.000 Or I think it was from him.
01:56:18.000 He said, those are rights that I, Okay, I think Jack Ryan is saying that.
01:56:24.000 Okay, I'm sorry, I'm sorry.
01:56:25.000 Milo said you cannot give people the First Second Amendment unless they're Christian, you can't.
01:56:28.000 Then Jack Ryan is saying those are rights that I, a non-believer, fought for and better men have died for to protect for all American citizens.
01:56:34.000 Yeah, I think a lot of Christian beliefs are in our system, our agnostic system, so you can still run the—I don't know if you need to be a Christian.
01:56:42.000 I think literally over 90% of Christians who believe in the Second—95% would agree that everyone has the right to protect themselves.
01:56:49.000 Michael?
01:56:51.000 It's episode 666 on the 333rd day of the year, which equals 999.
01:56:56.000 And I read it at 945, and half of 9 is 4.5.
01:57:00.000 And there were currently 54,000 viewers, which equals 9.
01:57:05.000 Wow.
01:57:06.000 That proves it.
01:57:07.000 Herman Cain is spinning in his grave.
01:57:09.000 He's a numerologist, too.
01:57:10.000 No, he had that 999 plan, remember?
01:57:13.000 And then Michelle Bachmann said, turn upside down.
01:57:17.000 There's a German joke there.
01:57:20.000 Prosciutto Liv says, so sick of Lex being shoved down my throat.
01:57:23.000 Lex has the most boring vanilla opinions on everything.
01:57:27.000 He does get recommended everywhere.
01:57:28.000 That's the point that we're making is that like YouTube, he probably has more recommendations than any other podcast.
01:57:36.000 I'm sorry, I never thought... I'm not saying anything bad about him.
01:57:40.000 Well, that person is, and I'm happy to defend Lex.
01:57:43.000 I don't think he has boring vanilla lip paints and everything.
01:57:45.000 I think his big principle, and what he's doing well, especially to do, is for people to be kinder to one another, to listen to each other, one another.
01:57:54.000 And to be less antagonistic and more cooperative.
01:57:56.000 And I think that's a great message that I don't entirely agree with.
01:57:59.000 with.
01:58:13.000 Thank you.
01:58:15.000 Community governance, well, there's going to be a form of governance no matter what.
01:58:20.000 Correct, I'm not disagreeing with that, yeah.
01:58:22.000 Troy Rubert says don't forget he has... He doesn't agree with that either.
01:58:26.000 Right, there's a scope and scale of the current government that's the big problem.
01:58:29.000 Troy Rubert says, don't forget he has synesthesia and he sees sound.
01:58:32.000 Do you Ian?
01:58:33.000 Kanye.
01:58:35.000 I think I do too sometimes.
01:58:36.000 When I listen to music and play music, I visualize it.
01:58:39.000 The note, if the note goes up, I visualize it on a line graph and I can see it and sometimes it'll become a three-dimensional graph.
01:58:44.000 Oh cool, okay.
01:58:46.000 That's interesting.
01:58:47.000 Is it?
01:58:48.000 I do as well.
01:58:48.000 Fun fact.
01:58:50.000 I'll explain it later though.
01:58:51.000 Red Vista says, just a point for the discoverability of your channel.
01:58:54.000 The way I discovered this channel is I looked up Joe Rogan drama when he was being cancelled and just filtered search by max views and got here.
01:59:01.000 Wow.
01:59:02.000 Interesting.
01:59:03.000 People complain often that you'll search for my name and it's nothing but haters.
01:59:06.000 Like people are trying to find this show and they find nothing but people hating on me.
01:59:09.000 Wow.
01:59:09.000 And this has happened to other people, I'm not going to drag into the conversation, but other prominent people who are not establishment aligned.
01:59:15.000 are like if you get the establishment left channels ragging on you instead of the people actually trying to
01:59:21.000 find the show.
01:59:21.000 Wow.
01:59:22.000 Yeah.
01:59:22.000 I do think that he algorithmically, if I type in someone's name, it should take me to their channel first.
01:59:27.000 Of course.
01:59:27.000 And then...
01:59:28.000 That's a no-brainer.
01:59:29.000 Yeah.
01:59:30.000 Yep.
01:59:32.000 All right.
01:59:32.000 Track Media Only says, someone needs to tell Malice that some people still buy iPhones.
01:59:36.000 So yes, people will buy products made by slave labor.
01:59:38.000 I think that point was made, right?
01:59:40.000 You were like, oh.
01:59:41.000 Thank you for informing me that people still buy iPhones.
01:59:44.000 Thank you for this new information.
01:59:45.000 I'm not going to buy Apple.
01:59:47.000 Not anytime soon.
01:59:48.000 I'm fed up with Apple.
01:59:48.000 You won't be able to.
01:59:49.000 Well, to be fair, I want to make sure I clarify.
01:59:52.000 We have Apple devices here because we're building an app.
01:59:54.000 And we started building an iOS app.
01:59:57.000 I'm not a big fan of Apple, but I recognize people use it.
01:59:59.000 I would love to see X-Phone from Elon or something, but I prefer Android.
02:00:03.000 I use Android.
02:00:03.000 I got Android here.
02:00:05.000 It is what it is.
02:00:05.000 Maybe you could make phone.
02:00:07.000 P-H-O with the umlaut.
02:00:09.000 N. Just phun.
02:00:10.000 Phun.
02:00:11.000 Phun.
02:00:11.000 Yeah.
02:00:12.000 Phun.
02:00:13.000 Phun.
02:00:13.000 Alright everybody, if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends, become a member at TimCast.com.
02:00:18.000 We're gonna have a members-only show coming up for you at 11 p.m.
02:00:21.000 over at TimCast.com, where you can click the Join Us button, sign up, and support our work.
02:00:25.000 You can follow the show at TimCast IRL, follow us on Instagram, and you can follow me personally at TimCast basically everywhere.
02:00:31.000 I post stupid things on Twitter if you want to see them.
02:00:33.000 Michael, do you want to shout anything out?
02:00:35.000 You can follow me at Twitter.com slash Michael Malice.
02:00:37.000 There's a few hundred copies left, signed hardcovers, and once they're gone, they're gone.
02:00:40.000 That's the top tweet on Twitter.com slash Michael Malice, and you're welcome It's going to be great for the rest of the year, and I'll be back very soon to launch the White Pale, A Tale of Good and Evil, which is a book I've been working on for two years.
02:00:52.000 We'll order pizza and wings when we do it.
02:00:55.000 Okay.
02:00:56.000 Thanks for coming and helping me deal with the statists and collectivists here.
02:01:01.000 I appreciate the additional help.
02:01:02.000 You're all a bunch of socialists!
02:01:03.000 Wait, can I tell that story quickly?
02:01:05.000 Go ahead.
02:01:05.000 When Ludwig von Mises was at the Mount Pelerin Society and Milton Friedman and all of them were discussing how in a free society you could still have some kind of progressive income tax and Ludwig von Mises stood up and said, you're all a bunch of socialists and stormed out of the room.
02:01:18.000 You bunch of socialists!
02:01:20.000 My YouTube channel is youtube.com forward slash WeAreChange.
02:01:23.000 I just did a video about Elon, China.
02:01:25.000 Check it out.
02:01:26.000 It's up there right now.
02:01:27.000 And if you're a member of lukeuncensored.com, I did a video about all the behind-the-scenes things that were happening here.
02:01:32.000 Check it out.
02:01:33.000 lukeuncensored.com.
02:01:34.000 See you there right after this broadcast.
02:01:36.000 Thank you again so much for having me.
02:01:37.000 It's fallmediancrossland.net, all my social media, Ian Crossland, YouTube, Facebook, Twitter, Mines, the list goes on.
02:01:42.000 Michael, great to see you again, brother.
02:01:44.000 Uh, where, when people watch You're Welcome, what time of day, what day?
02:01:48.000 I drop it Wednesdays at, like, in the evening.
02:01:50.000 Like, usually around 7.
02:01:51.000 All right.
02:01:52.000 Thanks for coming, man.
02:01:53.000 What's happening, Serge?
02:01:54.000 And I'm still Serge.com with the high energy coming through.
02:01:59.000 Love you guys.
02:01:59.000 Thanks for checking it out.
02:02:00.000 All right, everybody.
02:02:01.000 We will see you all over at TimCast.com.