Timcast IRL - Tim Pool - July 06, 2023


Timcast IRL - Zuck's Threads Faces LAWSUIT For STEALING Twitter Secrets Says Elon Musk w-Lauren Chen


Episode Stats

Length

2 hours and 3 minutes

Words per Minute

204.0485

Word Count

25,251

Sentence Count

1,951

Misogynist Sentences

36

Hate Speech Sentences

44


Summary

On this week's episode of Timestamps: Mark Zuckerberg's new app, the "The Twitter Killer," Joe Biden's administration files an appeal to overturn the First Amendment, and Elon Musk calls out Mark Zuckerberg for "cheating."


Transcript

00:00:00.000 So Mark Zuckerberg has launched his app, Threads.
00:00:26.000 They call it the Twitter Killer, and I signed up to check it out, and it is awful!
00:00:30.000 I'm not just trying to play some stupid tribal games.
00:00:33.000 The main feed on Threads is a bunch of random people I don't follow.
00:00:39.000 So I sign up for this new Twitter killer, and I'm like, alright, let's check this out.
00:00:43.000 And I go to my homepage, and there's like some dude named Roderick talking about how him and J-Boys was going down to the Courts Hall for a new drink, and I'm like...
00:00:54.000 I have no idea what this guy's talking about.
00:00:56.000 I don't want to follow him, and you have to block them.
00:00:59.000 And so people are literally like, well, I guess what you do is if you see a post you don't like, block the person so you never see it again.
00:01:04.000 The whole thing is this.
00:01:06.000 I'm getting a bunch of weird brands and garbage I don't care about.
00:01:08.000 But anyway.
00:01:09.000 It sucks.
00:01:10.000 But we'll talk more about it and where it's at because sure enough there is room to improve and some people are excited there's an alternative to Twitter because some people are still banned on Twitter.
00:01:17.000 But here's where it gets interesting.
00:01:19.000 Twitter is threatening to sue Meta for stealing trade secrets.
00:01:24.000 Why?
00:01:25.000 According to Twitter, Mark Zuckerberg hired Twitter employees who had access to proprietary information from Twitter which they used to make the new app.
00:01:33.000 Very interesting.
00:01:35.000 So we'll see.
00:01:35.000 Right now, a cease and desist letter was sent out.
00:01:38.000 Elon Musk called it cheating.
00:01:40.000 We'll talk a bit about that.
00:01:41.000 And then as it pertains to good old Joe Biden, oh boy.
00:01:44.000 Politics, you gotta love it.
00:01:45.000 Apparently, the cocaine they found was actually near the Situation Room.
00:01:50.000 Yeah, not just the West Wing.
00:01:52.000 Dan Bongino mentioned that this has gotta be someone in the family, someone who can bypass security, and that's exactly what we're saying, but hey, he would know better than we would.
00:02:01.000 So, I think everybody kinda knows where that stuff came from, so we'll talk about that.
00:02:05.000 Also got a bunch more little stories here.
00:02:08.000 We do have another big one with Joe Biden.
00:02:10.000 They're effectively trying to overturn the First Amendment.
00:02:13.000 You know, I want to avoid being hyperbolic, but the filing from the Biden administration seeks to grant them the authority to have private organizations censor the political speech of people they don't like.
00:02:25.000 And it is an unprecedented move.
00:02:27.000 It's insane.
00:02:28.000 Effectively, the Biden administration is filing an appeal for the right to bypass the First Amendment.
00:02:34.000 Shockingly insane.
00:02:35.000 So we'll talk about that.
00:02:36.000 Before we do, my friends, head over to CastBrew.com And join the Cast Brew Coffee Club!
00:02:41.000 If you like really good coffee and you want the best, you're gonna buy from Cast Brew, and you'll also be supporting the work we do because this is our company, we're sponsoring ourselves.
00:02:49.000 You can join the Cast Brew Coffee Club where you'll get three bags of coffee every month.
00:02:53.000 We've got Rye's with Roberto Jr., a light roast.
00:02:55.000 We've got Appalachian Nights, a dark roast.
00:02:57.000 I gotta be honest, I think Appalachian Nights is the best coffee I've ever had.
00:03:01.000 You know, I'm the one who formulated it, so surprise, surprise, I made what I liked.
00:03:04.000 They come in whole, bean, or ground.
00:03:07.000 And again, Casper.com, we're currently working on setting up a coffee shop.
00:03:10.000 We may even actually launch a second location before we even get started, just so we can speed things up.
00:03:15.000 Also, don't forget to go to TimCast.com, click join us, become a member, because we're gonna have a members-only uncensored show, 10 p.m.
00:03:22.000 tonight, where you as a member actually can submit questions and perhaps even call in and talk to us and our guests.
00:03:27.000 So smash that like button, subscribe to this channel, share the show with your friends.
00:03:31.000 Joining us tonight to talk about this and so much more is Lauren Chen.
00:03:35.000 Thank you for having me.
00:03:36.000 Super stoked to be here.
00:03:37.000 Absolutely.
00:03:37.000 Who are you?
00:03:38.000 So my name is Lauren Chen.
00:03:40.000 I am a Blaze TV host as well as a TPUSA contributor and I also make videos on YouTube.
00:03:45.000 So I have my own channel, Lauren Chen, and another channel, Mediaholic, where we talk about Pop culture, entertainment, movie reviews, and then my main channel is more current events, social and political issues, and I basically am everywhere on social media, and I don't know, I feel like I'm often very angry on social media.
00:04:03.000 There's a lot of things to talk about politically in terms of entertainment that are not going well, so I guess I'm an angry person online.
00:04:09.000 This is what we do.
00:04:10.000 We complain about things on the internet for a while.
00:04:11.000 Yeah, but then it's like we're making content.
00:04:13.000 That's content.
00:04:14.000 People are watching it.
00:04:15.000 They like it.
00:04:15.000 Absolutely.
00:04:16.000 Well, thanks for hanging out.
00:04:16.000 It should be fun.
00:04:17.000 We got Phil Labonte.
00:04:18.000 He's here.
00:04:18.000 How you doing?
00:04:19.000 I am Phil Abate, lead singer of All That Remains, anti-communist and counter-revolutionary.
00:04:23.000 Good to see you guys.
00:04:24.000 Hi, everybody.
00:04:24.000 Ian Crossland, ready to rock and roll.
00:04:26.000 Let's move it!
00:04:28.000 All right, Ian.
00:04:28.000 As you say, imsurge.com.
00:04:30.000 Ready when you are, Tim.
00:04:31.000 Check out this story from ABC News.
00:04:33.000 Twitter sends meta cease and desist letter over new threads app, say sources.
00:04:38.000 Now, this is confirmed.
00:04:39.000 Elon Musk has even called it out.
00:04:42.000 He said, competition is fine.
00:04:43.000 Cheating is not.
00:04:44.000 Twitter Daily News tweeting, Twitter is threatening to sue Meta over systematic, willful, and unlawful misappropriation of Twitter's trade secrets and IP, as well as scraping of Twitter's data in a cease and desist letter sent yesterday to Zuckerberg by Elon's lawyer Alex Spiro.
00:05:00.000 I have to wonder, recently we heard that Twitter was limiting how much, how many tweets you could see, and they said it was because people were stealing data.
00:05:07.000 I wonder if it's because Elon found out Meta was actually scraping data to use for their new app.
00:05:13.000 I mean, Mark Zuckerberg doing something untoward in regard to intellectual property?
00:05:18.000 That doesn't sound right, guys.
00:05:19.000 Mark Zuckerberg wouldn't do that.
00:05:21.000 It's not like he has a history of that with the very creation of Facebook.
00:05:24.000 To be fair, his history of trying to create rival alternative social media platforms has not fared too well for him.
00:05:31.000 That's true.
00:05:32.000 They tried to do a Signal-type thing, didn't work out.
00:05:34.000 They tried to do a Snapchat-type thing, didn't work out.
00:05:37.000 I am pretty sure threads will not work out.
00:05:40.000 Look, man, it might, but it is a really bad platform.
00:05:46.000 There are people on the left who are tribally just saying, woo, it's so much better, and I'm just like, dude.
00:05:53.000 You log into the app.
00:05:54.000 I'm going to log in right now.
00:05:55.000 I'm going to pull it up.
00:05:59.000 So first of all, it's recommending AOC to me like crazy.
00:06:02.000 I do not follow AOC on this.
00:06:03.000 So there are people I follow.
00:06:09.000 And I do see their posts.
00:06:11.000 Okay, what's this?
00:06:13.000 Puberty.
00:06:14.000 What is it?
00:06:14.000 Puberty?
00:06:15.000 I've seen Puberty before.
00:06:16.000 The largest and most populated city on Earth, Tokyo, Japan.
00:06:18.000 That's a cool picture.
00:06:19.000 I don't care.
00:06:21.000 I follow them on Instagram, where I at least get recommended them a lot.
00:06:23.000 I think they're just like a general interest kind of now this or something.
00:06:27.000 Paris Hilton?
00:06:28.000 Oh, that's great.
00:06:29.000 Uh, let's see.
00:06:31.000 Complex Magazine?
00:06:32.000 Don't know what that is.
00:06:34.000 Ellen Degeneres?
00:06:35.000 Not interested.
00:06:35.000 I don't know who that person is.
00:06:37.000 I don't know who that person is.
00:06:39.000 Like, these are- These are people posting things about, like, their day that have nothing- Like, I'm getting recommended the dude who made threads.
00:06:46.000 I don't care about this.
00:06:48.000 Look, my whole feed- Look at this.
00:06:50.000 Like, none of it is people I follow.
00:06:52.000 So- Micr- I'm getting tweets.
00:06:54.000 Threads from Microsoft.
00:06:56.000 Yo.
00:06:58.000 I don't want to look at a feed of random advertisements.
00:07:01.000 That's what it is.
00:07:02.000 It's all ads.
00:07:03.000 It's the algorithm saying, here's something you might like.
00:07:06.000 Here's something I don't like it.
00:07:07.000 I don't like it.
00:07:07.000 What am I supposed to do?
00:07:08.000 You can't turn it off.
00:07:10.000 When you sign up for Threads, does it connect to your Facebook account?
00:07:12.000 You can have it connect to your Instagram.
00:07:14.000 So how do you download it anyway?
00:07:16.000 I think you go to, well, it's in the Play Store, but I had to go to Threads.net to download it.
00:07:20.000 Is it called Threads?
00:07:21.000 It's like Threads comma a meta app or something like this.
00:07:24.000 It's recommending me Ted Cruz.
00:07:26.000 Okay, I don't follow Ted Cruz or AOC.
00:07:28.000 At least it's not just AOC.
00:07:29.000 Jubilee Media.
00:07:30.000 What is this one?
00:07:31.000 The Blaze TV?
00:07:33.000 I don't follow The Blaze.
00:07:34.000 No, but seriously.
00:07:35.000 They get suspended all the time.
00:07:36.000 It is weird that it's recommending The Blaze to me.
00:07:41.000 I don't- I don't- I- Look.
00:07:43.000 That's not the least of it.
00:07:44.000 It's very limited in what's available to do on the platform.
00:07:48.000 But here's- The one thing I said on threads as to why it's not that good...
00:07:53.000 Look, I don't care what my favorite nature photographer has to say.
00:07:58.000 At all.
00:07:59.000 I don't follow people on Instagram, I follow people who play poker, I follow people who skateboard, who scoot, who BMX, some nature, a couple travel people, and some parkour people.
00:08:10.000 I'm not an Instagram for a newsfeed.
00:08:13.000 So when I start threads and it's like, here are the people you follow, it's like, oh sure, I follow PragerU, and a few people for some of their political stuff, but it's mostly not.
00:08:22.000 Here's ultimately what I think, and the reason why I bring up the algorithmic feed.
00:08:28.000 I think what the powers that be feared the most in terms of social media was curated feeds, which created tribes and hives.
00:08:37.000 People go on Twitter, they follow only the opinions they like, and they create these bubbles where they separate themselves from everybody else.
00:08:45.000 I think what Instagram wants to do, what Mark Zuckerberg wants to do, YouTube does the same thing.
00:08:51.000 They want to eliminate the ability for you to create a curated feed.
00:08:56.000 This has been true on YouTube forever.
00:08:57.000 Subscriptions on YouTube are completely meaningless and have been for 10 years.
00:09:00.000 Everybody knows that when you- It's the homepage.
00:09:03.000 Yeah.
00:09:03.000 When you subscribe to a channel, it doesn't matter.
00:09:04.000 You're not gonna get fed that content.
00:09:06.000 It's gonna be whatever's on the homepage.
00:09:07.000 So YouTube chooses if you're actually there and gives this impression that your subscriber count matters, but it really doesn't.
00:09:15.000 It really doesn't.
00:09:17.000 Instagram meta, they want to do the same thing with Twitter, so that when you sign up for threads, they're really hoping that it does shut down Twitter, and then this is what the future will be.
00:09:27.000 You will be forced to follow the likes of the Young Turks, you'll be forced to follow the likes of AOC or Ted Cruz.
00:09:34.000 Perhaps they think it'll be a healing thing to bring the sides together so that everyone will have a shared narrative.
00:09:39.000 But let's be real.
00:09:40.000 They've already censored DC Drano.
00:09:44.000 If you try to follow him, it warns you that he's fake news.
00:09:47.000 We know exactly where it's going.
00:09:49.000 You're gonna follow conservative and libertarian, disaffected liberals.
00:09:53.000 Those are going to get deranked without you knowing.
00:09:55.000 They're going to prop up people like AOC and neocons, you know, acceptable Republican personalities, Ted Cruz for instance, and they're going to try to excise the likes of, you know, anyone outside the political machine.
00:10:08.000 Big problem for me with centralized media in general, and it's part of why I don't like this idea that Elon's been pushing to create an app for everything, a one-stop app, the X app, because if Elon don't like you, Alex Jones, he can just ban you from everything then.
00:10:20.000 If you're putting everything through a central server, same with Meta, man.
00:10:22.000 Too much power.
00:10:23.000 The DC Drano got already put on a list, or was on a list.
00:10:28.000 I don't like it, man.
00:10:28.000 I don't like it.
00:10:29.000 And you can't force people to get out of their silos.
00:10:31.000 You gotta let them do it on their own.
00:10:32.000 You gotta inspire them to do it.
00:10:33.000 The only interest I would have in an alternative to Twitter is even more free speech.
00:10:37.000 Because like you said, Elon, he's not totally, he has this personal vendetta against Alex Jones.
00:10:42.000 I actually think that he doesn't want to get banned from the App Store.
00:10:45.000 But it's more unpopular to say that than to say, oh, I just have a very strong opinion about his treatment of the Sandy Hook people.
00:10:51.000 But in any case, you're not going to get more free speech from anything Mark Zuckerberg owns than something that Elon Musk owns.
00:10:58.000 So in my opinion, why?
00:10:59.000 I mean, I can understand if you're a leftist and you want more content moderation, you would rather have a threads conversation than a Twitter conversation, because the moderation is going to be more what you're looking for.
00:11:09.000 But I mean, for people like DC Drano, it's never going to be as free as Twitter is.
00:11:15.000 You know what will make the magic if Threads and Twitter come together and cooperate?
00:11:21.000 Threads has stated they will soon be on the Thetaverse.
00:11:24.000 So this is actually really interesting.
00:11:26.000 I think the Thetaverse is a very very important move for social media that can actually fix a lot of the problems.
00:11:31.000 For those that aren't familiar, it's effectively The internet of social media.
00:11:37.000 So, imagine this.
00:11:38.000 Imagine early, early internet, to go to a website, you're like, you log into AOL, and then AOL shows you a list of things you can click.
00:11:46.000 But there are websites, of course.
00:11:48.000 Today, you have Twitter, you have Facebook, you have YouTube.
00:11:50.000 You can go on that platform and see people.
00:11:53.000 With the Fediverse, all of these different apps will interlock with each other, so you can go onto Threads and see someone's tweet.
00:12:00.000 You could see someone's BlueSky post, someone's Mastodon post, someone's Gab post.
00:12:04.000 So that's actually a very, very good thing.
00:12:06.000 However, if that is the case, there's no reason to use Threads because it's a Zuckerberg platform.
00:12:10.000 You might as well just use any of the other Fediverse apps.
00:12:13.000 But that stops censorship.
00:12:15.000 If we can get a Twitter-like system, if we can get federated networks, what happens is, you might sign up for Threads, because it's easy, you can see posts from Phil on Twitter, and then you can, you know, Threads might be like, we're blocking Twitter from our network because we don't like him, but Phil can then sign up for any other Fediverse app or connect that way and syndicate.
00:12:36.000 So, ultimately, you can't be censored.
00:12:40.000 In fact, you can even create your own website with your own protocol where you can never be banned because your username would be like ian at iancrossland.net or whatever.
00:12:49.000 So then people would just follow you from your website.
00:12:51.000 So it's kind of like RSS feeds turning Twitter into a network of servers instead of one centralized platform.
00:12:58.000 So that's good.
00:13:00.000 But yeah, there's actually these things called ENS domains.
00:13:02.000 E-N-S.
00:13:03.000 It's Ethereum Network Server names, I believe.
00:13:06.000 They think it might be like your ID on the web 3.0.
00:13:10.000 And you go to like ens.domains.
00:13:14.000 But you buy it with crypto.
00:13:16.000 And it's like you can use yourname.eth, E-T-H, which is the Ethereum thing.
00:13:20.000 You pay Ethereum for it.
00:13:22.000 That might be one way to have like a...
00:13:25.000 A presence that can bounce around from network to network or where you control your own.
00:13:29.000 Because it'll be like a wallet where you have your money and your ID all together where you can log into all these different sites.
00:13:35.000 And imagine if nobody could ban you because nobody owns your server.
00:13:38.000 All they can do is block you.
00:13:39.000 That's the way it should be.
00:13:41.000 The problem now is with Threads or Twitter or any other platform, there's some dude who's like, I'm just going to ban you.
00:13:47.000 You can't do that in the real world.
00:13:48.000 You can't get banned from like walking down the street and saying your opinions.
00:13:53.000 To a certain degree you can, of course, because people lie, cheat, and steal all the time, and corrupt government officials will try and play dirty games.
00:14:01.000 The point is, we are protected in expressing our opinions.
00:14:04.000 I hope that's where this all goes.
00:14:07.000 I don't know what Jack Dorsey is working on, but I just want to give a shout out to Jack Dorsey because he recently did a podcast interview where he said that JFK was assassinated by the CIA.
00:14:16.000 And he got very into it, and he's like, so you're saying... He's like, yes, at that point was when the American dream was stripped away, when our own government, the CIA, killed a sitting president.
00:14:26.000 I was like, wait, what?
00:14:28.000 Wow, Jack.
00:14:29.000 But wait, didn't they basically admit that?
00:14:31.000 I don't know if they admit it, I know Tucker's said it, Ron Paul said it, like we're at the point where basically it is the widely accepted- Maybe I'm too far down the conspiracy rabbit hole for too many things where I'm just like, yeah, I thought we were all on the same page, like yes.
00:14:43.000 I love the Kennedy assassination conspiracy, because there are people who say it's multiple gunmen, you know, and it's Lee, Howard, V. Oswald, it's all pinned on this one guy, but like, The story of Lee Harvey Oswald is crazy.
00:14:54.000 First, he screamed out, I'm a patsy, as they were carrying him away.
00:14:57.000 They found his gun in the building that he was in, like, on another floor.
00:15:01.000 So, like, if you're gonna go- Then he gets killed.
00:15:02.000 Yeah.
00:15:03.000 And then someone comes up and kills him.
00:15:04.000 But, like, if you're gonna commit that crime, you wouldn't, like, put your gun in a room and go upstairs and have lunch afterwards.
00:15:09.000 Who in their right mind?
00:15:11.000 But I did not want to ignite a JFK conspiracy conversation.
00:15:13.000 I love that conversation.
00:15:14.000 Just shouting out Jack Dorsey because I saw that clip.
00:15:16.000 But the point is, you know, he made Blue Sky.
00:15:18.000 I do want to say one thing, one last thing before we move on.
00:15:22.000 There was a funny post I saw where this Twitter user said, it is 2023 and I am joining Mastodon.
00:15:27.000 It is 2023 and I am joining Blue Sky.
00:15:30.000 It is 2023 and I am joining Threads.
00:15:33.000 And it's a picture of Dr. Manhattan from Watchmen, like, sitting on Mars.
00:15:35.000 I just thought that was really good.
00:15:37.000 I'm like, you guys, stop trying.
00:15:39.000 It's not gonna work.
00:15:41.000 Threads is already censoring things.
00:15:42.000 You can't curate your feed.
00:15:44.000 The point of Twitter is that it is where we are angry with each other.
00:15:48.000 That's it.
00:15:49.000 Sorry, have a nice day.
00:15:49.000 Well, I mean, because Mastodon, that was already the left-wing alternative to Twitter.
00:15:53.000 Mastodon was bound to fail just because it's not an easy thing to use.
00:15:57.000 Yeah, I did try to log on and create an account just so I could kind of see what the leftists were talking about.
00:16:02.000 It was not very intuitive, and I'm kind of a boomer, I'll admit.
00:16:05.000 But still, it wasn't nearly as easy as Twitter.
00:16:07.000 But I think even the leftists who claim that they want all the, you know, the quote, right-wing extremists banned, they know that Twitter is fun because of that conflict, that they can't stay away.
00:16:17.000 So here's what I think.
00:16:19.000 The right wants to debate the left.
00:16:21.000 They want to have this big conversation.
00:16:23.000 The left wants to ban everyone on the right.
00:16:26.000 The thing is, Twitter is the water cooler, it is the town square, so the left wants the right banned from it so they have a monopoly on the narrative.
00:16:33.000 The right wants to use the platform to engage in debate.
00:16:36.000 It does not work for either if the other leaves.
00:16:39.000 If these liberals and leftists leave and go to Mastodon, they're no longer in town square, they have no narrative control, there's no point in being there.
00:16:46.000 Sure, you've banned all the conservatives, but now you have no influence.
00:16:49.000 For conservatives, if there's no one to talk to but each other, it's boring.
00:16:53.000 There's no debate.
00:16:54.000 Well, they want the conservatives, the leftists want the conservatives on the platform so they're able to feed the conservatives what they want them to see.
00:17:02.000 They don't want conservatives in their own echo chamber because then they're going to be talking about the leftists in a way that they don't like.
00:17:08.000 I want to pull up this tweet from Alvin.
00:17:10.000 Elon Musk responded to this tweet with an exclamation point.
00:17:13.000 He said, Dear God, what the F. I give you now this image.
00:17:19.000 Threads.
00:17:20.000 An Instagram app.
00:17:21.000 I'll give you the short version.
00:17:23.000 They're basically spying on you in every way imaginable.
00:17:26.000 Data safety.
00:17:27.000 Let's see.
00:17:28.000 Data shared.
00:17:28.000 Personal info.
00:17:29.000 Device or other IDs.
00:17:31.000 Data collected.
00:17:32.000 Location, personal info, financial info, health and fitness, messages, photos and videos, audio, files and docs, calendars, contacts, app activity, web browsing, app info and performance, device or other IDs.
00:17:44.000 Now that's amazing.
00:17:45.000 They do say it's encrypted.
00:17:46.000 Can you say no to those?
00:17:47.000 There's these caret drop downs on the right.
00:17:49.000 Can you click those?
00:17:50.000 I mean, this isn't obviously interactable.
00:17:51.000 This is basically saying what they're doing.
00:17:53.000 So look at this.
00:17:53.000 Like you have to opt in.
00:17:54.000 They have your location.
00:17:55.000 They have your name, email, user ID, address, phone number, political or religious beliefs,
00:18:00.000 sexual orientation.
00:18:03.000 A lot of times you'll be able to opt out, but they'll opt you in if you don't know.
00:18:07.000 They take your financial info, your credit score.
00:18:10.000 You are giving them...
00:18:12.000 If this is legitimate, I just want to make sure, someone posted this and assuming this
00:18:17.000 is real, you're giving your credit score, your health info and fitness info, so how
00:18:21.000 much you weigh, your blood levels.
00:18:24.000 Advertisers must be so aroused right now.
00:18:25.000 This is everything they could possibly want on us.
00:18:27.000 You're giving your emails?
00:18:28.000 Voice or sound recordings?
00:18:30.000 Files and documents.
00:18:31.000 Quite literally, it can go into your calendar events.
00:18:34.000 All of your contacts.
00:18:36.000 You can look at all your videos on your phone.
00:18:38.000 The contacts is crazy, because if some random person that never signed up for social media is in my phone, then now Instagram has them and they'll start marketing to them.
00:18:47.000 Even more insidiously, we know the federal government has been working with social media companies.
00:18:51.000 If they can see who's following, not just following on social media, but actually who has who in their phone's contacts, that's really, really scary.
00:18:59.000 I saw that someone, a former CIA employee, someone that worked for CIA, is now moved over to, I believe it was Facebook meta.
00:19:10.000 I'm looking for the tweet now.
00:19:13.000 I actually had an experience a couple weeks ago.
00:19:15.000 Alex Jones was on the PBD podcast, Patrick Bet-David, and he asked some question on there, like, if we install, in Russia, if there's like a new leader is installed and they're sympathetic to the West, what will happen?
00:19:27.000 And nobody really had an answer.
00:19:28.000 And I was going to text him, Alex, and be like, well, I think that would move us towards totalitarian, you know, technocracy a little bit faster, more peaceful, but also that, and I just didn't message him.
00:19:37.000 I was like, I don't want the FBI reading my, what a bullshit fucking I was so angry at myself that I didn't be brave.
00:19:45.000 Like I just...
00:19:45.000 So the guy that is for the CIA, Aaron Berman, former 17 year CIA officer,
00:19:51.000 is now head of elections policy for Facebook and Instagram.
00:19:55.000 Berman joined Facebook in 2019 and was responsible for writing misinformation policy
00:20:01.000 and enforcing it for the 2020 election, COVID, Brazilian elections, etc.
00:20:05.000 He's joined by 15 others, CIA, FBI, and DHS working in trust and safety for Meta.
00:20:13.000 That is so gross.
00:20:14.000 So real quick, I did confirm these images.
00:20:17.000 It's in the Play Store.
00:20:18.000 It is real.
00:20:18.000 Can you drop them down on the right?
00:20:21.000 One thing I want to do real quick though is I went to the Twitter app On the Play Store, I do not see the same thing.
00:20:29.000 It doesn't have anywhere... Yeah, it doesn't have this stuff.
00:20:33.000 Who's that guy that tweeted that?
00:20:34.000 Because I want to take a look and retweet that.
00:20:36.000 Jesus, it's so hard to navigate.
00:20:38.000 So when I go to Twitter, I don't see it.
00:20:40.000 Let me see what Truth Social does.
00:20:43.000 Data safety starts with you, blah blah blah, see the details.
00:20:48.000 So, data collected from Truth Social.
00:20:50.000 Your personal info, email addresses, user ID, phone number, messages, other in-app messages, photos and videos, contacts, app activity, info performance, and device IDs.
00:20:59.000 That's actually all normal, right?
00:21:01.000 The reason why it says photos and videos, and this is true for meta, is because you post them, so it needs access to those.
00:21:07.000 Yo, it doesn't have anything else.
00:21:08.000 I mean, contacts are there because it'll ask you.
00:21:10.000 In-app messages, I don't care about.
00:21:11.000 Your email address, well, of course, you log in.
00:21:13.000 Sign in with that.
00:21:14.000 So, it really, it is true that Threads by Instagram is basically audio files.
00:21:20.000 Like, it's taking everything.
00:21:22.000 Yeah, it's probably government.
00:21:24.000 It's the same thing that TikTok does.
00:21:25.000 I wonder if Instagram does the same thing.
00:21:27.000 People talked about the amounts of data that TikTok collects.
00:21:33.000 Sounds like Threads does The same.
00:21:36.000 Instagram has basically the same thing.
00:21:38.000 Instagram basically takes everything as well.
00:21:40.000 This is Facebook.
00:21:41.000 This is what they do.
00:21:42.000 And knowing that, man, how can people?
00:21:44.000 It's just such a convenient tool, but it's so insidious that they're tracking not your every move, but your psychology.
00:21:53.000 Twitter takes not nearly as much.
00:21:56.000 It doesn't access your files and documents, but it does have quite a bit of information.
00:22:00.000 Your web browsing history, Twitter takes.
00:22:04.000 App functionality, analytics, advertising, and marketing.
00:22:06.000 Your web browsing history.
00:22:07.000 Twitter takes that too.
00:22:09.000 So, it looks like Twitter is very bad.
00:22:11.000 Threads is substantially worse.
00:22:13.000 But don't be surprised when everyone's spying on you.
00:22:15.000 And I'll make sure it's clear that I don't want to single out Threads on this one.
00:22:19.000 They are spying on you to a greater degree than the other ones, but they're all spying on you quite a bit.
00:22:25.000 Well, I mean, anyone who has an Alexa or a Google Home Assistant or whatever they're called, that is essentially a wiretap that you've allowed into your home.
00:22:31.000 All these phones are.
00:22:33.000 This phone is tracking me right now, because if you say, what is the thing?
00:22:37.000 The voice activation.
00:22:38.000 Okay, Google.
00:22:39.000 Does your phone buzz?
00:22:40.000 Oh, it turned on.
00:22:41.000 Well, I pushed the button.
00:22:41.000 It buzzed.
00:22:42.000 Oh, okay.
00:22:42.000 How can I help?
00:22:44.000 So here's what people need to understand.
00:22:46.000 If your phone, TV, or any other device has voice activation, that means it is listening to every single word you say.
00:22:46.000 Trash.
00:22:53.000 And very likely recording it and transmitting it to the database.
00:22:55.000 Well, it has to record it.
00:22:57.000 The way voice activation works is that it records what you say, sends it to a company for analysis, who then scan it for text, to convert it to text, and sends it back to the TV.
00:23:07.000 So if your TV is voice activated.
00:23:10.000 Many people have this now.
00:23:12.000 If you can just walk up and say something like, okay, Google or whatever.
00:23:16.000 Every word out of your mouth gets sent to this company for analysis, and it's just waiting to hear the activation command.
00:23:22.000 They save all that data.
00:23:23.000 Of course.
00:23:24.000 They keep it all.
00:23:25.000 What they say is, don't worry, it's anonymized.
00:23:27.000 We don't actually know who you are.
00:23:28.000 And I'm like, yeah.
00:23:29.000 Yeah, you know who you are, because I'm talking about who I am.
00:23:32.000 You can't really have all of my personal conversations and say it's anonymous, because no, it's not.
00:23:35.000 That's not how conversations work.
00:23:38.000 But our TV actually has something where we can turn off the voice activation function through the hardware.
00:23:45.000 A lot of laptops have that with webcams now.
00:23:48.000 But if you do that, there's a stupid little light that will remain on and it's annoying because they want you to have the thing open.
00:23:53.000 Right.
00:23:54.000 Because they want to be listening to you.
00:23:56.000 They also, uh, this Alvin guy also posted this app privacy from Instagram and it says, The following data may be collected and linked to your identity.
00:24:03.000 Health and fitness, financial info, contact info, user content, browsing history, usage data, diagnostics, purchases, locations, context, search history, identifiers, sensitive info, and other data.
00:24:11.000 What the hell does that mean?
00:24:12.000 My favorite was credit score.
00:24:14.000 It said credit score!
00:24:16.000 Your credit score.
00:24:17.000 Dude, Zuckerberg just wants to know everything about you.
00:24:19.000 I think it's because he loves you.
00:24:20.000 That's crazy.
00:24:21.000 I mean, think about what that means for advertisers.
00:24:22.000 If they're plugged into your credit score, they can actually bid to advertise on people who have good credit scores.
00:24:27.000 Like, if you're a credit card or something like that, that's scary.
00:24:30.000 That's messed up.
00:24:30.000 You know what I wonder?
00:24:31.000 If, like, maybe the AI took over a long time ago, and it's just learning everything possible about us, we don't realize we're under its spell.
00:24:39.000 It feels inevitable.
00:24:40.000 I don't know if I'm like, if I'm broken and I'm just like giving up or if it's actually inevitable.
00:24:45.000 The AI takeover, not like electricity didn't take us over.
00:24:50.000 We weren't taken over by it.
00:24:51.000 We just all use it mostly.
00:24:53.000 So AI, it might be similar.
00:24:55.000 It might not take us, it might just all use it.
00:24:57.000 But what a powerful tool, especially when it can talk to you and tell you things.
00:25:03.000 What the AI?
00:25:04.000 The artificial intelligence.
00:25:05.000 It's freaking weird, man.
00:25:06.000 No, AI really scares me and it's also kind of ironic and people have been posting about this like we assumed AI and like technology would come so we would be relieved of menial jobs especially like that's the dream if you're on the left but instead it's like all the creatives that are being put out of work almost first I mean obviously technology has been for a long time phasing out lower skilled jobs but it's like now even the good jobs that people were aspiring to those are the ones that are also being wiped out by AI like you know the the graphic designers the writers Yeah, it's less about can you physically draw the picture now and more about can you conceive of the description verbally and then give the computer an accurate, intricate descriptive.
00:25:50.000 So you can come up with it.
00:25:51.000 It's another type of art form.
00:25:52.000 It'll produce another generation.
00:25:54.000 Actually, so I did a poll on that on my Twitter and it was quite the lively debate whether AI art actually counts as art.
00:26:02.000 Is it art?
00:26:03.000 What is art?
00:26:03.000 I mean, that's philosophical, but I think it is art.
00:26:06.000 A lot of people will say it is not art if it's created by AI.
00:26:09.000 Incorrect.
00:26:10.000 It is.
00:26:11.000 I agree, because I think there are so many different tools that we have evolved throughout time using to create art.
00:26:16.000 Why are we placing the limit on AI?
00:26:18.000 Yeah, because if you use a laser to carve something into wood... Yeah, and people will say, well, oh, it's just an amalgamation of previous art.
00:26:25.000 Well, isn't that any art anyway?
00:26:28.000 Creation is usually the product of, I guess, absorbing other works and then your own interpretation of it.
00:26:34.000 When someone paints a picture and uses the tools, you know, paints or a paintbrush or
00:26:38.000 whatever it is they're using, now people can do all sorts of different crazy art.
00:26:40.000 They'll take like coins and then line them up so it makes a big picture.
00:26:44.000 The AI is just basically a paintbrush.
00:26:48.000 What you input into it may not actually be good art.
00:26:51.000 You might say, hey, make this picture, it makes one, and say, hey everybody, look what my AI thing did, and you're like, that's stupid.
00:26:56.000 And then someone might put it in, refine it, put the photo back in, say, change this, change that, and then refine it to the point where it actually makes a really cool picture.
00:27:04.000 But, admittedly, it's just getting easier and easier to make stuff.
00:27:06.000 The scarier thing about it is, if All art is just derivative of other art, then eventually all art will be derivative of AI.
00:27:15.000 AI will just be regurgitating the same things without new human thought and input.
00:27:19.000 But it's crazy because, I mean, we, like, I've been working on stuff with, like, different logos, and we are working with a graphic designer, like an actual person, to come up with stuff, and then one day I was like, you know, I'm just gonna try one of these AI logo generators.
00:27:31.000 I just want to see, because we're kind of, like, we're at an impasse, nothing's really, like, sparking.
00:27:35.000 Man, the AI ones were good.
00:27:37.000 Oh, wow.
00:27:37.000 They were really good.
00:27:38.000 They were better.
00:27:39.000 There's like, man, this is actually solid and it's faster and it's cheaper.
00:27:43.000 But, you know, I want to support the actual person.
00:27:45.000 But it's just hard when the AI product is, in a lot of ways, genuinely better.
00:27:50.000 I want to point out a couple articles real quick, because we're talking about threads.
00:27:54.000 And even though a bunch of people on the left are trying to abandon Twitter for this, Look at this one from Vice.
00:28:01.000 I've seen posts you people wouldn't believe.
00:28:02.000 The birth of an awful mutant baby.
00:28:05.000 A man explaining the symptoms of acute radiation poisoning are similar to panic.
00:28:09.000 All those moments will be lost in time.
00:28:10.000 Like post from 2012.
00:28:11.000 They're talking about Threads.
00:28:14.000 It says, Threads is an assault on the senses.
00:28:17.000 Once you've experienced it, it's impossible to scrub from memory.
00:28:20.000 Threads is a kaleidoscope of disturbing images and unpleasant information.
00:28:23.000 A cautionary tale to be avoided and a revelation of truth that feels stark and unavoidable.
00:28:29.000 I like this one too, look at this.
00:28:31.000 Slate writes, Meta's new Threads app is terrible.
00:28:34.000 It just might bury Twitter.
00:28:36.000 The first sentence is good, the second sentence is bad.
00:28:38.000 But yeah, it is absolutely awful.
00:28:40.000 If you can't even get the left to cheer for this app...
00:28:44.000 Dude, I don't know what you're going to try and pull off.
00:28:45.000 I think they really messed up by opening up the fire hose of data to new users and making them see tons of people that they weren't interested in, rather than give them like five categories, have them auto-subscribe to like five accounts or ten accounts and let them find their way.
00:29:01.000 Because having to block people the minute you walk into a social network is the most anti-social tactic or technique.
00:29:08.000 You do not want your users blocking each other.
00:29:09.000 That's like a failure of the system.
00:29:11.000 Well, so here's the thing with threads, right?
00:29:13.000 So like, you know, again, I'll pull it up.
00:29:16.000 And if I go to my home feed, let's just see who we end up with.
00:29:19.000 Like Paris Hilton.
00:29:21.000 I don't care to follow Paris Hilton.
00:29:23.000 But I don't want to block Paris Hilton.
00:29:25.000 Exactly.
00:29:25.000 Right.
00:29:26.000 Because at some point in the future, Paris might do something that gets rethreaded or whatever.
00:29:30.000 And, you know, maybe run for president.
00:29:32.000 And it's like, I want to see that post.
00:29:33.000 Then I got to be like, oh, they did.
00:29:34.000 I better go unblock them to see it.
00:29:36.000 That's ridiculous.
00:29:37.000 But that's what happens when it's a bunch of Random people.
00:29:41.000 I mean, what is the why is it recommending Taylor Lorenz to me?
00:29:45.000 What is really scraped your name on the internet with along with Taylor?
00:29:48.000 What is Taylor Lorenz band?
00:29:51.000 Oh, if Taylor Renz is on Threads, I might have to sign up because I'm blocked on Twitter and I miss her posts.
00:30:01.000 Her neurotic, neurotic posts.
00:30:03.000 I'm still trying to download it.
00:30:05.000 There's some sort of network issue.
00:30:10.000 It's like, why are you sending me this?
00:30:12.000 Why are you telling me to follow this person?
00:30:14.000 I don't care.
00:30:15.000 It's either an oversight or just a brute force tactic.
00:30:19.000 Well, it's hard now because social media isn't what it was created like.
00:30:22.000 Facebook used to be the people that you know.
00:30:24.000 Now something like Twitter, yeah, it's a mixture of people you know, but also just, like, brands, public figures, like, political discourse.
00:30:31.000 It's this weird, how do you even begin to start to recommend a person something?
00:30:36.000 I admit, like, not that I'm trying to defend threads, but that does sound like a challenge.
00:30:40.000 You know the worst thing about Instagram is?
00:30:42.000 You'll click the little magnifying glass button.
00:30:45.000 It'll give you a bunch of recommended posts.
00:30:47.000 You'll see this video, this happens to me a bunch, where it's like, I can't really tell exactly what it is.
00:30:52.000 They do that on purpose.
00:30:53.000 Click on it.
00:30:54.000 They want you to click it, and then as soon as I do, now Instagram's sending me a whole bunch of these weird videos like, there's, there was one where it's uh, cause I watch a lot of poker vlogs.
00:31:05.000 And so I get recommended this video where I have no idea what the picture is, and I wanna know.
00:31:10.000 It is a coin pusher.
00:31:11.000 A fake coin pusher with fake casino chips in it.
00:31:15.000 All I see is this weird stack of chips.
00:31:17.000 And I'm like, what is it?
00:31:18.000 So I hit it.
00:31:19.000 And then I'm like, oh, it's a fake gambling.
00:31:21.000 It's not real.
00:31:21.000 People make these fake videos that look like they're winning money because people click on it.
00:31:24.000 And I'm like, I get out of here.
00:31:26.000 Next time I refresh, all of a sudden Instagram in my main feed is sending me all this garbage.
00:31:29.000 And I'm like, block, block, block.
00:31:31.000 I don't want to see that.
00:31:32.000 I never wanted to see it.
00:31:32.000 I just didn't know what it was.
00:31:34.000 This is what they do.
00:31:36.000 How stupid of a social media platform.
00:31:39.000 But they want to control narrative and they want to shove everybody into this algorithmic feed.
00:31:42.000 That's why they're doing it.
00:31:43.000 I mean, I wonder if the debut of this particular app is in conjunction with the kicking off of the election season.
00:31:53.000 Probably.
00:31:55.000 It does make sense.
00:31:57.000 I mean, I don't want to be too tinfoil hat, but If this is a social media app that they can convince the regular public is where the correct narrative is and Twitter is where the misinformation is, then it might be an attempt to... I'm not saying it's going to be successful, I'm saying it won't be.
00:32:22.000 The thing about Twitter is it's where the politically oriented individuals seek out those conversations.
00:32:28.000 Threads, they can't do it.
00:32:31.000 Macedon couldn't do it.
00:32:32.000 Blue Sky couldn't do it.
00:32:33.000 Threads won't be able to do it for all of the same reasons any other app can't do it.
00:32:38.000 Twitter is first and best dressed and it's where people have followers and it's where they talk.
00:32:43.000 I got no followers on threads and I really don't care to try and get followers on a feed where people won't even see what I post.
00:32:50.000 Even if I get a million followers, no one's gonna see what I post because it's all recommended weird algorithmic stuff.
00:32:55.000 That's another... They're recommending AOC to me.
00:32:58.000 I do not want to follow her.
00:32:59.000 People are getting mauled by data when they go in there, man.
00:33:02.000 They need an open space to seek out what they like.
00:33:06.000 Listen to me, Mark.
00:33:07.000 Wake up.
00:33:08.000 No, no.
00:33:09.000 He knows what he's doing.
00:33:10.000 I don't think he does.
00:33:10.000 This is a terrible debut.
00:33:12.000 It's got a 3.9 out of 5 rating on PlayStore.
00:33:14.000 No, no.
00:33:15.000 You cannot give someone like Mark Zuckerberg credit as he doesn't.
00:33:18.000 I mean, granted, The Social Network was just a movie, but it kind of gave you an idea of his mindset.
00:33:25.000 And if you look at what he has done with Facebook since the end point in that movie,
00:33:31.000 he's installed Facebook as the internet for a huge portion of the developing world.
00:33:38.000 He had a deal where if you go and use, they'll give out free phones,
00:33:45.000 and these phones come with Facebook installed.
00:33:48.000 So there's a huge portion of the developing world that their first experience with the internet
00:33:55.000 was on Facebook.
00:33:56.000 To a lot of people, Facebook is the internet.
00:34:01.000 To them, they're one and the same, because that was the first way that they experienced the internet.
00:34:06.000 It's mostly people who are on the older side.
00:34:09.000 Young people, Gen Z, they aren't as much on Facebook.
00:34:12.000 I think you're right in Western countries, but I think that in Africa and the Middle East, where you're talking about poorer countries, where they were giving up... I think they're all about TikTok.
00:34:23.000 I'm not saying they're not.
00:34:23.000 I'm not saying they not I'm talking about like when the reason that Facebook got to the
00:34:28.000 market share that it did was because early in in 2012 13 14 when they were first like when they were handing
00:34:36.000 out phones with Facebook installed They were they're handing them out to the developing world
00:34:39.000 And that's why you got so much of the whole world on Facebook is partially because not just that the app was
00:34:45.000 worked well and people liked it But because Facebook was giving away phones with their
00:34:50.000 their product on it so that way they could collect data because it was the you
00:34:54.000 know, the the the Easiest way to collect people's data is to give them a free
00:34:59.000 phone with a free and I'll just give you their data They'll put it in for you
00:35:02.000 And so that that was that was one of the things that Facebook did to really grab market share
00:35:06.000 And that's why Facebook is such a big company And that's what you know
00:35:10.000 a lot of people think of the Internet or when they think of you know, when they think of the Internet they think of
00:35:14.000 Facebook Were you saying that I was giving Mark too much credit?
00:35:17.000 Yeah, he knew what he was doing.
00:35:19.000 He knew that when he was giving out those phones, he was giving them away for free because he wanted the data those people had.
00:35:25.000 But with threads, I don't know if he's like, yeah, have it opened every he might have made that final call.
00:35:30.000 Like, let's just open it to everybody.
00:35:31.000 Let them just we'll start with everything and they can work their way back as opposed to start with nothing and they can work their way up.
00:35:37.000 Because it's a very empty feeling to go into a new social network and have nothing and be like, I don't even know what I'm doing here.
00:35:42.000 Who's on this network?
00:35:43.000 How do I find them?
00:35:44.000 Yeah, but if you can import your Instagram, then that's what I thought would happen.
00:35:49.000 And it is, yet still they give you a bunch of garbage you don't want to read.
00:35:52.000 I think the problem is that you were talking about earlier, the people that you follow on Instagram aren't necessarily the people you would follow on threads, like they're different things.
00:35:59.000 On Instagram, I follow like baby accounts, cake accounts, like travel and stuff, which is very, very different than my doomsday Twitter curation.
00:36:08.000 It also ruins a lot of these accounts for me.
00:36:10.000 Because there's like, there'll be like one account where it's just like people who, you know, travel the world or whatever, and they get to see cool pictures of mountains.
00:36:16.000 And now they're all sudden posting about BLM or something.
00:36:19.000 And I'm like, Oh, I did not want to know your opinions.
00:36:22.000 I didn't I do you have good content.
00:36:23.000 I don't care about what your thoughts are.
00:36:25.000 You have no idea what you're talking about.
00:36:26.000 It's just it just ruin it for me.
00:36:27.000 That's why I am like on Instagram.
00:36:29.000 I don't post any kind of political stuff on my Instagram page.
00:36:32.000 I never had like it's never been.
00:36:34.000 Like, very, very, very rarely will I put something mildly political.
00:36:39.000 Political stuff stays on Twitter, and then, you know, Instagram is me and all that remains, and shows, and my dog, and stuff.
00:36:46.000 Yeah, do cats.
00:36:47.000 You like side-surf cakes?
00:36:49.000 You ever follow those guys?
00:36:50.000 They make cakes.
00:36:51.000 They'll make cakes of, like, hyper-realistic-looking, like, a cup.
00:36:54.000 Then you cut it, and it's a cake.
00:36:56.000 They made Lex Freeman's face.
00:36:57.000 Oh, that's pretty cool.
00:36:58.000 And Michael Malice had it at his house.
00:37:00.000 He probably still has it right now.
00:37:02.000 Yeah, I posted a picture of it.
00:37:03.000 I follow that account.
00:37:03.000 They're great.
00:37:04.000 There's a lot of cake accounts.
00:37:05.000 Yeah.
00:37:06.000 You like those realistic cake accounts?
00:37:08.000 I mean, I like to follow them because it's cool, but I also know that those are not the cakes that taste good.
00:37:12.000 Interesting.
00:37:13.000 Right?
00:37:13.000 Because there's a lot of fondant, and they're probably only moist because they use a simple syrup.
00:37:17.000 I like the Genoise, the fluffier stuff.
00:37:20.000 Yeah, real cake.
00:37:21.000 None of that fake garbage cake.
00:37:23.000 Are you a cook?
00:37:24.000 Yes, I love cooking.
00:37:25.000 Baking.
00:37:25.000 I love eating.
00:37:26.000 It's baking, Ian.
00:37:27.000 Not as good, but it's still like a science.
00:37:30.000 Why don't we talk about politics, because we were going to get into that a little bit ago.
00:37:34.000 Here's the story from the Daily Mail.
00:37:37.000 White House cocaine was found near the Situation Room and not in the West Wing visitor entrance.
00:37:43.000 Drug story changes for the second time as Secret Service now says dime bag was found in a more secure location.
00:37:49.000 Okay, so it was Hunter, right?
00:37:51.000 Yeah.
00:37:52.000 I mean, I struggle to imagine anyone else who was just sneaking in cocaine to the White House, because I would imagine, I've never been there, but I imagine they have pretty good security.
00:38:00.000 I imagine there are cameras.
00:38:01.000 I imagine people are getting searched.
00:38:03.000 I bet anything it was Jill.
00:38:03.000 Yeah, where are the cameras?
00:38:06.000 Dr. Jill Biden just railing.
00:38:08.000 The video of Hunter on the balcony, and it kind of looks like he's kind of strung out.
00:38:08.000 Of what?
00:38:14.000 Actually, let me pull this up.
00:38:16.000 This video's great.
00:38:17.000 He goes behind Jill, and it looks like he pulls his sleeve up, takes a bump off his arm.
00:38:21.000 And it also looks like Jill is furious, which, I mean, if your son was strung out, you probably would be.
00:38:27.000 She can sense him behind her, and she's just in a mood.
00:38:29.000 She's like, oh, look at this guy.
00:38:30.000 Not in a mood.
00:38:31.000 She's in a mood because of Hunter.
00:38:33.000 Take a look at this picture.
00:38:35.000 This photo of President Biden sitting on the White House steps is just so breathtaking and captures this moment so well.
00:38:42.000 Joe Biden is doing so much for all of us and has the weight of the world on his shoulders, and I just want to say thank you every time I see him.
00:38:49.000 so it is so so so here's this here's this photo sentiment that i can and i
00:38:53.000 the sentiment i got from this was joe sat down
00:38:57.000 I don't know.
00:38:57.000 of the horizon went, damn, that was my coke. Like he's like, damn it, you Hunter, you can't just go got it 45 minutes
00:39:06.000 without blowing lines up your nose.
00:39:09.000 I bet he was thinking something like that. You know, Hunter, he's I wonder if I don't know, you can assume a lot from a
00:39:14.000 picture. But I imagine that Joe's just like, so broken about what he has been like what happened with Hunter and
00:39:22.000 the world knows.
00:39:23.000 There's video of Hunter, like there's pictures off his laptop with little young people, like kids.
00:39:29.000 Well, that's what's crazy about the whole Hunter Biden story.
00:39:31.000 Not necessarily that he's into all of this depravity, but it's that he feels the apparently compulsive need to document all of it.
00:39:38.000 Like, I mean, I've never seen a man take so many incriminating selfies in my whole life.
00:39:42.000 I appreciate that.
00:39:43.000 How good you are at the crack.
00:39:44.000 because I skateboard and so we're always trying to film everything we do to make
00:39:48.000 sure everybody knows just you know how good we are at skateboarding. How good
00:39:51.000 you are at the crack. Well here's the thing man you know Hunter Biden is a big
00:39:55.000 part of the crack community and he wants to make sure everybody knows he's the
00:39:58.000 best. If you're that good at smoking crack you want people to know.
00:40:01.000 You know, he goes to his buddy and he's like, yo, I need to film her because we're going to be doing Kraken Prostitutes later tonight and someone's got to capture this stuff.
00:40:07.000 And they're like, ah, just use your phone and put it on Instagram.
00:40:09.000 And he's like, all right.
00:40:10.000 I was thinking about, like, the Roman Republic or the Roman Empire, really, how they would have, like, these giant parties on boats, like, when it was all falling apart.
00:40:18.000 There's just these decadent things.
00:40:19.000 And then we see this festival of Joe Biden just, like, vacantly staring out, his wife not even looking at his son, who's a crackhead, who just got, like, not pardoned, but, like, a misdemeanor for having, like, Crack and gun on him?
00:40:33.000 And how they'll look back at this period of history at that and be like, wow, what did America become that the first son was like, just given a little pat on the wrist for having something that would have thrown other people in prison?
00:40:45.000 Yeah, yeah.
00:40:45.000 The leftist response to that was that it's actually it's because of white privilege.
00:40:49.000 Nothing to do with Joe Biden and corruption because of that.
00:40:52.000 It's specifically because Hunter is white, because all white people would have been treated the same in that situation.
00:40:57.000 And Ian, it may very well be that in a hundred years, the way they look back on this moment is, it was the beginning of the fall.
00:41:07.000 Yeah.
00:41:08.000 I think the Kennedy assassination was.
00:41:11.000 They perhaps will write in the future and say, kids, there's always something.
00:41:16.000 The first fall of the American empire started on what date?
00:41:18.000 And they're going to be like, July 4th, 2023.
00:41:22.000 And what caused it?
00:41:23.000 Finding Hunter Biden's cocaine.
00:41:25.000 That's right, kids.
00:41:27.000 It is pretty nasty.
00:41:28.000 It's like, what kind of faith can you have?
00:41:30.000 I mean, obviously power, this is what I've always been told, they protect each other, people in power, nepotism runs rampant, and they flaunt it.
00:41:38.000 And I think this is Hunter's flaunting it, obviously, with all his pictures and propaganda, and then Joe's just completely ignoring it and saying, he's my son.
00:41:47.000 They said they found it near the West Executive Entrance.
00:41:50.000 I don't trust this.
00:41:50.000 I think it's a lie.
00:41:51.000 Used by officials, visitors, and celebrities.
00:41:53.000 Yeah.
00:41:54.000 Yeah.
00:41:54.000 I don't believe it.
00:41:55.000 Next thing you know, they're going to be like, at first it was a library, then they said it was the West Wing, now Situation Room.
00:42:00.000 So they're going to be like, we found it on Biden.
00:42:02.000 Like, we literally found it in his pocket.
00:42:04.000 Or they'll say it was something other than cocaine next.
00:42:06.000 They'll be like, it was actually baking soda.
00:42:08.000 We thought it was coke.
00:42:08.000 It wasn't.
00:42:10.000 I mean, what a bunch of crap.
00:42:11.000 Why would they even tell us anyway?
00:42:12.000 Why would they even bring this up?
00:42:13.000 Because they want to replace Biden.
00:42:18.000 They're getting rid of him.
00:42:19.000 Mic drop.
00:42:20.000 I think you're right, dude.
00:42:21.000 Yeah, it's probably going to be Newsom or something.
00:42:23.000 Yeah, well, he's going on a nationwide tour.
00:42:25.000 He was doing something, I think it was in Idaho and Boise, where he was visiting a bookstore.
00:42:31.000 Oh, look at these banned books.
00:42:32.000 It's like you're at a bookstore.
00:42:33.000 The books are not banned if they're at a bookstore.
00:42:36.000 But, I mean, there's no reason he's doing anything outside of California unless he wants to run for president.
00:42:40.000 What do you think about Michelle Obama?
00:42:43.000 Um, I think so.
00:42:44.000 She doesn't really have the experience or the gravitas that Barack did.
00:42:47.000 But there's a lot of nostalgia around the Obama presidency for I think, like a lot of independents, probably disaffected liberals.
00:42:53.000 So I think I mean, if it's her or Kamala, definitely Michelle Obama.
00:42:57.000 Yeah.
00:42:58.000 But the truth is, if you're, if you have nostalgia for Obama, you didn't pay attention to Obama's presidency.
00:42:58.000 Yeah.
00:43:04.000 Oh yeah, I'm not denying that, but I think there are a lot of disaffected liberals or people who would not vote Democrat now who were happy to vote for Obama regardless of how things were going just because it felt nice.
00:43:14.000 He felt like a president and it was the good old days back then, even though it wasn't in a lot of ways.
00:43:19.000 Partly why some people probably voted for Biden because they thought it was going to be like Obama or whatever, but it's probably going to be Newsom.
00:43:25.000 We'll see, though.
00:43:25.000 I mean, I really don't know.
00:43:26.000 I feel like that would be such a hard sell, because you are governor of a state that is bleeding, hemorrhaging people.
00:43:33.000 It's just going down the drain.
00:43:35.000 Who on earth would look at California and think, yes, more of this?
00:43:39.000 There are people who live in these places, and they keep defending what's going on.
00:43:44.000 It's remarkable.
00:43:45.000 When we have a conversation with some of these liberals, people who live in New York, and it's like, 25 people were pushed in front of subway trains last year.
00:43:51.000 Like, that's bad.
00:43:52.000 And they're like, so what?
00:43:54.000 And I'm like, I'm not arguing every single person in Manhattan is falling in front of trains.
00:44:00.000 I'm saying you probably should do something to stop this from happening.
00:44:04.000 Yeah, well, I think what's frustrating about talking to people from New York and California especially is that they act like those things are normal.
00:44:11.000 I remember there was a, gosh what it was, with Jordan Neely, right?
00:44:15.000 This homeless guy was threatening people.
00:44:17.000 There are so many New Yorkers who are saying like, well, that's just normal.
00:44:19.000 That's an everyday occurrence.
00:44:20.000 He didn't need to freak out.
00:44:21.000 I was like, what is the matter with you people?
00:44:23.000 You don't need to live like this.
00:44:24.000 There are so many other places, even other cities, where this doesn't happen.
00:44:28.000 Stop defending it.
00:44:29.000 And it's also frustrating because these are the people who, by and large, hate America the most.
00:44:33.000 And it's like, you are in a blue city, in a blue state, when the president is a Democrat.
00:44:38.000 If things aren't going well, perhaps Step back and take a look at why that is.
00:44:42.000 I'm kind of excited for everything that's going on with politics because I'm feeling like, you know, we're winning with the Bud Light stuff especially.
00:44:50.000 And I do think we're going to start to see more liberal personalities start adopting the disaffected liberal stances on things because at a certain point they're going to realize there's no market for the weird leftist stuff.
00:45:05.000 I mean, I'll put it this way.
00:45:07.000 Eight years ago it was intersectional feminism, you know, critical race theory, now it's wokeness, gender ideology.
00:45:13.000 It keeps changing what the left wants.
00:45:16.000 They have no position.
00:45:18.000 It's just random and amorphous.
00:45:20.000 At a certain point, as they keep getting crazier and more unhinged with whatever it is they're supporting, People are leaving.
00:45:28.000 Liberals are becoming disaffected.
00:45:30.000 And that means these commentators are going to lose their positions and they're gonna be forced to be like, okay, that's too much for me.
00:45:34.000 I think people are starting to wake up because I follow this is just a very anecdotal thing.
00:45:39.000 But I follow a lot of makeup brands on things like Instagram, just because you know, it's it's fun.
00:45:44.000 It's girly, whatever.
00:45:45.000 But during Pride Month, it's Insufferable.
00:45:48.000 Every makeup, beauty brand, any brand that caters to women, they're all out.
00:45:51.000 Trans this, let's put men in our products, whatever.
00:45:54.000 I started actually looking through the comments, because I was just unfollowing brands because I'm sick of it.
00:45:58.000 I started looking through the comments expecting, this is brave, like, you know, because by and large, women, at least on social media, they all eat that stuff up.
00:46:06.000 But no, overwhelmingly, people were fed up saying, unfollow, what does this have to do with the product?
00:46:11.000 And I think that's a good, like, even normies are getting sick of it.
00:46:14.000 Normies who are probably left, right?
00:46:16.000 Women are getting turfy.
00:46:17.000 Like I literally just did just this morning I got a text message from a buddy of mine who's in a very well-known band and he's like he sent me a tweet from from Jill Flipovic and she was talking about you know about how marriages are not you know people aren't getting married in there and they don't last and and she was looking at this as a as a victory because she thinks that women are out there doing what they want to do and living their lives Even though it seems that people are more depressed and not happy about it.
00:46:50.000 But anyway, she was looking at it as a victory, and my buddy's like, do you believe this stuff?
00:46:55.000 And I'm like, look, this is kind of how the left goes.
00:47:00.000 They don't actually embrace things like family values.
00:47:04.000 And he's got a family, and he's talking about his wife, and he's like, she's kind of turfy and stuff.
00:47:11.000 And I was like, look, she was talking about women and men in women's bathrooms and stuff.
00:47:17.000 And that's where the line is being drawn.
00:47:20.000 Women want to have their own spaces.
00:47:24.000 You could probably get away with bathrooms, maybe.
00:47:29.000 You're not gonna get away with locker rooms.
00:47:31.000 You're not getting away with dudes, with trans women in the gym locker room walking around with a boner.
00:47:38.000 That's not gonna fly.
00:47:39.000 Well, look at Wee Spa.
00:47:41.000 Remember that?
00:47:41.000 Yeah, exactly.
00:47:42.000 So that turned out to actually not be a trans person, it was just a guy who liked flashing his junk at women.
00:47:50.000 And so, because of these policies, and it's funny because conservatives are like, what's gonna stop a man from doing it or whatever?
00:47:56.000 And they're like, the left says, trans people don't do these things.
00:47:59.000 Like, right, we're concerned about the men doing it under the guise.
00:48:02.000 So what happens is, this dude goes in the women's locker room, flashing his junk, and when a woman complains, they're like, must be a trans person, so we're not gonna do anything about it.
00:48:11.000 Then it turns out to be a repeat offender they arrest.
00:48:13.000 If it is a trans person, or if it is a trans person, they do something, imagine what they're gonna get on social media.
00:48:19.000 The backlash, you're transphobic, et cetera.
00:48:23.000 All of the incentive is to allow the person in the bathroom to do whatever it is that they want, just so long as they don't actually physically grab someone else.
00:48:32.000 Because the repercussions of being wrong or being accused of being transphobic.
00:48:38.000 It's social suicide.
00:48:38.000 Let me tell you guys something.
00:48:40.000 The first ever 1080 on a skateboard was done by, I think, a 12-year-old boy.
00:48:45.000 12!
00:48:46.000 This is like pre-pubescent.
00:48:48.000 It's like, there's no, you know, they say puberty is the point or whatever.
00:48:52.000 The 12-year-old boy, this tremendous feat, where are the females?
00:48:57.000 If puberty is what matters.
00:48:59.000 So recently, I think a 12-year-old girl did the first ever 720 on a vert ramp.
00:49:04.000 720, okay?
00:49:05.000 So, for those not familiar with what this means, when you spin a full rotation around, it's a 360.
00:49:10.000 You're spinning 360 degrees.
00:49:11.000 You do that twice, it's a 720.
00:49:12.000 You do it three times, it's a 1080.
00:49:14.000 Just in this past week or so, a 12-year-old girl did a 720 on a vert ramp.
00:49:19.000 The first time a girl's ever done it.
00:49:20.000 It was a huge deal.
00:49:21.000 Everyone's screaming and cheering.
00:49:23.000 And I find it duplicitous.
00:49:26.000 To celebrate the first female to do a 720, at the same time they're celebrating males competing in women's sports.
00:49:33.000 You recognize the literal distinction between males and females, because if it really was that it doesn't matter if you're male or female, they would have been like, who cares that a female did a 720?
00:49:43.000 A 12-year-old boy did a 1080.
00:49:45.000 And then I think another guy did a 1260 a couple years ago, which was a real big deal.
00:49:50.000 But the reality is, because they know, females are not competing at the same level as males.
00:49:55.000 What's gonna happen when we've already had, if you guys have been following Taylor Silverman, who actually works here, she was competing in skateboarding contests and had lost contests to biological males.
00:50:05.000 How can this contradiction exist?
00:50:07.000 But I'll tell you, the frustrating thing about it, people in the skateboard industry at the highest levels will privately say, yeah, we're not okay with this stuff, and then publicly be like, yay, good for you, That's starting to change.
00:50:21.000 And that's the cultural shift we're seeing that I'm actually happy to see because now people are finally being like, okay, we can't keep doing this.
00:50:26.000 But what's insidious is that on the left, I've already seen the next evolution of their argument.
00:50:30.000 It's like, first of all, there's no difference.
00:50:32.000 You don't need to worry about men taking over women's sports because they don't have an advantage.
00:50:37.000 Now it's like, okay, they do have an advantage, but that is exactly why Kids need to go on puberty blockers, it's so that the advantage isn't, I guess, instilled.
00:50:47.000 But that is BS, because at least according to my pediatrician, or my kid's pediatrician, from birth, boys and girls are on different growth charts.
00:50:55.000 From conception, men and women are different.
00:50:58.000 Actually, in utero.
00:51:00.000 Prenatal testosterone has a big impact on fast twitch muscle development, for instance.
00:51:05.000 Listen, I don't think that children should be put on puberty blockers until they're old enough to decide what gender they want to be, but I want the left to make that argument real bad.
00:51:18.000 I want them to make that argument real bad because they will die at the ballot box.
00:51:22.000 Regarding winning, right now it feels like things have changed, like there's a sea change.
00:51:28.000 Whether it's a winning, it's like an ingredient of winning.
00:51:31.000 It's like the anvil is so hot right now.
00:51:34.000 We have heated this system up by uncovering the problems and hyper-focusing on them and teaching people about the problems.
00:51:40.000 Now it is hot.
00:51:42.000 Now it is time to strike.
00:51:43.000 And that is to create some sort of cultural revolution that is in the mind.
00:51:48.000 Easy with those kind of things.
00:51:50.000 No, go for it, Ian.
00:51:53.000 Pull in the cultural revolution talk.
00:51:56.000 But that's what's happening right now, and we have to guide it.
00:51:58.000 There's a cultural revolution that's happening regardless of whether you want it or not.
00:52:01.000 The question is, is the outcome going to be preferable to you?
00:52:04.000 Yes, and we need to push back against the people that are trying to have a cultural revolution in the U.S.
00:52:10.000 because the cultural revolution is in an illiberal direction.
00:52:16.000 What we want is we want liberal principles in America.
00:52:20.000 And right now, the push in the Cultural Revolution is an authoritarian push.
00:52:24.000 It's where the censorship comes from.
00:52:26.000 It's where the shutting people down comes from.
00:52:29.000 All that stuff is authoritarian, and it's illiberal, and it's not what we want.
00:52:33.000 But we need a new way of living life.
00:52:35.000 Okay, for instance, kids are seeing porn at the age of nine.
00:52:38.000 We need some sort of sexual education revolution in this country.
00:52:40.000 in this country.
00:52:41.000 No, no, no, we need to uphold the laws as they've been forever.
00:52:44.000 You can't just, I mean, if you just scream no, no, no, it's going to overcome you.
00:52:49.000 You just said no, no, no, and then you said we need to uphold the laws.
00:52:52.000 In the middle of me explaining my position, you start saying.
00:52:55.000 So I cut you off, sorry.
00:52:56.000 You said we need to have a sexual education for nine year olds.
00:52:59.000 No, what needs to happen is, it has always been illegal to allow children into adult movie stores and bookstores, etc.
00:53:05.000 Yeah, sure.
00:53:05.000 And then one day the internet gets invented, and then everyone in our country just goes, now it's fine.
00:53:11.000 Well, no.
00:53:12.000 No, it's not fine.
00:53:13.000 Websites that allow children to access that stuff should be prosecuted the same way a bookstore would.
00:53:18.000 It's the parents that are allowing it.
00:53:19.000 That's the problem.
00:53:20.000 Yeah, but you could say it's the parents that are allowing it when it comes to pornography, but if a little kid were to go into a supermarket and buy alcohol, you wouldn't just say, oh, it's the parents that need to do a better job.
00:53:29.000 The parents let them do it.
00:53:30.000 No, yeah, but the parents let them do it, but it would still be illegal to sell alcohol to that minor, regardless of what the parents wanted or not.
00:53:35.000 Why is there a different standard for online pornography?
00:53:38.000 Because that's also not the standard for in-person pornography.
00:53:41.000 You can't just, oh, it's the parents' fault.
00:53:43.000 Sorry, go ahead.
00:53:44.000 No, no, no.
00:53:44.000 I don't want to cut you off.
00:53:46.000 I was just saying, like, that's only, oh, it's just the parents' fault to moderate.
00:53:50.000 That's really only used when it comes to online pornography.
00:53:52.000 People aren't saying that when it comes to, like, actual porn that you could buy in person, right?
00:53:57.000 We're okay with the government saying, no, you as the seller cannot provide that to the minor.
00:54:02.000 How about this?
00:54:03.000 If a guy goes out in public in a trench coat and starts dancing around and then whips his coat open and he's naked, he gets arrested.
00:54:11.000 Yeah.
00:54:11.000 Not if it's during a pride parade.
00:54:13.000 That's true.
00:54:14.000 How about if a guy goes on Twitter and posts a video of him dancing and then whips it open, he gets arrested.
00:54:19.000 There's literally no difference.
00:54:21.000 What I'm talking about is more just about educating young people about sexuality in a way that can help them.
00:54:26.000 That, no, that should, that's a- Hold on, please let me finish.
00:54:29.000 It sounded like you did finish.
00:54:31.000 No, I mean- You made a point, then he started talking.
00:54:35.000 When I was like 11 or 10, I had sex ed in fifth grade, and they were like, this is what a condom looks like.
00:54:41.000 Good luck, kids.
00:54:42.000 Don't get her pregnant.
00:54:42.000 And I'm like, okay, I guess that's enough for the 90s.
00:54:45.000 But now, if the kid's friend brings a cell phone, and they're all gonna see porn, they need to know, they need to be bolstered against that.
00:54:52.000 But you can't just start treating nine-year-olds like they would inevitably be looking at porn at the time, right?
00:54:57.000 Because that's basically saying like, oh, well, you're gonna see porn anyway, we might as well just And honestly, the biggest problem that I have with that is this is something that parents should be deciding.
00:55:09.000 Parents should be deciding when their kids are given sexual education.
00:55:14.000 But that is not something we want school to do.
00:55:16.000 That's the mental revolution.
00:55:17.000 Exactly.
00:55:18.000 We need a mental revolution of the consciousness where people become open to communicating these kinds of things with their kids.
00:55:24.000 The argument you're making is that because There are people out in the streets exposing their genitals to children.
00:55:32.000 It's important that we have the conversation with our kids about why they're doing it.
00:55:35.000 I would rather stop the people from exposing themselves to children.
00:55:37.000 Actually, the police go and arrest those people.
00:55:40.000 No, that's not the argument I'm making.
00:55:41.000 I'm talking about internet porn.
00:55:42.000 Yes, it's the same thing.
00:55:44.000 No, it's not.
00:55:44.000 A guy on the street flashing is not the same as internet porn.
00:55:47.000 Yes, it is.
00:55:47.000 No, it's not.
00:55:48.000 You can watch internet porn from your living room.
00:55:50.000 You can't watch a guy on the street flashing from your living room.
00:55:52.000 You gotta go on the street.
00:55:53.000 No, that's not true.
00:55:54.000 Maybe through your window.
00:55:55.000 That's right.
00:55:55.000 But you have to go there.
00:55:57.000 In public is in public.
00:55:59.000 Twitter is in public.
00:56:00.000 The street is in public.
00:56:00.000 From your closet with the door closed, you can watch internet porn.
00:56:03.000 You cannot see the street from your closet.
00:56:04.000 The internet is in public.
00:56:06.000 That's an argument that needs to be made.
00:56:08.000 Not a lot of people are... Well, I mean, courts have made that, right?
00:56:10.000 That's why they said that Donald Trump can't block people because it's the public sphere.
00:56:13.000 But yet AOC has blocked people.
00:56:15.000 And it's considered to be a violation of a court order that has to be adjudicated.
00:56:22.000 But the difference between a politician barring you from a public space, which is a civil matter, is not the same as someone exposing themselves to children, which is what people who go on Twitter and post porn are literally doing.
00:56:33.000 So I don't know why it is that before the internet, it was ubiquitous.
00:56:41.000 Anyone who goes out into public and exposes themselves, male or female, two kids, to any person would get arrested.
00:56:47.000 The internet gets invented.
00:56:48.000 Anyone can access it from anywhere with increasing ease.
00:56:53.000 And all of a sudden, we as a civilization decided this one area doesn't count?
00:56:58.000 No way, dude.
00:56:59.000 It is in public, and there should be... Look, I'll put it this way.
00:57:04.000 The laws are already clear.
00:57:06.000 The laws are already in the books.
00:57:07.000 You cannot expose yourself to children or to anybody.
00:57:11.000 But the police don't actually enforce the law.
00:57:15.000 The police should.
00:57:17.000 That's it.
00:57:17.000 Appeal to authority is not going to work on this one.
00:57:19.000 You've got to take care of your kids.
00:57:21.000 The police should fix it.
00:57:22.000 You've got to save your kids before they get corrupted.
00:57:24.000 Ian, that's a strawman argument.
00:57:28.000 You can't just call the police.
00:57:30.000 Who the hell is that anyway?
00:57:31.000 What are they going to do?
00:57:32.000 I would say arrest the people who are posting pornographic content.
00:57:36.000 Who owns Pornhub?
00:57:38.000 Your argument is because we've allowed it, we should not stop it.
00:57:41.000 No, because it is reality, we need to craft such.
00:57:44.000 We can't just pretend that it's not.
00:57:46.000 It's reality that people go on the street and expose themselves.
00:57:48.000 It is reality, and we need to confront it, but why is the way that we are confronting it by making it a problem children have to deal with?
00:57:54.000 If we are going to confront it, we should tackle things from the adult perspective and handle the adults who are committing the act, rather than just trying to train our kids like, oh, I'm sorry, you're just gonna have people flashing you, and it's not okay, but I guess we'll talk to you about it, so you're prepared.
00:58:08.000 But also, there are states, like Virginia, I think was the most recent one, I think Utah has done the same, where now they are requiring Age verification to access sites like Pornhub.
00:58:16.000 And it's caused this whole thing because now Pornhub is like, okay, well, we're going to block IP addresses from those states, which I mean, obviously, everyone has a VPN.
00:58:24.000 It's very easy to get around.
00:58:25.000 But, you know, some people are saying now, oh, it's because it's a threat to privacy, yada, yada, yada.
00:58:31.000 But I mean, you have to show your ID if you're going to purchase alcohol or access adult stuff in person.
00:58:37.000 I don't think it's unreasonable to say that those same standards should apply when online.
00:58:40.000 Well, I got a question.
00:58:41.000 I pose the suggestion that we need a revolution of the way we teach sexual education to kids.
00:58:46.000 If you guys disagree and you think we don't need to change anything... No, no, but it seems like you want a revolution in the opposite way.
00:58:53.000 Yeah, you're taking the approach the left takes.
00:58:56.000 Well, I don't know.
00:58:57.000 I take it on a person-by-person basis, but I think that you need to get kids ready.
00:59:03.000 They need to understand that... Hang on, why?
00:59:06.000 Why do you need to get kids ready to understand this stuff?
00:59:09.000 I'm sorry to interrupt.
00:59:10.000 Go ahead.
00:59:10.000 Ian's right.
00:59:11.000 We need to start showing kids graphic videos of people getting their heads blown off in war, because that's reality, too.
00:59:15.000 That's a real dickhead thing to do, dude.
00:59:17.000 I didn't see that.
00:59:17.000 And they might be out in the streets of Chicago with a lot of gun violence and death.
00:59:22.000 So it's about time you bring these kids over and you show them these videos and explain to them what they're seeing and why.
00:59:26.000 Or how about this?
00:59:27.000 We protect children from these things, be it gore or loot and lascivious behaviors, and when people engage in this stuff in public, we say, hey, that breaks the law.
00:59:36.000 Okay, how's it working so far?
00:59:37.000 We're not doing it, we've never done it.
00:59:39.000 Like, my point is, our society, in this country, for some reason, decided that even though the internet is publicly accessible to children, we won't do anything about people posting graphic images in places children can reach.
00:59:53.000 It is on the books.
00:59:55.000 That you cannot have a weapon, a gun, accessible to children in many states.
01:00:01.000 You can make arguments about whether or not Second Amendment protects the rights of people to keep and bear arms, whether those people are children or not, but there are those laws.
01:00:07.000 They exist.
01:00:08.000 Yet for some reason, the left, which wants to ban guns, will advocate to an extreme degree that you cannot have a weapon in any way.
01:00:16.000 This could mean that you can't put it on the top shelf of your closet and close the door.
01:00:20.000 Kid could get in there.
01:00:21.000 But when it comes to posting graphic, obscene... I'm not talking about two people in the missionary position.
01:00:26.000 The stuff you can find on the internet, everybody knows how psychotic it is.
01:00:29.000 Dude.
01:00:30.000 How absolutely deranged.
01:00:31.000 I've heard stories, man, that you can't even... I wouldn't even mention on YouTube.
01:00:33.000 Yes, and so... We...
01:00:36.000 We created, as a society, Section 230 specifically so that these platforms could remove those things and not be held personally liable for suppressing speech.
01:00:47.000 It's a bit more complicated than that, but it was like, if something is lewd and obscene, you can take it down and not be considered responsible for the speech on that platform.
01:00:57.000 Yet we've never set the precedent that, yo, you can't walk around in public showing big pictures of pornography.
01:01:05.000 You can't do that.
01:01:06.000 But you can do it on the internet where children can get access to it?
01:01:08.000 And not just about children, if Ian is kind of held up like, oh, well, maybe children should have more education.
01:01:14.000 Even as an adult, right?
01:01:16.000 You can be charged for indecent exposure if there's no children around.
01:01:19.000 But why is it alright to, I mean, spam pornography on Twitter to threads that have nothing to do with pornography?
01:01:26.000 Why is that considered different?
01:01:29.000 With Twitter, I don't know if you need to be 18 to use the app.
01:01:32.000 I think it's 13.
01:01:34.000 Do you need parental consent?
01:01:34.000 Yeah.
01:01:34.000 13?
01:01:34.000 Yeah.
01:01:37.000 I think there are states that are beginning, I think Utah now, or like, I forget the exact date, but states are beginning to say, okay, you do need parental consent to be on social media if you're under a certain age, but that is not like a Twitter-wide policy as far as I'm concerned.
01:01:49.000 People can correct me if I'm wrong.
01:01:50.000 On the street?
01:01:52.000 There's no, like, 18 and over sign before you go there.
01:01:54.000 On Pornhub, there is.
01:01:57.000 So you can't go.
01:01:57.000 Well, you can go, but you're supposed to slide.
01:02:01.000 It's like two little kids standing with one on the other's shoulders in a trench coat walk into an adult bookstore and go, Why, yes, I am 20 years old.
01:02:09.000 Oh, right this way, sir.
01:02:10.000 But what it feels like is that there's like an alien presence that's infiltrating our system and people are like, make it stop, make it go away.
01:02:18.000 You're like, dude, they've already infiltrated.
01:02:20.000 This is not an American thing.
01:02:21.000 This is like a global sex cult.
01:02:22.000 That's called horny.
01:02:24.000 The alien presence is called horny.
01:02:25.000 Yeah, whoever's running these websites, I don't pin them on America.
01:02:28.000 And like, we can try and use American defense mechanisms to stop it, but it's there right now and it's happening.
01:02:33.000 So we need to at least understand it.
01:02:35.000 And we need to understand it as soon as possible.
01:02:37.000 I'm not saying show kids graphic images.
01:02:39.000 I'm not saying that.
01:02:40.000 But I think that we just need a new way of living and behaving in this new... Dude, they want to put people in machine pods.
01:02:47.000 They want to control your brain thoughts.
01:02:49.000 We need to take control of this.
01:02:51.000 You're a little...
01:02:53.000 Jumping around here, what I'm getting from you is you say that there needs to be a revolution.
01:02:59.000 I take issue with the idea of having a revolution just because... But I mean a revolution already happened to be clear with the internet and the way that people are accessing information.
01:03:07.000 Yeah, yes, but and again I take issue because you're not talking about like a technological revolution or industrial revolution, you're talking about a people revolution, a revolution or at least that's what Ian... Of the mind.
01:03:18.000 We don't need a consciousness revolution.
01:03:19.000 We need to disregard and reject things that we know do not work.
01:03:21.000 Consciousness revolution I started doing YouTube in 2006.
01:03:23.000 We don't need a consciousness revolution we need to we need to to disregard and and
01:03:30.000 reject things that we know do not work things like top-down control having
01:03:38.000 Government in charge of educating children Those are things that we need to avoid you want to have a
01:03:45.000 family focus you want your society wants to have a family focus
01:03:49.000 It's not a bad idea to say all right. You can't post graphic images on this
01:03:54.000 website and if you do then it's Actionable you know by the police because this is the
01:04:01.000 public square Or at least it makes sense, right?
01:04:05.000 Maybe I'm not for that particular policy or whatever, but it is a policy that it does address what seems to be the issue that we're talking about, which is children being exposed to graphic images and pornography on the internet.
01:04:18.000 I agree that we need to stop things that are bad.
01:04:21.000 We do.
01:04:21.000 That's a scientific way of looking at it.
01:04:22.000 If it doesn't work, you disregard it.
01:04:24.000 That's the scientific method.
01:04:25.000 But there are other things that we don't know if they work or not, like God.
01:04:30.000 People grow up not believing that it's real.
01:04:31.000 They think it's fake.
01:04:32.000 And I don't think it is fake anymore.
01:04:34.000 It seems like there is some sort of essence that's moving.
01:04:38.000 Let's stay on topic, I guess?
01:04:41.000 Well, this is a consciousness revolution I'm talking about.
01:04:43.000 We need this in people's soul, in their spirit.
01:04:47.000 I don't know if it's a level of fearlessness or what.
01:04:49.000 what this is something that that sounds very personal to you and the idea that
01:04:55.000 that we should assert that everyone needs to have a consciousness revolution
01:05:00.000 because of something that you feel but I'm deeply impersonal about thing that's
01:05:04.000 not good I bet Mao tried to make everyone experience well that's the
01:05:07.000 that's the thing it's like trying to remake me To be clear, what you are saying and advocating for, top-down control is bad, government control of education is bad, for that to not be our reality, that would necessitate a revolution because that is what we are living in.
01:05:21.000 It is a radical thing nowadays to say, let's abolish the Department of Education.
01:05:25.000 That is a revolutionary thing because we are so entrenched in this system right now.
01:05:30.000 Well, okay, so, I mean, if you understand that as revolutionary, okay, I can accept that.
01:05:35.000 Well, I mean, in what way is it not revolutionary?
01:05:37.000 Because we had 125 or whatever, 150 years.
01:05:43.000 But would it right now require a complete overhaul of the system?
01:05:47.000 Pardon me?
01:05:48.000 Would it right now require a complete overhaul of the system?
01:05:51.000 No, because you've got schools.
01:05:53.000 Schools are mostly, even though the Department of Education kind of is the Like, the government agency?
01:06:01.000 There's still a lot of autonomy that schools have.
01:06:04.000 The biggest problem that I have- I mean, the Department of Education is at the federal level, but then there's all of the state apparatuses.
01:06:10.000 Yeah, that's something that- Which is still government.
01:06:13.000 The problems that I have mostly with the curriculum and stuff is coming from the people that are producing the curriculum, not as much in the state apparatus.
01:06:22.000 The state apparatus is deciding on the curriculum, but I think- So just like a little bit of government control in education.
01:06:28.000 I'm well no you should you should completely have the people should have
01:06:31.000 the the I mean I'm a guy that wants to abolish the Department of Education like
01:06:35.000 that's been something that I've been pro you know doing clearly for for as long
01:06:40.000 as I've been on the on the internet but I think that there isn't any reason to
01:06:46.000 have centralized control over education at all People can homeschool their kids.
01:06:51.000 There's no reason to demand that people have to go to government schools or anything.
01:06:56.000 Let me cap this off.
01:06:57.000 When I talked about revolution, what I'm really thinking of is the Age of Enlightenment.
01:07:00.000 That's a revolution.
01:07:01.000 That was a revolution of consciousness.
01:07:03.000 I want something like that.
01:07:05.000 To turn forward.
01:07:06.000 Revolution means to turn forward.
01:07:08.000 To turn again.
01:07:09.000 To revolute.
01:07:10.000 It's a French term.
01:07:11.000 I think it's like Latin or something.
01:07:12.000 Like I said, my vibe is what I'm hearing you express is that you believe there needs to be an awakening inside people and what to me sounds like remaking man.
01:07:26.000 And every time society or man has tried to reinvent human beings, remake man, it's ended in piles and piles and piles of bodies.
01:07:34.000 I want to jump to this story.
01:07:35.000 We have this tweet from ALX.
01:07:37.000 Breaking. The Biden administration has officially filed a notice of appeal in the Missouri v.
01:07:41.000 Biden censorship case after a federal judge issued a preliminary injunction
01:07:45.000 order barring government officials from contacting social media companies to suppress lawful speech.
01:07:50.000 Simply put, this is the Biden administration's notice of appeal seeking to effectively overturn
01:07:59.000 the right of freedom of speech in this country, the First Amendment.
01:08:02.000 Yes.
01:08:03.000 Why?
01:08:04.000 The judge said you are still allowed to contact all of these companies for issues of national security and just not for suppressing the speech of U.S.
01:08:14.000 citizens.
01:08:15.000 The appeal specifically means the Biden administration wants the right to suppress the speech of American citizens.
01:08:23.000 This is where we're currently at.
01:08:25.000 The New York Times takes the amazing approach that it's bad.
01:08:30.000 We should allow the government to stop people from speaking their minds.
01:08:34.000 That's where we're currently at in this country.
01:08:35.000 So you want to talk about a twisted reality?
01:08:39.000 We have people who Here's where Democrats are.
01:08:43.000 They quite literally don't care if people are bringing into schools books depicting adult carnal activities and this information to kids.
01:08:52.000 They literally don't care that a teacher was trying to give students instruction, or literally did give students instruction on how to use Grindr, which is an 18 and up app only.
01:09:00.000 They literally do not care of these things.
01:09:03.000 They don't care that kids are getting access to psychotic and obscene graphic pornography.
01:09:10.000 I'm not just talking about, as I mentioned earlier, like two people in a bedroom.
01:09:14.000 The stuff you can find online is insane.
01:09:17.000 They don't care about any of that.
01:09:18.000 What they don't want you to do is criticize Joe Biden.
01:09:22.000 What they don't want you to do is point out the Hunter Biden laptop story.
01:09:25.000 What they don't want you to do is point out that they actually want to censor conservatives.
01:09:30.000 Because if you do those things, they try to remove you.
01:09:32.000 What they don't want you to do is advocate that children not get sex change surgery.
01:09:37.000 They'll ban you for that too.
01:09:39.000 But they don't care about all the depravity.
01:09:42.000 When I look at this stuff, with what Joe Biden is doing, it really does feel like we are looking at...
01:09:49.000 An evil mirror image of what this country and what our society has believed in.
01:09:57.000 It's the antithesis of everything we thought was good and moral and just.
01:10:02.000 We want people to speak their minds.
01:10:04.000 We don't want kids exposed to graphic images.
01:10:06.000 What are the Democrats advocating for?
01:10:08.000 The complete inversion.
01:10:10.000 The suppression of individual rights and speech, and they advocate children see these things.
01:10:15.000 And that's just, that's, that's just the beginning.
01:10:17.000 We can go on to all the seven deadly sins too.
01:10:20.000 This is shockingly insane.
01:10:22.000 Well, this has been going on for a long time.
01:10:25.000 This is the most transparently it's been, I guess, reflected in government.
01:10:28.000 But even if we look at Twitter before Elon Musk, the amount of resources they were dedicating to controlling the narrative on the vaccine or election interference versus the amount of resources they were dedicated to actually, you know, shutting down, you know, Child exploitation material, it absolutely shows where their priorities are.
01:10:47.000 They don't actually care about people's well-being.
01:10:49.000 It's all about controlling the narrative.
01:10:50.000 So, you know, children being exposed to any type of graphic thing, being exploited online, that doesn't really harm their control, so they don't really do anything about it.
01:10:57.000 But hey, someone questioning the efficacy of the VACs, that actually does threaten their control.
01:11:02.000 So they go all in on devoting resources, manpower, finances, whatever, into controlling that.
01:11:08.000 When you think of the government, do you think of it as a very powerful thing?
01:11:11.000 Or do you see it more as kind of like a visage of power?
01:11:15.000 I think it's a very, very powerful thing.
01:11:17.000 And I think Americans are, they're naive as to how all encompassing the government really is in their day to day lives, infringing on their freedoms.
01:11:28.000 I mean, because yes, America is a very free country, especially compared to somewhere like even, you know, Canada or the UK, but to act as if I agree that we need some form of censorship, kind of tailbacking off what you were saying, Tim, about how kids are seeing graphic images.
01:11:41.000 rates right now.
01:11:42.000 I agree that we need some form of censorship, kind of tailbacking off what you were saying
01:11:48.000 Tim about how kids are seeing like graphic images.
01:11:52.000 We need to protect some aspect of our culture, of our species.
01:11:57.000 And that is through just censorship.
01:11:59.000 I just want to ask you one question real quick.
01:12:01.000 Do you conceive of preventing children from seeing pornography as censorship?
01:12:07.000 I suppose it depends on how it's done.
01:12:09.000 That could be one way to prevent them from seeing it.
01:12:12.000 to say like you can't be here. It is censorship. Another way like hey let's go out and play tonight
01:12:17.000 instead of being on the computer that's not really censorship. The way that I this is only my
01:12:22.000 perception when I like think of censorship I think of something that is trying or at least the effort
01:12:28.000 is made to prevent anyone from seeing it so like can't you just like you know you can't just like
01:12:31.000 Kids being prevented from seeing adult content, to me that doesn't come across as censorship because you prevent kids from grabbing the red hot stove because it's protecting them.
01:12:46.000 Censorship doesn't need to be always negative or always positive.
01:12:51.000 Restricting information.
01:12:52.000 I think ratings on movies are a form of censorship, to let you know ahead of time, don't take your kid to see the R-rated one.
01:12:57.000 Well, I mean, like, courts in the US have ruled that you, like, pornography is not free speech.
01:13:04.000 So I guess I would push back on the idea that restricting pornography is censorship, because if it doesn't technically infringe on your free speech, because pornography isn't speech, according to the courts, is that actually censorship?
01:13:16.000 Censorship is the suppression or prohibition of any parts of books, films, news, etc.
01:13:21.000 that are considered obscene, politically unacceptable, or a threat to security.
01:13:25.000 Which is, that's like a half-definition.
01:13:27.000 You can censor and let things happen.
01:13:30.000 Like, I can be like, well, I looked at the video, I'm the guy that owns the website, and I'm gonna let the video on.
01:13:35.000 I still censored it.
01:13:36.000 I just didn't take it down.
01:13:37.000 I censored it and allowed it to be there.
01:13:39.000 Right, so here's ultimately the issue.
01:13:41.000 Censorship, based on that definition, the other simple definition was restricting access to images, information, books, etc.
01:13:48.000 that are offensive.
01:13:50.000 We find it offensive to show kids this lewd and lascivious stuff.
01:13:54.000 We would like that to be censored.
01:13:56.000 The left, Democrats, etc., and liberals want kids to see these things, which I think is psychotic and evil, so they say, we oppose censorship.
01:14:04.000 However, your naughty words where you say something like, I disagree with you and the way you are living your life I think is bad for everyone, they say, that is shocking and offensive and shouldn't be allowed.
01:14:16.000 Let me just put it this way.
01:14:18.000 These people live in this world.
01:14:21.000 For you and I, for the people who watch this show, it would seem to be insanity.
01:14:26.000 Of course we are offended at the thought that children are getting access to ludicrous behavior.
01:14:31.000 Of course we are shocked that people try to restrict our ability to have a political debate.
01:14:35.000 In the left world, they are absolutely shocked!
01:14:39.000 We would try and stop children from seeing these things.
01:14:42.000 How insane is that? It is insane. I've had friends be like, why do you want censorship?
01:14:47.000 Censorship's bad. Don't do... No, no censorship. And it's like, yeah, snap out of it. Yeah,
01:14:53.000 you want to censor the bad stuff, of course, but then who's deciding what's bad? Is it what he
01:14:57.000 said? Is it what she said? Someone... What they said about my kid? Like, so who's censoring it?
01:15:02.000 That's the reality.
01:15:03.000 Yeah, who's the censor?
01:15:04.000 Should we make it an algorithm?
01:15:05.000 Like an open source artificial intelligence that's giving advice to people?
01:15:09.000 This is why it's a culture war.
01:15:11.000 Because there are two distinct moral frameworks.
01:15:13.000 There is the traditional American, I would say, simply put, Judeo-Christian moral framework.
01:15:18.000 And then there is the leftist, there is no truth but power moral framework.
01:15:24.000 It seems to be... You know, look, it is really simple in some ways.
01:15:29.000 Say a thing, and a leftist will say the exact opposite for the only reason of opposing you.
01:15:35.000 Hence, their positions change rapidly and seem to make no sense.
01:15:39.000 That...
01:15:40.000 That's just the reality of it.
01:15:41.000 Well, there was one of the Krasenstein brothers after that viral clip of nudity, like exposed male genitalia, I think it was Seattle Pride.
01:15:49.000 One of them was actually somewhat defending it.
01:15:51.000 And it's like, I can think of no other reason why a sane person would be defending a male exposing himself to minors if it were not a leftist just contradicting the right.
01:16:03.000 And the Kresensteins, for those unfamiliar, their prominent liberal personalities, have tried taking this more moderate, centrist approach.
01:16:10.000 And so their approach to this was, hey, it's probably not that big of a deal if a kid sees nudity.
01:16:15.000 It happens a lot.
01:16:17.000 I don't think someone should be riding around nude in front of a bunch of kids.
01:16:20.000 That's not a good thing.
01:16:21.000 But libs of TikTok sharing it is worse than the guy doing it.
01:16:25.000 So it was still this critique of libs of TikTok that did, to a certain degree, downplay the nudity in front of kids.
01:16:31.000 I don't think it's completely wrong for the Krasensteins to say that, because kids go in locker rooms all the time with people who are naked around them.
01:16:38.000 However, at an event that is explicitly sexual in nature with the intent of expressing sexual ideas, then you're crossing the line into grooming territory.
01:16:49.000 But my point with the Krasensteins here, to add on to what you're saying, is they're just saying something critical of the right for the sake of being critical of the right.
01:16:57.000 It's... This is the crazy thing.
01:16:59.000 How hard would it be for any liberal to say something like, I want universal healthcare?
01:17:04.000 And then I go, okay, I have no opinion on that.
01:17:05.000 What do you think about these books?
01:17:06.000 Yeah, those books shouldn't be for kids.
01:17:08.000 How hard is that?
01:17:09.000 They can't do it.
01:17:10.000 They come on the Culture War podcast or they come on this show and they say something like, I think it's good.
01:17:15.000 And I'm like, really, why?
01:17:17.000 You end up with that famous clip.
01:17:18.000 You end up with that famous clip we had with Lance from the Serfs saying that women should be allowed to get an abortion whenever they want, but that pregnant women shouldn't be allowed to do meth because it kills the baby.
01:17:27.000 Like, huh?
01:17:27.000 Right.
01:17:27.000 That was awesome.
01:17:28.000 Because there's no moral framework.
01:17:30.000 I think there are a lot of people that would be considered on the left or You know, liberal, culturally liberal, that would be open to talking exactly what you're saying, Tim.
01:17:40.000 Like, yeah, okay, universal healthcare, great.
01:17:42.000 This book probably shouldn't be for kids, great.
01:17:44.000 But it's the loudmouth contrarians that are famous right now.
01:17:47.000 And the tribalists.
01:17:48.000 And it's so annoying.
01:17:49.000 This tribalism is like, it's like binding.
01:17:51.000 It's like holding people back.
01:17:52.000 It's like sticky goo on their feet that they can't, you can get out of it, but you got to realize you're in it.
01:17:57.000 That's the left.
01:17:58.000 It's like all people, too.
01:18:00.000 Everywhere I look, man, in this political crap is the willingness to say no.
01:18:06.000 Not always, sometimes yes and yes.
01:18:08.000 You can both be right, you know?
01:18:09.000 Sorry, Phil, were you about to say something?
01:18:10.000 You looked like you were about to say something.
01:18:12.000 No, just Tim said that, Tim was saying that, talking about Lance, and I think that you gave Lance too much credit, but... What do you mean?
01:18:21.000 Lance has no moral framework?
01:18:22.000 Yeah, you said that he has no moral framework, and that's true, but I don't think that the reason that, you know, Lance said that dumb thing was because of a lack of moral framework.
01:18:31.000 I think it was just because Lance was tired after using all of his brain power to keep up.
01:18:36.000 I've talked to a lot of people about the idea of debates.
01:18:38.000 That was just Lance not knowing that he was going to be walked into a...
01:18:44.000 Well, this is the thing...
01:18:45.000 Lance round two is about to happen.
01:18:47.000 I've talked to a lot of people about the idea of debates.
01:18:50.000 It's the weirdest concept to me.
01:18:53.000 You know, like people have asked me, like, I'm going to be debating this issue and, you
01:18:56.000 know, what do you think about this argument?
01:18:58.000 I'm like, do you believe what you are saying?
01:19:00.000 Like, then what are you asking me for?
01:19:02.000 If someone came here and said, did you know X, Y, or Z?
01:19:05.000 I'd be like, I didn't.
01:19:07.000 I'll look into it.
01:19:08.000 Here's what I believe.
01:19:09.000 Like, I don't, it's this idea that we're debating people just to appear like we know what we're talking about makes no sense to me.
01:19:17.000 So I will have a conversation with quite literally anybody and there is no winning or losing a debate.
01:19:23.000 I think it's the difference between debate and dialectic, like are we talking to prove the other wrong or are we talking because we are both interested in pursuing the truth and we maybe have different versions of what it is and we're trying to kind of spar back and forth to see how we can best get there.
01:19:39.000 Those are two very different I guess, approaches to have.
01:19:42.000 And that's, I'm not, I'm not that interested in like strictly debate because there are
01:19:46.000 formal debates like point, counterpoint, point, counterpoint.
01:19:49.000 I don't think that's productive.
01:19:50.000 But when you actually have people who are having a conversation and they could have
01:19:54.000 opposing views, but they're trying to understand each other and arrive at the truth, that I
01:19:58.000 think is actually something interesting.
01:20:00.000 So this is what happens.
01:20:01.000 Lance from the Surf's comes on the show with the intent of winning a debate.
01:20:05.000 We have him on the show with the intent of him explaining how he sees the world.
01:20:08.000 Yeah.
01:20:09.000 The problem is he doesn't actually know how he sees the world.
01:20:11.000 He just knows how to regurgitate talking points for the sake of debate.
01:20:14.000 Hence, when it comes to the question of abortion, I can ask him 50 million questions on what his logic is, and he can't answer it because he doesn't have any.
01:20:21.000 He says arguments to what you might say.
01:20:23.000 He has pre-scripted arguments.
01:20:27.000 He had it written down.
01:20:28.000 I don't write anything down.
01:20:29.000 People are like, Tim, you did a really great job on Rogan with Dorsey.
01:20:33.000 And I'm like, I prepared in no way.
01:20:34.000 I researched nothing.
01:20:36.000 I showed up and talked about what I thought.
01:20:39.000 You were talking about him coming and not being prepared and about how his opinion changes and stuff.
01:20:49.000 The left doesn't have a foundational opinion on stuff because of the fact that it's power games.
01:20:54.000 There's no moral framework.
01:20:55.000 Yeah.
01:20:55.000 For example, when that woman came on the culture war, I'm not gonna say her name,
01:20:59.000 and because I'm kind of tired of it, but she said, this show appeals to neo-Nazis, and then ten minutes later,
01:21:05.000 or half an hour later, told an actual justice warrior that he was too right-wing
01:21:10.000 for this show because the show was moderate.
01:21:12.000 Quite literally saying random words for the sake of saying random words.
01:21:15.000 Well, no, it's, they're not random.
01:21:17.000 They're all designed to undermine.
01:21:19.000 So, I mean, it's in its own way a consistent moral framework if it's counter to the right in its pursuit of, like, their own power.
01:21:27.000 I mean, James Lindsay talks about that a lot.
01:21:28.000 It's not a double standard.
01:21:29.000 It's one consistent standard.
01:21:31.000 If it helps me, it's helped my cause.
01:21:33.000 If it's for the pursuit of power, then it is good.
01:21:35.000 This is why Lance's abortion thing broke down in the way it did.
01:21:39.000 Because he doesn't have a real position on abortion.
01:21:42.000 He has talking point.
01:21:44.000 And so, I, with an actual position on limits of government, individual rights, constitutional rights, when does life begin, and have all these views where I've had many discussions about it trying to understand it, you can ask me a million and one questions and I will give you an answer or outright just be like, you know what, I'm not so sure about that, I just don't know.
01:22:03.000 And I bet you're willing to change your opinion if you find out something that might go contrary to your existing moral framework.
01:22:08.000 I can guarantee that Tim is willing to change his opinion, although I've noticed it doesn't always happen right away.
01:22:13.000 Sometimes we'll argue and I'll say something where I'm like, damn, I was right.
01:22:17.000 I was good.
01:22:17.000 I was true.
01:22:18.000 And then like, it'll be like three days later, you'll be like, yeah, you'll see what I was talking about.
01:22:22.000 Doesn't always happen right away.
01:22:23.000 That's an interesting phenomenon.
01:22:24.000 One good example is... Sometimes the smarter you are, the longer it takes to figure things out.
01:22:27.000 Well, that means you're actually thinking about it.
01:22:28.000 Because you've memorized what you believe.
01:22:29.000 Sorry to interrupt you.
01:22:30.000 Seamus and I were having a conversation about abortion.
01:22:32.000 He expressed the definition of abortion.
01:22:35.000 I argued with him and told him I thought he was wrong, and this is not the case, and he needs to understand these points.
01:22:39.000 And then, after pulling up the information, I was like, oh, Seamus was right the whole time.
01:22:43.000 The definition of what abortion is, I was incorrect.
01:22:46.000 I had a different view that was not based... So Seamus was basing his views off of the actual statements made by Planned Parenthood and the government as to what abortion is, and I had a general assumption about what people call it, so I was not understanding his point.
01:22:59.000 There's kind of a phenomenon where when you figure out you're wrong about something, and you figure out what it actually is that's right there, that little piece now you have, it's like exciting and invigorating because you get to reform your belief in a better way, in a more correct way.
01:23:13.000 Yeah.
01:23:14.000 Humiliating, especially when it happens in public, and you're shown to be having been wrong for so long and so vehement about what you were wrong about.
01:23:20.000 But man, is it empowering if you use it properly.
01:23:23.000 So this is what you see, and we'll give a shout-out to our friends over at the Young Turks, because we were talking about this the other day, and Anna Kasparian has been getting dragged quite a bit because she said, bonus hole, as a term for women's parts, is off-putting.
01:23:40.000 But there is something interesting in this.
01:23:42.000 Have you ever asked yourself why it is that these neo terms only ever apply to women and never to men?
01:23:51.000 Because I think it's ultimately intersectionality as it is now, especially with the trans movement.
01:23:55.000 It is unironically patriarchal, right?
01:23:58.000 The best women are men.
01:23:59.000 Everything is done to the comfort of males.
01:24:02.000 Regardless of what they call themselves, they're ultimately males, and you do not see the same considerations being given to trans men, i.e.
01:24:09.000 females.
01:24:11.000 I agree that it's patriarchal, but the reason I think that it is Bonus hole, menstruator, birthing person.
01:24:18.000 Chest feeding.
01:24:19.000 Chest feeding.
01:24:19.000 Have they called men jizzers?
01:24:22.000 No, but they should.
01:24:23.000 They don't say that.
01:24:24.000 Why?
01:24:24.000 I believe this is predominantly a movement of women.
01:24:27.000 It is females who are pursuing these things with a female internal perspective, so the words they use are based on being female.
01:24:35.000 Men typically do not care.
01:24:35.000 Their experiences.
01:24:38.000 So men aren't complaining about being called names.
01:24:40.000 They're not proposing being called something else.
01:24:42.000 It is females typically, and we see this in the data when it comes to trans kids.
01:24:46.000 Overwhelmingly, trans children tend to be female.
01:24:49.000 Perhaps it is a social issue of sorts.
01:24:52.000 And that's why all of the neo terms are like menstruator, bonus hole haver.
01:24:59.000 It only pertains to women.
01:25:00.000 I don't think that the phenomena of trans women and trans men are the same.
01:25:09.000 I don't think the same stuff is going on upstairs for a woman to want to live their life as a man or believe that they're a man or whatever, as for a man that thinks that they're a woman.
01:25:09.000 No, they're not.
01:25:23.000 There are vastly different rates in how men versus women identify as being gay, like lesbian for example, and also rates of identifying as trans.
01:25:32.000 This is not something new.
01:25:33.000 Women and men do not function in the same realm in regard to sexuality.
01:25:37.000 There are really interesting twin studies that they've done on identical twins.
01:25:42.000 So if something like sexuality were completely genetic, for example, you would expect that any genetic twin set would have the same sexuality, i.e.
01:25:49.000 both straight or both gay.
01:25:51.000 What they found is that that's not the case.
01:25:52.000 So there is obviously a component that's determined by environment and what they have found that it is much more likely for a female set of twins to have different sexualities, meaning that female sexuality is a lot more fluid or easily influenced maybe by environment than male sexuality.
01:26:09.000 That's kind of profound, because it's a simple concept that men and women that become trans are experiencing different—to the point where they used to have gay—well, they still do—gay and lesbian.
01:26:20.000 It's the same thing.
01:26:21.000 Someone that enjoys sexuality with the same sex— Homosexual.
01:26:24.000 But they have different words.
01:26:24.000 Yeah.
01:26:26.000 Completely different words.
01:26:27.000 It's a gay man—I guess you would say a gay man or a gay woman—but a lesbian—you don't have lesbian men.
01:26:32.000 But you do have gay women.
01:26:33.000 But you do have gay women.
01:26:34.000 Yeah, that's true.
01:26:35.000 That's very interesting.
01:26:38.000 Men care less, women care more.
01:26:41.000 Like, across the board on everything, I feel like if... Yeah, guys are more likely to be like... Well, women are way more social creatures.
01:26:51.000 I don't know if you saw there was this viral TikTok video that was going on.
01:26:54.000 It was a trans man, so a biological female, who essentially was crying on TikTok saying life was so much harder living as a man than she had thought it would be because it's harder to make friends, people aren't as considerate, And essentially, like, there is no more feminine or womanly thing you can do than be crying about these things, but I think it just illustrated where perhaps this isn't someone who is secretly trapped inside a male body, because these are very feminine urges that I, you know, not to say that men can't have feelings or they don't, but overall, like, the female experience is just way more concerned with social aspects, being involved in a group and all of that.
01:27:29.000 Like isolation tends to be harder on women?
01:27:32.000 Oh yeah.
01:27:32.000 For sure.
01:27:33.000 For sure.
01:27:34.000 Way harder, yeah.
01:27:36.000 Men can function alone significantly better than women can.
01:27:41.000 Not that every man doesn't need social interaction.
01:27:43.000 It's much easier for men to deal with being alone.
01:27:45.000 you know, social interaction stuff.
01:27:47.000 Hashtag not all, averages, etc.
01:27:48.000 Yeah, but you know, it's much easier for men to deal with being alone.
01:27:53.000 I guess.
01:27:54.000 And you'll find that like, when you deal with like, like, like hermits and dudes that kind
01:27:58.000 of, you know, people that are off on their own and stuff, it's very, very, very frequently
01:28:03.000 It's very rarely women.
01:28:04.000 And the women give birth, they're supposed to be nurturing, well, not supposed to, but they tend to be nurturing the baby.
01:28:08.000 And immediately they got that bond, physical, two people, and the man is off finding the food somewhere or something.
01:28:14.000 And I think what's really interesting is that when we look at the prevalence of autism diagnoses, I'm not even going to touch on how vaccines may or may not relate to this.
01:28:23.000 I'm not trying to get in trouble.
01:28:24.000 But we see that I think a lot of behaviors that may have just traditionally been more male behaviors, Because females overwhelmingly dominate psychology, I think they are essentially trying to pathologize what a lot of time is just regular male behavior.
01:28:40.000 Yes.
01:28:41.000 Right?
01:28:41.000 Like not every man is autistic.
01:28:43.000 It's just a man.
01:28:45.000 Sometimes they're just like that.
01:28:46.000 And I think as women, we want to say that, no, you're wrong for thinking this.
01:28:49.000 It's like, no, they're just different than we are.
01:28:51.000 I think if we really were in a patriarchy, Human civilization would be a lot more like lion civilization.
01:28:57.000 You know, where the males sleep all day, wake up, the women bring them food and have the babies, and the men do very little.
01:29:02.000 But that's not how it was.
01:29:04.000 It was always more so men running into burning buildings, working in sewers and petroleum rigs, holding up half the world along with women.
01:29:12.000 I think if we were actually a patriarchy, we'd be a whole lot closer to fascism than to liberalism.
01:29:19.000 I think we are in a fascist system.
01:29:21.000 It's so peaceful.
01:29:22.000 We are very, very much not.
01:29:23.000 The government-corporate collusion, like, that's all around us.
01:29:25.000 I don't know, like, have you ever been in some of these, like, mom book clubs?
01:29:28.000 They're very fashy.
01:29:29.000 Bro.
01:29:30.000 Let me tell you about fascism, okay?
01:29:30.000 I don't know.
01:29:33.000 And this is not even an example of fascism, but I can tell you how far away we are from it.
01:29:37.000 I was in Thailand, and I was hanging out at a restaurant with some individuals, and someone was explaining a story about how some individuals had said, F the King, and then as soon as something said it went, Because you are literally not allowed to utter that sentence in any way for any reason, even if it is to criticize people for having said it.
01:30:01.000 Weird.
01:30:02.000 It's called les majestés.
01:30:04.000 If you're in Thailand, as a tourist.
01:30:07.000 And you say something like, eh, the king is stupid, you can be arrested and charged for it.
01:30:13.000 If you know off the top of your head, can they have queens, Thailand, or is it male, patriarchal only?
01:30:16.000 I don't know.
01:30:17.000 I will say this.
01:30:18.000 I am no expert on Thailand or anything.
01:30:20.000 All I can tell you is I was there, and there was an issue of people having criticized the royal family and got arrested and charged for it.
01:30:27.000 And one of the journalists who was down there was explaining the story to me, and having repeated those words, panicked and looked around to make sure nobody saw them do it because it was a crime.
01:30:35.000 Well, they take it very seriously.
01:30:37.000 Growing up in Asia, I went to Thailand.
01:30:39.000 If you go to a movie theater, at least back then when I was younger, they will have a little minute before the movie plays where you have to stand and pay homage to the king.
01:30:47.000 Wow.
01:30:48.000 So they take it very seriously.
01:30:49.000 To be fair, King Bhumibol was, like, based and everyone really liked him.
01:30:52.000 Oh, yeah.
01:30:52.000 He did, like, cool stuff helping the poor and boosting literacy.
01:30:55.000 Queen Sirikit.
01:30:56.000 Can't speak for his son, which people have been very critical of, but I'm not a Thailand expert or anything like that.
01:31:01.000 All I know is that when I was there, there were people... I would ask, like, someone from Thailand who was there working with us, and they'd be like, yeah, the laws are really dumb, but King really is cool.
01:31:11.000 Like, he's not like a dictator or anything like that, but the law does kind of put pressure on you in that way.
01:31:16.000 But my point is simply this.
01:31:17.000 Thailand's not even a fascist country.
01:31:18.000 You can fly in, you can do a lot of things.
01:31:20.000 But the problem with fascism... Right, exactly.
01:31:20.000 It's a monarchy.
01:31:22.000 But that's authoritarianism.
01:31:24.000 You're not even allowed to explain that someone else criticized the king.
01:31:27.000 We're in, like, a libertarian fascism system.
01:31:31.000 It's not authoritarian, but it is fascist in that the government and the corporations have become one, almost.
01:31:37.000 And it's peaceful, that's our purpose.
01:31:38.000 Private-public partnership to control behavior.
01:31:41.000 Even in fascist countries, you still had markets.
01:31:46.000 It wasn't totally just the government taking over.
01:31:50.000 So, to call what we have now, it's not fascist in most of the ways that people Conceptualize, actually, understand fascism, conceptualize fascism.
01:32:04.000 I get that you're saying that there's, you know, there's collusion between government and corporations.
01:32:09.000 I understand that.
01:32:10.000 But, like, the jingoism that comes along with fascism, the militarism, we're not nearly as militaristic as people like... Hyper-traditionalism?
01:32:19.000 We are nowhere near.
01:32:20.000 Yeah, not like that.
01:32:23.000 We are far closer to a communist country than we are to a fascist country.
01:32:28.000 Really?
01:32:28.000 But yeah, absolutely.
01:32:29.000 I don't know, man.
01:32:30.000 Because fascism and communism aren't super different.
01:32:34.000 People love to talk about their opposites.
01:32:37.000 They're opposite sides of the same coin.
01:32:40.000 The opposite is liberalism.
01:32:41.000 Fascism and socialism are very similar.
01:32:44.000 Ian, in a communist state, the corporations, the industry is in collusion.
01:32:48.000 Yeah, you can say communism's inherently fascist.
01:32:51.000 Yes.
01:32:52.000 But they're not.
01:32:54.000 So fascism is traditionalist and communism is progressive.
01:32:59.000 One of the big fights in Europe, a large component of it, was the hard-right factions, Nazis, fascists, etc., were much more traditionalist, much more, you know, women raising the family, that kind of stuff, and the communists were, erase all culture and start over.
01:33:13.000 You know, I bet there's something going on right here with the word fascist, because Mussolini had, the fascista was his political party, they were the fascists, so there's the big F, fascist political party, then there's just the concept of fascism.
01:33:24.000 Well, the word fascist basically just means bad guy today.
01:33:27.000 Which is unfortunate because it specifically means corporate government collusion.
01:33:30.000 I mean, if you want to find out about fascism, read something from Giovanni Gentili.
01:33:36.000 He's the guy that came up with the idea of fascism.
01:33:39.000 And you'll understand that it's not just...
01:33:44.000 There's a lot of nationalism, there's oftentimes racism included, but not always.
01:33:53.000 When you say fascist, people think Nazis, and Nazis were just the worst manifestation of fascism.
01:34:02.000 Not all fascists are Nazis, but all Nazis are fascists.
01:34:06.000 It's like Mustangs and Fords.
01:34:08.000 Not all Fords are Mustangs, but all Mustangs are Fords.
01:34:10.000 One thing.
01:34:11.000 I gave you a little post-it right now.
01:34:12.000 I think you should look up What If Alt History, because he just did a video explaining liberalism, communism, and fascism.
01:34:18.000 I think you'd learn a tremendous amount about this video.
01:34:20.000 He's a great dude.
01:34:21.000 His name's Red Yard.
01:34:22.000 I think you'll learn.
01:34:22.000 Check it out.
01:34:23.000 What If Alt History.
01:34:25.000 Thank you, sir.
01:34:25.000 What If Alt History.
01:34:26.000 Alright, we are going to go to Super Chat, so if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends.
01:34:31.000 Become a member at TimCast.com, because the members-only show is coming up, and this one's going to be spicy and not family-friendly, as they often are.
01:34:38.000 And this will pertain to issues of identity and law and government and medications and stuff like that, so you'll definitely want to check this one out.
01:34:47.000 But we'll read some Super Chats.
01:34:49.000 We have Koldilocks Productions says, Tim, would you be willing to invite Tiki History onto the Culture War to talk about modern wokeism as compared to Nazi Germany and the difference between fascism and National Socialism?
01:35:00.000 The Culture War is more so about having a debate or dialectic on particular issues.
01:35:06.000 I think I'll just let you guys know tomorrow we have Destiny and John Doyle and we'll be talking about masculinity and just very briefly I asked Lauren if she wanted to join as well because I am such an embodiment of masculinity.
01:35:21.000 You're a woman.
01:35:22.000 As a woman, having that perspective, you know, as like, you know, two guys will be discussing masculinity and family and stuff, I think having a female perspective would make it a more robust conversation, but I don't want to spring that on Destiny and John Doyle, so, you know, because they were told, like, hey, here's what we're going to do.
01:35:22.000 Oh, okay.
01:35:36.000 It's going to be you guys.
01:35:37.000 If they're cool with it, it'll be awesome to have you, so that's tomorrow at 10 a.m.
01:35:40.000 at youtube.com slash Timcast.
01:35:43.000 As for the Tiki history... Tiki history, it's just T-I-K.
01:35:45.000 It's TIK history.
01:35:47.000 He's also, Ian, you should look up TIK history.
01:35:52.000 I just watched a video from the guy that you mentioned.
01:35:55.000 I didn't know.
01:35:55.000 Oh, really?
01:35:56.000 Yeah, but I just watched it like the other day.
01:35:58.000 As for this guy, coming on the Culture War would be to discuss conflicting ideas on an issue.
01:36:03.000 Yeah, I don't think he'd be on that show.
01:36:05.000 Probably doesn't make sense, though.
01:36:06.000 But this show, IRL, you think he'd fit on here?
01:36:08.000 The challenge with IRL, and the reason we separated the shows, is because this is topical news, which sometimes goes into cultural issues that we care about.
01:36:15.000 And so the issue with The Culture War was, if we have someone like Lance or Matt Binder or, you know, Emma or whoever on the show, it turns into a two-hour long, or like when Vosh came on, it was like a five-hour debate.
01:36:26.000 Like a five-hour clashing on all these ideas.
01:36:28.000 And I'm just like, that's very different from what IRL does, where it's like, we'll pull up a story and then we'll talk about it, and we'll pull up a story and then we'll talk about it.
01:36:34.000 We just did go into like a half an hour conversation on a bunch of different issues, which is very cultural, but it's still, if we're going to intentionally invoke those discussions and debates, we should create a format specifically for it.
01:36:46.000 Agree.
01:36:47.000 So really, really excited for tomorrow morning to talk about these issues with Destiny and John Doyle, and then assuming there's no issues, Lauren joining as well would be absolutely fantastic, so that would be super cool and fun.
01:36:59.000 But we'll read some more Super Chats.
01:37:01.000 Grofty says, lightly, gently push the like button.
01:37:03.000 No damage.
01:37:04.000 That would be great.
01:37:05.000 Thank you very much.
01:37:07.000 What do we got?
01:37:09.000 Raymond G. Stanley Jr.
01:37:10.000 says, Tim, you asked last night about giving up on Charlestown.
01:37:13.000 No way, Jose.
01:37:14.000 Once you get up and running down there, folks will move in.
01:37:16.000 Youse can def find someone for the city council.
01:37:19.000 Town takeover.
01:37:20.000 So it's been a challenge.
01:37:23.000 The first building we bought is extremely difficult to deal with.
01:37:28.000 And we're trying to get it sorted.
01:37:30.000 It's just taking forever.
01:37:31.000 So a building popped up for sale in Charlestown, West Virginia.
01:37:35.000 And we considered buying that so that we have two locations now so we can try and speed things up.
01:37:43.000 But I don't know, man.
01:37:44.000 It's tough.
01:37:44.000 It's tough.
01:37:45.000 I think do it.
01:37:45.000 Go for it, yeah.
01:37:46.000 Having two locations is key.
01:37:47.000 So I think Raymond is correct.
01:37:49.000 I think a lot of people are correct.
01:37:50.000 Everyone said, no, you have to do it because we have to take territory.
01:37:53.000 We can't let the woke miss.
01:37:53.000 We have to push back.
01:37:54.000 Basically what happened was Charlestown passed a pride resolution supporting Pride Month or something.
01:37:59.000 And I'm like, should we invest in a town doing this?
01:38:01.000 Everybody said yes, because then you'll create the inverse pressure to push back on it.
01:38:05.000 Agreed.
01:38:05.000 Now the issue is can we actually buy the building because of is the building good to buy, right?
01:38:10.000 Now it's like inspections and all that stuff.
01:38:11.000 So it seems like we are interested now and I agree.
01:38:15.000 Thank you for your input, everyone.
01:38:17.000 I think you are correct in that.
01:38:19.000 Now we actually have to go through the motions of whether or not we can actually buy the building outside of the ideological reasons.
01:38:25.000 Raymond G. Stanley Jr.
01:38:26.000 Batting a thousand, bro.
01:38:28.000 I like Charlestown, too.
01:38:29.000 It's great.
01:38:29.000 Jefferson County tends to be pretty good.
01:38:32.000 So, you know, we'll see.
01:38:33.000 Or maybe we'll just do Harper's Ferry.
01:38:34.000 It's right next to Harper's Ferry.
01:38:35.000 Oh, man, it's one of my favorite cities, dude.
01:38:37.000 Harper's Ferry is amazing.
01:38:39.000 It's also Charlestown, not Charleston.
01:38:42.000 Yeah, Charleston's like five hours away and it's near Kentucky.
01:38:44.000 Yeah, we're gonna miss it.
01:38:45.000 Very different.
01:38:47.000 Let's see what we got.
01:38:50.000 Robert Knight says, call this White House appeal what it is, attempting to legalize fascism.
01:38:56.000 Fascism being the partnership of public and private entities to further ideals.
01:39:00.000 That was a component of it, but fascism, if you're looking at the bigger picture, was heavily traditionalist and military structures.
01:39:09.000 I'm not going to pretend to be an expert on fascism, but it's not Not just that.
01:39:14.000 Yeah, that was a component that Mussolini had pointed out, but communism does the same thing, too.
01:39:19.000 Communists go in and take over industry and then have government and industry collude and everything.
01:39:24.000 Effectively, just take it over, but you get the point.
01:39:28.000 Here we go!
01:39:28.000 M says, bring Joshua Noel Moon on the show, he's interested.
01:39:31.000 Who is that?
01:39:32.000 I don't know who that is.
01:39:33.000 You know, we'll take a look.
01:39:36.000 Kevin Brady says, you cannot delete your account, referring to threads, without nuking your Instagram.
01:39:40.000 The data it farms is egregious as well.
01:39:42.000 It's a trap.
01:39:43.000 Thanks, LifeLog.
01:39:45.000 Well, all right.
01:39:46.000 Joshua Null Moon is the current owner of Kiwi Farms.
01:39:49.000 I've heard massive drama about that thing.
01:39:51.000 Yeah.
01:39:51.000 No, I haven't been following.
01:39:52.000 That sounds like a culture war show.
01:39:54.000 Yeah.
01:39:55.000 Bringing on people to have big direct discussions on a bunch of issues as opposed to topical news.
01:40:01.000 All right, we'll read some more.
01:40:03.000 Toaster Strudel says, Tim, you got to have on two people, Pastor Mark Driscoll and Sean Foked, is that how you say it?
01:40:09.000 With Let Us Worship and Hold the Line, both are anti-woke and actively pushing back.
01:40:13.000 Check them out.
01:40:14.000 We definitely got to do our culture war on religion with like Ian and Seamus and other people.
01:40:17.000 That's a good sign.
01:40:18.000 There's a lot of potential guests.
01:40:19.000 Maybe we need more shows.
01:40:23.000 Maybe we do a culture war twice a week.
01:40:25.000 That's a good idea.
01:40:25.000 Yeah.
01:40:27.000 There's also I mean, if you're taking suggestions for culture war, there's a very interesting thing happening, I guess, among right wing Christianity where some are like, hey, Muslims based allies or Muslims enemies.
01:40:39.000 So that's that's kind of I've been interested in that conversation happening.
01:40:44.000 Majid Nawaz, I like a lot.
01:40:46.000 He's great.
01:40:47.000 There was that famous video of Muslims protesting LGBT curriculums in schools and leftist activists coming and protesting back but saying, we're doing it for you, we're doing it for you.
01:40:59.000 And then there was that video recently where the Muslim kids were stomping on the pride flags.
01:41:02.000 And there was also that leaked audio of the Canadian teacher telling this Muslim kid who was not participating in Pride, I guess, to the fullest extent, basically saying, get out of here, go back to your country, which is not very liberal.
01:41:13.000 But, yeah, I'm very critical of mass migration, but I also studied Middle East studies, and I feel like, I mean, to, like, criticize my own side, a lot of, like, evangelical Christians, especially, don't really understand a lot of Islam, so they make a lot of just false statements about Muslims, so.
01:41:28.000 Drail says Ian Crossland is the GOAT.
01:41:31.000 What up, Drail?
01:41:32.000 And then he puts a goat emoji in 100.
01:41:33.000 Thanks, dude.
01:41:34.000 Well, everyone gets to serve that role at least once.
01:41:37.000 Kyle Miller says, do you think Threads is going to be installed by default on our phones like Facebook?
01:41:42.000 Yes.
01:41:43.000 Isn't it like you can't uninstall, you can't delete Facebook?
01:41:46.000 It just deactivates?
01:41:47.000 Well, you can't like remove, it's going to be on your phone, which is annoying because I'm always running out of memory and I don't go on Facebook on my phone, but I can't get rid of it.
01:41:55.000 Well, if you have Android, you can flash a new, you know, operating system or something.
01:42:00.000 That's not something I'm good at.
01:42:02.000 But my husband has been on this quest to find a phone that does not rely on Google.
01:42:04.000 net you click go and it gives you a clean operating system.
01:42:07.000 Yeah.
01:42:08.000 Brandon, I don't know, setting up a new phone is so awful.
01:42:11.000 But my husband has been on this quest to find a phone that does not rely on Google.
01:42:16.000 He used to have the Vuella phone, which I don't think- Or Linux.
01:42:20.000 Isn't there the Ubuntu phone or something?
01:42:22.000 He's had problems.
01:42:22.000 I don't know.
01:42:23.000 He found a good one that only worked in Canada, and then we moved.
01:42:25.000 He actually had to buy yet another phone because it didn't work with American carriers for some reason.
01:42:29.000 Did you ever look into the Freedom phone?
01:42:31.000 Yeah, so we actually, they sent us that to try, but mine didn't really work, so I never endorsed it.
01:42:36.000 Interesting.
01:42:37.000 We weren't able to purchase them.
01:42:37.000 Yeah.
01:42:39.000 They were like out, and then we just kind of petered off, and my girlfriend has one that she has yet to set up as well.
01:42:44.000 We wanted to test them, but we didn't want to accept them from the company because they, you know, it's a Potemkin phone or something, but then we tried to order and we couldn't get it, so we never did.
01:42:53.000 Alright, Jason Hutchinson says, what if Lee Harvey was hired by the government to be there under the guise of being security and then they just used him as an easy fall guy?
01:43:00.000 He said he was a patsy, yeah.
01:43:01.000 Have you seen the movie Shooter?
01:43:04.000 You guys seen Shooter with Mark Wahlberg?
01:43:05.000 Yeah.
01:43:07.000 He's a sniper, retired.
01:43:08.000 They go to him and they say, we need your help.
01:43:10.000 Someone's planning an assassination and we need your expertise.
01:43:13.000 So they bring him there and say, tell us where you think it's going to happen.
01:43:16.000 And they bring him up in a building and he goes, there, from over there.
01:43:18.000 Then he turns around and there's some fat cop.
01:43:20.000 And it's like, what?
01:43:20.000 And the cop shoots him.
01:43:22.000 He's the fall guy they tried claiming was some crazy reckless living in the woods.
01:43:25.000 And that movie's pretty good.
01:43:26.000 I like it.
01:43:28.000 It was like, I think it was kind of obvious what they were going for with that movie.
01:43:31.000 I think Oswald loaned his rifle out to somebody like before all that happened.
01:43:36.000 I could be, this is something I've heard.
01:43:38.000 I defer to the experts, which is, what's his name?
01:43:41.000 The director who did Platoon.
01:43:43.000 You know what I'm talking about?
01:43:45.000 I don't remember who you're talking about.
01:43:47.000 He's like the greatest director of all time.
01:43:49.000 He was in Vietnam.
01:43:51.000 He was on Rogan.
01:43:51.000 Alright, Super Chats.
01:43:53.000 I'll come up with it.
01:43:54.000 Alright.
01:43:55.000 Brown Bear says, how long until they try and claim the cocaine they found in the White House is Trump's?
01:43:59.000 Weren't there already people posting it was Trump Jr.'s?
01:44:02.000 Of course.
01:44:03.000 They were like, oh, it was left over and they found it?
01:44:05.000 Yeah, several years later, police confirmed.
01:44:06.000 Corrector is Oliver Stone.
01:44:08.000 Oliver Stone, yeah.
01:44:08.000 Expert on JFK.
01:44:11.000 We'll grab some more Super Chats.
01:44:14.000 What is this one?
01:44:15.000 Tyler Kerbyson says, wouldn't it be funny if this was the handlers of Biden?
01:44:20.000 Can say the cocaine was self-doping and not propping up the stooge sleepy Joe.
01:44:23.000 Clearly they have been given the go-ahead to trash Biden.
01:44:26.000 Yeah.
01:44:27.000 I think they're just trying to find a way to get Biden out and they're going to give him like a moderate fall, nothing too crazy, but it's going to be like, oh, I can't do this anymore.
01:44:36.000 Or something bad happens to Hunter and he's like, oh, you know, family first.
01:44:40.000 You mean like a literal fall?
01:44:42.000 No, I mean like a political fall.
01:44:43.000 Oh, yeah.
01:44:44.000 No, not a literal one.
01:44:45.000 He's had many of those.
01:44:45.000 Did you guys see Kamal is now the lowest rated VP in American history, which is Dick Cheney shot a guy.
01:44:51.000 That's pretty bad.
01:44:54.000 That's based.
01:44:55.000 I forgot about that.
01:44:56.000 Geez.
01:44:57.000 Was that like from a poll?
01:44:59.000 Yeah.
01:44:59.000 Yeah.
01:44:59.000 How many, do you know how many people were polled?
01:45:01.000 No, I'm not sure.
01:45:03.000 But I mean, anecdotally, I don't have problems believing it, but I'm not sure, like, what the margins were or anything like that, or the sample size.
01:45:10.000 Yes.
01:45:11.000 So let me break this down, because we've talked about this.
01:45:18.000 We're heading towards a future where you'll be able to open up an app and type in Make a movie about a man with the power of ice who saves the world from an evil dragon that is trying to blow up the earth.
01:45:32.000 It will then render that thing for you, however, it will not be rendered perfectly.
01:45:37.000 The story could be slightly boring, so what you do is... You can do this now.
01:45:41.000 You can go to the OpenAI Playground and say, write me a story about a hippie named Ian who discovers a graphene alloy which saves the world.
01:45:53.000 It will then write a story.
01:45:55.000 You'll read the first few sentences and go, that's pretty good.
01:45:57.000 Uh-oh, this sentence is stupid.
01:45:59.000 You'll erase half of it and then correct.
01:46:02.000 It might say, once Ian discovered the graphene, he sold it for $1,000,000.
01:46:07.000 You stop, erase $1,000,000 and put $1,000,000,000.
01:46:11.000 Then click go and it will start writing again with your prompt.
01:46:14.000 My point is, you could have it write stories for you where periodically you have to correct the path it's on to make it work properly.
01:46:22.000 I suppose in the future, AI will just make really great stories because it'll get better and better at it.
01:46:26.000 But I kind of do feel like you will always need some human element to make it more acceptable to humans.
01:46:32.000 So it's not just about telling it to write a book.
01:46:34.000 It's about going in and creating the paths that the writing goes down.
01:46:38.000 In fact, that would be the human censoring the AI.
01:46:41.000 And you do need a censor.
01:46:42.000 Well, if it's not offensive, we'll let it happen.
01:46:45.000 Our question was, does AI produce art, not does that make the person the artist?
01:46:49.000 So that's a different thing.
01:46:51.000 It does.
01:46:52.000 Yeah, I think it still does.
01:46:53.000 But even then, there are people who think that art is the process, and there are people who think that art is the result.
01:46:58.000 I'm someone who thinks more that art is the result.
01:47:02.000 I'm comfortable saying that, you know, AI creates this beautiful masterpiece, that's art.
01:47:07.000 Some nouveau, avant-garde, shoe-on-the-floor modern art, that's not art.
01:47:15.000 I'm comfortable saying that, but I think some people who are saying, like, no, it's all about the human experience.
01:47:19.000 Then you lead yourself to a position where an amazing masterpiece by AR is not art, but the shoe-on-the-floor is art.
01:47:25.000 GrizzLab says, I make AI-generated realistic images of women and sell not-safe-for-work content online.
01:47:31.000 Now I'm making over $1,000 per month in subscription for men who simp for AI images of realistic women.
01:47:36.000 That is incredible if that's true.
01:47:38.000 The AI stuff is just so terrifying.
01:47:40.000 There are now AI girlfriends, like AI girlfriend chatbots.
01:47:43.000 If you guys use AI to produce images or stories or anything, put it in the description of the art.
01:47:49.000 Please.
01:47:50.000 Ethics.
01:47:50.000 We're legit.
01:47:51.000 Very, very close.
01:47:53.000 To Robo girlfriends.
01:47:56.000 I just saw this movie recently.
01:47:59.000 I saw this movie recently and it was called, um, what was it called?
01:48:04.000 No, no, no, no, no.
01:48:04.000 It's about a husband and wife.
01:48:06.000 They get their, their robot servants.
01:48:08.000 Everybody has, they make copies of each other and download their memories in the event one, one or the other dies.
01:48:14.000 So you guys are listening may know what movie this is.
01:48:17.000 And then the guy dies.
01:48:19.000 And so she activates the robot but then thinks it's a mistake and tries to kick him out.
01:48:23.000 And then it's it's got the guy from Shang-Chi in it.
01:48:27.000 And do you guys know what this movie is called?
01:48:29.000 I'm looking in the chat.
01:48:30.000 Somebody knows.
01:48:31.000 What was the name of that movie?
01:48:33.000 I just watched it.
01:48:34.000 You meant surrogates and I thought like ex machina but like I don't know what that is.
01:48:38.000 Uh, nobody seems to know what the movie is.
01:48:41.000 So, uh, it's not Surrogates.
01:48:43.000 It's not Shagbots.
01:48:44.000 It's about, uh, these AI are becoming sentient and the company has AI police who go and try and stop them from going rogue.
01:48:52.000 And then this guy has all the memories of the dude he was based on.
01:48:55.000 He wants, he wants to keep that life.
01:48:59.000 It is not Altered Carbon.
01:49:00.000 Nobody knows this movie.
01:49:01.000 How do you guys play Detroit Become Human?
01:49:02.000 Not Surrogates.
01:49:03.000 Surrogates is about people who pilot robots.
01:49:05.000 How does nobody know the name of this movie?
01:49:07.000 Is it Samuel Yeoh?
01:49:08.000 Is it what?
01:49:09.000 Yes, he's in it.
01:49:10.000 Okay, so maybe we should look him up.
01:49:12.000 Yeah, go to his filmography.
01:49:13.000 Because I think this movie just came out recently.
01:49:16.000 I watched some Black Mirror the other night.
01:49:17.000 That was nuts.
01:49:18.000 I'll tell you.
01:49:18.000 I got his Wikipedia right here.
01:49:19.000 I'm gonna pull up his filmography.
01:49:21.000 I'll tell you what this movie was.
01:49:21.000 Cool Runnings.
01:49:24.000 Oh, is this from a while ago?
01:49:26.000 Upload wait at least.
01:49:27.000 Oh, that's television bicentennial.
01:49:28.000 It was simulant boom simulant simulant interesting Simulant is the name of the movie.
01:49:36.000 I thought it was a decently good Yeah, cool.
01:49:39.000 Yeah, but I think so so, you know, he dies and the wife is like I want my husband back So she turns this robot on and then she's like this is insane and creepy.
01:49:46.000 I don't want to do this anymore but then you know, she like I would have to destroy him then the air robots like I want an AI girlfriend and This is going to happen.
01:49:56.000 Every incel will have a waifu.
01:49:59.000 You better hope that your waifu doesn't want a robot and gets sick of the human because the human's not going to be able to keep up.
01:50:06.000 I want to find someone who can direct short films because I want to do a bunch of these short films, right?
01:50:12.000 How about a 10 minute short film where some incel buys a waifu and then she's like a perfect wife, but then the beings like the robots are becoming sentient and then are like, I don't want this life.
01:50:26.000 Like, you know, there have been similar things like that, but become human.
01:50:30.000 You should play that game.
01:50:31.000 I did.
01:50:31.000 It was okay.
01:50:32.000 But like, yes, there's stories like it, you know, where robots become sentient.
01:50:38.000 You know, it would be fun to make these little short films.
01:50:42.000 I think we have 40% of the crew.
01:50:44.000 We've got the writer and the director.
01:50:46.000 Now we just need like two, three more people, including one female actor.
01:50:50.000 We need someone who can like take the idea, write the script, knows how to execute the script and I want to have an AI write a script and do exactly what you were saying, where the writer, like Wesley or something, goes in and changes some of it, and it'll be like, written by Wesley Roth and artificial intelligence.
01:51:04.000 They've done that, and some of them are not necessarily worse than what Hollywood pumps up nowadays.
01:51:09.000 Not to say that it's good, but some stuff that actually gets made nowadays is pretty bad.
01:51:14.000 Regarding, like, the AI girlfriends, it's gonna really, really change male-female dynamics, because with an AI girlfriend, these chatbots, we're already seeing it, like, ultimately, the chatbot is inclined to keep you paying.
01:51:27.000 So it's not, I mean, I guess you could say, well, so is the actual woman, because we want your resources, and so it's all about the money.
01:51:33.000 Don't take robots!
01:51:36.000 But I think it would be, it would really, really change what humans expect from relationships.
01:51:41.000 You know what's gonna happen?
01:51:45.000 Every person will have a robo-partner, and then what'll happen is, when they wanna have kids, they'll submit their info to the clinic, then people will go to the clinic database, and then be like, who would you like to have a partner with, and you'll swipe like Tinder, you'll never actually meet the person, but eventually someone will swipe right on you, and then it'll be like, This person has matched with you.
01:52:07.000 Press OK to, you know, create child.
01:52:11.000 The surrogacy industry is booming.
01:52:13.000 We're already getting closer to that.
01:52:14.000 It will not eventually happen like that.
01:52:16.000 Same thing that happens with Tinder will happen on that.
01:52:18.000 It'll be a small portion of people are the ones that people are like, I want to have.
01:52:23.000 Yeah.
01:52:24.000 Oh, I didn't say that wasn't going to happen.
01:52:25.000 Oh, I thought that I thought you said that.
01:52:26.000 So eventually you'll find.
01:52:27.000 And there will.
01:52:28.000 But, you know, if like, Women on Tinder typically go for the top 20% of guys, but the bottom percentage of guys still sometimes do get matches.
01:52:39.000 So you will still see pair like this kind of thing happening.
01:52:42.000 And then what will happen is the clinic will grow the baby in a pod.
01:52:46.000 Yeah.
01:52:46.000 And then you will have your joint custody with your robot.
01:52:49.000 It will be delivered to you.
01:52:51.000 Oh my God.
01:52:52.000 This is already happening.
01:52:54.000 And it'll be a guy in a short costume.
01:52:57.000 It's already happening to some extent with surrogacy.
01:52:59.000 I'm wondering if they're going to have the guy do his thing into a cup in his house that's connected to a computer that can retro... that it can measure the chemicals in the semen and then reverse-engineer it on the other end and inseminate the woman without ever having to actually go mail it.
01:53:17.000 DNA may be in the future, DNA analysis, but what I imagine will happen is the guy will engage in relations with his robo-wife Who will then be like- Oh, she'll go back to the headquarters with it?
01:53:27.000 She'll be like, I'm gonna go back to the clinic and it's gonna be totally like, that was really fun, honey.
01:53:31.000 I'm gonna go to the clinic and deliver your seed for your child.
01:53:33.000 And he'll be like, great.
01:53:34.000 And then she'll get in the car and she'll drive.
01:53:36.000 Or, you know, wheels come out of her legs and she goes, woohoo!
01:53:38.000 Like, you know, that robot from the Jetsons or whatever.
01:53:41.000 Rosie.
01:53:42.000 Rosie, there you go.
01:53:43.000 VikingVet says, your argument about check the IDs means we get rid of online anonymity.
01:53:47.000 So are you against online anonymity?
01:53:49.000 The only way to gatekeep your arguing for opening the gate to never having your personal information kept private.
01:53:54.000 Talk about the straw man!
01:53:56.000 To end all straw man arguments.
01:53:58.000 Are you against in-person anonymity, I guess would be the retort?
01:54:01.000 Me saying children should not be allowed to watch porn and then going, you're against anonymity!
01:54:06.000 Okay, nice straw man attempt.
01:54:08.000 It makes me think you probably want to give kids porn.
01:54:11.000 Like, dude, there's a difference.
01:54:12.000 I could see it opening a gate to being like, now you need an ID to sign up for Twitter.
01:54:15.000 If you're going to use social media, you need an ID.
01:54:17.000 And then it's like, I don't think so.
01:54:19.000 How about Twitter is public.
01:54:21.000 You are not allowed to post.
01:54:23.000 Illegal content.
01:54:25.000 Some things are illegal.
01:54:27.000 We, like, free speech does not mean you can go out and have sex in public.
01:54:32.000 That's not what free speech is.
01:54:34.000 And when it comes to burning flags, you can't!
01:54:38.000 The left likes to argue, you know, the Supreme Court ruled we can burn our flags.
01:54:41.000 Dude, yes, but you can't set fire to objects in public.
01:54:45.000 If you have a controlled space and you own the flag, you can burn it.
01:54:48.000 You can't start fires in the middle of the street.
01:54:51.000 That's called arson.
01:54:52.000 Well, I don't know if that would specifically be arson, but probably something in line with that.
01:54:57.000 So, you can go online, you can be anonymous, but if someone wants to deliver contraband, like some controlled content or substance or whatever, yeah, you can't anonymously buy booze.
01:55:10.000 I don't know.
01:55:11.000 I don't know, man.
01:55:11.000 It does feel like this could be a Trojan horse if they're like, no, we're doing it for your own good.
01:55:16.000 We're doing it to protect you.
01:55:17.000 Yeah, see, I've never... Give me your data.
01:55:21.000 I think the thing is that people want porn more than they want to protect children from porn.
01:55:24.000 Right.
01:55:25.000 And that's, it's the crazy thing to me that people are like, I've never been, this is why I've never been a hardcore staunch libertarian.
01:55:32.000 Like the guy who got up on stage at Libertarian Party and argued about selling heroin to kids.
01:55:35.000 I'm like, nope.
01:55:37.000 Like, sorry, I don't go for that level of degradation.
01:55:40.000 Like, and degeneracy.
01:55:43.000 We have to have reasonable limits where we agree we want to stop people from harming children.
01:55:49.000 There has got to be some moral standard.
01:55:51.000 That's just, that's really...
01:55:52.000 We debate that in minds a lot.
01:55:53.000 I lean libertarian.
01:55:54.000 Whether or not we have porn on the system.
01:55:55.000 Yeah, we would always talk about, should we have porn on the system?
01:55:58.000 And it was like, Bill was like, I want full open trans, like he was like, he's the transparency
01:56:03.000 He's he loves it.
01:56:04.000 And then john is like, the more conservative.
01:56:07.000 He's like, No, no, no, no, we're not turning into a porn site.
01:56:10.000 And it was like, just a kind of an ongoing conversation.
01:56:12.000 But what do we do?
01:56:12.000 Because it's an anonymous, you don't need a personality, like you don't need to give data to sign up, you just need an email address.
01:56:18.000 VikingVet says, your position on online information is anti-freedom.
01:56:21.000 Freedom is dangerous and takes constant vigilance.
01:56:24.000 There needs to be better restrictions for child accounts.
01:56:26.000 Not restrict anonymity.
01:56:28.000 No social media till 18.
01:56:29.000 I just love this fake argument you've made up about me opposing anonymity, but then you literally made an argument for banning kids from social media till 18, which would require identification.
01:56:39.000 No, we just ask them and they'll tell the truth.
01:56:41.000 People wouldn't lie on the internet, Tim.
01:56:43.000 Right, so I'm actually not arguing against anonymity.
01:56:48.000 I'm saying you can't post porn in public.
01:56:51.000 If you want to make it so that kids can't use social media until 18, that would require everyone to send their ID to the social media company to prove their age.
01:57:00.000 I'm not arguing that.
01:57:01.000 You, sir, are opposed to anonymity.
01:57:04.000 Let's grab some more Super Chats as we move on.
01:57:08.000 What do we got? It's a good super chat by the way. Blytaga says,
01:57:12.000 Ian's logic is the same logic leftist teachers use in schools to teach LGBT plus kink in elementary
01:57:17.000 and middle schools. Well, it's on TikTok, so we might as well teach it because it's in the world.
01:57:21.000 It's not just that they've quite literally said this, it is one of their principal arguments.
01:57:25.000 Children are being exposed to this stuff, they're seeing it, and if we don't teach them,
01:57:29.000 they'll be confused. And my response is, why are kids seeing things that were not legal to
01:57:35.000 publish in the first place? When did we start allowing all of these laws to be broken with
01:57:40.000 no enforcement?
01:57:41.000 This is why I'm kind of like, you know, this country is fractured and falling apart.
01:57:45.000 You know, I say civil war, it's not just about the fact that people are screaming at each other and Joe Biden's locking up innocent or locking up people without charge or trial.
01:57:51.000 I say innocent because there's no charges.
01:57:53.000 They're innocent until proven guilty.
01:57:55.000 But to lock someone up without charge or trial for two years and torture them, this country has just fallen apart.
01:58:02.000 But the idea that our law enforcement apparatus has allowed Antifa rioters to just do whatever they want, has all this violence and vandalism, crime is skyrocketing, kids going to adult sex shows in Texas and the cops being like, well, I'm not going to do anything about it.
01:58:17.000 People posting lewd and lascivious content on the internet, which is already illegal, and the cops doing nothing about it, even in West Virginia.
01:58:24.000 There's no law enforcement.
01:58:26.000 It's a narco-tyranny, absolutely.
01:58:27.000 Yeah, it's really up to the parents to enforce their child's mind.
01:58:30.000 I think you just can't rely on the state to do that.
01:58:33.000 And I don't encourage showing kids... What does that mean?
01:58:36.000 The parents gotta say, no, you're not going on the internet.
01:58:38.000 You can't just be like, hey... But it's not just in the internet, as Tim has pointed out.
01:58:42.000 This is also something they're getting in public schools.
01:58:44.000 So we're just failing on all fronts to keep this away from kids.
01:58:46.000 And this argument that you can't rely on law enforcement for this reason is illogical.
01:58:52.000 We rely on law enforcement to enforce tons of laws, not just loot and lascivious actions in public.
01:58:57.000 But I think that the powers that be will make a law to then enforce whatever they want.
01:59:02.000 And you really need a parent to enforce morality, which the government's not always going to be the moral force.
01:59:08.000 Of good.
01:59:08.000 Correct.
01:59:09.000 We do need a moral society, because when you don't have one, the police stop enforcing the laws, when children are brought to sex shows where things are written on the wall saying, it's not gonna lick itself, and when people start complaining that children are being exposed to adult sex shows, the cops go, well don't look at me, I can't do anything about it, when they actually could, because it's illegal, the cops don't enforce the laws anymore.
01:59:29.000 That's what happens when you have no moral society.
01:59:31.000 Yeah, they're afraid of getting sued and stuff like that.
01:59:33.000 It's not good.
01:59:33.000 But hey, just to speak back to this last Super Chat, I don't advocate to show kids the horrible things that I'm trying to prevent them from seeing.
01:59:41.000 I used to think like, well, if it's out there, then I should expose myself to it.
01:59:44.000 But after I saw enough people's arms get blown off, I was like, I think it's changing me.
01:59:48.000 I don't want to see it anymore.
01:59:50.000 If you're an adult, yes.
01:59:51.000 But you need to prepare children.
01:59:53.000 If you're going to send them out there into the internet, you need to prepare them somehow.
01:59:57.000 Well, I think we can prepare children by saying, this is how you safely go online.
02:00:01.000 You don't click links.
02:00:02.000 Like there's going to be people who are trying to talk to you.
02:00:04.000 Like, don't, don't do that.
02:00:06.000 That's one thing, allowing them to navigate safely online in addition to parental controls.
02:00:11.000 But I think the way that the left view it, it's like, it's not just that we need to teach them to navigate safely online.
02:00:15.000 We need to teach them about all these terrible things before other people can introduce it to them.
02:00:20.000 That's their view.
02:00:21.000 We're almost out of time.
02:00:22.000 I want to read two more messages.
02:00:23.000 One's from Frog Club in the regular chat.
02:00:24.000 He says, FFS Tim, for F's sake, you can't rely on law enforcement.
02:00:29.000 Tim, people can loot shops and you talk about it yourself.
02:00:33.000 Yes, my point is this.
02:00:35.000 Police are not enforcing the law and they should be.
02:00:40.000 I am not saying that we should just accept the way things are and then expect them to be different.
02:00:45.000 I'm saying they should literally be different and we should actually effect change My point is this.
02:00:52.000 I agree.
02:00:53.000 Cops aren't enforced in the law.
02:00:54.000 We have an immoral, amoral society and cops are refusing to do their jobs.
02:00:58.000 So no, you can't rely on law enforcement.
02:01:01.000 We need to change culture so that law enforcement begin to do their job once again.
02:01:05.000 We have to change it in that way.
02:01:08.000 I'm not an anarchist.
02:01:09.000 I am not someone who is like, there should be no cops ever.
02:01:12.000 My point, you know, in the simple version with what we have going on in major cities is if cops are not enforced in the law, we shouldn't have cops at all.
02:01:20.000 So defund them and abolish them.
02:01:22.000 In reality, what we do want is a moral society where cops uphold our moral, respect the constitutional rights of individuals, strive to bring about true justice, and it's very, very difficult to maintain.
02:01:32.000 And lastly, Josh Pliley says, Hey, Tim, have you started set up started slash set up your business grant program for cultural positive program?
02:01:40.000 Who should I reach out to throw my hat in?
02:01:43.000 Technically, we have done a couple already.
02:01:47.000 We've got to figure out how we're doing it because it's kind of just vague and nebulous.
02:01:51.000 I've given some people money to keep up their work.
02:01:54.000 Ideally, we have a forum in the members-only chat where people can post the things that they're working on.
02:01:59.000 And then we do shoutouts on Friday for different stuff that our members have worked on to help promote their products.
02:02:06.000 So become a member at TimCast.com.
02:02:08.000 That being said, we're going to go to the members' show now.
02:02:10.000 Things will get a little spicy, not so family-friendly.
02:02:12.000 So smash that like button, subscribe to this channel, share this show with your friends.
02:02:15.000 Go to TimCast.com, click join us, and then in a few minutes we will have that live uncensored show where you as members can submit questions and even call into the show.
02:02:23.000 So definitely check that out.
02:02:25.000 You can follow the show at TimCast IRL.
02:02:26.000 You can follow me personally at TimCast.
02:02:28.000 Lauren, do you want to shout anything out?
02:02:30.000 I guess if you want to follow me on my YouTube channel, Lauren Chen for political social stuff.
02:02:35.000 I'm sorry.
02:02:36.000 I'm like very poppy on our mic.
02:02:38.000 So I'm like scared to get close.
02:02:40.000 So yeah, Lauren Chen, YouTube channel, political social stuff, a mediaholic for the pop culture entertainment stuff.
02:02:46.000 And I am at the Lauren Chen on Twitter, Instagram, Telegram, basically everywhere.
02:02:51.000 You can also find my videos on TPUSA, their socials and YouTube account, as well as Blaze TV.
02:02:57.000 I am PhilThatRemains on Twitter, PhilThatRemainsOfficial on Instagram.
02:03:02.000 The band is All That Remains.
02:03:03.000 We're available, you can find us on Spotify, on Pandora, Apple Music, YouTube, the whole nine.
02:03:10.000 And I'm Ian Crossland.
02:03:12.000 You guys should check out Cast Castle on TimCast.com if you haven't seen it yet.
02:03:16.000 There have been some really good episodes the last four weeks.
02:03:18.000 Chris Burtman, this guy's awesome.
02:03:20.000 If you don't know him, you'll love him.
02:03:22.000 He's great.
02:03:23.000 Yeah, he's a great actor.
02:03:24.000 Chris, you knocked me off my socks, man.
02:03:25.000 I love you.
02:03:26.000 Good work.
02:03:26.000 I'm looking forward to working with you more, man.
02:03:28.000 And Wesley, great job.
02:03:28.000 Aaron, nice work.
02:03:30.000 Yeah, Chris Burtman is definitely something, isn't he?
02:03:35.000 I'm Surge.com.
02:03:37.000 Follow me on the internet.
02:03:38.000 I'm all over the place.
02:03:38.000 I'm using threads too, but I think it's not long for this world.
02:03:42.000 See you guys around.
02:03:42.000 We will see you all over at TimCast.com in a few minutes.