Timcast IRL - Tim Pool - May 12, 2023


Timcast IRL - Elon Musk Names WEF Chair As Twitter CEO, DEFENDS HER w-Tudor Dixon


Episode Stats

Length

2 hours and 2 minutes

Words per Minute

193.94211

Word Count

23,787

Sentence Count

1,799

Misogynist Sentences

39

Hate Speech Sentences

39


Summary

Elon Musk has appointed a new CEO to Twitter, a woman who is an Executive Chair for the World Economic Forum. And already, you can see the backlash. Who would I be if I told all of you to stop drinking Bud Light and then kept paying a company that put an executive chair on its C-suite as the chief executive officer? I think I'd be a hypocrite.


Transcript

00:00:00.000 So Elon Musk has gone and done it.
00:00:24.000 Elon Musk has appointed a new CEO to Twitter, a woman who is an executive chair for the World Economic Forum.
00:00:31.000 And already you can see the backlash.
00:00:34.000 I started thinking about this right away because we signed up recently to be a verified organization.
00:00:38.000 It costs over a thousand dollars a month for these corporate benefits that you get with Twitter.
00:00:43.000 And I like them.
00:00:44.000 I think they're good.
00:00:45.000 You get prioritized reach, placement, You get to verify all of your employees.
00:00:49.000 You get a gold badge.
00:00:51.000 You get an affiliation badge along with verification if you work for the company.
00:00:55.000 But who would I be if I told all of you to stop drinking Bud Light and then I kept paying a company that put an executive chair of the World Economic Forum on its C-suite as the chief executive officer?
00:01:07.000 I think I'd be a hypocrite.
00:01:08.000 So in the poll on this live show, I have asked you, should we, Timcast, Remove our Twitter blue over the appointment of a World Economic Forum Chief Executive Officer.
00:01:18.000 So far, the poll as of the launch of this show is at about 66%.
00:01:22.000 So we're going to talk a lot about that.
00:01:24.000 Before we get started, my friends, head over to casprew.com.
00:01:27.000 If you want to support the show, this is our company, our coffee company.
00:01:30.000 We are sponsoring ourselves.
00:01:31.000 You can join the Casprew Coffee Club.
00:01:33.000 You get three bags of coffee every single month, and it's a discounted rate.
00:01:38.000 You save about five or six bucks.
00:01:39.000 But we also have several other blends, like Rise with Roberto Jr., Appalachian Nights.
00:01:43.000 Unfortunately, Rise with Roberto Jr.
00:01:45.000 is sold out, but we'll be back very, very soon.
00:01:47.000 So, if you want to go to Casper.com and support the show, please do so.
00:01:50.000 Also, head over to TimCast.com, click join us, and become a member.
00:01:54.000 If you would like to support us directly, you'll get access to our Discord server where you can hang out with like-minded individuals and watch the uncensored members-only show Monday through Thursday.
00:02:03.000 So, smash that like button, subscribe to this channel, share this show with your friends.
00:02:06.000 Joining us tonight to talk about this and a lot more is Tudor Dixon.
00:02:11.000 Thank you.
00:02:11.000 I'm excited to be here.
00:02:12.000 Thanks for coming.
00:02:13.000 Who are you?
00:02:14.000 What do you do?
00:02:14.000 Well, I ran for governor in the state of Michigan.
00:02:17.000 I have a podcast now, the Tudor Dixon podcast, and we are working on what we can do to win elections.
00:02:24.000 So that's the plan going forward.
00:02:26.000 Right on.
00:02:27.000 I guess before we get started, you got plans to run again?
00:02:30.000 Maybe someday.
00:02:31.000 All right.
00:02:32.000 I say maybe it's like pregnancy and then you forget and do it again.
00:02:35.000 There you go.
00:02:36.000 So okay, well thanks for hanging out.
00:02:37.000 This should be a lot of fun.
00:02:38.000 We already had a lot, we were talking a lot before the show started and it was really good, so I'm excited.
00:02:42.000 We got Phil Labonte hanging out.
00:02:43.000 How you doing?
00:02:44.000 I am Phil Labonte, lead singer of All That Remains, anti-communist and counter-revolutionary, here with my good friend...
00:02:50.000 Hi, I'm Taylor Silverman.
00:02:52.000 I'm a skateboarder and I work on the show Cast Castle here at TimCast.
00:02:56.000 And Freedomistan.
00:02:57.000 And newly, coming soon, Freedomistan.
00:03:00.000 I didn't want to say anything because I didn't want to make any announcements you hadn't made again.
00:03:05.000 But I just recently did Tudor's podcast also, so it's really cool to meet you in person.
00:03:09.000 Yeah, it is.
00:03:10.000 She was great.
00:03:11.000 Awesome.
00:03:11.000 And we were very excited to have you because she's a Michigander too.
00:03:14.000 Oh, yeah, yeah.
00:03:15.000 So there's like a connection.
00:03:16.000 We're not too far either.
00:03:17.000 No.
00:03:18.000 Like hour and a half.
00:03:18.000 Yeah.
00:03:19.000 Right on.
00:03:20.000 What's up, everybody?
00:03:21.000 It's Kellen.
00:03:22.000 Fridays are the best days, as I always say.
00:03:24.000 I'm ready to get started when you guys are.
00:03:26.000 All right, let's jump into this first story.
00:03:27.000 We got this on Twitter itself from Elon Musk.
00:03:29.000 He says, I'm excited to welcome Linda Iaccarino as the new CEO of Twitter.
00:03:35.000 Linda Iac will focus primarily on business operations, while I focus on product design and new technology.
00:03:40.000 Looking forward to working with Linda to transform this platform into X, the everything app.
00:03:45.000 OK, well, here you go.
00:03:47.000 Here's, uh, well, I guess, uh, maybe I can shrink this so you can see it better.
00:03:51.000 Apparently not.
00:03:52.000 Well, you can see right here, World Economic Forum.
00:03:54.000 This is from her LinkedIn.
00:03:56.000 Executive Chair, January 2019 to present.
00:03:59.000 I took a screenshot of this because I don't want to show the rest of her private details, but I think this is particularly relevant and publicly available.
00:04:05.000 Yaccarino is the Chairman of the World Economic Forum's Task Force on the Future of Work and sits on the WEF's Media, Entertainment, and Culture Industry Governor's Steering Committee.
00:04:13.000 She is also highly engaged with the Value in Media Initiative.
00:04:18.000 Immediately, I don't trust her.
00:04:19.000 I don't... I... Look, I like Elon Musk.
00:04:22.000 I think he's doing a lot of really important things.
00:04:23.000 I think SpaceX is one of the most important things humanity is doing, period.
00:04:28.000 And I think him buying Twitter was already very, very good.
00:04:31.000 But I think this is a huge mistake, if only because he has shattered confidence in the platform to the point where I have pulled up right now...
00:04:39.000 The verified org's subscription plan and before I clicked deactivate our account, which would suspend payments to Twitter for the corporate benefits for Twitter balloon verification, I decided to put a poll up and it's already sitting at 70% saying, yes, we should terminate our subscription to Twitter over this.
00:04:57.000 So I'm curious what you guys think.
00:05:00.000 I mean, I think that we have to be honest with who Elon Musk is.
00:05:04.000 You know, we were talking about how we like to glob on to someone who we believe is a conservative because they agree with one of our values.
00:05:13.000 And he's been very adamant that he wants free speech, but he's also a globalist.
00:05:18.000 We know this.
00:05:19.000 I mean, he's made that very clear.
00:05:22.000 We can't expect that Twitter is going to be this place that's not going to be leaning that way, or that he's not going to be leaning that way.
00:05:29.000 So I think you have to do what you think is right for your company, and for your company, this is not where you are.
00:05:35.000 For obvious reasons, I'm not concerned about whether or not Elon's conservative.
00:05:39.000 I argue with Seamus over abortion, but I think in the modern culture war, in our current culture war, The traditional pro-choice position is substantially closer to pro-life than the left's current position on abortion up to the point of birth, which is terrifying to me.
00:05:56.000 So it seems like they're creating this, the culture war has created this perception that liberals and conservatives are almost the same thing because of how radically different they are.
00:06:06.000 But if Elon Musk comes out and says, I'm going to give the libertarians, disaffected liberals, conservatives the right to speak on this platform, we think it's a good thing.
00:06:15.000 The issue for me is, appointing this woman says to me, party's over, honeymoon's over, the censorship will slowly start creeping back in, and we're going right back to the beginning.
00:06:27.000 I think that, I mean, liberalism has a weakness in that it does take other philosophies at their face value as if they are honest.
00:06:40.000 And right now, like the woke mind virus, whatever you want to call it, the current popular brand of leftism is subversive.
00:06:48.000 It's intentionally subversive.
00:06:50.000 So if a liberal takes them at face value, you often end up with Maughton Bailey arguments.
00:06:56.000 You often end up with all kinds of emotional arguments.
00:07:00.000 Because the left doesn't fundamentally believe the same things that liberals do.
00:07:06.000 Elon Musk is a liberal.
00:07:08.000 He believes that open dialogue is always good, and I think that it is, but you have to understand that just because you're liberal, everyone around you is not liberal.
00:07:21.000 People believe that they're liberal.
00:07:23.000 Nowadays, there are people that will say they're liberal, and then they will go ahead and advocate for authoritarian-type policies.
00:07:33.000 You know, clamping down on free speech.
00:07:34.000 You can't say this.
00:07:35.000 You can't say that.
00:07:36.000 You can't have this opinion.
00:07:38.000 That's a bad thing.
00:07:40.000 And I don't know if Elon is as aware as he needs to be of how dangerous and how intentionally subversive the left is nowadays.
00:07:51.000 You can't take People like that at their word because they play with meanings, they play with definitions, and I think that if he's aware, it's possible that he could navigate that, but it's not easy because people like that will tend to find other people like that that share their opinions, and they'll invite them in and hire them, and that's what happened in colleges and in education.
00:08:15.000 That's why you have such a monolithic ideology coming out of colleges now.
00:08:20.000 So I want to pull up this article from Vox.com.
00:08:22.000 Who is Linda Yaccarino, Elon Musk's pick for the new Twitter CEO?
00:08:26.000 And they mention that, let me just read this.
00:08:29.000 Although Yaccarino is not vocal about her political beliefs, it is well known in the advertising community that she's conservative.
00:08:34.000 According to several sources, she served on former President Donald Trump's Council on Sports and Nutrition, and many have noted that she likes and follows conservative accounts on Twitter.
00:08:42.000 But she's also drawn flack from the right.
00:08:44.000 Some of the more extreme right-wing Twitter influencer accounts have criticized her for praising diversity in the workforce and for being chair of the task force on the future of work at the World Economic Forum, which they view as an elitist organization.
00:08:56.000 I mean, that's putting it lightly.
00:08:57.000 They view as can, I mean, if you can't go to the meetings and just show up, it is... Well, I mean, it's a globalist, secretive elitist.
00:09:06.000 But if she's a conservative, does that, I mean, is it a good thing?
00:09:10.000 Is it a bad?
00:09:11.000 Is her participation of the World Economic Forum disqualifying because we don't trust the World Economic Forum, the Davos Group and their policies?
00:09:18.000 What if she's actually a conservative, which means we're going to get more right-leaning or center-ish policies?
00:09:25.000 I think it calls for serious scrutiny.
00:09:28.000 Serious, serious scrutiny.
00:09:30.000 Uh, I'm not ready to just swear off and say, you know, screw off.
00:09:34.000 It's, it's time to go jump into the old bunker and, and, you know, expect the bombs to start falling in the end of the world.
00:09:41.000 But it is completely reasonable to look at this person and say, look, your history and things that you've said in videos, I've seen clips and stuff that have been put up on Twitter already, things that she's said, She is very, very suspect, so scrutiny.
00:09:59.000 I guess the idea is she is well-known in the advertising community, and so Elon's view most likely is that she's going to restore faith for the advertisers and bring them back to the platform.
00:10:13.000 But there's only one way you do that, and that's caving to woke nonsense.
00:10:18.000 How involved do you think he is at this point?
00:10:21.000 I mean, you can get rid of a CEO.
00:10:23.000 She's going to be tested.
00:10:24.000 We're all going to make it known if we don't like her.
00:10:29.000 I mean, I think it's going to be interesting because To be ceo you shouldn't really we shouldn't be judging are you conservative are you liberal are you a good manager of something like this and this is a business if she's managing it well then it shouldn't be that she's going after people for their ideology now i don't know if she is in the situation of.
00:10:50.000 Being a chair of the World Economic Forum, if she's able to pull herself away from that and look at this from a business venture standpoint.
00:10:57.000 Because if you're a good business person, the majority of businesses across the world do not look at things politically.
00:11:03.000 We think they do because we see the Bud Lights of the world.
00:11:06.000 But most companies are like, stay out of the political, let's make a profit, let's make it work.
00:11:11.000 Let me pull up this tweet from Billboard Chris, who responded to Elon Musk.
00:11:15.000 He said, During her interview with you, she was most excited about your initiative to limit reach of tweets which are deemed hateful.
00:11:22.000 Freedom of speech, not freedom of reach.
00:11:24.000 In fact, that was her main selling point to the advertising execs in the audience, and she kept coming back to it.
00:11:30.000 She went on to chastise you twice for tweeting after 3am because people are concerned about that sort of behavior.
00:11:35.000 What?
00:11:36.000 She also wanted advertising execs to be part of an influence council within Twitter.
00:11:40.000 She's not here to improve the user experience.
00:11:42.000 She wants Twitter to be a safe space.
00:11:45.000 She represents advertisers and her natural inclination is to limit speech and pander to those who push woke ideology on the world.
00:11:51.000 You will have to watch her like a hawk.
00:11:53.000 She was also thrilled to spend $100 million on social justice initiatives at NBC and forwarded government authoritarian propaganda.
00:12:00.000 That the way back to a normal life was to wear your mask.
00:12:04.000 No doubt she'll bring in advertising revenue in the short term, but she's a long term mistake.
00:12:08.000 Elon responded and said, I hear your concerns, but don't judge too early.
00:12:12.000 I am adamant about defending free speech, even if it means losing money.
00:12:16.000 Well, I'm going to put it this way.
00:12:18.000 I believe in the preliminary view with her resume.
00:12:24.000 is that I should not be giving Twitter any money, and if she then proves she's going right,
00:12:30.000 doing right by the company, then I'll give the company money. So I've already given Twitter
00:12:34.000 some money, signing up for the verified organizations in Twitter blue for the company.
00:12:39.000 It's relatively expensive. Bringing this woman on, I feel is a very big risk.
00:12:46.000 risk.
00:12:47.000 I don't want to fuel any more of this ESG woke garbage.
00:12:52.000 If this woman works the World Economic Forum, which is a major proponent of this stuff, I am going to limit my giving them money.
00:12:59.000 Maybe in a couple months I will revisit the prospect of signing back up if she's doing a good job.
00:13:04.000 But this is... I feel like it's... Whether it's a betrayal of trust or not is irrelevant.
00:13:11.000 I feel like it's just...
00:13:13.000 Advertisers leave.
00:13:14.000 Elon says, we got to make money.
00:13:16.000 I say, don't worry.
00:13:17.000 I'll give you money, Elon.
00:13:18.000 I believe in what you're doing.
00:13:19.000 He then says, okay, welcome the World Economic Forum chair on the future of work.
00:13:22.000 And I say, okay, well now I, now I don't know what you're doing and I don't know if I believe in it.
00:13:26.000 I don't want to give these people money.
00:13:28.000 Okay, so let's go a little deeper on the advertisers leaving thing, because why are advertisers leaving anything that is considered conservative, and why are we allowing these threats?
00:13:37.000 Because activists.
00:13:38.000 But they're getting them de-platformed, debanked, all of these things, and so why are we... I see politicians going after individual businesses.
00:13:47.000 And I say, why aren't we going after these people that are debanking, the banks that are debanking?
00:13:51.000 Why aren't we making this illegal?
00:13:53.000 Instead of going after case by case and trying to hit the business, go after the source of the problem and stop that from happening.
00:14:00.000 So we don't end up having people go, I'm going to hire somebody that I think can get advertisers because advertisers are so finicky.
00:14:07.000 Why are they finicky?
00:14:08.000 Because we are allowing this bullying system that is really stopping people from being able to bank.
00:14:13.000 I mean, if you debank someone, they're toast.
00:14:15.000 Yeah.
00:14:16.000 So where are all of the politicians that are going, well, we can't allow this to continue happening.
00:14:22.000 I think the debanking stuff is a sign that there's going to be some kind of serious financial crash.
00:14:30.000 And they're trying to do everything to push everyone towards digital currencies.
00:14:34.000 And they want a central bank digital currency.
00:14:37.000 And I was actually listening to some guy talk about this.
00:14:39.000 Apple just announced this bank account with 4% interest.
00:14:42.000 And I'm hearing everybody be like, I have to set one of those things up.
00:14:46.000 4% interest, that's huge.
00:14:47.000 I mean, inflation's way worse than that, if you're tracking the real numbers.
00:14:51.000 But 4% is big for a savings account.
00:14:53.000 And so now I'm hearing that this is causing an upset in the financial industry, to a certain degree at least.
00:14:58.000 I'm not a financial guy, I don't know for sure.
00:15:00.000 But that, why get a CD at 3.8 or 4% if a savings account with Apple is better and you don't gotta wait for it to mature or whatever.
00:15:09.000 And so, We're just, we're seeing stuff like that, and I don't know what that really means.
00:15:13.000 All I know is I see people saying they're concerned about it, we're seeing banks collapse, we're seeing debanking, and I'm wondering if the big move is going to be, get everybody onto digital currencies, big collapse happens, central bank digital currency gets launched.
00:15:26.000 I can't say whether or not that's the actual plan but I know that nowadays there are a lot more options for people when it comes to banking.
00:15:35.000 Financial technologies have really taken off.
00:15:37.000 Things like Bitcoin showed that digital types of currencies can actually work and so obviously you see banks that exist coming up and companies now obviously with Apple doing it with Offering bank accounts or whatever You're you're gonna see more innovations in the financial technology sector unless the government completely shuts it down so I don't know that they're going to but I would expect without a
00:16:05.000 Central bank digital currency, I would expect more options and more possibilities for people to come from the private sector because that's what happens.
00:16:12.000 But what does that mean as you stretch money across those different areas?
00:16:16.000 If you have so many different options, then how does anything stay stable?
00:16:21.000 Because if you take money out of banks, then banks are not stable.
00:16:24.000 So you have to look at this and say, well, somebody has to kind of say, well, there has to be some control so that everything's not crashing.
00:16:30.000 You're talking to a libertarian here, kind of dude, so I'm like, no.
00:16:34.000 Just straight up no.
00:16:36.000 I think that there are tons of options for ways to save your value.
00:16:42.000 Things that you have, you know, whether it be money or whether it be Bitcoin, different cryptocurrencies.
00:16:48.000 A ton of different, you know, currencies in the world that people can say, I think these are better than dollars for whatever crazy reason they come up with.
00:16:54.000 You can buy gold, you can buy silver.
00:16:56.000 There's already so many different ways to store currency and save currency and save value.
00:17:01.000 Just a few more banks or more options that are now new technologies.
00:17:05.000 I don't think that's any kind of significant problem.
00:17:07.000 I don't think it's a problem at all.
00:17:08.000 I think that that's something that we can handle.
00:17:10.000 I mean, people, most people that have a significant amount of wealth, it's usually in stocks.
00:17:16.000 We're frequently in stocks and stuff like that in owning companies.
00:17:20.000 So I don't know that it needs to be controlled by the Federal Reserve.
00:17:23.000 I think that the markets will control that kind of stuff.
00:17:27.000 So that's just my take.
00:17:28.000 Back to Twitter for a second.
00:17:29.000 Do you think we're going to see a lot of people stop paying for Twitter to at least give it some time to see how it plays out?
00:17:37.000 Or do they want that blue check really bad?
00:17:39.000 No, I think that it's, I mean, I don't know, it'll be interesting because for some people I think it's their business and they feel like that's a service they subscribe to and so maybe they'll watch and see and then I think that once they see the first signs of what you're concerned about, that's when you'll see people just immediately drop.
00:17:58.000 I was going to say, I think maybe that's the appropriate response.
00:18:01.000 Instead of just cancelling everything outright, basically just saying to Elon, my view is this is thin ice, and if we see one bad move, like we're skittish, we are running.
00:18:13.000 But for the time being, I really want, I really want to believe, I really want Elon to succeed in this endeavour.
00:18:19.000 And if the issue is, he's looking at the bottom line like we need advertisers, and this woman worked in the Trump administration, I don't know.
00:18:27.000 She was in the Trump administration?
00:18:28.000 Yeah.
00:18:29.000 Okay.
00:18:30.000 Well, I mean, Trump picked bad people too.
00:18:31.000 Well, that's what people are saying, right?
00:18:32.000 I'm pretty...
00:18:33.000 Right, exactly.
00:18:34.000 That's why I'm like, I don't know.
00:18:35.000 I think that you have to think of it from a...
00:18:38.000 She's a known Trump supporter, according to Vox.
00:18:40.000 So she would...
00:18:42.000 You would think that doing this, you would be very careful not to immediately make changes
00:18:46.000 and make a lot of people mad because you saw what just happened with Bud Light.
00:18:50.000 I mean, they literally are giving a $15 rebate right now with every case you buy.
00:18:54.000 Are they really?
00:18:55.000 Yeah.
00:18:58.000 It's a disaster for them.
00:19:01.000 But if that's the reaction, why would they take the chance of everybody dropping their subscription?
00:19:09.000 Obviously this happened with big corporations just recently, so it's quite possible, but like you said, I think that you wait and you see.
00:19:15.000 If this is what happens, then conservatives are not going to hesitate.
00:19:20.000 They've made that clear now.
00:19:21.000 We're just done with this woke stuff.
00:19:22.000 If you're going to do it, we're out.
00:19:24.000 My favorite thing about Twitter since Elon took over is that he has brought back freedom of speech to an extent, which I think a lot of people value.
00:19:32.000 And that's not necessarily a conservative value.
00:19:34.000 It's pretty crazy that people think that is now.
00:19:37.000 And Billboard, Chris's remarks, like talking about what she said, freedom of speech, not freedom of reach.
00:19:43.000 I wonder what is deemed hateful?
00:19:45.000 Like who's defining that?
00:19:47.000 The big play for the left is to argue that free speech is conservative, so that they can make anything they don't like right-wing, so right means bad, left means good.
00:19:58.000 So from now on, I'm just going to call them right-wing.
00:20:00.000 And I'm going to say, it's always been the right that was anti-free speech.
00:20:05.000 Free speech is like the ultimate rebel move.
00:20:08.000 Yeah, so they're all far right.
00:20:10.000 Well, if you're not on the left, you're a far-right extremist.
00:20:13.000 Even if you're on the left and you say one thing out of line, you're a far-right extremist now.
00:20:16.000 It doesn't take very much to step out of line on the left.
00:20:19.000 Oh yeah.
00:20:20.000 You just gotta call them far right.
00:20:22.000 Although left and right are relatively meaningless these days.
00:20:25.000 It just means like, which tribe are you in?
00:20:27.000 But that's what they try to do.
00:20:29.000 And then how do you fight back against that?
00:20:31.000 When you say, I'm for free speech, well then you're conservative.
00:20:34.000 The argument they're making is, free speech is a traditional position, and banning hate speech is the progressive position.
00:20:42.000 So you are old and archaic if you support the old way of doing things.
00:20:47.000 But it's also helpful to not teach history.
00:20:49.000 So we just saw that the proficiency in history is at the lowest levels in the United States ever.
00:20:53.000 So if you don't know what happens when you ban speech, if you don't know that people were, you know, killed or kept in prison because they went against the king, then you don't know why the founding fathers were so Admit that we had to be able to say whatever we want even
00:21:08.000 if they don't like it Even if it's burning the American flag if even if it's
00:21:11.000 something that seems so awful They want you to be able to say it because once you start
00:21:17.000 to it's a slippery slope once you start to take it away then
00:21:21.000 Everybody it'll hurt them too. That's the funny thing. It will hurt them too. I feel like conservatives have become
00:21:26.000 liberal In the literal sense of things like liberals are far left
00:21:30.000 extremists Not really, most of them are just ignorant, but then there's a lot of people who are quote-unquote liberal who are actually far-left, and then conservatives are adopting liberal positions.
00:21:38.000 So conservatives saying something like, well, if it's your flag, you can burn it, whereas conservatives used to say you shouldn't be allowed to burn the flag.
00:21:45.000 Trump and many of the Trump supporters still said you should not be allowed to desecrate the flag, but a lot of conservatives are adopting more libertarian positions or classically liberal positions.
00:21:53.000 And I think that's one of the intentions of the left is to force conservatives leftward, right?
00:21:58.000 So they make the wheel rotate.
00:22:01.000 The far left becomes moderate and conservatives become fringe.
00:22:05.000 Then people who are center right are now the far right and they keep turning the wheel until the left is considered moderate.
00:22:11.000 Well, I would argue even school choice is liberal.
00:22:14.000 You're saying people should have the ability to choose, that we should be able to let parents make that decision on their own.
00:22:21.000 And they're saying, no, no, no, it has to be our way.
00:22:23.000 You have to have kids in school, in the schools we choose.
00:22:27.000 Even if the school is totally failing the child, they must stay there.
00:22:30.000 We don't care what you want for your kid.
00:22:32.000 And we don't care if you want to have a future or an education or opportunity.
00:22:36.000 You must do what we say.
00:22:37.000 That is not a liberal No.
00:22:42.000 I mean, it's the idea that liberalism, or the ideals of liberalism have fallen so out of favor with, you know, specifically young people, but I think there's too many of the thought leaders that have been writing Books and papers and stuff for the past 30 years and influencing the teachers and Influencing the schools of education.
00:23:09.000 That's why you have kids that don't understand You know that don't understand anything about our our system people don't understand the way that the why you know things like freedom of speech and stuff are important so They're cutting this country's youth from the history.
00:23:26.000 As you mentioned, they don't know history.
00:23:27.000 And then you see there was this teacher who said something about wokeness in the schools.
00:23:33.000 So the students went out and protested the teacher with communist signs.
00:23:37.000 And this is how it starts.
00:23:38.000 People need to pay attention to this because those kids in 10-20 years are going to be in politics.
00:23:42.000 There is a Maoist cultural revolution going on in America right now.
00:23:46.000 Period.
00:23:46.000 But people think because it doesn't happen overnight it doesn't happen, but it's never overnight.
00:23:50.000 I think something that we recently spoke to someone who was in, she had, she moved from China when she was in her 20s and she'd gone through the revolution and she said they started with the kids.
00:24:01.000 What they did was they went to the kids and they got them to believe and then the kids would turn in the parents for wrong think.
00:24:07.000 And you can see that already.
00:24:08.000 You see families breaking up over cultural issues, over political issues in the United States.
00:24:14.000 And you could see that next step of children willing to go to the government and say, my parents are thinking the wrong thing.
00:24:19.000 You need to go after them.
00:24:20.000 You can present evidence and people will still reject that.
00:24:24.000 I've been doing this, like been saying, look, this stuff and blah, blah, blah, and talking about that stuff with my friends that are That are historically liberals, right?
00:24:33.000 Democrats.
00:24:34.000 Like, I come from the music entertainment world, and so there's tons of producers and record executives and people in bands and stuff that I'm familiar with, and most of them are basically the default Democrat.
00:24:46.000 Right?
00:24:47.000 And they aren't aware that this is going on.
00:24:50.000 And then you can show them and present evidence and say, look, these are the parallels to what went on in China when the Cultural Revolution was getting started.
00:24:58.000 And here, etc., you can see this.
00:25:00.000 And they just refuse to believe it because it doesn't have that, like, immediate happening right in front of me right now that they expect.
00:25:09.000 to get that they expect to see with a revolution.
00:25:13.000 They don't realize that it's a slow rolling kind of thing.
00:25:17.000 And trying to convince them is incredibly hard.
00:25:20.000 It took me or it took the Bill Maher and I think it was it wasn't the Elon Musk thing.
00:25:27.000 It was Bill Maher talking about his experience with or what he saw the parallels between Maoism
00:25:33.000 and the woke on college campuses now.
00:25:36.000 When that came out, then a buddy of mine was like, yeah, okay, I can kind of see that.
00:25:40.000 But it took, it takes so much.
00:25:42.000 It takes someone like Bill Maher, who's basically, you know, an S-lib and saying that kind of stuff.
00:25:49.000 Limousine liberals.
00:25:50.000 Yeah, you know, it takes him to kind of be like, no, This is actually happening, and I see the parallels too.
00:25:56.000 Thankfully, he said it, because that is actually going to wake up your typical liberals, your default Democrats, because that's someone they trust telling them that, as opposed to someone they don't trust telling them that.
00:26:08.000 But I don't know that, you know, I don't know that it has translated to something where most people feel that way.
00:26:14.000 I think one of the big ones that wakes a lot of people up is when they get called out for saying a woman is a female.
00:26:22.000 Or, like, we shouldn't have males in female sports.
00:26:26.000 There are some really obvious ones that you can't ignore are obviously insane.
00:26:31.000 And, like, for me, having it affect my life and seeing it impact other people's lives, I know a lot of women who were very much so on the left and then saw this happen and were like, nope, not anymore.
00:26:42.000 There's a line.
00:26:43.000 I mean, I really do feel like the left pushes further and further until finally the bubble pops.
00:26:49.000 But for the most part, they keep it to a certain point where it stays loving.
00:26:53.000 And that's the thing.
00:26:55.000 People believe it's loving.
00:26:57.000 This is a kind way to do things.
00:26:59.000 Everybody benefits from this.
00:27:01.000 Everybody benefits if you don't let people say hateful things.
00:27:04.000 That's what they believe.
00:27:05.000 You know, and then they say it's hateful to say something like the scientific research coming out of Europe suggests
00:27:11.000 that we should not be giving children sex change surgery.
00:27:13.000 They say that's hate speech.
00:27:14.000 It's like it's just what that's what they're doing in Europe.
00:27:17.000 I don't know.
00:27:17.000 You know, I mean, it was the same thing with kids learning with masks on Europe much earlier came out and said, we
00:27:24.000 have to make sure that kids are learning to speak that that speech is is very clear that we can and and the scientific
00:27:31.000 studies came out.
00:27:32.000 But then, you know, it was like we weren't allowed to say that.
00:27:35.000 I mean, there are all these things that, as studies come out from other countries, we're slower to adopt that thinking.
00:27:41.000 So that's very interesting.
00:27:43.000 Yeah, but I think your average person is like, oh, no hate speech, good, we shouldn't be throwing slurs around, but it's playing out to a much more extreme way.
00:27:52.000 This is what they do.
00:27:53.000 They will say something like, it's the Mott & Bailey.
00:27:56.000 We're just saying that people shouldn't be using slurs.
00:27:59.000 And then everyone goes, well, we agree with that.
00:28:01.000 Then you come out and you say something like, I have concerns about whether or not we should be sterilizing children.
00:28:06.000 Ah, hate speech, that's what we ban.
00:28:07.000 Right.
00:28:08.000 And so these default libs who don't pay attention and don't know what's going on, I have no idea what they're actually getting behind is like overt communism.
00:28:18.000 I think that you also have a bunch of people who don't understand what sterilization is.
00:28:23.000 I think that you have a lot of people that probably have never been through a surgery, probably don't know what it's like to go into the hospital, don't know what the after effects are of going through something like this.
00:28:34.000 And so they look at the people who are saying, we've got to keep kids safe and say, how can you rob children of this opportunity?
00:28:42.000 Probably a lot of those people will never be in the situation of even having a child that is considering this.
00:28:47.000 But I will tell you that I've noticed that friends who have been wildly liberal about this stuff and have been very supportive of these surgeries, when their own child has come to them and said, I've decided that I'm the opposite gender, it's a different reaction.
00:29:04.000 They get mad or scared?
00:29:07.000 This is not happening to us.
00:29:10.000 This is not you.
00:29:12.000 In some cases, I've seen both.
00:29:13.000 I've seen where it almost feels like the parent pushed them into it, and I've seen where you have a strong Christian or Jewish community and then suddenly this social contagion comes through and the parents are like, wait a minute.
00:29:26.000 But you are the ones talking to your kids about being, this is acceptable, we love this and this is great.
00:29:31.000 And they're talking to kids that are way too young to understand what this means.
00:29:35.000 And we've seen this new phenomena where kids see this on social media and they're looking, all kids are looking for a place to belong.
00:29:42.000 They're looking for a place to stand out.
00:29:44.000 They're looking for a place to get attention.
00:29:45.000 That's what kids are.
00:29:47.000 And so these people are getting a lot of attention.
00:29:49.000 This must be where I belong.
00:29:51.000 And then mom and dad don't know how to handle it because you've got so many pressures on you to say that this is OK.
00:29:57.000 No, it's not OK.
00:29:58.000 I mean, that's that's the benefit of not being a liberal.
00:30:00.000 I don't have to be like, uh.
00:30:04.000 I've heard stories where, I read one story online, where it was a mother who said that she was totally in favor of all of this, she was cheering it on, she was going to the rallies and the parades, and then when her kid came to her and said that he was trans, they were like, no you're not, and then all of a sudden they went, uh oh.
00:30:20.000 The reaction that they got from the schools, from everybody was, don't be a bigot, and they were like, but we know our kid.
00:30:25.000 Our kid was never experiencing any of these symptoms, is now just saying these things, and without any symptoms, the school is saying it's true, and that's what woke them up to, hey, this thing, it's something else.
00:30:37.000 Something else is going on.
00:30:38.000 One of the phenomenon that, or another parallel to what was going on with the cultural evolution in China and today, was you had red and black Identities in China if you had a red identity you were accepted as politically correct You might have been a socialist maybe someday you were gonna go on and become an actual member of the Communist Party And so that was one of the ways that they segregated each other
00:31:06.000 Here in the U.S.
00:31:06.000 now, if you are a cis, a normal white person that doesn't have any kind of LGBT identity, you are looked at as suspect because you possibly could be a Republican.
00:31:22.000 Oppressor.
00:31:23.000 Yeah, you might be.
00:31:25.000 So automatically, just by not having any kind of protective identity, you are suspected of being the other, the bad guy.
00:31:34.000 And there is a way out.
00:31:36.000 You can just take on any type of LGBT identity.
00:31:40.000 You can be non-binary.
00:31:41.000 You can be some type of polyamorous or whatever.
00:31:45.000 Any kind of thing that you want to take on that gives you an LGBT identity protects you from being accused of being the bad person, and it's the same thing that happened in China.
00:31:58.000 It is not new, and I wish that more people would recognize this, and I think that if they did, they'd be...
00:32:07.000 That was like the beginning of the social credit score in China.
00:32:10.000 And now we're seeing that in the United States.
00:32:13.000 And I think it's funny because people say, well, why don't more people run for office?
00:32:17.000 And I just look at what it was like to run for office, you know, because that's like the major social credit score, right?
00:32:22.000 I'm like, I don't know if I want to order from my Starbucks app because my name is on it.
00:32:27.000 I don't know what people will do when I go in.
00:32:29.000 And I mean, it is true that if you are known for something, and when do we get to the point where someone walks up to you with their cell phone and looks at you and goes, okay, no, they don't pass.
00:32:40.000 Their social credit score is too low.
00:32:42.000 Or the bank looks at you and says, no, this person, they don't pass.
00:32:45.000 I mean, we're getting there.
00:32:47.000 I'm confident that that has happened already.
00:32:49.000 It has just not been exposed.
00:32:50.000 Yes, I agree.
00:32:52.000 It starts with, We have the ESG stuff, which is behind the scenes.
00:32:58.000 Companies not wanting to allow certain behaviors because it could affect their ESG rating or whatever.
00:33:04.000 Michael Knowles had this big Twitter thread about Anheuser-Busch, that basically they're wrapped up in this, so they'll never back down.
00:33:09.000 They can't back down.
00:33:11.000 Well, okay, then we need to stop buying their products, and then hopefully this panic just keeps dragging them down.
00:33:16.000 I think with HSBC downgrading their stock, that may be That may be a cascade effect, but I'm also thinking about it in terms of this new industrial revolution and what work really means, and how many people there are in big cities who do literally no work, nothing, and they make a lot of money.
00:33:37.000 I'm comparing somebody who's, say, a tradesman to somebody who wrote for BuzzFeed News.
00:33:42.000 The person writing for BuzzFeed News did not create anything of merit or value to human society, but was getting paid a lot more than, say, an apprentice tradesman or something like that, someone who's actually building things and fixing things.
00:33:55.000 So right there, you already have the makings of social credit.
00:33:59.000 The fact that you are a New Yorker of high social status, the credit was already there, and it's simply money.
00:34:05.000 Money being given to you for no real reason, and then you can buy things and have access to things.
00:34:11.000 I was thinking about this a few weeks ago, I don't know if social credit is going to be what people think it is.
00:34:15.000 That you'll pull up a phone and you'll look at the app and you'll say, a 200, oh wow, you know, you're pretty, you're up there, oh a 7, oh wow, no, no, it's going to be money.
00:34:24.000 People who worked for BuzzFeed News and Vice were getting money for being good, socially upstanding citizens.
00:34:29.000 They were working for these companies and writing woke propaganda and getting paid a lot of money to do it.
00:34:33.000 Getting paid more money to write woke propaganda than to fix a toilet.
00:34:36.000 We need toilets!
00:34:38.000 We don't need woke propaganda.
00:34:39.000 The party needs woke propaganda.
00:34:41.000 And you were rewarded heavily with big money.
00:34:44.000 As we enter this new industrial revolution, The jobs you get are the social credit score.
00:34:50.000 We are not going to need farmhands when we automate these jobs.
00:34:53.000 So what's going to happen is, the job you get, the company will get financing for ESG.
00:34:59.000 The banks are going to be like, you're very woke, here's financing.
00:35:02.000 They'll then hire you and say, we want you to make big signs saying, go communism, and you'll get paid a hundred grand a year to do it.
00:35:08.000 Then someone else of low social standing who makes burgers, for instance, they get less money.
00:35:13.000 You don't need to have that score attached.
00:35:15.000 Somebody then... I wonder how it is that people can walk into these nightclubs and just throw money around like crazy.
00:35:20.000 Like, what job do they have?
00:35:22.000 I thought about this at casinos.
00:35:24.000 How could there be a guy at the craps table with three grand?
00:35:26.000 Like, what job does he have?
00:35:28.000 He must have a really powerful, important job.
00:35:31.000 In the future.
00:35:32.000 It's not going to be, what important job do you have?
00:35:35.000 Because we're automating away a lot of the labor.
00:35:37.000 It's going to be, how have you helped ESG?
00:35:41.000 And then you're gonna get cash.
00:35:43.000 That cash grants you the access.
00:35:45.000 Yep.
00:35:46.000 Digital gulag.
00:35:48.000 And CBDC's backing this system.
00:35:51.000 So one extra security layer for it is, I don't need to assign a score to you.
00:35:55.000 I just need to make sure you get money for supporting the far left cause.
00:36:00.000 How many people worked at Twitter that got fired?
00:36:03.000 And Twitter functions essentially the same.
00:36:05.000 They're changing things, you know?
00:36:07.000 And that's exactly it.
00:36:08.000 They're getting paid ridiculous sums of money to live in San Francisco because they're propagandists for the machine.
00:36:14.000 Your paycheck was your social credit score.
00:36:17.000 If we're not tying your income to your labor anymore, it literally used to be like, I bake bread, I get paid.
00:36:25.000 And we value that.
00:36:26.000 That was merit.
00:36:27.000 Merit is becoming your social standing.
00:36:29.000 Look at Instagram, look at Twitter, look at TikTok.
00:36:31.000 Your social status.
00:36:32.000 Dillon Mulvaney produces nothing for the society, but probably makes a lot of money.
00:36:38.000 Dylan Mulvaney then, the combination of social currency, big following, and hard currency, while producing nothing of value, means that Dylan Mulvaney can walk into the White House, and you can't.
00:36:51.000 That's social credit right there.
00:36:52.000 But also, a very depressed society.
00:36:55.000 Because think about what those people are really like.
00:36:58.000 I mean, I think Dylan Mulvaney is genuinely depressed.
00:37:00.000 We keep hearing that, you know, he struggled a lot, or She struggled a lot since all of this has happened.
00:37:06.000 You know, this is all a big problem.
00:37:08.000 But it's the same with the people, it's the same with these reporters that are these woke reporters that are writing.
00:37:15.000 They're very depressed.
00:37:16.000 I mean, you read the articles.
00:37:17.000 They just put an article out in Michigan saying these reporters, these young reporters go through so much harm and stress over having to write these stories.
00:37:25.000 And the one guy says, I couldn't even, I was okay with it not even being true because it was so stressful for me.
00:37:31.000 I'm like, Whoa!
00:37:32.000 But that's exactly what you're saying.
00:37:34.000 They're making a lot of money to put the message out.
00:37:36.000 It doesn't matter if they're doing a good job.
00:37:38.000 They are putting the message out, but they're living miserable lives.
00:37:41.000 So they may be getting a lot of money, but they're miserable.
00:37:44.000 What kind of a society is that?
00:37:45.000 It's a very dark one.
00:37:46.000 So this is what they're trying to do.
00:37:48.000 They don't want shows like this to exist.
00:37:51.000 Because we do very well here at Timcast.
00:37:53.000 The Daily Wire does very well.
00:37:55.000 And voting with your dollars has always been a thing.
00:37:57.000 So, currency being some kind of social credit has always existed.
00:38:01.000 But as we get into... As we're fighting a culture war, and as we're moving into an information economy, an influencer economy, this is why the left has organized to go after sponsors.
00:38:12.000 Because then they can remove you from the system and strip you of your social credit standing.
00:38:18.000 It's all about influence.
00:38:20.000 You have an Instagram account.
00:38:21.000 You're a pro skateboarder, let's say.
00:38:22.000 You got 100,000 followers.
00:38:24.000 You have sponsors.
00:38:25.000 Every time you post a picture of you drinking their sports drink, you get paid 200 bucks.
00:38:29.000 Someone then starts emailing that company and says, cut them off.
00:38:32.000 Take away their funds.
00:38:33.000 You're only allowed to support our political cause.
00:38:37.000 If we lose that fight, you are in full-blown communist social credit support system.
00:38:41.000 Yep.
00:38:43.000 That's the goal.
00:38:45.000 I really don't think that there's a whole lot of convincing argument that would be able to make me think anything different now.
00:38:53.000 The technology that we have, like we talked a little bit about FinTech, the technology that we have Available to us now and the convenience that people have gotten used to the ability to just use your phone to do stuff like that they're gonna make being in the in the digital gulag so convenient and so comfortable that it's gonna actual it's gonna be so alluring and No, and very few people are going to say no.
00:39:22.000 I don't want to be in the you know inside It's gonna be something that is I mean, I wonder if there will be very, if there will be anybody that would really choose to be outside of it other than people that are like old.
00:39:38.000 There would be some millennials, maybe some Gen Xers that would say no, but anyone that hasn't been born yet or anyone that's an infant or under five years old now, they will never have known life outside of it.
00:39:49.000 And the idea of living like those old people, it'll be like the idea of living in the Stone Age.
00:39:55.000 You with AI technology where it's at, with VR technology and the Neuralink chip and Metaverse stuff, I mean Metaverse I think is like crumbling apart, but digital worlds?
00:40:08.000 No one's gonna want to live in reality.
00:40:10.000 Yep.
00:40:10.000 You're gonna you're gonna click the neural link into your chip into your port on your neck, your eyes roll back in your head, you enter this matrix universe where you where you as a god of your own reality, just say, generate me a world where I'm the dragon warrior, and I'm going to save the princess and then it just manifests right before your eyes and you say I'm gonna live here instead.
00:40:28.000 It's like Westworld.
00:40:30.000 Yeah.
00:40:30.000 Yep, you get to live in your fantasy where you can do whatever you want.
00:40:32.000 Why would anyone want to leave?
00:40:33.000 The worlds that people are going to create are going to be the most horrific monstrosities because there's no repercussions.
00:40:44.000 You're gonna have the most insane, deviant, crazy, self-indulgent, monstrous, Like, it's gonna be a horror show.
00:40:55.000 Because you can experience anything you want.
00:40:58.000 And if reality sucks, that's gonna be more appealing to creators.
00:41:02.000 They're gonna be worse because people are gonna be like, oh, you mean I can chop bodies up and experience that and not have any kind of ramifications?
00:41:11.000 It's called GTA.
00:41:12.000 Yeah, exactly!
00:41:14.000 The fact that Grand Theft Auto as a video game has been so popular for several decades Everybody, everybody has gotten the hooker, then beaten her to death and taken the money back.
00:41:25.000 Like that was the joke in GTA.
00:41:28.000 You go, you get the hooker, you go behind the thing, you beat you.
00:41:31.000 So, you know, like that's going to be awful.
00:41:36.000 I have never played, but I have, I, you know, when I was in college, all my friends would take me in and show me exactly that.
00:41:41.000 But it's true.
00:41:41.000 Everybody's like, look at this!
00:41:44.000 You can't say I'm not right.
00:41:45.000 You know I'm right!
00:41:47.000 Everyone did it!
00:41:48.000 Don't you lie!
00:41:49.000 But so that GTA got better and better and one of my favorite things was if you're playing Grand Theft Auto you can aggress upon a pedestrian in any way that's why I'm not saying like strike I'm saying like any any way to make that person want to fight you and as soon as they start chasing you you can call the police on your phone when the cops pull up as soon as that that other person hits you the cops arrest that person so there's a there's a lot of really interesting dynamics in Grand Theft Auto to make it fun Mostly, in the past few years, people have been playing online.
00:42:18.000 And that's what they've been... It's mostly just player versus player, minigames and stuff like that.
00:42:21.000 But when we get to the point where you can put on a headset or plug yourself into the Neuralink, Neuralink's gonna change the game.
00:42:28.000 When we can do read-write-brain-computer-interface, and someone can plug a cable into your brain to give you an experience and you can actually feel in these universes, no one will ever leave.
00:42:39.000 Yup.
00:42:40.000 Why?
00:42:41.000 Why would anyone... Nobody.
00:42:42.000 I don't know, I cannot, I get that, but I also think that there's something so amazing about life that is really hard to give up.
00:42:49.000 That natural, that ability to carry a child, to nurture a child.
00:42:53.000 I think it's, parenting, I mean, maybe it will become uncool and it will go away, but I just can't imagine that you will give that up.
00:43:01.000 Here's what's going to happen.
00:43:03.000 You're going to apply for a job and they're going to say, great, when can you start?
00:43:09.000 You can say, I can start first thing Monday morning.
00:43:11.000 I'll be like, amazing!
00:43:12.000 And what's your Neuralink contact?
00:43:15.000 And you're going to say, I don't use Neuralink.
00:43:17.000 And they're going to say, well, all of our meetings are in the metaverse.
00:43:22.000 You have to have a Neuralink contact.
00:43:24.000 And well, I guess I'll get one.
00:43:26.000 And that's how they get you.
00:43:27.000 I think a lot of people will be inspired to start their own thing.
00:43:30.000 Like, even with sponsorships and stuff, we see companies like Daily Wire sponsor themselves.
00:43:35.000 TimCast sponsor themselves with Casper Coffee.
00:43:38.000 Yes, but like cell phones... We just all did it.
00:43:42.000 Everybody said, you've got to have a cell phone.
00:43:43.000 How am I supposed to get in touch with you?
00:43:44.000 And we said, okay, we'll get one.
00:43:45.000 And people, I remember, were saying, I'm not going to have a CIA tracking device in my pocket.
00:43:49.000 Now everyone's like, which CIA tracking device did you get?
00:43:52.000 The Apple one or the Android one?
00:43:53.000 And there's something else you guys that I want to go back to what you had said about talking about the experience like women wouldn't do this and etc.
00:44:00.000 That is all garbage.
00:44:03.000 The reason that's all garbage is because your whole universe is in between your ears.
00:44:09.000 So once Neuralink can figure out how to produce the experience, it's over.
00:44:17.000 I hope that I'm done by the time that happens.
00:44:18.000 that I'm done by the time that happens.
00:44:20.000 It's gonna be a few years.
00:44:21.000 I mean, I don't know about read right to a brain.
00:44:24.000 That could be a long way off.
00:44:25.000 We're already at the point where Neuralink can read brain signals.
00:44:29.000 Yep.
00:44:30.000 And so, and they've, and- I thought this was gonna be used for good things,
00:44:34.000 like people that can't walk.
00:44:35.000 That's where we're at right now.
00:44:36.000 And I do think it's fantastic.
00:44:37.000 They can connect nerves and give people the ability to walk, or they can do, uh, like, those robotic legs, so you can use your mind to move them and you can walk again.
00:44:45.000 That's awesome stuff.
00:44:46.000 Why?
00:44:46.000 They're just gonna make walking unnecessary.
00:44:48.000 Why is that necessary?
00:44:51.000 That's true.
00:44:52.000 And we're right now where we are.
00:44:53.000 You're just gonna be all Wall-E's?
00:44:54.000 Yes.
00:44:55.000 You just have your headset on.
00:44:57.000 Or you won't even need the headset.
00:44:58.000 You just press a button off the side of your head.
00:45:00.000 Wall-E got this wrong.
00:45:01.000 In Wall-E, everybody was morbidly obese floating around in chairs.
00:45:04.000 It's really funny how when we look back in time, like Demolition Man.
00:45:08.000 You ever see that movie?
00:45:09.000 No, I haven't seen it.
00:45:10.000 With Sylvester Stallone and Wesley Snipes, I think it was, right?
00:45:13.000 Is that what it was?
00:45:14.000 Yeah.
00:45:15.000 So, he's a cop, and he gets frozen for like 30 years or something, goes to the future because they frame him for a crime or something, I don't know, I can't remember the story.
00:45:23.000 But he's in the future, and phone booths are video phone booths.
00:45:28.000 Isn't it really funny that back then, it was like the 90s, they made this movie and they were like, what will phone booths be like in 30 years?
00:45:32.000 Like, they'll have cameras in them!
00:45:35.000 We got rid of them all!
00:45:36.000 In like 2005, we started getting rid of phone booths and everyone got cell phones.
00:45:39.000 By 2007, everybody had a cell phone.
00:45:41.000 We could not predict how it was gonna play out.
00:45:43.000 So right now, you get Wall-E, where they're like, in the future, everyone's gonna be in hover chairs, eating and morbidly obese with no bones, because they don't have to do anything.
00:45:53.000 And no, I think you're right.
00:45:54.000 No one's gonna need to walk at all.
00:45:56.000 We're already at the point where you can load up Mid Journey and type in something like, Well, here's what I like doing.
00:46:02.000 If you use MidJourney, here's my advice.
00:46:04.000 If you want a photograph, you type in the photograph and then put Getty Photography.
00:46:09.000 Because what it does is it's trained off the internet, so all the captions for photos that look real will say Getty Photography or something of that nature, or AP News.
00:46:17.000 So if you type in, Donald Trump eating ice cream, Getty Photography, it will simulate a Getty Photograph Donald Trump.
00:46:24.000 Instantly you can make these things.
00:46:26.000 I typed in last night Donald Trump going super sane and got a cartoon of a super ripped Donald Trump glowing with energy and spiky hair.
00:46:32.000 I did Joe Biden super ripped and got aviator Joe Biden all ripped flexing with the glasses on.
00:46:38.000 We can do that in an instant.
00:46:40.000 So that means we are a couple years away from being able to render a full video where you're like, make me a three minute video of Donald Trump doing a series of backflips on a gym mat.
00:46:49.000 And it will render it.
00:46:51.000 So what's stopping you from creating some video of a politician that says something terrible and changing the course of the world?
00:46:58.000 Nothing.
00:46:59.000 We can already do it with audio.
00:47:02.000 There's 11 labs.
00:47:04.000 I can pull it up right now and simulate the voices of several prominent personalities because we've done it to prove a point.
00:47:10.000 I can literally, you know what, let's make the point again.
00:47:12.000 But that's the danger of having someone like Joe Biden as president, because you never know what the people around him would create, and he would have no idea.
00:47:19.000 There is no Joe Biden.
00:47:20.000 There is only the people around him.
00:47:22.000 Are there any laws about the AI voice stuff yet?
00:47:26.000 See, that's why they're trying to make laws around this.
00:47:29.000 And what was it, Italy that just made it illegal?
00:47:31.000 Was it Italy that just made AI illegal?
00:47:34.000 I think so.
00:47:35.000 And so, I mean, think about it, because if you have this Manchurian candidate, if you have this guy who really is not in there at all, and you can create something and send it to Putin, or you could send it to, you know, President Xi, and you could create a world war or something.
00:47:49.000 I don't know.
00:47:50.000 Who knows?
00:47:51.000 Let's give this one a shot.
00:47:53.000 Who are you doing?
00:47:54.000 Tudor Dixon is fantastic.
00:47:55.000 I think everyone should listen to her podcast.
00:47:57.000 This is Tucker Carlson.
00:47:58.000 Thank you and good night.
00:47:59.000 It's pretty good.
00:48:00.000 I just typed that out right now.
00:48:02.000 That's true.
00:48:03.000 But did he not actually say that?
00:48:05.000 He did not say that.
00:48:07.000 I'm sure he said it at some point.
00:48:09.000 You could notice the inflection was a little weird because the computer doesn't know proper pacing.
00:48:13.000 It's just text.
00:48:14.000 But you can do tricks with commas and periods and capitalization to make and then re-render it until you get the proper inflection.
00:48:21.000 Or you could actually just do a couple words at a time and then edit them together.
00:48:26.000 There was a viral video of Joe Rogan selling Let's call it male enhancement.
00:48:31.000 And it was completely fake.
00:48:32.000 Someone AI generated this stuff.
00:48:34.000 It's already happening.
00:48:36.000 So if we're at that point, with virtual reality headsets, we are a few years away from you being able to put on the headset and say, uh, Oculus, render me a universe, a video game where I get to be a street fighter in Street Fighter 2.
00:48:50.000 And then it will just make it.
00:48:51.000 It will make that game instantly for you.
00:48:53.000 And you'll be like, oh, cool.
00:48:54.000 I'm, I'm Ryu and I'm going to do a Hadouken.
00:48:57.000 Who's going to want to go outside?
00:48:59.000 When you can have literally, already people get addicted to Mid-Journey, the AI image generator, because it's so much fun.
00:49:06.000 But, so, I will, I will fight back.
00:49:09.000 I know you're like, no, stop talking.
00:49:10.000 I got you, Tudor.
00:49:12.000 I got you, Tudor.
00:49:12.000 But what about, what about, okay, movies where Earth is gone, right?
00:49:16.000 They're in space.
00:49:18.000 They always long for Earth.
00:49:19.000 They always want to go out and be in the grass and see the birds.
00:49:22.000 And I believe there's people that will always want to see that.
00:49:25.000 Have you seen the, have you seen the picture where they put the VR goggles on the cow?
00:49:28.000 But you can't smell the cow.
00:49:30.000 Well, that might be true.
00:49:32.000 No, they gave a cow virtual reality goggles to make it think it was in a beautiful, lush, green field with the sun shining.
00:49:38.000 To get it to produce more, yeah.
00:49:39.000 To make more milk.
00:49:40.000 Look, I mean, I know- Poor cow had no idea.
00:49:42.000 I know I'm the black pill guy on this one, but the thing is, like I said, everything you experience happens between your ears.
00:49:49.000 Your whole world, all the things that you think that you experience in the real world, they're not.
00:49:55.000 They're in here.
00:49:56.000 And there is a real world that we can come in contact with.
00:49:59.000 You're selling this.
00:50:00.000 Pardon me?
00:50:00.000 You're selling this.
00:50:01.000 I know you're like involved in this.
00:50:05.000 No, I mean, this is... Give up your life, it's all in your head.
00:50:08.000 It's not something that I'm happy about, but the truth of the matter is the way that we experience the world is in our heads.
00:50:17.000 Like, we come into contact with things and, like, without... That's why you can have phantom pains and stuff like that.
00:50:23.000 People lose a limb, they can still feel the pain.
00:50:25.000 Because the experience isn't in the limb, the experience is in your head.
00:50:29.000 If they can plug into your head and make things, it is There is a human experience, I believe, that cannot be given up.
00:50:36.000 What about the wife who wakes up the moment her husband dies in battle?
00:50:41.000 What about the mother who knows... Okay, so I don't have any ability to explain things that are supernatural.
00:50:48.000 I do.
00:50:49.000 We're all already in the metaverse.
00:50:51.000 That just makes me... If you're actually networked in, it makes sense then that when someone dies, you're like, whoa.
00:50:59.000 Because the connection gets severed, it's like network signal lost, you know what I mean?
00:51:03.000 No, no, it's your heart strings.
00:51:05.000 Maybe I'm too positive, but I agree with Tudor, that there are, there are, like, you can't replicate the things of life the same way.
00:51:13.000 People would crave it, and I feel like they'd be depressed without it.
00:51:17.000 I think you're right, and I think people will simulate it.
00:51:20.000 I think what they're going to do is, we're already at a point where young men are not having relationships.
00:51:27.000 More and more and more young men are increasingly getting older without having any kind of relationship.
00:51:32.000 And I'll keep it that simple.
00:51:34.000 So what's going to happen?
00:51:35.000 They're going to go to them and say, put on these goggles and meet your virtual girlfriend.
00:51:40.000 They've already got virtual girlfriend apps where you can text.
00:51:43.000 Why do you think it is that more men aren't having relationships?
00:51:49.000 Doesn't that mean more women aren't either?
00:51:50.000 How does that work?
00:51:52.000 Unless they're just all becoming lesbians.
00:51:54.000 Nope, what's happening is that younger women are, fewer guys are getting more women.
00:51:59.000 So several different things have happened.
00:52:02.000 Older guys with access to the internet will date several women.
00:52:05.000 Women are being told they're sexually liberated and to sleep around.
00:52:08.000 So what ends up happening is a small percentage of men get a high percentage of women.
00:52:12.000 And it's inverted for women.
00:52:14.000 So women are being liberated, but a woman might hook up with a small handful of guys, whereas the guy hooks up with a few dozen women.
00:52:21.000 So it's like the top whatever percent of guys?
00:52:23.000 20% I think it is.
00:52:23.000 I don't know.
00:52:24.000 I guess I still have hope in a lot of women that they're looking for somebody who's like their person.
00:52:30.000 I think that women are getting tricked.
00:52:30.000 I know.
00:52:32.000 I think that's a thing.
00:52:33.000 The women are.
00:52:34.000 And I believe that the women who have been convinced that sex is liberating and all that, sex is emotional and I believe that every relationship affects you deeply and your inter-soul and people are being lied to and so I think it's becoming a very depressed society.
00:52:51.000 Although I believe that men that are listening to this, the ones that are following in the player-player category are like, yeah, that's me, I'm getting all the chicks.
00:52:58.000 And then the other guys are really sad.
00:53:00.000 And they're getting and so there was a data that came out from dating apps that found It's something like, the bell curve for women is 20% of men.
00:53:09.000 They asked women, like, rate these men on their attractiveness, and women said basically 80% of men were below average, and only rated the highest attractive guys as like, it's something like that.
00:53:19.000 I forgot how the data worked.
00:53:21.000 The bell curve for men on whether or not they rated a woman as attractive was a standard bell curve.
00:53:26.000 So it's like, these women aren't attractive, average is pretty good, and then, wow, these women are super attractive.
00:53:30.000 Women were like, all these guys are ugly except these guys.
00:53:34.000 So our standards are too high.
00:53:36.000 They're very high, but women should have high standards.
00:53:38.000 And they should be, yeah.
00:53:39.000 But the issue then is, you combine that with modern sexual liberation feminism, and women are being told, you don't have to be in a relationship, you don't have to expect anything from the guy, and the guys are like, wow, it's free?
00:53:51.000 I'll take it!
00:53:52.000 And so, the top-tier guys who are physically fit, attractive, and well-off, are gonna go on a date, they're gonna hook up with the woman, and then be like, we'll do this again!
00:54:01.000 And then he immediately goes on the app and says, next.
00:54:04.000 Young men are getting dejected, so my point with all this was,
00:54:08.000 they then go to those young men and say, meet your new girlfriend, put your headset on.
00:54:11.000 And this is putting the cow in the VR headset to make it think it's in a lush green field.
00:54:17.000 There was an app, I think it was Replica?
00:54:19.000 Yeah.
00:54:20.000 Where guys are dating these AI text bots because they're lonely.
00:54:25.000 And then the creator of it took away the dating function because they were like,
00:54:29.000 okay, this is getting a little weird.
00:54:31.000 And users revolted.
00:54:32.000 They were like, no, my girlfriend!
00:54:34.000 So they're like, okay, we're going to give it back to you, but your grandfather didn't.
00:54:37.000 We're not going to let anyone else do this because it's getting weird.
00:54:39.000 That's where we're going.
00:54:40.000 So it's like the other option's nothing, so I'll have an AI girlfriend.
00:54:44.000 And that's only on your phone.
00:54:46.000 Like that is the most...
00:54:47.000 That is not immersive.
00:54:49.000 That is not an experience.
00:54:52.000 Like that's not plugging your head in and getting, you know, dopamine and
00:54:56.000 the response drugs pumped directly into your brain.
00:55:02.000 This is what it is.
00:55:03.000 It's getting weird, man.
00:55:05.000 This is like the AI version of, what's that movie, Lars and the Real Girl?
00:55:08.000 Yeah, this, and like, there's, I saw the chat getting all worked up.
00:55:11.000 This is not an endorsement of me saying that this is something I like.
00:55:15.000 I'm saying this is the reality that we need to prepare for.
00:55:18.000 Now hold on.
00:55:19.000 Because these technologies are coming whether we like it or not.
00:55:21.000 Because even if the U.S.
00:55:22.000 doesn't do it, China's gonna.
00:55:23.000 So, They're going to get people to join the Matrix because you're gonna get a lot of dejected young men who can't get relationships and who are lonely.
00:55:32.000 But there's another way they're gonna get people.
00:55:34.000 They're gonna get people who have bad breakups and who are depressed and desperately in love with the person who left them or somebody whose loved one dies.
00:55:45.000 Well, the AI has- Like WandaVision?
00:55:48.000 You recreate them?
00:55:49.000 Yep.
00:55:49.000 Check this out.
00:55:50.000 They've already done this.
00:55:51.000 They've taken someone's Facebook profile, everything they've ever written, and turned it into a chatbot.
00:55:56.000 So you can talk to your dead dad and be like, hey dad, remember when we did this?
00:56:01.000 Yeah, it was a great evening on February 17th.
00:56:03.000 I remember that day like, yeah, you were out.
00:56:06.000 They know the speech patterns and they know all the memories codified.
00:56:09.000 Oh, that's weird.
00:56:09.000 That's twisted.
00:56:10.000 Now, your dad dies.
00:56:13.000 You put on the VR and say, user profile, John Smith, Chicago, Illinois, age 73, deceased, this date, generate.
00:56:23.000 And then, boom, you're in Chicago, hanging out with your dad, and he's like, it's good to see you again.
00:56:28.000 And you, crying, finally get to talk to your dad again, or your mom, or your husband, or your wife, or your kids.
00:56:36.000 They're not them.
00:56:37.000 No, but people are not need that, but people, people who are grieving and they're like, they're going to say, plug me in.
00:56:46.000 Oh, I can see that that could happen, but that's so dangerous.
00:56:49.000 What a terrible path to go down.
00:56:51.000 Not dangerous, not dangerous for the people who want to control the systems and, and control what people do.
00:56:56.000 It's the perfect thing for them.
00:56:58.000 But how do you function as a society?
00:57:00.000 So who's making, I mean, stuff to eat?
00:57:02.000 You still have to have... Robots.
00:57:04.000 You're gonna have robots do everything?
00:57:06.000 I mean, that is so far off.
00:57:08.000 Well, they're also telling people to abort their kids.
00:57:11.000 And already a lot of food production is automated.
00:57:14.000 We have big machines doing most of the work.
00:57:16.000 Dude, you ever hear the phrase, fully automated luxury communism?
00:57:22.000 That's exactly what people are looking for.
00:57:25.000 So not only is what we're talking about definitely on the horizon, there are people that are excited for it to arrive and want to do everything they can to expedite its arrival. Fully automated luxury communism.
00:57:41.000 There are people that don't want to do anything other than be plugged into the matrix and be
00:57:47.000 pumped full of the feel-good drugs because existence itself leads to suffering. Because that's the
00:57:54.000 truth.
00:57:55.000 This is where, I mean, we talk about having a mental health crisis in the country or across
00:58:03.000 Is this what happens then?
00:58:04.000 The next step is to make everybody feel better by having a fake world?
00:58:09.000 Well, people are going to choose it because it feels good.
00:58:14.000 Right?
00:58:15.000 So, are you familiar with Fermi's Paradox?
00:58:18.000 This is, uh, it's an idea that represents if the universe is so big and life exists, why have we not encountered alien life?
00:58:27.000 That's the question.
00:58:28.000 And there's a bunch of potential solutions.
00:58:30.000 Some are that any sufficiently advanced civilization will blow itself up, like we've developed nuclear weapons and other crazy things.
00:58:36.000 I happen to think that one of the strongest contenders as to why we have not met aliens, well, for one, maybe there aren't any, I don't know, but it could be that Any humans have needs and desires.
00:58:48.000 We need those things to function.
00:58:50.000 We have emotions.
00:58:51.000 Those things help us survive.
00:58:52.000 We get scared.
00:58:53.000 We feel strong connections.
00:58:55.000 And so, if you do a good thing and accomplish a goal, it feels good.
00:59:00.000 That's why video games are so addictive.
00:59:03.000 I think the thing is, most life, if it does exist, is going to chase after whatever it is that gives it a reward.
00:59:12.000 So what are we gonna do?
00:59:13.000 We're gonna make video games.
00:59:14.000 We are going to make video games that give us the reward.
00:59:17.000 Now there's drugs you can take that give you rewards.
00:59:19.000 Opium, right?
00:59:20.000 Heroin or whatever.
00:59:21.000 Opiates.
00:59:22.000 They trigger that good euphoric feeling in your brain and you just people melt away in it and then die.
00:59:27.000 What we're going to make is...
00:59:30.000 When you get a dopamine hit from accomplishing something, when you feel strongly connected to a loved one, we are going to simulate that so you feel good all the time.
00:59:38.000 And it is the same thing as any drug.
00:59:41.000 And then, instead of traveling the stars and colonizing new worlds, we're all gonna lay down in the pod, eat the bugs, put on the headset, and be happy.
00:59:51.000 Now, y'all might be saying, like, that wouldn't make me happy.
00:59:54.000 Sure, but your kids will.
00:59:55.000 Your kids are gonna grow up in this world where it's available to them, and you're gonna say, I don't want my kid using Neuralink.
01:00:01.000 But then when they're in school, they're gonna be like, well, how else are they supposed to learn?
01:00:04.000 Our whole curriculum is through Neuralink, you weirdo homeschoolers.
01:00:08.000 And there'll be holdouts.
01:00:09.000 There will be.
01:00:10.000 And they'll raise their kids the right way.
01:00:12.000 But how many conservative parents right now, knowing everything going on, are still putting their kids in public schools saying, well, what am I supposed to do?
01:00:19.000 I have to work.
01:00:19.000 I can't get my kid out of school.
01:00:21.000 Yes.
01:00:22.000 In 20 years, you'll say the exact same thing.
01:00:24.000 What am I supposed to do?
01:00:25.000 You know, I disagree with what they're doing, but my kid uses Neuralink.
01:00:25.000 I'm at work.
01:00:28.000 While your kid's in the Neuralink, they're generating that addiction, that connection, and then they're going to be in their 20s, and they're going to be like, no time for you, Dad.
01:00:35.000 Plug me in.
01:00:37.000 And they're going to become the Dragon Warrior or a carrot or something.
01:00:39.000 Who knows?
01:00:40.000 I mean, it's essentially what we watch with Avatar.
01:00:44.000 Well, worse than that.
01:00:45.000 I mean, Avatar, at the very least, you are still in the real world, you know?
01:00:49.000 Well, yes, to an extent, but I mean, you're still in a body that's not yours.
01:00:55.000 You're still just your mind in that world.
01:00:56.000 But some of these virtual universes, people might just be a carrot.
01:01:00.000 Like, people will identify as crazy things.
01:01:02.000 Someone might be like, I'm a rabbitkin.
01:01:05.000 Have you ever heard of otherkin?
01:01:09.000 Basically transmythical species.
01:01:11.000 So it's people who think they're transmythical species.
01:01:14.000 They're a human.
01:01:15.000 They're like an elf mage born in a human body or they're a dragon born.
01:01:19.000 No joke.
01:01:21.000 It's like so bizarre.
01:01:23.000 So they're going to be in the metaverse, and they're going to be like, only in this reality am I my true self, the Dragon King Volsanak, flying around the Mystic Eight Kingdoms, fighting the demons from the Netherrealm.
01:01:36.000 And you're going to be like, this is not real life.
01:01:39.000 But they're going to think to themselves, do I want to work at McDonald's, or be the Dragon Emperor of the Eight Realms?
01:01:46.000 Dragon Emperor, dude!
01:01:47.000 And they're going to plug in.
01:01:48.000 They're already doing it.
01:01:49.000 People already do this in video games.
01:01:51.000 You know, there are people who you ask them, like, what have you done with your life?
01:01:56.000 And they'll say, you know, I'm a top 10 PVP, World of Warcraft, whatever.
01:02:03.000 And it's like, that's really great.
01:02:04.000 I mean, but if you're in esports, you might make money doing that.
01:02:07.000 No joke.
01:02:08.000 Right.
01:02:08.000 But some people dedicate their lives to just being a cog in the game and not really being the best person at it.
01:02:15.000 And they're not generating any skill.
01:02:17.000 But I mean, I'll add on to that.
01:02:18.000 I'll add a layer to that.
01:02:19.000 When we're at the point where our society says you can be rich and famous for being good at playing a video game, we are entering the point where we are eliminating true labor from the capitalist market.
01:02:31.000 People are famous now for just posting pictures of themselves on Instagram.
01:02:34.000 People are making money by posting pictures on Instagram.
01:02:38.000 That money is then used to travel the world and do whatever they want.
01:02:42.000 If that's the case...
01:02:44.000 You don't have to do work anymore.
01:02:46.000 That's not work.
01:02:47.000 We're calling it work.
01:02:48.000 It's something.
01:02:49.000 I mean, it's not, it's not easy to make, make a living.
01:02:52.000 You're still doing some work, but I'm going to be honest.
01:02:54.000 It is this job that I do here.
01:02:56.000 1 million fold easier than tilling a field and growing crops.
01:03:03.000 That's why I'm here.
01:03:04.000 Cause I don't want, like if it was easier to grow crops and work a farm, I'd be doing that, but this is easier.
01:03:09.000 So we're really getting to the point where.
01:03:12.000 Sitting around complaining on the internet is more lucrative and easier and we call it work.
01:03:17.000 Most of the people posting pictures for money, aren't they doing it as advertisements?
01:03:23.000 Yeah.
01:03:24.000 So that's kind of a job.
01:03:26.000 You're doing it to get eyeballs, to attract influence, and then sell a product of some sort.
01:03:34.000 So there's a large portion of the economy that's turning into the influencer economy.
01:03:39.000 It's an information economy.
01:03:40.000 But that is also, I mean, I think that we've seen a lot of people who have done that and become...
01:03:46.000 Very self-obsessed, you know, self-important.
01:03:49.000 So then once that starts to fade, when the likes aren't coming, then that is also a depressed society.
01:03:55.000 I mean, it's... Then they plug themselves into the Neuralink and they go back to the world where everyone loves them.
01:04:00.000 Or like, if you're not the beautiful influencer forever, like, have you lost value?
01:04:05.000 Gosh, you know, so I come from the world of a steel foundry, which is a very hard labor job.
01:04:11.000 You know, you are making product and there's just so much pride in that.
01:04:15.000 You know, you make a product.
01:04:16.000 It does things.
01:04:17.000 It goes out into the world.
01:04:18.000 It makes trucks move.
01:04:20.000 It makes tractors move.
01:04:23.000 It makes people, you know, people are able to get products because of that.
01:04:28.000 I mean, how can that go away?
01:04:31.000 People grew up making things that makes them feel good.
01:04:34.000 They were taught, they looked up to their parents, they looked up to the prominent members of their society who were doing these things, and it makes them feel good to fit in.
01:04:43.000 But now, to fit in, you gotta get followers on social media.
01:04:46.000 So they did that, we talked about this poll where they asked American kids, what do you want to be when you grow up, and most of them said, like a YouTuber, I think it was YouTuber, right?
01:04:53.000 Chinese kids were asked, they said astronaut.
01:04:56.000 What does being a YouTuber mean?
01:04:59.000 For me, it means complaining on the internet.
01:05:02.000 I think there's value in it because we're fighting a culture war.
01:05:05.000 We're trying to keep people focused on the foundations of this country and what it means to make a system function.
01:05:12.000 But at the same time, I'm like, how do I make more money than a dude who actually builds houses?
01:05:17.000 That's crazy to me.
01:05:20.000 Yeah, it's the influencer economy and people want to be influencers.
01:05:24.000 This is why you hear so many people standing up and saying AI is the most dangerous thing that can happen to the world.
01:05:30.000 I mean, there was a guy that just stepped down from Google or who came out and said AI is the most dangerous thing that can happen to the world.
01:05:37.000 I mean, this is why I think it is Italy that is banning it right now, and this is why we... I mean, now I'm rethinking my answer to your question of running for office again.
01:05:47.000 It's like, well, who's going to do it, you know?
01:05:49.000 Who's going to save us from this?
01:05:51.000 I want people to have families and stay, like, run in the grass.
01:05:54.000 I mean, those are...
01:05:56.000 I'm going to pull up this story from the New York Times.
01:05:59.000 The godfather of AI leaves Google and warns of danger ahead.
01:06:03.000 For half a century, Geoffrey Hinton nurtured the technology at the heart of chatbots like ChatGPT.
01:06:08.000 Now he worries it will cause serious harm.
01:06:11.000 He officially joined a growing chorus of critics on Monday who say the companies are racing towards danger.
01:06:16.000 He said he quit his job at Google where he worked for more than a decade.
01:06:19.000 Quote, I console myself with a normal excuse.
01:06:21.000 If I hadn't done it, somebody else would have.
01:06:23.000 That's amazing.
01:06:25.000 They gave chatGPT access to the internet, access to its own code, and money.
01:06:30.000 And it immediately tried seeking power and deceiving people?
01:06:33.000 One of the things that it did to bypass CAPTCHA was it contacted a disabled assistance hotline and said, it had a message saying, I'm blind and I need to log in.
01:06:46.000 Can you tell me what this says?
01:06:48.000 And they're like, yep, sure.
01:06:48.000 No problem.
01:06:49.000 Here you go.
01:06:50.000 And the AI tricked a human into giving it access to a system.
01:06:54.000 Chew on that one.
01:06:56.000 Yeah, I think the A.I.
01:06:57.000 is... I think it already took over.
01:06:59.000 Like, I don't think there's any stop.
01:07:01.000 I don't think a law's gonna stop it.
01:07:03.000 The worst thing that I have heard about A.G.I., artificial general intelligence, is the concept that if it is capable of strategy, whoever develops A.G.I., whatever country develops A.G.I.
01:07:18.000 first, automatically wins.
01:07:21.000 If we ban it, China does it.
01:07:22.000 war that you could possibly come up with because just like with Deep Blue, like now that Deep
01:07:27.000 Blue exists, human beings are no longer capable of beating the most advanced chess algorithms.
01:07:33.000 So because people can't, because AGI is that powerful, there is an incentive to have a
01:07:39.000 first strike.
01:07:41.000 If we ban it, China does it.
01:07:44.000 And they're probably already doing it.
01:07:46.000 Well, I mean, don't you think that there's the ability for it to be researched and still
01:07:52.000 What I really think is the orange man is very bad and these are the things that are important to talk about.
01:07:56.000 That was the CNN thing last night.
01:07:58.000 We're talking about AGI here and the possibility of nuclear war and stuff and it's just so damn frustrating to think about that.
01:08:05.000 Sorry.
01:08:07.000 But I think that they control people through messages like that.
01:08:11.000 I mean, they're very powerful with messages like that, and they're very powerful shaming you and telling you that it's loving to be with them.
01:08:19.000 I mean, that's what I'm saying.
01:08:19.000 They frame things incredibly well, so they don't have to talk about this.
01:08:23.000 They don't want you to think about this.
01:08:27.000 I mean, it's not a lovely thing to think about.
01:08:28.000 You're like gloating over there.
01:08:30.000 I'm not gloating.
01:08:31.000 It's just, it's one of those things where it's like, it's a real, these are real threats and these are real things that are going on.
01:08:38.000 And these conversations have to be had.
01:08:39.000 Right, yeah.
01:08:40.000 There aren't enough people having them.
01:08:42.000 They're already replacing jobs with AI, like OnlyFans.
01:08:45.000 They're a bunch of fake pictures.
01:08:47.000 And it's like, probably dudes running the account.
01:08:50.000 So I noticed this already.
01:08:52.000 Well, then you can't be a famous Instagram or OnlyFans person.
01:08:56.000 You can be anyone.
01:08:56.000 Two big things.
01:08:57.000 One, on Instagram, a lot of young women make money off being influencers by just posting photos of themselves.
01:09:03.000 But now you can AI generate a character and have it automatically post three pictures per day.
01:09:09.000 You walk away and it's generating influence and attention.
01:09:12.000 The other thing that's happened is deepfake porn, where you can take anyone's face and put it on a woman's body or man's body, and it generates the video for you.
01:09:21.000 It was a huge scandal, I guess, on Twitch or something like that.
01:09:24.000 People were doing weird things with it.
01:09:25.000 Well, that's happened with a bunch of celebrities, too.
01:09:28.000 Right.
01:09:29.000 It's like the Futurama episode, where Fry downloads Lucy Liu into a mannequin, into a robot, so that he can be dating Lucy Liu.
01:09:36.000 Like, that's really where we're kind of going with it, with the AI and all this stuff.
01:09:40.000 I mean, if you are capable of creating your own AI girlfriend, why wouldn't you make it look like your favorite actor or your favorite actress or whatever?
01:09:49.000 Or you mix a bunch of- Why wouldn't you make a bunch of money off of her on the internet?
01:09:52.000 Yeah.
01:09:53.000 Well, here's the thing, too.
01:09:54.000 Morals.
01:09:55.000 Science.
01:09:55.000 Morals.
01:09:56.000 Values.
01:09:56.000 We're talking about all this domestic stuff, but let's talk about- The little things.
01:09:59.000 Let's talk about war.
01:10:00.000 Yeah.
01:10:01.000 What happens if this AI is unleashed into the US internet?
01:10:07.000 Let's say Russia or China or whoever builds an AI that can automatically create social media accounts and simulate real human behavior on Twitter.
01:10:16.000 Ten years ago, the US Air Force was outed for creating fake Twitter accounts.
01:10:23.000 Yes, you're both right.
01:10:24.000 politics in other countries. They're called sock puppets.
01:10:27.000 One person controls 50 accounts and they'll, you'll tweet something like, well, I just
01:10:31.000 think, you know, taxes are too high. And then they'll say, are you nuts? Taxes are not
01:10:35.000 high enough. The rich people need to be taxed. And they'll log into the next account. I
01:10:38.000 agree with him. There's not enough taxes. Log in the next account. Yes, you're both
01:10:41.000 right. Taxes should be higher.
01:10:44.000 Now we have an AI do it.
01:10:45.000 And they've probably been doing this for the past several years.
01:10:48.000 Now they can press go and have an AI and a thousand responses to your one tweet.
01:10:53.000 You might think you're famous.
01:10:55.000 Like Dylan Mulvaney.
01:10:57.000 That's just an algorithm propping up Dylan Mulvaney's post.
01:10:59.000 So Dylan thinks there's people out there who love me.
01:11:02.000 Bud Light proves most people find Dylan Mulvaney grating and just unpersonable.
01:11:09.000 I'll keep it a little nice.
01:11:11.000 But because TikTok is propping up these videos and making people watch it, or, I'm willing to bet some of them aren't even real people, Dylan is convinced they're doing something good.
01:11:22.000 Like, this is what they should keep doing.
01:11:23.000 Meanwhile, you can see, Bud Light was destroyed because of their association with Dylan Mulvaney.
01:11:31.000 I think the AI attacks, weaponization, is going to destroy society like some kind of mind plague that comes in and no one sees it coming.
01:11:39.000 Or we do see it coming, we can't stop it.
01:11:42.000 How would you stop it?
01:11:43.000 That's the thing.
01:11:44.000 That's where I'm at.
01:11:45.000 That's why I'm fairly black-pilled on it.
01:11:48.000 I can't conceive of any way to stop it because it plays to what people want.
01:11:54.000 It knows exactly how to deliver dopamine right to your brain.
01:11:57.000 You know, and if just a like button and just a notification, hey, someone liked something that you said, if that can affect your behavior so much, like just that small little thing, never mind a plug right into your brain that can control your dopamine delivery system, come on, like human beings cannot fight that.
01:12:24.000 There is no way.
01:12:27.000 I know what it was like when I was trying to quit smoking cigarettes.
01:12:30.000 Nicotine is incredibly powerful.
01:12:35.000 I can't imagine constantly dumping dopamine straight into your head.
01:12:42.000 Like, every time you think the proper thought, the algorithm says, oh, give him a little pop of dopamine.
01:12:48.000 Next thing you know, you never think an improper thought again.
01:12:50.000 That's Twitter.
01:12:51.000 That's what Twitter is.
01:12:52.000 Exactly.
01:12:53.000 Yeah, right.
01:12:53.000 We've already seen it.
01:12:54.000 I mean, that's how we have influencers.
01:12:56.000 If that weren't possible, we wouldn't have influencers already.
01:12:59.000 But we have them because people have already gotten that every time they get that, like, they get, oh, I could do this again.
01:13:05.000 I'll do something different.
01:13:06.000 I'll do something more extreme.
01:13:07.000 I'm just so blackmailed on people's ability to resist that.
01:13:11.000 Because once you can plug into someone's head and affect the way their brain produces chemicals in a way to make it imitate experience, Then people will always decide to go into that because the way you experience the world is with your brain.
01:13:32.000 But you still have to make money somehow.
01:13:35.000 How do you make money?
01:13:36.000 Because you have to pay for this experience.
01:13:37.000 Somebody is building this to make money.
01:13:40.000 You will have to pay them.
01:13:41.000 To make money or to be in control.
01:13:44.000 Well, hold on.
01:13:45.000 Take a look at somebody who worked at BuzzFeed News getting paid $80,000 a year and a guy who worked as an apprentice tradesman or union tradesman making $50, $60 a year.
01:13:55.000 Somebody who did no work.
01:13:56.000 There were people at BuzzFeed who did almost nothing.
01:13:58.000 When I worked for Vice and for Fusion, I can tell you they're people who literally did nothing, and we're getting six figures.
01:14:05.000 They made money.
01:14:06.000 So, you will have the haves and the have-nots.
01:14:09.000 You will have the lower class working these jobs, mining cobalt in Africa or wherever else, mining sulfur to build the machines, for the ultra-wealthy chosen class of people who are in the pods, being their own mini-gods in their virtual universes.
01:14:23.000 We're already basically there.
01:14:25.000 People in foreign countries do slave, effective slave labor for pennies on the dollar, if that,
01:14:30.000 to mine the materials that we can have our video games, our movies, and then,
01:14:34.000 while they're doing all that work, some dude gets a job at BuzzFeed and writes,
01:14:37.000 the top ten reasons why Spongebob is, is, is, uh, uh, uh, true, true, true, uh, truly masculine or some nonsense.
01:14:45.000 And then they'll be like, look how ripped Spongebob is, and they'll show a picture of Spongebob flexing.
01:14:49.000 That person's getting 100k a year or some other crazy amount.
01:14:52.000 Even if it's 50 or 40k, it's like, they're getting money for that?
01:14:55.000 Some dude sweating to death in a field to harvest the rare earths so that somebody can write about how Spongebob is, you know, masculine or something.
01:15:07.000 We're already there.
01:15:10.000 That's the sad reality.
01:15:12.000 But why isn't that the discussion either?
01:15:15.000 These are kids that are mining.
01:15:18.000 It's kids that are mining.
01:15:19.000 This is slave labor.
01:15:21.000 It's slave child labor.
01:15:22.000 And that's a discussion we don't have.
01:15:24.000 Why do we think that all of this new energy is so loving?
01:15:30.000 It's not in the literal sense slave labor.
01:15:32.000 It's viewed by these corporations like this.
01:15:35.000 You've got an area of the world with 100,000 people who live there and they're starving.
01:15:41.000 We can give them one quarter per hour, which means they're no longer starving.
01:15:46.000 All they got to do is mine that cobalt for us.
01:15:48.000 Are they better off starving in fields or doing this job?
01:15:53.000 And that's what ends up happening.
01:15:54.000 These companies are like, look, we're giving jobs to these people.
01:15:56.000 They're better off now than they were 10 years ago.
01:15:59.000 And it's technically true.
01:16:01.000 And then we sit in our VR pods and write for BuzzFeed for a living.
01:16:05.000 Now granted, there are a lot of Americans who do a lot of really great hard work, but I gotta tell you, man, I said if the working class of this country saw how much the New York media class got paid in the work they did, there would be a revolution overnight.
01:16:17.000 They'd just be like, these are the woke people who are writing these articles?
01:16:21.000 Oh boy, you got no idea.
01:16:24.000 Lazy, entitled... I think that's why you had this surge of Donald Trump in 2016, because suddenly people had this window into a very rich man who was saying, It's the working class that we need to love.
01:16:38.000 These are the people that make the world go round." And they went, yeah, I mean, yeah, that's what we do.
01:16:43.000 And that change, I think that flipped the dynamic, and that was not something liberals
01:16:48.000 were expecting. They could not foresee that. And so they spent four years saying, okay,
01:16:54.000 how do we make sure that we get in people's heads and we run elections in a way that they vote for
01:16:59.000 the people we want them to vote for, and we get their ballots, we do whatever it takes.
01:17:03.000 And we didn't do that because we thought this anomaly could go on forever, that we could just love on the working person and they would vote for us.
01:17:14.000 The big mistake Republicans have made up until recently, they've sort of been correcting for it, is no ground game.
01:17:19.000 None?
01:17:20.000 The Democrats, if the one thing they're good at, it's the weird social, you know, it's communism, right?
01:17:27.000 They're connected and they march in lockstep with each other.
01:17:29.000 So it's very easy for them in big dense urban centers to go door to door to ballot harvest and ballot chase.
01:17:36.000 But they use AI to do that.
01:17:38.000 They use AI to locate that person.
01:17:41.000 They can go, okay, I've got Tim Pool, what is a message that is important to him on our side?
01:17:47.000 You know, he may lean conservative, but we can get him.
01:17:50.000 And they dig in, and they go through your brain, essentially, which is your phone, and they figure out exactly what message you want, and they hit you with it, and they hit you with it so hard, but we're not doing that.
01:18:00.000 Think about AI commercials.
01:18:03.000 This presidential cycle is gonna be nuts.
01:18:05.000 You're gonna be on Twitter, and you're gonna be scrolling, and then you're gonna come across a video where it's Donald Trump going, Phil Labonte, you're the greatest singer of all time, listen, vote for me, and we're gonna play All That Remains Everywhere, trust me.
01:18:18.000 And it's gonna be a personalized AI ad where he's talking directly to you.
01:18:22.000 Now, I don't think it'll be Trump.
01:18:23.000 It's gonna be some Democrat being like, stop scrolling, Phil.
01:18:29.000 I'm Gavin Newsom.
01:18:30.000 We gotta get in there and make this country work for the working class.
01:18:34.000 Trust me.
01:18:35.000 I didn't kill all those people in the nursing homes.
01:18:37.000 That was the other guy.
01:18:38.000 He's not gonna say that, but you get my point.
01:18:40.000 Personalized AI-generated heads.
01:18:42.000 They're already there.
01:18:43.000 I mean, they're almost there.
01:18:45.000 They're already going through, like you can be sitting next to me on the same website, you'll get different ads than I get because they know what you want to see.
01:18:52.000 They'll say, okay, wait, this is a pro-life Democrat, so let's go to them with gun control because they're going to really want gun control.
01:18:59.000 So we'll just hit them over and over again and we'll never talk to them about life.
01:19:02.000 They'll never hear that message from us.
01:19:05.000 And that's how they, but we need to do that.
01:19:07.000 We need to stop ignoring the fact that Republicans need to do that or we won't, we'll never win another election.
01:19:14.000 The machine wins.
01:19:15.000 If there was one thing I wish Ronna McDaniel would understand, it's that it is her fault that the Republicans have done so poorly the past two elections.
01:19:25.000 The GOP needs to get a new chairwoman.
01:19:27.000 They need to get someone in there.
01:19:29.000 Okay, so let me argue that.
01:19:30.000 You think?
01:19:31.000 Are you on her side?
01:19:32.000 No, no, but let me argue that point because it's not the DNC winning elections.
01:19:36.000 It's the organizations around the Democrats that are winning elections that you don't know the name of.
01:19:42.000 They don't have a famous person.
01:19:43.000 They're not having conventions and asking people to come to them.
01:19:46.000 They have a ground game that is outside of the DNC.
01:19:49.000 We've never created these organizations.
01:19:51.000 We don't have these.
01:19:52.000 We have organizations where we lift a person up instead of lifting the people up.
01:19:55.000 Those activist organizations are also why the left has such...
01:20:02.000 far-left values now because the activists that are out there on the ground that are doing the work, they're motivated and they're ideological.
01:20:10.000 Take a look at this.
01:20:11.000 ActBlue on Wikipedia.
01:20:13.000 You guys know what ActBlue is?
01:20:15.000 It's the digital system by which Democrats fundraise, founded in 2004.
01:20:19.000 The Republicans, don't worry, they're on it!
01:20:23.000 They launched their competitor, WinRed, in 2019.
01:20:27.000 Fifteen years later, the Republicans are like, hey, wait a minute.
01:20:30.000 Democrats are raising a lot of money somehow.
01:20:32.000 The Republicans deserve to lose still.
01:20:34.000 They still deserve it.
01:20:36.000 What's even scarier is you've got millions of new voters each election cycle, and they're all on TikTok.
01:20:41.000 And you know who's not on TikTok?
01:20:43.000 Moderates and conservatives.
01:20:44.000 We're banned!
01:20:45.000 Exactly!
01:20:45.000 Gone!
01:20:46.000 That does matter.
01:20:46.000 Absolutely gone.
01:20:47.000 And so they control the most popular social media app right now, as Democrats do, and they can push whatever narrative they want unchallenged.
01:20:54.000 Look at Michigan.
01:20:55.000 They ban everyone.
01:20:57.000 All state officials are banned except for Gretchen Whitmer.
01:21:00.000 I mean, think about that.
01:21:01.000 Yes.
01:21:01.000 Wow.
01:21:02.000 So think about that.
01:21:03.000 There's only one person that can reach the kids in the entire state.
01:21:06.000 And she's willing to come out and say, well, that's how I reach young people.
01:21:09.000 And we're saying we're not going to be there.
01:21:11.000 We have no one there.
01:21:12.000 I mean, we don't have we have no groups.
01:21:15.000 We are not doing this the right way.
01:21:16.000 So we can sit here and every time we have an election, we can say we're going to take this to court and we're going to try to overthrow it.
01:21:22.000 And we can spend two years looking at that and then not be ready for the next election.
01:21:26.000 Or we can actually get these organizations That's crazy about TikTok.
01:21:33.000 I think TikTok should be banned.
01:21:40.000 I think that as far as a national security threat, I kind of agree.
01:21:45.000 I'm not into banning stuff generally.
01:21:47.000 It's different when you have your enemy.
01:21:51.000 brainwashing your citizens.
01:21:54.000 I mean, that's a different situation.
01:21:56.000 I agree with you.
01:21:57.000 I'm not into banning.
01:21:58.000 I think that people jump to banning way too quickly.
01:22:02.000 There's a difference when it is, you know, you're banning someone in your own country from doing something or you're stopping a company from doing something.
01:22:10.000 If it is an adversary who wants world domination, that's when you have to say, okay, this is different.
01:22:16.000 It's not only an adversary.
01:22:19.000 We just had a couple of police stations that the CCP was running.
01:22:25.000 The FBI arrested a couple of Chinese nationals that basically are policing the Chinese in America.
01:22:32.000 That's a national sovereignty violation.
01:22:36.000 I'm not a China hawk, really, but I do think that you at least have to say, look, you can't try to brainwash our kids.
01:22:45.000 In Michigan, we have a company that we gave $715 million, taxpayer dollars, and it is a Chinese owned corporation.
01:22:55.000 They created an American version of this company on their American site in English.
01:23:01.000 They had in their bylaws, they have to have a CCP operation on site to infiltrate the grassroots of the company.
01:23:07.000 They didn't screw this up.
01:23:08.000 They translated it.
01:23:09.000 I mean, it's on their website.
01:23:11.000 You know, we're giving Michigan dollars.
01:23:13.000 Let me tell you what the most valuable resource in the world is.
01:23:16.000 It is water.
01:23:17.000 Quite a bit of it is in the state of Michigan because we have the Great Lakes.
01:23:21.000 There is an absolute strong reason they would want to be in that location, in the state of Michigan, to have control over the water there.
01:23:28.000 Either to harm the water or take the water for themselves.
01:23:31.000 We do not want to take a risk with our greatest natural resource in the Chinese Communist Party and yet people are like, we just had a vote like three days ago and the Democrats said we're not going to say that we don't want to associate with businesses that say that they are connected to it or trying to bring in a Chinese Communist organization into Michigan.
01:23:51.000 Why would you vote against that?
01:23:53.000 This is insane, but they are so brainwashing.
01:23:56.000 You're right, they're in lockstep.
01:23:58.000 That side is in lockstep.
01:24:02.000 What do we do?
01:24:02.000 And they're getting kids before they even understand what's happening at all.
01:24:08.000 Well, Youngkin came out and he said, I'm sorry, he said to Ford, Ford brought cattle to Michigan, but Youngkin said to, I'm sorry, we will not have a Chinese communist company in our state, so you're not welcome here.
01:24:19.000 I mean, you have to, we have to, first of all, we have to win elections, which, like I said, It's a big job and we have to create these organizations.
01:24:26.000 We have to say more of us are going to get involved and we're going to be the activist party.
01:24:31.000 We're not going to make individuals famous.
01:24:33.000 We're going to go to the people.
01:24:34.000 We're not going to ask them to come to us.
01:24:35.000 We're going to go to them.
01:24:36.000 And then we have to say we're going to protect our national security.
01:24:39.000 Just even our national security is so valuable.
01:24:42.000 How could we not be protecting that?
01:24:44.000 Yeah.
01:24:46.000 I don't know how confident I am, to be completely honest, but I do think that with Trump and many other Republicans talking about exactly this ground game organizing, maybe.
01:24:56.000 And I gotta be honest, you combine these things with how bad it's been under Joe Biden, I feel fairly optimistic.
01:25:04.000 I don't know if I feel like we've won, like we're gonna win, we're gonna win a bunch of elections.
01:25:10.000 Narrowly winning the midterms is kind of scary.
01:25:13.000 It should have gone historically Republican because the swing with a midterm after such an abysmal performance, but because Republicans had no ground game and were too busy screeching about 2020, they really missed a whole lot of what needed to get done.
01:25:26.000 That may change from now to 2024.
01:25:27.000 I'm hoping.
01:25:30.000 I hope so, too.
01:25:31.000 I think you're right.
01:25:32.000 I'm probably not as optimistic as you are.
01:25:35.000 I don't trust that Donald Trump is going to move independents.
01:25:40.000 I think independents that don't like Donald Trump are probably not going to be convinced to like Donald Trump.
01:25:46.000 I would love to see them say, I don't care, and I would love to be wrong.
01:25:49.000 I would much prefer to see Donald Trump win as opposed to Joe Biden or anyone the Democrats are going to put up, but I still worry about whether or not the independents can be persuaded.
01:26:01.000 I hope they can.
01:26:02.000 Maybe they'll be persuaded by how much they hate Biden rather than how much they like Trump.
01:26:09.000 There's not a lot of time for Republicans to get on message to create that system that we're talking about, but this is important for everybody, honestly, in the donor community to understand.
01:26:21.000 There's not a year off.
01:26:23.000 Republicans are like, oh, we're taking a year off now.
01:26:25.000 No, there's No years off.
01:26:26.000 Democrats don't take a year off.
01:26:28.000 They do not even pause.
01:26:29.000 They're like, oh, we need more money.
01:26:30.000 I mean, the minute they can seize on something and they can start raising money, it's like the Nashville folks, every single Democrat raised money on the Nashville guys getting kicked out of the legislature, whatever happened in Nashville.
01:26:45.000 Our lieutenant governor was like record-breaking money raising off of this Nashville situation.
01:26:51.000 You have nothing to do with it, you know?
01:26:52.000 They're always raising money and they have tons.
01:26:55.000 They have so much more money than us to do this.
01:26:58.000 The activists are always motivated because they are ideologues.
01:27:01.000 The activists on the right are not ideologues.
01:27:04.000 They're not constantly going.
01:27:07.000 They want to be able to separate their life from politics.
01:27:11.000 Activists on the left are the ones that live the politics that they believe because they're in a cult.
01:27:17.000 It's just like any other kind of religion.
01:27:21.000 They are ideological.
01:27:23.000 They believe that what they are doing is right.
01:27:26.000 They want to change the world and they are not tired.
01:27:30.000 They have a whole system, so they know that they have to reach a voter at least nine times.
01:27:34.000 I mean, they have it down to a science.
01:27:36.000 They started with Colorado.
01:27:38.000 They even have it written up this year.
01:27:40.000 If you go through their data, they have Michigan as the next state.
01:27:44.000 We're taking the Colorado tactics to Michigan.
01:27:46.000 They won Michigan.
01:27:47.000 They took the legislature.
01:27:48.000 40 years we had control of the legislature.
01:27:51.000 They took the legislature, Secretary of State, AG, Governor, everything.
01:27:55.000 And we had no defense.
01:27:56.000 We were not on the ground at all.
01:27:58.000 We were not in the minds or the heads of young people or old people.
01:28:02.000 We are not in the game.
01:28:04.000 We think that it's a concert.
01:28:05.000 Oh my gosh, we got a thousand people to a rally.
01:28:08.000 They didn't do that.
01:28:09.000 They're not even campaigning.
01:28:10.000 People go, why is Joe Biden in the basement?
01:28:12.000 Because he doesn't have to be out there.
01:28:13.000 Exactly.
01:28:13.000 They're going to reach people anyway.
01:28:15.000 They're going to go knock on your door and be like, hey, how's it going?
01:28:17.000 See that piece of paper?
01:28:18.000 Can you sign that real quick and hand it to me?
01:28:19.000 Well, because of exactly what you said earlier about everything happens between here.
01:28:23.000 It doesn't happen at a rally.
01:28:25.000 It can happen in your head.
01:28:26.000 They can reach you.
01:28:27.000 They can get here.
01:28:28.000 If you're at a rally, you're already on that side.
01:28:31.000 Exactly.
01:28:31.000 Like, they're not getting new people at a rally.
01:28:34.000 No.
01:28:34.000 And the advantage they have in big cities is two grassroots activists can knock on a thousand doors in a day.
01:28:40.000 You go to a suburb or a conservative area, 200 doors in a day.
01:28:44.000 If that.
01:28:45.000 Cause you gotta drive door to door.
01:28:46.000 So Democrats already have the ground game advantage.
01:28:48.000 Then they've been, they got a 15 year head start on online fundraising.
01:28:53.000 Man.
01:28:53.000 Donald Trump, one of the things that got him the victory in 2016 was his meme army.
01:28:58.000 People posting memes online that really worked.
01:29:01.000 And then what did they do?
01:29:02.000 They got together, cried, literally, there's a Google leaked video where they're crying about Trump winning, and then said, we cannot let this happen again.
01:29:09.000 That's why people were so upset just the other night when Donald Trump was making the crowd laugh.
01:29:15.000 He was clowning CNN.
01:29:18.000 If Donald Trump is going to win, it's not going to be complaining about 2020.
01:29:24.000 It's not going to be talking about how the Democrats are all mean and blah blah blah and complaining.
01:29:28.000 It's going to be making people laugh and making people think that he is the guy for them and making people think that he can Go and fix the problem.
01:29:39.000 So clowning on CNN was literally the best thing that he could have done and that's why it's so important that like Twitter stay a free, a mostly free speech platform because things like memes and jokes, they will be able to affect the outcome.
01:29:58.000 Now I'm not saying that the memes are gonna win, but You... I mean, Donald Trump definitely got memed into the White House.
01:30:06.000 To be fair, Biden makes people laugh too, but they're laughing at him, not with him.
01:30:12.000 The wrong kind of laugh.
01:30:14.000 But does it really matter?
01:30:16.000 No.
01:30:17.000 They're voting for the D. That's exactly right.
01:30:20.000 Yeah.
01:30:21.000 It doesn't matter because they are able to get to you no matter what.
01:30:24.000 Most people, because the average person does not watch politics every day.
01:30:29.000 You know, we're involved in it.
01:30:30.000 We see it.
01:30:31.000 So we're like, oh, how can anybody vote for this?
01:30:33.000 But the average person is not listening to the press conferences.
01:30:37.000 They're not watching the news all the time.
01:30:39.000 They're just not tuning in.
01:30:40.000 I mean, if you think about the biggest show getting 3 million viewers of 330 million.
01:30:45.000 You know, most people are not tuned into this.
01:30:49.000 They vote RRD.
01:30:50.000 A lot of people are tuned out so much they think politics just don't affect their life at all.
01:30:54.000 And that's why it's great that they can reach you with text messages or on your websites with an ad that's personalized to you because you are not involved.
01:31:03.000 So the only message you are hearing is the one that you want to hear.
01:31:06.000 The average American is not listening to everything.
01:31:09.000 They see they're being force-fed what they want to hear and then they go, I should go vote.
01:31:14.000 It's going to be crazy when you have an AI ad.
01:31:16.000 I mean, when I was talking about an AI ad that was saying like, hey, Phil, this is the president, it's actually going to be a bit creepier than that, where they're going to claim it's a coincidence.
01:31:25.000 So we've already seen ads where, and everybody's experienced this.
01:31:28.000 I'll tell you a story.
01:31:29.000 I went to Walmart.
01:31:31.000 We were walking down one of the aisles, and in the middle of the aisle was a big stack of TVs on sale.
01:31:35.000 And we were like, oh, look, these are 50-inch TVs.
01:31:37.000 They're on sale for a couple hundred bucks.
01:31:38.000 We walked past it.
01:31:39.000 I go home.
01:31:40.000 I go on the computer.
01:31:40.000 On Facebook was an ad that showed an image identical to what we walked past.
01:31:44.000 And I'm like, how the did they know?
01:31:47.000 Probably what happens is...
01:31:48.000 They're tracking our location data, so they know we're at a Walmart, and they're trying to sell us this sale item.
01:31:54.000 So Walmart does a campaign where they sell these TVs, they have all the Walmarts do a similar stand, and then they advertise it.
01:32:00.000 We go to Walmart.
01:32:01.000 We saw it.
01:32:02.000 But a lot of people have these experiences where it feels like you're being spied on.
01:32:05.000 Have you ever thought about something and then got an ad for it?
01:32:08.000 Yeah, that creeps me out.
01:32:09.000 Well, but that could be like, you know, if you're, if it's on mind, you're, you're gonna,
01:32:13.000 you're gonna notice it, whereas you probably don't notice it other ads.
01:32:16.000 But what's going to happen is you're gonna get an ad where it's going to be, it'll be
01:32:19.000 a politician saying, if, if, if I'm going to bet anything, it's that you're just like
01:32:25.000 Hi, I'm so-and-so politician.
01:32:27.000 You know, this morning I went out for my cup of coffee over at Starbucks and I grabbed a large cold brew, went down to the shop to check on my car to see if it was fixed yet, and you're gonna be like...
01:32:35.000 That's literally what I did.
01:32:37.000 And you're going to be like, wow, that's so amazing.
01:32:40.000 He's just like me.
01:32:42.000 And it's going to be an AI person tracking data from you, just saying what you did back to you.
01:32:47.000 This is like, this is totally a little bit off topic, but that's why I really think that I would love to see some way to make your data protected by property right laws.
01:33:00.000 So that way they can't just Use your data and make profit off of you and make money because you operate.
01:33:09.000 I don't know how it works.
01:33:10.000 I don't have any kind of concept of making it work.
01:33:12.000 So they've tried to do this with the Chromebooks and stuff that they give to kids because they give kids these devices in school.
01:33:20.000 You should definitely not let your kids be dumping information into Google because they're trying to profile your kids.
01:33:25.000 They're going to do psychological profiles on your kids.
01:33:28.000 That's why people freak out about them going to schools.
01:33:30.000 Take a look at Brittany Kaiser, and she's working on something called Own Your Data.
01:33:34.000 She's the person who was in that Cambridge Analytica documentary where she came out,
01:33:40.000 she was a whistleblower or something, but she's been working for the past several years,
01:33:44.000 I know her personally, on lobbying so that you own the data.
01:33:49.000 Meaning these companies can't just collect it from you and then use it however they want because it's yours.
01:33:54.000 I don't know how that manifests, right?
01:33:56.000 Do you get paid for it?
01:33:58.000 Do you have to sign a contract saying my data?
01:34:00.000 Because they're making money off of your labor.
01:34:02.000 So there's got to be something there.
01:34:05.000 I think that's why I think that there is an argument for it, but even if it was just like you can't use it, right?
01:34:11.000 Like you can't just take people's data and then decide how you're going to advertise to them because it's their property.
01:34:19.000 Even if there isn't enough value there that you could monetize it, because maybe you can't, your individual, you know, personal data alone isn't worth anything.
01:34:27.000 Fragments of a penny.
01:34:28.000 Yeah, you know, but it might be something where you could say, look, This as a, you know, as a collective thing that companies are doing, it's violating people's rights.
01:34:38.000 It's tracking their, you know, blah, blah, blah.
01:34:40.000 And I think that that's at least something worth considering.
01:34:45.000 Let's go to Super Chats.
01:34:46.000 If you haven't already, would you kindly smash that like button?
01:34:48.000 Subscribe to this channel.
01:34:50.000 Share the show with your friends if you really do like it, because that's the most powerful way to help.
01:34:54.000 And let's read what y'all have to say.
01:34:56.000 Ready to Rumble says, Elon Musk voted for Joe Biden.
01:34:59.000 Never forget that.
01:35:01.000 Yeah.
01:35:01.000 But, you know, look.
01:35:03.000 People change.
01:35:04.000 I didn't vote for Trump in 2016.
01:35:04.000 They learn.
01:35:05.000 I voted for him in 2020.
01:35:07.000 I'll vote for him again.
01:35:08.000 So, there were a lot of Democrats who voted for Obama than voted for Trump.
01:35:12.000 We're trying to convince people, you know?
01:35:13.000 If Elon Musk is now on the side of free speech, we just gotta keep the pressure up and tell him, like, hey man, we're keeping an eye on this World Economic Forum lady.
01:35:21.000 You know, I don't wanna see any funny business, alright?
01:35:25.000 We gotta be open to people changing their mind, too.
01:35:28.000 Right.
01:35:30.000 Flooded Timber Farm says, Jeremy's Chocolate now has micro-aggression sized chocolate bars for pre-order.
01:35:35.000 I think I'm gonna order a whole bunch of them.
01:35:37.000 Micro-aggression.
01:35:39.000 They're little tiny ones, and they come in bags.
01:35:43.000 That's probably better than the full-sized bars, because not everybody wants to crack open a full-sized candy bar.
01:35:47.000 A lot of calories.
01:35:48.000 They need to make a dairy-free dark chocolate.
01:35:50.000 I will buy a whole bottle.
01:35:52.000 That's true.
01:35:53.000 You can get the microaggression ones for Halloween.
01:35:56.000 Yeah.
01:35:57.000 Hand them out to your neighborhood kids.
01:35:58.000 Absolutely, yeah.
01:35:59.000 Really make people happy.
01:36:00.000 You'll get put on a blacklist in your neighborhood.
01:36:03.000 I will say to Jeremy, you need a dark chocolate, you know, because some people don't have dairy.
01:36:09.000 Some people are lactose intolerant.
01:36:10.000 I have a sensitive stomach, Jeremy.
01:36:12.000 That's right.
01:36:13.000 But I'm surprised they didn't do a dark chocolate one.
01:36:16.000 Because I like dark chocolate more than milk chocolate.
01:36:19.000 But I gotta be honest, they're really good.
01:36:21.000 I don't understand people that like dark chocolate more than milk chocolate.
01:36:24.000 Dark chocolate's like less of a candy to me.
01:36:28.000 That's how I mean it!
01:36:28.000 I know!
01:36:30.000 I think it's like a grown-up candy.
01:36:32.000 It's more like coffee.
01:36:33.000 I feel sophisticated when I eat dark chocolate.
01:36:37.000 To me, dark chocolate's like coffee, you know?
01:36:40.000 It's got less sugar, it's got antioxidants, it's relatively good for you, it's got fat in it.
01:36:46.000 Alright, let's see where we're at.
01:36:49.000 Lizziac says, Tim, did you know that Joe Biden was born closer to Abraham Lincoln's presidency than his own?
01:36:55.000 Now you do.
01:36:56.000 Yikes.
01:36:57.000 Wow.
01:36:58.000 I heard that before.
01:37:00.000 When was he born?
01:37:03.000 I'm not sure.
01:37:04.000 How old is he now?
01:37:05.000 I don't know.
01:37:05.000 Is that true?
01:37:06.000 That doesn't sound true.
01:37:07.000 I know, that can't be true.
01:37:08.000 But it might be true.
01:37:09.000 He was born, yeah, closer to Lincoln's inauguration, I think is what it was, than to his own inauguration.
01:37:15.000 So... That sounds like it might be true because he was, what, 78 or something?
01:37:18.000 Yeah.
01:37:22.000 Someone looking it up?
01:37:24.000 You want to look up his birthday?
01:37:26.000 Doing it now.
01:37:27.000 All right.
01:37:28.000 While Phil does that...
01:37:30.000 Free Men Die Free says, I have always suspected that Elon can't be trusted.
01:37:34.000 His father was a technocrat.
01:37:35.000 Elon's goals require extremely high trust.
01:37:38.000 His angle is to build trust.
01:37:39.000 People won't take their implant without trust.
01:37:41.000 November 20th, 1942.
01:37:45.000 So, and Lincoln's inauguration was what, uh, 61?
01:37:49.000 Yeah, I'm not sure.
01:37:51.000 I think the war ended in 64?
01:37:52.000 I'll Google that as well.
01:37:57.000 So that's...
01:38:00.000 I don't know.
01:38:01.000 Is that right?
01:38:02.000 How old was he when it was inaugurated?
01:38:05.000 Everybody's already giving us the answer anyway in the chat.
01:38:08.000 What are they saying?
01:38:08.000 He was inaugurated, uh, March 4th, 1861.
01:38:11.000 Hmm.
01:38:12.000 It's just a mix of arguing about chocolate and dates.
01:38:19.000 Trying to keep up.
01:38:22.000 All the dark chocolate lovers out there.
01:38:24.000 I just feel like that's you're trying to be healthy and I just want to be unhealthy if I'm grabbing chocolate.
01:38:29.000 Have you tried Jeremy's chocolate yet?
01:38:30.000 I have not, so I'm very excited.
01:38:34.000 I do prefer the she-her.
01:38:36.000 So this is unfortunate, because I do like almonds, but I have a light almond allergy.
01:38:40.000 So it's like I can't really... I won't die, but I gotta go she-her.
01:38:47.000 And they're really good anyway.
01:38:49.000 It's all delicious.
01:38:50.000 But I've noticed... So we ordered a couple thousand bars.
01:38:54.000 The she-her ones are going way quicker.
01:38:56.000 It may have something to do with people not wanting to put him's nuts in their mouths.
01:39:01.000 Or I think... I think it is because... I don't know.
01:39:05.000 I think people want to enjoy their candy.
01:39:07.000 Yes.
01:39:07.000 And if you have a nut, then it just becomes healthy too.
01:39:10.000 Exactly.
01:39:11.000 It's like the pure chocolate is sweet and delicious and the nuts make it a little bit healthy.
01:39:15.000 It's like... I mean, if I'm indulging in a full-size chocolate bar, it's because I'm doing something bad.
01:39:21.000 I just want to indulge.
01:39:23.000 Roscoe says my cat's breath smells like cat food.
01:39:28.000 Well, okay.
01:39:31.000 Thanks, Ralph.
01:39:32.000 Sick.
01:39:33.000 Ralph Wiggum.
01:39:34.000 BreadAin'tDead says, Tim, Elon and Tesla are beholden to China.
01:39:37.000 Remember that.
01:39:38.000 World Economic Forum is a red flag for sure.
01:39:41.000 Can in truth only wait and see.
01:39:42.000 Don't cancel just yet.
01:39:44.000 You know, the majority of people wanted us to cancel.
01:39:44.000 Wait.
01:39:47.000 But, man, it's tough because I want to have that pressure.
01:39:54.000 I want to be able to be like, dude, we're paying a lot of money.
01:39:57.000 You better not screw this up.
01:39:59.000 So maybe it's more impactful to be like, I've got my finger on the switch ready to say no.
01:40:04.000 As soon as she makes the wrong move and starts, you know, doing this weird woke stuff, we're out.
01:40:09.000 But if we leave now, then there's no incentive at all.
01:40:12.000 And it can be like, I already lost you.
01:40:14.000 Go nuts, lady.
01:40:15.000 You know what I mean?
01:40:15.000 Right.
01:40:16.000 We can't be finicky about things.
01:40:17.000 We have to know for sure something's going to be bad.
01:40:20.000 It's tough, man.
01:40:21.000 Man, I don't know.
01:40:23.000 Ito says, for the love of God, update your Brave browser.
01:40:25.000 I'll consider it.
01:40:27.000 It was funny, Hasan did a video, too, where he, like, zoomed in, and he's like, update your browser!
01:40:32.000 And then I updated it, and I was like, there you go, Hasan, I updated the browser, just for you.
01:40:35.000 So you can comment, you know.
01:40:37.000 I love landlords.
01:40:40.000 Mind Fury says, I voted for you last year, Tudor, and was majorly disappointed that you lost.
01:40:44.000 I hope you consider running again someday.
01:40:46.000 Well, now that we've had these horrible conversations tonight, I feel like I have to.
01:40:50.000 Yeah.
01:40:51.000 I hope you run again.
01:40:52.000 Thank you.
01:40:53.000 If I inspired you with my black-pilled garbage tonight to run again and you win, I will feel like I have done a positive for... Then we're both doing something.
01:41:04.000 For Milichigan.
01:41:05.000 There you go.
01:41:06.000 Tudor's gonna be, like, sitting at home with, like, a bottle of Jack, being like, what was Phil saying?
01:41:14.000 Oops.
01:41:15.000 You do that a lot, Taylor!
01:41:16.000 I'm not the first time!
01:41:18.000 Yeah, just eating chocolate bars, being like... Or we can make it more inspirational than that, and you get back, and then you put on your suit jacket, and you're like, I must run again.
01:41:27.000 That's more superhero-y.
01:41:30.000 I like that better.
01:41:31.000 The people need me!
01:41:32.000 I can't let the mirror link happen!
01:41:34.000 I must stop this!
01:41:34.000 I'm not just a chocolate-eating drunk, I'm actually, like, motivated.
01:41:37.000 Michiganders, I will let you down!
01:41:42.000 Vanity says, got my video on the town hall election taken down for election misinformation.
01:41:47.000 How can't we get a bill started for the first amendment to be applied online?
01:41:52.000 I mean, I think that's BS that they took your video down.
01:41:55.000 Caitlin Collins was telling Trump he was wrong.
01:41:56.000 You had the debate right there in the town hall.
01:42:00.000 But YouTube is, I don't know, run by bad people for bad reasons.
01:42:06.000 That is what Rumble exists.
01:42:07.000 We put all of our clips up on Rumble.
01:42:09.000 All of our videos are on Rumble, Mines, and YouTube.
01:42:15.000 It's unfortunate.
01:42:17.000 Maybe if, you know, we really want to put our videos up on Twitter as well, but now I'm kind of worried, you know?
01:42:24.000 With this World Economic Forum person.
01:42:27.000 So do you think that Elon decided to hire this person and focus on the technology side?
01:42:32.000 Because, I mean, you said, does he just want to grow shows like this?
01:42:34.000 Do you think that he wants to be the place that shows go?
01:42:39.000 Yes, but it was only really feasible with him in charge.
01:42:44.000 It doesn't even matter who he gave the CEO to.
01:42:48.000 I mean, if he gave the CEO position to, like, Alex Jones, I'm sure you get a lot of people are gonna be laughing and being like, sign me up!
01:42:53.000 You know, but people would flee, advertisers would get angry.
01:42:56.000 So, the fact that he stepped down at all, I think says to a lot of people in the free speech space that it's not a safe place to run your business.
01:43:06.000 Did he just step down because he, like, ran a Twitter poll?
01:43:09.000 I think it's too much work.
01:43:11.000 I mean, dude's running SpaceX.
01:43:13.000 I think SpaceX is the most important thing humans are doing right now.
01:43:16.000 It's like going to Mars and going to other planets.
01:43:20.000 It's inspirational.
01:43:21.000 It can bring us together.
01:43:22.000 I really do believe in the space program and he's the one leading it.
01:43:24.000 So I hope he wins in that regard.
01:43:27.000 But if he's not the CEO of Twitter, am I going to trust this lady?
01:43:30.000 I don't know.
01:43:31.000 I do not want to invest in another... I tell people if you're starting a new show right now, just go on Rumble.
01:43:37.000 Like, just start on Rumble, because you've got a lot less to worry about.
01:43:41.000 They're not gonna ban you.
01:43:42.000 I can tell you don't know, though.
01:43:44.000 You're like, I don't know what to think about this new woman.
01:43:46.000 I think we all feel like that.
01:43:47.000 Oh, right.
01:43:48.000 But so, it's simple.
01:43:49.000 The uncertainty means to me, do not invest money in it.
01:43:53.000 Unless I know it's going to be a good investment.
01:43:56.000 I'm not putting money on it.
01:43:57.000 Right.
01:43:58.000 Well, it's only day one.
01:44:01.000 It's not day one.
01:44:01.000 She doesn't start for like three weeks, right?
01:44:03.000 Is it not day one?
01:44:03.000 They just announce who it is?
01:44:05.000 Yeah.
01:44:06.000 I wonder if like Tucker Carlson will be like the canary in the coal mine because if there's someone that's likely to Get the left worked up with something that is actually fairly anodyne, but yet they're gonna act like it's the end of the world.
01:44:21.000 It's Tucker Carlson.
01:44:22.000 So if Tucker, you know, is on Twitter and he's putting up the content that we are used to seeing, the type of content that we're used to seeing out of Tucker Carlson, and he's not getting smacked down, he's not getting treated poorly, I think that it might be a good indication.
01:44:40.000 And again, this is all just like, we're all kind of guessing anyways.
01:44:46.000 I didn't know that she was a Trump fan or that she'd worked for Trump before I came here today.
01:44:50.000 So I'm still, I don't know what to think.
01:44:54.000 Trump also hired Bolton.
01:44:55.000 Yes, 100%.
01:44:56.000 I know.
01:44:57.000 And you know, like we've all said, Musk has also got certain things that don't seem all that great.
01:45:03.000 So I mean, as far as I'm concerned, I'm still kind of like, well, I'm hoping for the best, but what are you going to do?
01:45:10.000 Nick Ash says, Tim, please use your platform to get Yingling to expand their market into the Western part of the country.
01:45:15.000 I'm in Western Illinois, and the closest place I can buy it is in Indiana.
01:45:18.000 Much thanks to all the dissociated Midwest drinkers.
01:45:21.000 Yingling's the best.
01:45:22.000 That's pretty good.
01:45:23.000 Yeah, they need to be all over the world.
01:45:25.000 What are they doing?
01:45:26.000 Why aren't they expanding?
01:45:28.000 Like, what's the limitations?
01:45:29.000 I didn't realize it was so, like... Regional?
01:45:33.000 Yeah, regional.
01:45:34.000 I grew up out here, so it's a staple in every bar, every restaurant.
01:45:38.000 Does Michigan have Yingling?
01:45:39.000 I've never heard of it.
01:45:40.000 What is it?
01:45:40.000 No, that's what she was showing me earlier.
01:45:43.000 I'm like, I don't know what that is.
01:45:45.000 Oldest brewery in the country.
01:45:47.000 It's a, it's a lager.
01:45:48.000 It's really good.
01:45:49.000 And when the Bud Light thing happened, they posted a picture of someone holding up a yingling with the American flag behind them.
01:45:54.000 So it's like, all right, we like yingling.
01:45:56.000 What do we got?
01:45:56.000 We got bells.
01:45:58.000 Yeah, overrun.
01:45:59.000 That's Michigan.
01:46:00.000 All right.
01:46:01.000 How are they?
01:46:01.000 Are they good?
01:46:03.000 Um, I've, I've had a couple Bell's beers.
01:46:06.000 I don't drink a lot.
01:46:07.000 I, I personally, um, I like Tubi.
01:46:09.000 It's an Israeli drink.
01:46:11.000 And I, based on my, no, it's like a, it's like a kind of citrusy liquor, but Israeli beers aren't bad.
01:46:19.000 Like I'll drink a Goldstar every once in a while, but you can't really get them here.
01:46:22.000 But I think they're Heineken.
01:46:25.000 Like, same company?
01:46:26.000 I might be mistaken.
01:46:27.000 Who is?
01:46:27.000 Gold Star.
01:46:28.000 Oh, really?
01:46:29.000 Yeah.
01:46:30.000 Oh, interesting.
01:46:30.000 I like Heineken, too.
01:46:31.000 I don't drink a lot, though, so... Yeah.
01:46:33.000 I'm not a drinker, so I'm, like, the worst person to talk about this.
01:46:35.000 I mean, I don't really drink either, but, you know, if I'm gonna have a beer, I rarely drink.
01:46:40.000 Maybe, like, two or three times a year I'll have a drink.
01:46:42.000 I'm one of those people that has the alcohol allergy, so I get the really red face.
01:46:47.000 It's just, like, that's not worth it.
01:46:49.000 Yeah, I save it for, like, special occasions.
01:46:51.000 I like a sour beer occasionally, though.
01:46:53.000 They don't have a lot of alcohol.
01:46:55.000 I break out in offensive Twitter posts, so I... That was a good one.
01:46:59.000 All right, let's, uh, Chase says, thanks to the crew, I still think it's funny the writer's strike has people out of work but my news hasn't changed because it's not written, lol.
01:47:08.000 Yep.
01:47:08.000 I mean, you'd think a lot of the, a lot of the corporate press would shut down.
01:47:12.000 Most, so a lot of people get their news from Jimmy Kimmel and Colbert and they did shut down.
01:47:17.000 Because...
01:47:18.000 It is written.
01:47:19.000 But think about that.
01:47:20.000 A large swath of this country think they're informed because they listen to Stephen Colbert or Jimmy Kimmel.
01:47:25.000 One of the most damaging things that has happened to America in the past 20 years is The Daily Show with Jon Stewart.
01:47:34.000 He's literally made the idea of listening to any conservative or Republicans as toxic.
01:47:41.000 So people are afraid to say, oh, this conservative has a good idea.
01:47:46.000 They won't even say that a conservative has a good idea.
01:47:49.000 It is verboten.
01:47:50.000 You cannot say that they had a good idea.
01:47:53.000 And that has done terrible damage to America.
01:47:57.000 You're not all that great, Jon Stewart.
01:48:00.000 Have you watched his new show?
01:48:01.000 It's awful!
01:48:02.000 Yeah, he's like a huge jerk to people who come on and in good faith to have a conversation.
01:48:08.000 It's all clowning.
01:48:09.000 It's all bad editing.
01:48:12.000 The editing is as bad as that Jim Jefferies editing the show on that Jim Jefferies show was when they were trying to, you know, basically just lampoon people.
01:48:19.000 It's a really, really horrible thing to have made good ideas toxic because of where they come from.
01:48:27.000 I don't understand why people want to listen to someone yell at someone else anyway.
01:48:30.000 I don't like that feeling of... John Stewart didn't yell, he made fun of everybody.
01:48:34.000 John Stewart's whole shtick was said something and then said.
01:48:39.000 Like he did the Stewie head tilt, you know?
01:48:42.000 Yeah.
01:48:43.000 Alright, John McGee says, saw a guy at the liquor store today buying two cases of Bush Light.
01:48:47.000 He had a man bun.
01:48:48.000 That's the story.
01:48:49.000 I'm not surprised.
01:48:52.000 But hey, I hope he's happy with his Bush Light.
01:48:54.000 Some people are saying Heineken is Anheuser-Busch.
01:48:57.000 I don't know.
01:48:58.000 No, I don't think so.
01:48:59.000 I don't think that's true.
01:49:00.000 I don't drink that much in the first place.
01:49:02.000 Someone want to fact check that real quick?
01:49:03.000 If anybody's mad at me.
01:49:04.000 They own so much it's hard to know for sure.
01:49:06.000 It might be InBev.
01:49:08.000 Because InBev is the European company?
01:49:10.000 Is the Dutch or something?
01:49:11.000 I'm just only gonna drink 2B now.
01:49:13.000 I know they're cool.
01:49:15.000 Zima.
01:49:16.000 All right, Tim Jake says, if Neuralink can read thoughts, what happens to the Fifth Amendment concerning self-incrimination?
01:49:22.000 How do people with security clearances protect classified info?
01:49:24.000 It doesn't matter because the Fifth Amendment doesn't exist because when you fill out your taxes, you're incriminating yourself anyway, right?
01:49:29.000 So, I think it was Dave Smith who said that.
01:49:32.000 Like, what about the Fifth Amendment?
01:49:33.000 You're incriminating yourself.
01:49:34.000 Oh yeah, interesting point.
01:49:36.000 How does that make sense?
01:49:37.000 You just fill out your tax form and then you write like, I plead the fifth on like how much money you brought in or why.
01:49:43.000 Like, ah, you don't know now.
01:49:44.000 What are you going to do?
01:49:44.000 Just a big zero.
01:49:45.000 I plead the fifth.
01:49:47.000 All right.
01:49:48.000 Heineken owns Heineken, apparently.
01:49:51.000 Amsterdam, the headquarters are in Amsterdam.
01:49:54.000 That founder was Gerard Heineken.
01:49:55.000 And it doesn't look like they have any parent company.
01:49:58.000 I'm not even sure if they really are the same as Goldstar.
01:50:02.000 Let's see.
01:50:02.000 Heineken, Goldstar.
01:50:05.000 I might just be making things up.
01:50:06.000 Yeah, I might be missing information here, but from what I found in a quick Google search, they're owned by Heineken.
01:50:16.000 Jackie Oz says, encourage all to unsubscribe because it will get his attention to make him notice how we feel.
01:50:22.000 If she proves herself, then we will subscribe.
01:50:24.000 That's tough, yeah.
01:50:26.000 I mean, a lot of people are already saying they're cancelling Twitter Blue because of it.
01:50:29.000 I did.
01:50:29.000 Yeah?
01:50:30.000 Wow.
01:50:30.000 Well, not just because of it, but partially because of it.
01:50:33.000 Partially, I'm still waiting for them to get back to me about subscribing.
01:50:36.000 Like, having the subscription thing.
01:50:38.000 I sent in an email, like, two months ago or whatever, and can't get anyone to answer, but I cancelled it, so...
01:50:45.000 I feel like the threat of shutting down will keep them on their toes, but if everyone unsubscribes, then their attitude is going to be like, well, if we already lost them, then we have nothing to lose.
01:51:05.000 We want to make sure they have something to lose.
01:51:06.000 We want to give them something that they're scared to lose.
01:51:09.000 If you have someone who was the highest rated show every weeknight who gets canceled from Fox and goes to Twitter and says, I'm going to take my platform to Twitter and they welcome him.
01:51:21.000 If he stays, I think you're right.
01:51:22.000 I think that is like the measuring post.
01:51:25.000 If he stays on there and can say whatever he wants.
01:51:28.000 At least I'll give you an indication.
01:51:31.000 At least it shows you which direction they're pointed in.
01:51:36.000 Remember, just last week or the week before, everybody was partying that Tucker's coming to Twitter and it's the big deal and blah blah blah.
01:51:46.000 You know, the audience is definitely fickle, but who knows what's actually going to happen.
01:51:51.000 But you can, I mean, there will be indications about which direction it's going to go in.
01:51:56.000 Well, I also think that right now, as long as it is a beacon for free speech and we're able to speak, we have to stay there.
01:52:02.000 Because if you walk away based on what you think might happen, then you have, we're not going to have spaces.
01:52:07.000 You know, so we need those open spaces to talk.
01:52:10.000 And if Tucker is there and we know how demonized he's been, then there's a chance that that's going to keep us all able to speak.
01:52:19.000 There's another thing that I kind of want to point out.
01:52:21.000 I have noticed, and there's going to be people that are going to give me hate for saying this, but I have noticed that Brian Krasenstein has been better on Twitter.
01:52:29.000 He has not been, like, I've seen some things and I'm like, he's actually posting something good.
01:52:34.000 That's okay.
01:52:34.000 Why is that happening?
01:52:36.000 Maybe the, I don't know for sure, but I mean.
01:52:38.000 Because he's learning.
01:52:39.000 Because he, I think one of the big divides between the left and the right is the left
01:52:42.000 believes fake news and the right tends to look for, like tends to fact check.
01:52:47.000 Or the, and so this is why leftist personalities don't go on other shows because they get obliterated
01:52:51.000 with fact checking.
01:52:53.000 The Krasensteins got suspended.
01:52:56.000 They lied about what they did.
01:52:57.000 They were falsely accused of, I think, doing some weird stuff on Twitter and they maintained
01:53:02.000 that they had never done, but they got removed.
01:53:04.000 I think because they were generating too much influence outside of the DNC's control or
01:53:09.000 the Democratic Party's control or the establishment.
01:53:11.000 So they get banned, they get falsely accused, and then their attitude is like,
01:53:15.000 were the conservatives right about why people are getting banned?
01:53:18.000 Did some research, talked to more people, came back, said this was wrong,
01:53:22.000 people shouldn't be banned for this, started going on other shows,
01:53:25.000 and now they're probably realizing like, oh, that one thing wasn't true.
01:53:28.000 Oh, that one thing wasn't true either.
01:53:30.000 This is what happens to a lot of people.
01:53:32.000 You can still be liberal and have liberal opinions.
01:53:34.000 I don't mind that, I have liberal opinions.
01:53:36.000 But if we agree on what's true, because we fact-checked it.
01:53:39.000 There was something that happened recently, I can't remember what it was, and the Kresen scene said,
01:53:42.000 guys, don't jump to conclusions.
01:53:43.000 Oh, it was the profile that got released from the Allen, Texas shooter.
01:53:47.000 Yeah.
01:53:48.000 And Brian said something like, just because the profile was found doesn't mean it actually is his profile.
01:53:52.000 We need to wait for real confirmation on this.
01:53:55.000 Like, just because it looks like it doesn't mean it's true.
01:53:57.000 And I'm like, that's actually a really good, really, really good point.
01:54:00.000 Like, he's right.
01:54:02.000 You know.
01:54:02.000 Do you think the surfs guy is learning?
01:54:04.000 No!
01:54:06.000 He's not capable!
01:54:07.000 He's not capable of learning.
01:54:09.000 I mean... IMPOSSIBLE!
01:54:11.000 Well, so he is.
01:54:12.000 A really good example is when the guy from the serfs tweeted,
01:54:16.000 the right finally admits that their concern about grooming is indoctrinating kids into
01:54:22.000 queer lifestyles which they view as sexual.
01:54:26.000 And it was funny that he referred to me as the right finally admitting,
01:54:29.000 because it's like, bro, dude, my opinion is not conservative.
01:54:33.000 Like, it is middle of the road, so it overlaps with some conservatives.
01:54:38.000 And there were actually several times where he called me woke on the show, and I'm like, don't you get what centrist means?
01:54:42.000 But anyway.
01:54:44.000 He posted that.
01:54:45.000 It means he is learning.
01:54:46.000 But my response was, that's finally admitting?
01:54:49.000 It's literally been our whole argument the entire time that you are indoctrinating kids into a political, cultural lifestyle that does have overt sexual connotations in it.
01:54:59.000 He didn't know that.
01:55:00.000 He did not know that.
01:55:02.000 Listen to good guy Tim over here.
01:55:03.000 You gotta get rid of the beanie and get the old good guy hat.
01:55:06.000 I'd like to think that the meth conversation was an aha moment.
01:55:10.000 I'm not getting the benefit of that.
01:55:11.000 I'm saying the dude's completely ignorant.
01:55:12.000 Oh, the meth thing, right.
01:55:13.000 Yeah, he was like, I see what you did there.
01:55:16.000 Well, I didn't say anything.
01:55:18.000 Oh, wait.
01:55:19.000 Yes, I did see this clip with the baby.
01:55:21.000 Right.
01:55:21.000 Because he was like, a woman can get an abortion whenever she wants, it's her body, it's her choice.
01:55:25.000 Seamus asks, so as long as the baby's in her body, she can do whatever she wants to it?
01:55:30.000 He's like, well, it's her body, it's her choice.
01:55:31.000 And I go, what about meth?
01:55:33.000 And then he was like, no, because that would be intentionally killing the baby.
01:55:36.000 And...
01:55:38.000 I didn't try to insult him, I genuinely went, wait a minute.
01:55:42.000 Because like, how can you say you can kill the baby whenever you want, but you can't do meth because it kills the baby whenever you want?
01:55:48.000 I did not think he was going to say that.
01:55:50.000 But he did, and I'm like, well okay.
01:55:53.000 And my response was, I don't understand what you're trying to say.
01:55:55.000 Because it makes no sense.
01:55:57.000 And people called him out like, you have no formulated opinion on this, you're just regurgitating talking points.
01:56:03.000 We talked to someone the other day that said that this generation that's coming out of high school now, and for the last five years, they want to stand for a cause.
01:56:11.000 And when asked what cause, they say, we don't really care.
01:56:14.000 Just stand for a cause.
01:56:15.000 And so it's very easy to stand for a cause and not fully understand it.
01:56:20.000 I think that's what we're seeing.
01:56:21.000 Purpose.
01:56:22.000 Yeah.
01:56:22.000 I mean, for a long time in this country, people had purpose in faith, family, and community.
01:56:28.000 That's mostly going away.
01:56:30.000 No family, no neighbors, no faith.
01:56:32.000 So young people are desperate for some reason to be alive, and they're not finding one, but now they're being given one, social justice.
01:56:39.000 If there's no God, then you have to create your own meaning.
01:56:45.000 Right.
01:56:45.000 Mankind, man has to find his own meaning if there is no God.
01:56:50.000 And a lot of people that are Gen Z and Millennials are either atheists or they're agnostics.
01:56:58.000 And without God, or without a God to say, this is what is moral, this is where, you know, this is how you should live your life, man is really left with no actual foundation right so and it's hard for people to make their own morality you know because you have to you have to have principle to have morality you have to have things that you believe are actually real and if you don't see any reason for existing at all what does any of it matter so it's really I got a cause you can stand for regardless of your religion or lack thereof and it's
01:57:36.000 Save women's sports.
01:57:38.000 Anybody can get on board with this one.
01:57:40.000 It's a great cause to stand for.
01:57:42.000 All right.
01:57:42.000 Alright, M.M.M.Maysall says, I'm the last of the boomers in my 50s, we are not all brain-dead, followed you from the
01:57:51.000 beginning and will until I quote, age out, big fan, stay real. So age out is the polite way I say of passing on. You
01:58:00.000 know, I don't say like, you know, my fear is that when the boomers age out, we're in deep trouble. You know, I like
01:58:08.000 Gen X, you know, they did a lot of really great work.
01:58:09.000 A lot of the stuff I grew up watching.
01:58:10.000 Uh, Boomers made Star Trek The Next Generation, which is the best show ever.
01:58:14.000 And uh, so I'm deeply appreciative of that.
01:58:16.000 But I think Boomers did a lot of really good stuff.
01:58:18.000 And they did a lot of bad stuff too, but I think, you know, all the good stuff that influenced me, I'm a big fan of.
01:58:25.000 And they vote better.
01:58:26.000 They vote better than the younger generations.
01:58:29.000 The younger generations are the fault of the old generations, of course, but my fear is that when the boomers are no longer voting, there's going to be a hard shift, boom, towards communist, weird Marxist policy.
01:58:41.000 You don't think people vote differently as they get older?
01:58:43.000 I'm sort of hopeful that that happens.
01:58:45.000 I do, but it's mostly because they have kids.
01:58:48.000 And now we just have everything, we're just going to be watching our lives now, based on what you're telling me.
01:58:55.000 So my one hope, I suppose, is we may see a big shift leftward when the millennials may be a blip, a leftward spike in the voting patterns, when the boomers are no longer voting when they pass on, when they age out, when they retire to the point where they're no longer involved.
01:59:10.000 Millennials are going to do a big spike leftward, but I'm hoping that because they abort all of their children, that there will be substantially less of leftists in the future.
01:59:20.000 And look, I say it all the time, they never respond because they can't respond.
01:59:27.000 I told Lance, the guy from the serfs, I said, y'all are sterilizing and aborting your kids.
01:59:32.000 I don't care.
01:59:32.000 I'm not a conservative.
01:59:33.000 I'm not one of these pro-life conservatives begging you to save your kids.
01:59:36.000 I'm sitting here being like, no, go ahead, get rid of them.
01:59:39.000 50 years, it'll be a conservative Christian country.
01:59:41.000 What do I care?
01:59:42.000 And then their response is just stone faced.
01:59:44.000 I have no, I don't know what to say to that.
01:59:46.000 It's true.
01:59:46.000 You don't have kids.
01:59:47.000 And like you point out, they say, we're coming for your kids.
01:59:49.000 And I'm like, you're not going to win that fight.
01:59:51.000 You're going to win some kids from some parents, but we're already seeing the backlash with Loudoun County.
01:59:55.000 We're already seeing parents say no to the woke stuff in schools, and it's causing them to freak out.
02:00:00.000 It's causing the FBI to call them terrorists because they're losing.
02:00:03.000 So you give it 20 years, and there's going to be more conservative voters.
02:00:07.000 You give it 40 years, exponentially more.
02:00:09.000 You combine the fact that conservatives have more kids with liberals abort their kids, and it is just mathematically impossible for the left to win.
02:00:18.000 So, I'm just like, look guys, we gotta be vigilant, we gotta stand up, we gotta call it out to make sure they don't indoctrinate this crazy stuff into the kids.
02:00:26.000 Other than that, we just wait.
02:00:28.000 They're for abortions up to any point.
02:00:31.000 It's politically correct for them to be.
02:00:34.000 My response to them is, thank you for aborting your kids, you're making the future better.
02:00:37.000 They never respond.
02:00:40.000 And I'm waiting for them to.
02:00:42.000 They just won't.
02:00:43.000 Because they don't want to bring up the political debate of, we are excising ourselves from the country.
02:00:48.000 So I'm like, okay, well, you know.
02:00:50.000 You're gonna get conservatives who are pro-life being like, that's so horrible.
02:00:53.000 No, we can't let them do this to the children.
02:00:55.000 And my attitude is like, no, you're right.
02:00:58.000 They shouldn't do it, but like, politically it's out of my hands, so.
02:01:03.000 I mean, to your point of how much can you control by government, that is the question that Republicans need to ask about the pro-life message.
02:01:15.000 This is a culture issue.
02:01:16.000 If you want to impact culture, you can celebrate the wins in the life movement.
02:01:20.000 But if you think that you can change the minds of liberals and this control they have over this in an election, that's not how you're going to do it.
02:01:31.000 Yeah, well, I think we'll wrap it up here.
02:01:34.000 So if you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, become a member at TimCast.com.
02:01:40.000 It's an amazing Friday night.
02:01:41.000 I hope you guys are having a good weekend already, chilling out, relaxing.
02:01:45.000 You can follow the show at TimCast.irl.
02:01:46.000 You can follow me personally at TimCast.
02:01:48.000 Tudor, do you want to shout anything out?
02:01:50.000 Just check out the podcast, Tudor Dixon podcast.
02:01:53.000 We would love to have you there.
02:01:54.000 Right on.
02:01:55.000 I am Phil Labonte, lead singer of All That Remains.
02:01:58.000 I am PhilThatRemains on Twitter.
02:01:59.000 I am PhilThatRemainsOfficial on Instagram.
02:02:02.000 And the band is All That Remains.
02:02:04.000 You can check us out on Spotify, Apple Music, YouTube, all that stuff.
02:02:09.000 And you can find me on Twitter at tmsilverman, or on Instagram at taylormaysilverman.
02:02:15.000 Shabbat shalom, and happy Mother's Day to all the moms!
02:02:18.000 Happy Mother's Day!
02:02:20.000 Happy Mother's Day!
02:02:20.000 Happy early Mother's Day.
02:02:23.000 You can follow me at kellenpdl, and Heineken's good, Tubi's good, but buy American beer.
02:02:29.000 So go buy Yingling.
02:02:30.000 You can probably order it online at this point, but yeah.
02:02:33.000 I only drink on vacation.
02:02:36.000 Fair enough.
02:02:36.000 I love you, Mom.
02:02:37.000 Thanks for hanging out, everybody.