The Charlie Kirk Show - January 17, 2026


THOUGHTCRIME Ep. 111 — Autistic Barbie? Hollywood Deepfakes? British DEI Video Games?


Episode Stats

Length

1 hour and 31 minutes

Words per Minute

185.17805

Word Count

16,987

Sentence Count

1,635

Misogynist Sentences

59

Hate Speech Sentences

51


Summary

On this week's episode of Thought Crime Thursday, we have a special guest on the show, Andrew and Tyler. They are joined by Charlie and Andrew's good friend, Tyler, to discuss a variety of topics, including immigration reform, Arizona's AG, and much, much more!


Transcript

00:00:03.000 My name is Charlie Kirk.
00:00:05.000 I run the largest pro-American student organization in the country fighting for the future of our republic.
00:00:11.000 My call is to fight evil and to proclaim truth.
00:00:14.000 If the most important thing for you is just feeling good, you're going to end up miserable.
00:00:19.000 But if the most important thing is doing good, you'll end up purposeful.
00:00:24.000 College is a scam, everybody.
00:00:26.000 You got to stop sending your kids to college.
00:00:27.000 You should get married as young as possible and have as many kids as possible.
00:00:31.000 Go start a Turning Point USA college chapter.
00:00:33.000 Go start a Turning Point USA high school chapter.
00:00:35.000 Go find out how your church can get involved.
00:00:37.000 Sign up and become an activist.
00:00:39.000 I gave my life to the Lord in fifth grade.
00:00:41.000 Most important decision I ever made in my life.
00:00:43.000 And I encourage you to do the same.
00:00:45.000 Here I am.
00:00:46.000 Lord, use me.
00:00:48.000 Buckle up, everybody.
00:00:49.000 Here we go.
00:00:56.000 The Charlie Kirk Show is proudly sponsored by Preserve Gold, the leading gold and silver experts and the only precious metals company I recommend to my family, friends, and viewers.
00:01:09.000 Hello ladies and gentlemen, welcome to another edition this week's edition of Thought Crime Thursday.
00:01:15.000 It's a great week.
00:01:17.000 It's a great week in America.
00:01:19.000 Donald Trump's ICE officers and agents are out on the ground in Minneapolis.
00:01:25.000 The lib hordes are running towards them and they are vomiting on the snow because of the tear gas that is being launched and volleyed in their direction.
00:01:35.000 Incredible scenes, incredible content.
00:01:38.000 Sorry to all the people who say nothing ever happened.
00:01:40.000 Sorry to all the black pillars out there, the panicans.
00:01:43.000 You are losing.
00:01:44.000 We are winning.
00:01:45.000 Donald Trump is winning.
00:01:46.000 America is winning.
00:01:48.000 But tonight, we are here to commit some thought crime.
00:01:52.000 So who do we got tonight?
00:01:53.000 We got Andrew there.
00:01:54.000 I think we got Blake.
00:01:55.000 Yo, yo.
00:01:56.000 We've got three guys at the desk.
00:01:59.000 We're maintaining that pretty consistently.
00:02:00.000 I'm very proud of you guys.
00:02:02.000 I'm very proud to be here.
00:02:04.000 I, you know, I now live in this area.
00:02:06.000 Yeah, but you know, sometimes Tyler, Tyler wondered if I'd ever come.
00:02:10.000 And it just, you know, Charlie.
00:02:13.000 Charlie wanted to get Andrew to move to Phoenix for many years.
00:02:16.000 And he eventually sort of gave up.
00:02:18.000 And it's a weird, you know, I feel grateful to be here despite all the things.
00:02:25.000 And it's for I, on the other hand, think that God didn't intend for people to live in Arizona because it's a desert filled with nothing.
00:02:34.000 And you've thought about it.
00:02:36.000 Whereas when I'm talking about it, it just seems like God doesn't want people to be there.
00:02:39.000 Yeah, whereas when I go to D.C. or Pennsylvania and drive through Philadelphia, I really think this is the place God intended people to be.
00:02:46.000 No, D.C. is obviously satanic.
00:02:49.000 You're not going to convince me otherwise with that.
00:02:51.000 Pennsylvania Pennsylvania.
00:02:53.000 Wait, didn't you in Pennsylvania just have that, like, what was it?
00:02:56.000 Cook County, the DA there was like, was it a mayor who was like, we're breaking our Bucks County.
00:03:04.000 We're breaking our agreement with ICE and we're not going to cooperate with DHS anymore.
00:03:09.000 That's Pennsylvania to me.
00:03:12.000 Excuse me.
00:03:12.000 Do you want to talk about the Arizona governor, the Arizona AG?
00:03:16.000 We're going to get rid of her.
00:03:16.000 Yes, I do.
00:03:17.000 We're going to get rid of her.
00:03:19.000 How about, wait, wait, Tyler, how about this state of Arizona senators?
00:03:23.000 This state went six points for Trump in November.
00:03:26.000 How many points did Pennsylvania go for Trump?
00:03:29.000 I'm glad it was.
00:03:30.000 It was tight.
00:03:31.000 But it was a bigger, but it was a bigger swing.
00:03:33.000 But Tyler, no, in all seriousness, though, did you talk about thought crimes?
00:03:33.000 No, it wasn't.
00:03:37.000 You saw that story about Kirsten Sinema and her bodyguard today, right?
00:03:42.000 I'd missed it.
00:03:42.000 You missed this?
00:03:43.000 What?
00:03:44.000 Oh, I've been busy.
00:03:46.000 Believe it or not, believe it or not, we have.
00:03:48.000 Literally, like the keeper of the T of Arizona missed this.
00:03:52.000 Oh, Tyler, you're going to love it.
00:03:54.000 So, believe it or not, we've had two different Democrat lawmakers who won an election in 2018 who ended up having a weird, lurid sex scandal with a staffer.
00:04:03.000 Today.
00:04:04.000 Which is not today.
00:04:05.000 Oh, this is the one with her with her bodyguard.
00:04:07.000 Yeah.
00:04:08.000 So she got sued because she apparently had a drug-fueled, allegedly, a drug-fueled affair with her bodyguard and caused the dissolution of his 14-year marriage.
00:04:19.000 And in North Carolina, where the suit has been brought, alienation of affection is still a validity.
00:04:23.000 I love that.
00:04:23.000 That should be a rule everywhere.
00:04:25.000 Secondly, I do feel I just feel like it's the wife of the guy.
00:04:30.000 The wife of the guy I bring suing?
00:04:32.000 Yeah.
00:04:33.000 The wife of the guy.
00:04:35.000 So I do feel a little like, you know, tepid about my response here because Kristen Cinema came out in defense of Erica Kirk.
00:04:42.000 Like Wapo took a shot at her wardrobe choice or something like that.
00:04:47.000 And Cinema actually chimed in and was like, can we just stop this effing stuff right for once and for all?
00:04:53.000 And I was like, eh.
00:04:56.000 I haven't thought this highly of you, Kristen Cinema, since you blocked nuking of the filibuster by the crazed Dems.
00:05:02.000 My favorite part of the story, which is not new exactly, but I learned of it, which makes it actually new because that's what matters, is that apparently her post-Senate career has been lobbying to liberalize laws around hallucinogenic drugs, specifically some medicine.
00:05:17.000 Oh, there's a lot of hallucinogenic drugs in this story.
00:05:21.000 Wait, hold on.
00:05:23.000 Scott Perry.
00:05:24.000 Not Scott Perry.
00:05:25.000 Who's the Texas guy?
00:05:27.000 Department of Energy.
00:05:28.000 Former Department of Energy ran for governor of Texas.
00:05:33.000 Perry.
00:05:33.000 Perry?
00:05:34.000 Rick Perry.
00:05:36.000 Rick Perry is really into ayahuasca.
00:05:39.000 There was a whole New York Times feature on it.
00:05:43.000 Very interesting coded.
00:05:45.000 Yes.
00:05:45.000 I was literally thinking like Cerno, Cerno.
00:05:48.000 Yeah, he's really into it.
00:05:49.000 Like, there's a weird cross-section of people that are into ayahuasca and, you know, getting high on this, you know, this stuff you get in the rainforest so that you can get over past traumas.
00:06:01.000 I happen to think it's all bunk.
00:06:03.000 I would love to hear your thoughts on it.
00:06:05.000 You know, pharmakea in the Bible is what they often refer to as sorcery.
00:06:12.000 Sorcery is the word is pharmakia.
00:06:15.000 I believe that when you put substance in your body, it's a highway to hell.
00:06:20.000 You're just inviting witchcraft.
00:06:21.000 So that you can't.
00:06:22.000 I believe all of them.
00:06:23.000 I'm 100% believe it.
00:06:24.000 Yeah.
00:06:25.000 So the people that are big ayahuasca, I'm like, if I was the devil and I wanted to convince you that taking drugs is really good, I would leave you with a positive impression of your drug experience.
00:06:36.000 Yeah.
00:06:36.000 And there are people who take it, though, that have really bad experiences too, though, to be sure.
00:06:40.000 Some people get sick.
00:06:41.000 Some people like there's been violent crimes associated with it.
00:06:46.000 So it's really kind of like playing Russian roulette for a lot of people.
00:06:50.000 But the way that I always look at it is like that's, you know, and, you know, as a Christian, right?
00:06:58.000 So you read the Bible and witchcraft is clearly discussed in the Bible.
00:07:01.000 The occult is clearly discussed in the Bible.
00:07:04.000 And we are told not to do it.
00:07:05.000 However, that doesn't mean it's not real.
00:07:08.000 It is real.
00:07:09.000 The problem is, is that you're connecting with spirits and entities on the spiritual plane that you have no idea who you're coming into contact with.
00:07:18.000 Okay, that's not a little machine elf.
00:07:19.000 Okay, that's a demon.
00:07:21.000 All right.
00:07:21.000 You're being connected with a demon right there, and you are being tricked by that demon to probably do something that you shouldn't do.
00:07:29.000 So the way that we're taught to do this is through church, is through the Bible, is through Christ.
00:07:34.000 Obviously, that's the way to connect to the spiritual side of good and not all of this insanity of the demons and fallen angels.
00:07:42.000 All right.
00:07:43.000 So check it out here, just real quick.
00:07:45.000 This is the New York Times.
00:07:46.000 The long, strange trip of Rick Perry.
00:07:49.000 The former Texas governor and Trump energy secretary has now dedicated his life to promoting the powerful psychedelic Ibogaine.
00:07:56.000 That's what it was, Ibogaine, not ayahuasca.
00:07:57.000 That sounds like a hair loss medication.
00:08:00.000 I say as an expert on hair loss.
00:08:00.000 Yeah.
00:08:03.000 Yeah, no, not the medication to prevent it.
00:08:05.000 All right, we should get into it.
00:08:06.000 We'll transition.
00:08:08.000 Where are the demons?
00:08:08.000 We got our, we already have our first Rumble Rant tonight from Kyrie.
00:08:11.000 I know she's a regular.
00:08:13.000 Thank you very much, Kyrie.
00:08:14.000 She says, first, hey guys, great to have four out of five of the TC crew tonight.
00:08:18.000 I agree.
00:08:19.000 Two, y'all need to make a thought crime t-shirt and Tyler's In God We Trust hat available for us.
00:08:25.000 We will do it.
00:08:26.000 And then she asks, can we reveal number three?
00:08:29.000 Is that okay?
00:08:30.000 So the hat do you see in the chat?
00:08:31.000 Oh, yes.
00:08:32.000 So we can reveal this.
00:08:32.000 Yes.
00:08:33.000 All right.
00:08:33.000 So she asks, when is Daisy's baby coming?
00:08:35.000 She's, of course, a member of the staff here.
00:08:37.000 The baby has come.
00:08:39.000 We even got her a little box of goodies.
00:08:42.000 Beautiful.
00:08:43.000 And the baby's healthy.
00:08:45.000 We actually were, okay.
00:08:46.000 Beautiful, beautiful.
00:08:47.000 She was kind of on the small side.
00:08:49.000 I was worried because Daisy likes to eat carrots and broccoli.
00:08:52.000 And I'm like, eat a steak, eat a hamburger.
00:08:55.000 No, Daisy doesn't do that sort of thing.
00:08:58.000 And the baby was like trending on the small side, but then it came out totally healthy, really good weight.
00:09:04.000 Baby's doing great.
00:09:04.000 Really cute baby.
00:09:05.000 Now, I will note, I have not personally confirmed the existence of the baby, so this could all be a side.
00:09:10.000 I've seen pictures.
00:09:10.000 I've seen pictures.
00:09:11.000 I've seen it.
00:09:12.000 Oh, you can fake those, which is what we're going to talk about.
00:09:14.000 Ah, they're fine.
00:09:15.000 I actually have a great Daisy deep fake baby.
00:09:18.000 We have to investigate this.
00:09:19.000 I have to go confirm the existence of the baby.
00:09:20.000 This is the problem.
00:09:21.000 I'm actually the one that's blackpilling me on all of the AI slop because he's trying to find out when we did the strike against Venezuela if a bomb landed on, what was it, on Hugo Chavez's grave, basically, right?
00:09:35.000 But it didn't really.
00:09:36.000 We were seeing AI that suggested it was.
00:09:36.000 Well, that was a huge.
00:09:38.000 Yes.
00:09:39.000 But I don't think it was ever actually substantiated.
00:09:39.000 Yeah.
00:09:42.000 Yeah, it wasn't.
00:09:43.000 No.
00:09:43.000 But there were videos that people were sharing that were saying, this is it on fire.
00:09:48.000 But then the BBC went, took a photo, and this mausoleum intact.
00:09:51.000 Listen, if Blake Neff cannot ascertain the veracity of a certain image that is not AI or is AI, can you imagine what our parents are dealing with right now on Facebook?
00:10:02.000 Oh, they're cooked.
00:10:03.000 I mean, they're getting bamboozled by Facebook slop about giant pumpkins.
00:10:07.000 Yeah.
00:10:08.000 You see that bad bumps?
00:10:10.000 We've got to make that.
00:10:12.000 You know what?
00:10:13.000 In ancient societies, they would go and bring their adult parents and to live with them.
00:10:17.000 It was very communal or whatever.
00:10:19.000 Or they would go live with their adult parents.
00:10:21.000 In today's day and age, it's going to be less about living with your adult relatives and elderly relatives is like monitoring their social media behavior.
00:10:30.000 It's going to be endless.
00:10:31.000 It's bad.
00:10:32.000 It's bad.
00:10:33.000 And that's why we have to get to our first topic.
00:10:34.000 I think we've got to lead with this now is deepfakes are going to destroy Hollywood.
00:10:40.000 So we've reached the point where we can use AI programs to just essentially replace all actors because they've gotten good enough at making people resemble other people.
00:10:53.000 So we have a few highlight clips that are really representing this.
00:10:56.000 So first we have, this is a man using AI to become, I don't, I've never watched this show, so Jack's going to have to confirm, but apparently he's using AI to become different things from Stranger Things.
00:11:06.000 Let's show 463 so we can see here he's gesturing and just it's all him just waving to the camera, but then it's constantly changing him to different people.
00:11:28.000 Were those accurate representations of the Stranger Things yesterday?
00:11:32.000 No.
00:11:33.000 So they're incredibly accurate, except for the second to last one, he seems to have race swapped one of them, the character Dustin, with the, he has the hat and the curly hair.
00:11:43.000 Not this one.
00:11:44.000 I think it's this one right here.
00:11:45.000 So he's a white character on the show, but this guy apparently has race swapped him because, hey, with AI, if you want to race swap someone, if you want to gender swap someone, you can do so with the touch of a button.
00:11:59.000 Are you sure that Dustin just doesn't have a tan?
00:12:01.000 It's tan.
00:12:04.000 I don't think they race swapped him.
00:12:05.000 He looks white to me.
00:12:06.000 No, that's definitely.
00:12:07.000 No, not this guy, the guy before this guy.
00:12:10.000 Dusted.
00:12:11.000 This is a hopper.
00:12:13.000 There, right there.
00:12:13.000 No, not this.
00:12:15.000 That movie says a tan, Jack.
00:12:17.000 No, no, no, no.
00:12:17.000 Have you ever seen it?
00:12:18.000 He's not.
00:12:19.000 See, I get this all the time.
00:12:20.000 You guys think I'm Mexican because I tanned.
00:12:22.000 You are Mexican.
00:12:25.000 Hey, Jack, in Philadelphia, they call this a spray tan.
00:12:30.000 Yes, they do.
00:12:32.000 No, that's a little bit of soul.
00:12:33.000 That's Jersey, too.
00:12:34.000 But here's my point.
00:12:34.000 That's Jersey.
00:12:36.000 But whether he did or not is not my point.
00:12:38.000 The point is with AI, you could get whatever you want.
00:12:41.000 You could do whatever you want.
00:12:42.000 And if you're a filmmaker, and Andrew, you have a Hollywood background, so maybe you could speak on this.
00:12:47.000 But if you're a filmmaker, you can literally just pick and choose whatever you want in your films.
00:12:52.000 You don't even need actors anymore.
00:12:54.000 I had a bunch of friends when I was living in Los Angeles that were like working at DreamWorks and that were working at Disney, you know, as animators.
00:13:03.000 One of my buddies had like, they had like this special card that he could get just as many people into the park as he wanted to.
00:13:08.000 So that was actually the first time I went to Disneyland.
00:13:10.000 I think I went when I was really little, but that was the first time that I could remember going.
00:13:15.000 And he, I keep thinking about him with all this stuff because he was really, really talented, like an actual artist.
00:13:21.000 But now it's all like what kind of job?
00:13:24.000 I mean, I guess he could direct.
00:13:25.000 I have he could direct AI.
00:13:27.000 I have a really interesting bend on this because I don't think that this is advanced enough where it could replace somebody for a full movie.
00:13:35.000 But I do think, just even off that clip, think about like Fox News and CNN and MSNBC.
00:13:41.000 I don't trust MSNBC or whatever they call it now at all.
00:13:44.000 They can basically swap out anyone that they want to come onto MSNBC.
00:13:50.000 So all they have to do is get a sign-off from that person probably to say, hey, we'll pretend like it's you.
00:13:58.000 Like you could get a Bill Craig.
00:13:59.000 They approved text, and then they just feel or somebody else, an actor, could just be like Hillary Clinton, for example.
00:14:06.000 It's really hard to get Hillary Clinton to go on MSNBC, but if Hillary Clinton approves it, maybe somebody goes on.
00:14:13.000 A surrogate, like a campaign surrogate is that person.
00:14:16.000 So the surrogate now becomes the person.
00:14:18.000 Dude.
00:14:19.000 Think about how that's going to screw up policy.
00:14:21.000 Think about other stuff you can do.
00:14:21.000 Or other things.
00:14:22.000 You figure out how Joe Biden could have used this.
00:14:24.000 Yeah.
00:14:25.000 They probably did.
00:14:26.000 They probably did.
00:14:27.000 Joe Biden was using this.
00:14:29.000 I'm pretty sure Joe Biden himself was fake fate.
00:14:32.000 That was all AI.
00:14:33.000 And like the first time we saw the real Joe Biden was on the debate stage because they couldn't figure out the tech to how to like.
00:14:38.000 Yeah, exactly.
00:14:39.000 They couldn't get it.
00:14:39.000 They couldn't get it first.
00:14:40.000 It was a live stream.
00:14:42.000 Wait, that's not actually Joe Biden.
00:14:44.000 I mean, there's so many other spin-offs.
00:14:46.000 It's not just movies.
00:14:46.000 So as an example, imagine if we had, so for example, let's say we had a movie.
00:14:52.000 We had, let's say a James Bond movie came out.
00:14:55.000 And you have an actor in it who's playing the villain, playing the love interest, playing someone.
00:14:59.000 And then they have a scandal.
00:15:01.000 They donated to the wrong defense fund for someone, or they have a sexual harassment.
00:15:01.000 They.
00:15:07.000 They didn't do this to Kevin Spacey.
00:15:09.000 And then what if they just edit, they just literally swap them out of the movie so like their appearance isn't in the film anymore.
00:15:15.000 And like they're still in the movie.
00:15:17.000 I'm almost certain they did something like this with Kevin Spacey and they replaced him with like Chris Plummer when his scandal came out.
00:15:24.000 I don't know if it was technology was used, but they sort of like digitally inserted Chris Plummer into scenes.
00:15:29.000 But it was new footage.
00:15:30.000 It wasn't like an AI version.
00:15:33.000 But they've already done stuff like this already, where if you've got an actor who's associated with something, it's crazy.
00:15:39.000 But what I want to do is get to another level.
00:15:42.000 Yeah.
00:15:43.000 Oh, man.
00:15:43.000 I mean, that would be a relatively benign attempt a lot, actually.
00:15:47.000 The Princess Leia.
00:15:50.000 They had Carrie Fisher because Carrie Fisher died when the One Star Wars came out.
00:15:54.000 And then they had a young Carrie Fisher came on.
00:15:57.000 I think they did it with Alec Guinness.
00:15:59.000 Star Wars has done this a couple of times now.
00:16:01.000 This happens more than people think.
00:16:03.000 And didn't it happen to James Gandalfini?
00:16:05.000 Didn't he die in production or something?
00:16:07.000 They had to kind of change the storyline of his last film or something.
00:16:10.000 Oh, of like the Saints of Newark or whatever?
00:16:12.000 I don't know.
00:16:13.000 That was a prequel.
00:16:14.000 Maybe he died before his son played him in that one.
00:16:17.000 You know, I was just watching a TV show where they dedicated the episode.
00:16:23.000 I was like, who is that?
00:16:24.000 And I looked it up.
00:16:24.000 I was like, oh, he died on episode four of a 10-part series.
00:16:27.000 So they just kind of rode him out.
00:16:29.000 But now you could.
00:16:30.000 I don't know.
00:16:31.000 That's a moral conundrum.
00:16:33.000 Yeah.
00:16:33.000 And or other ones.
00:16:34.000 So, for example, I don't think one thing that could happen, what if we got, for example, people like Indiana Jones movies, but they don't like 85-year-old Indiana Jones played by Harrison Ford.
00:16:45.000 And of course, Harrison Ford is gone.
00:16:47.000 What if we just got Infinity Indiana Jones movies starring perpetually 40-year-old Harrison Ford?
00:16:53.000 Well, they do that.
00:16:55.000 They're the benefit of it all.
00:16:56.000 Yeah, they do at the very beginning.
00:16:58.000 What if we did it indefinitely?
00:17:00.000 We could get 20 Indiana Jones or something.
00:17:03.000 Yeah, half of de-aging.
00:17:07.000 I would do that 100%.
00:17:08.000 That's half of Indiana Jones would be good.
00:17:10.000 Yes.
00:17:11.000 Okay.
00:17:11.000 They just did Robert De Niro with Robert De Niro, Al Pacino, and Joe Pesci in the Irishman.
00:17:16.000 That was like a famous one that they did.
00:17:18.000 There was another one recently done.
00:17:20.000 Awful.
00:17:20.000 Curious case of Benjamin Franklin.
00:17:22.000 No, no, no.
00:17:22.000 Star Wars.
00:17:24.000 And what was the one?
00:17:29.000 Rogue One.
00:17:30.000 The one that was bad.
00:17:31.000 But people say it was good.
00:17:33.000 And they put the old guy in there from episode four.
00:17:35.000 Yeah, Grand Moff Tarkin.
00:17:36.000 Yeah, that's right.
00:17:37.000 Yeah.
00:17:37.000 No, we're not going to defend Rogue.
00:17:38.000 So now here's what I want to say.
00:17:40.000 So, Andrew, this is what I want to get into.
00:17:42.000 So, because we're talking de-aging, but I think we're going to go into a wholly different level here.
00:17:48.000 I think we're going to get to the point where people are just going to be sitting in front of a computer and they can just type it out.
00:17:54.000 I want this actor and this one to look this way and this one to look this way and this one to look this way.
00:17:58.000 There's not going to be any people at all.
00:18:00.000 And you might even get to the point.
00:18:01.000 So I think on Spotify right now, like the number one artist on Spotify is like an AI artist.
00:18:08.000 There was a worship song that was written by AI that was trending.
00:18:11.000 I saw that too.
00:18:12.000 Can the spirit of God be in a worship song written by a computer?
00:18:16.000 I'm sure it can.
00:18:16.000 I mean, the God will use whatever he wants, but the Holy Ghost in the machine?
00:18:21.000 Ooh.
00:18:24.000 So I guess the question is, though, what does this do to that?
00:18:28.000 That whole industry is done.
00:18:29.000 I'm sorry.
00:18:30.000 They're just done.
00:18:31.000 Well, you know what's interesting?
00:18:32.000 So I was thinking about this because if you see some of these, I think crypto did this, right?
00:18:37.000 Where crypto has these, what do they call them?
00:18:39.000 NFT or NFPs?
00:18:41.000 NFTs, right?
00:18:42.000 NFTs.
00:18:43.000 But people will buy, like, cartoon digital memes, basically, and there'll be a value associated with them.
00:18:50.000 So you're not wrong that there is a marketplace, Jack, that would support even financially completely made-up images, and we call them NFTs.
00:19:00.000 Trump has done this, but you could do this with just about anything.
00:19:03.000 And crypto is kind of this first wave of this.
00:19:07.000 So if you created computer-generated characters that had unique personality types, and maybe they were just, maybe they just really hit gold by creating some character that really appealed to people.
00:19:21.000 In theory, you could own the trademark on the character you created, and then you could, as an agent or manager of this AI character, you could then cast this character in movies.
00:19:33.000 People could become enamored by a completely made-up AI movie star.
00:19:40.000 And that person that owns the rights to the AI would then be like owning Brad Pitt, but you don't have to feed him and you don't have to house him and you don't have to pay it or you just own it.
00:19:51.000 A person could recreate themselves as a dynamic popular political figure, such as we actually had, the team went and they made me record a video of myself with the AI that those people were just using earlier.
00:20:05.000 Put up 480, put up 480.
00:20:07.000 This could go really bad.
00:20:08.000 Just oh my gosh.
00:20:10.000 So that is me as Barack Obama making various facial gestures.
00:20:15.000 That's also me with a moderate amount of hair.
00:20:18.000 Did they get me with more hair?
00:20:19.000 No, it's just no.
00:20:20.000 I just asked them to look me.
00:20:22.000 I asked them to give me a luscious mane of hair.
00:20:24.000 So that's me as an 80s hair metal star.
00:20:26.000 That's kind of great.
00:20:27.000 Although they kind of gave me a DMV lady.
00:20:31.000 A DMV lady.
00:20:32.000 You're kidding me.
00:20:32.000 Letitia James.
00:20:34.000 What are you doing here on Buck Ryan?
00:20:35.000 Whoa.
00:20:36.000 Oh, that is a creepy, that is an uncanny valley for Halifax.
00:20:40.000 Some of them menu.
00:20:41.000 No, that's not good.
00:20:43.000 That was Uncanny Valley Trump for sure.
00:20:45.000 Obama's not bad.
00:20:47.000 Obama's pretty good.
00:20:48.000 We have our first response to that, which is eek.
00:20:51.000 Yeah.
00:20:52.000 Yeah, for real.
00:20:54.000 I would have been curious to see him do like Abraham Lincoln or something.
00:20:57.000 They could probably make it.
00:21:00.000 By the way, I want to ban people doing this to Charlie.
00:21:04.000 Candidly, that's where my head instantly goes.
00:21:06.000 I'm like, can you?
00:21:08.000 Can we pass a national law that if you mimic Charlie, then the good news is that they always mess up Charlie's.
00:21:14.000 Whenever they do this, they mess up Charlie's facial features because he had kind of unique facial features and it kind of messes with it.
00:21:20.000 But it'll get to the point where it can do it.
00:21:23.000 It'll get to the point where this is the worst it's ever going to look.
00:21:26.000 You know, it's only going to get better from here.
00:21:29.000 So they will get to the point where they can do this with Charlie.
00:21:33.000 We're cooked.
00:21:34.000 Yeah.
00:21:35.000 You know, I guess they're, I guess, because it goes down to like who owns the rights to your likeness.
00:21:39.000 So I would imagine that that's like family.
00:21:43.000 And, you know, certainly hope that nobody would think to do something like that.
00:21:48.000 Like, you know, you could have like AI Charlie endorsements or stuff like that or just get him to say stuff.
00:21:54.000 It'd be disgusting.
00:21:55.000 It'd be completely disgusting.
00:21:58.000 My Pillow wants to say a heartfelt thank you to our listeners for your continued support.
00:22:04.000 To show their appreciation, they're offering an incredible after-Christmas sale with some of the best prices that they've ever had.
00:22:11.000 And all when you use promo code Kirk, K-I-R-K.
00:22:14.000 Right now, you can get their luxurious Giza Dream sheets for as low as $29.98.
00:22:20.000 That's pretty insane.
00:22:21.000 You'll also find cozy blankets, comforters, and duvet covers starting at just $25.
00:22:26.000 Six-pack towel sets are only $39.98, making it the perfect time to refresh your home.
00:22:31.000 But the savings don't stop there.
00:22:34.000 Everything is on sale from dog beds and socks to couch pillows and much more.
00:22:38.000 This is the best opportunity of the year to stock up on MyPillow favorites, take advantage of these unbeatable specials.
00:22:44.000 Don't wait.
00:22:44.000 Head to mypillow.com or call 875-0425 now.
00:22:47.000 And don't forget to use promo code KIRK.
00:22:50.000 These offers won't last long.
00:22:51.000 Call 800-875-0425 or visit MyPillow today and use promo code Kirk.
00:23:01.000 We have some Rumble Ramps.
00:23:03.000 We do, we do.
00:23:04.000 Should we get into this?
00:23:05.000 Oh, yeah, let's do that.
00:23:06.000 So we have, Jack, you're the one who's from that land called Poland.
00:23:10.000 If you can see this, I think I'll be pronounced Shajuls for DJT.
00:23:16.000 That's like a sh sound in Polish, right?
00:23:16.000 K-R-Z.
00:23:19.000 Otherwise, it's Krezzuls.
00:23:21.000 I apologize.
00:23:22.000 I cannot read these Kajulis.
00:23:26.000 Kazulis.
00:23:27.000 Kajulis.
00:23:27.000 We'll go with that.
00:23:28.000 Kazulis for DJT.
00:23:30.000 Just received a copy of The Island of Free Ice Cream by Jack Kosobik for my granddaughter.
00:23:35.000 I've sold so many people.
00:23:36.000 Thank you, Jack, for bringing back smart learning.
00:23:39.000 More of this, please.
00:23:41.000 And then Dylan Ivey, a warrior of the chat, he's here all the time, says, keep moving forward.
00:23:46.000 We appreciate all your efforts, and it's time to take all.
00:23:49.000 It's going to take all of us to prep the 2026 midterms, but we have the 2026 energy.
00:23:54.000 God bless.
00:23:56.000 And then lastly, they didn't include the name on this one.
00:24:01.000 That is from Zuzu's Pedals.
00:24:02.000 That's another one we see a lot of.
00:24:04.000 Howdy Zuzus.
00:24:05.000 No way to this AI craziness.
00:24:08.000 I would rather watch Doris Day movies in an old movie theater that only plays classic old movies before I support AI movies.
00:24:15.000 I will do high school plays for AI.
00:24:17.000 Hold on, Zuzu.
00:24:18.000 I completely agree.
00:24:19.000 I'm just trying to play down the line here a little bit.
00:24:23.000 Like, think down the timeline.
00:24:26.000 There will be people that own AI characters that then demand huge bucks because they know that their character that they created is going to be marketable.
00:24:35.000 And I think it's like, imagine like Steven Spielberg just created like some rando character with AI, cast him in a movie, and it's like it does big numbers at the box office, and then people want to see that person again.
00:24:48.000 That character, the uniqueness of that character, the storyline, the backstory, the intonation, the turns of phrases will all be trademarkable to this unique AI character.
00:25:00.000 And they're going to start marketing movies with this.
00:25:02.000 Because the only reason I think that this is true is because you got to not think like an Xer or a boomer or a millennial or even a Gen Zer candidly.
00:25:11.000 You got to think like a Asian teen.
00:25:14.000 Like go think of, like, put your head in like a Hong Kong 19-year-old girl.
00:25:18.000 They're already doing this stuff, like, on some level.
00:25:21.000 And so many other things.
00:25:22.000 You know, Zuzu says, I'd rather watch old movies in an old movie theater, but it's going to be crazy.
00:25:27.000 What if, what if someone, like, how many of us actually know every movie they made in the 1940s?
00:25:32.000 What if someone made an AI pretend Doris Day 1940s movie?
00:25:35.000 And they say, oh, you know, you hadn't heard of this one.
00:25:38.000 Yeah, the style of an old 40s movie.
00:25:40.000 And that's not even getting into, okay, this is spoofing.
00:25:44.000 This is just spoofing actors and actresses.
00:25:46.000 Spoof your family members.
00:25:48.000 And how many boomers are going to be like, oh, someone in, you know, some scammer in Karachi, Pakistan got audio recordings and videos of your granddaughter and then makes videos pretending to be your granddaughter live, like live action, pretending to be them.
00:26:04.000 And they use that to scam you for money.
00:26:07.000 Well, so bad stuff is going to go down.
00:26:10.000 The thing I wanted to add on, Andrew, what he was saying, though, is not only are they going to create these actors, but think of it.
00:26:15.000 They're going to have a whole team dedicated to like create, like playing that actor and actress.
00:26:21.000 So they'll have social media, they'll have TikToks, they'll have reels on Instagram.
00:26:26.000 And yet all of these things will be created.
00:26:29.000 It'll be totally written and scripted.
00:26:30.000 So that'll be part of it as well.
00:26:32.000 And the best part, Blake, I'm sure you can appreciate this too, is they're going to make sure that it has to be woke and it has to be like, it has to, you know, uphold all the right virtues as well and say all the right things.
00:26:43.000 Even if it's not even a real person, they'll make sure that, so will it be possible to cancel an AI?
00:26:51.000 And I would, an AI actor, and I would say yes, 100%.
00:26:55.000 Because that's how that stuff works.
00:26:56.000 It is a theology.
00:26:58.000 It is not a, you know, it's not common sense.
00:27:01.000 It's not an ideology.
00:27:01.000 Or excuse me, it's not a, you know, yo, it's not an ideology.
00:27:04.000 It's a theology.
00:27:05.000 And so.
00:27:06.000 Well, think about this too, Jack.
00:27:07.000 Think about this.
00:27:08.000 All you have is going to be scripted, though.
00:27:10.000 Yeah, but you have like Tomb Raider series, right?
00:27:12.000 Which started as a video game, then it becomes real life.
00:27:15.000 You get Angelina Jolie.
00:27:16.000 But just imagine instead of casting Angelina Jolie for it, you just create an AI version of the video game character that looks humanoid, right?
00:27:25.000 Flesh and bones, and it's not obviously not cartoon.
00:27:28.000 That becomes a piece of intellectual property.
00:27:31.000 A completely new actor.
00:27:32.000 Yeah.
00:27:32.000 Yeah.
00:27:32.000 Like a fake.
00:27:33.000 Yeah, that's my point.
00:27:34.000 That's entirely my point.
00:27:36.000 It could happen.
00:27:36.000 That's what they will do.
00:27:37.000 It's funny you mentioned that because they've never really been able to find a Tomb Raider, Lara Croft as the character.
00:27:43.000 And I don't think they've ever really been able to find one actor.
00:27:46.000 I think that series had been rebooted like three times or something.
00:27:49.000 Well, I mean, Angelina Jolie.
00:27:50.000 Angelina Jolie was pretty good.
00:27:53.000 She was good banners.
00:27:55.000 So then they rebooted it and then they rebooted it again.
00:27:58.000 Guys, breaking news that I just found out.
00:28:00.000 You guys are going to love this.
00:28:01.000 This is actually a legit study out of the UK.
00:28:03.000 And Poland, Jack.
00:28:05.000 Blue hair in the blues.
00:28:07.000 Dyeing your hair on natural colors is associated with depression.
00:28:11.000 And one of the instances that they're studying is borderline personality disorder.
00:28:16.000 True story.
00:28:16.000 I just tweeted about it.
00:28:18.000 I had the whole show borderline.
00:28:20.000 Wow, we're so surprised.
00:28:22.000 I said, color me shocked.
00:28:24.000 So I actually have a theory behind that too.
00:28:26.000 Sorry.
00:28:26.000 Angelo is probably like, guys, we have a show.
00:28:30.000 That goes into unnatural colors.
00:28:32.000 I think that when people change their hair color dramatically, even that's like a more natural color, but dramatically, it also is a side.
00:28:41.000 I think, yeah.
00:28:42.000 Jack, you said earlier today, I'm going to bring this image up.
00:28:46.000 Hold on, here we go.
00:28:47.000 Angelo's saying, not all.
00:28:48.000 He loves the combo.
00:28:49.000 Hey, throw this one up.
00:28:51.000 Jack, this is your ideology.
00:28:53.000 Got to throw it up, studio.
00:28:55.000 Well, that's one, but that's this one.
00:28:57.000 This one.
00:28:58.000 That's it.
00:28:59.000 This gave me chills today watching this.
00:29:01.000 This is a pink-haired jihadi in the snow in Minnesota.
00:29:06.000 Absolutely brand new.
00:29:08.000 It's so beautiful.
00:29:10.000 There's nothing wrong with that image.
00:29:13.000 What if we get psyops, though, where they just get us trapped in cocoons where they give us fake AI slop of base things happening?
00:29:20.000 And it lowers our base's energy to actually go do things.
00:29:24.000 No, imagine some sort of containment thing on Facebook or Instagram.
00:29:30.000 And people are, they're lobotomized.
00:29:33.000 It just gives them constant headlines like Trump elected president of Earth and like Trump awarded Nobel Peace Prize.
00:29:39.000 No, it depends what they do, though, because if they're feeding the AI slop of pink-haired jihadis getting like face planted in the snow, this is energy for me.
00:29:48.000 Jack will literally fire off 48 tweets.
00:29:52.000 This is gas in the tank.
00:29:53.000 Yeah, this is great.
00:29:53.000 This is fire.
00:29:54.000 No, this is incredible stuff.
00:29:55.000 This is not DV.
00:29:56.000 But however, changing gears just slightly.
00:30:00.000 So something we should mention, another breaking news, by the way, that I saw was that, you know, and we're writing it up over a post-millennial.
00:30:09.000 It's going to come out in a minute here, that the number one book on all of Amazon right now in all books, the entire website, is Reframe Your Brain, the User Interface for Happiness and Success by Scott Adams.
00:30:24.000 And for those who don't know, Scott Adams, the creator of Dilbert, the host of Coffee with Scott Adams, incredible author, multiple New York Times bestsellers, huge Trump supporter, day one member of the MAGA movement, did pass away this week.
00:30:38.000 And, you know, AI is something that he talked about a lot.
00:30:42.000 He talked a lot about AI.
00:30:45.000 And there were a few times where he was working with a number of people sort of in his community to create a sort of AI model of Scott Adams that could kind of live on online based on his work and based on these books that could live on beyond him.
00:31:05.000 No, I don't think we're quite at the level where it can be interactive, but he did make a couple of videos where they were taking, you know, chapters of his book, Reframe Your Brain, Loser Think, Win Bigly, How to Fail at Everything and Still Win Big.
00:31:22.000 And they had this AI Scott Adams, and they would have him just reading to you from his book, but they made it look like he was on his podcast saying it to you.
00:31:32.000 And gosh, I should have grabbed one of these videos before the show today.
00:31:36.000 And if you watched this thing, you'd have no idea.
00:31:40.000 You'd have no idea.
00:31:41.000 You'd think it was exactly Scott.
00:31:43.000 You'd say, that's Scott.
00:31:45.000 And he would say, look, I didn't actually read this.
00:31:48.000 This is just, this is AI Scott reading from my book.
00:31:52.000 So it's something that he wrote himself, like his own words.
00:31:56.000 And then Joshua Lysak, who's my co-author, was the editor on that book, Reframe Your Brain, and some of the other ones.
00:32:02.000 And so I wouldn't be surprised if Scott Adams has a project like this that's in the works.
00:32:09.000 That's all I'm saying.
00:32:10.000 All right.
00:32:11.000 We might get a little gift from Scott from Beyond.
00:32:15.000 If he sanctioned it, it's way different than in Charlie's instance.
00:32:19.000 Yeah, totally sanctioned.
00:32:20.000 Right, right.
00:32:20.000 Charlie wouldn't be sanctioned.
00:32:21.000 Charlie would never have greenlit something like that.
00:32:25.000 Never.
00:32:25.000 Never.
00:32:26.000 He was all about real.
00:32:28.000 All about real, and as we all know.
00:32:30.000 God's creation.
00:32:32.000 Yeah, you don't like you now.
00:32:34.000 Whenever people would say, like, oh, the AI Charlie, well, Charlie is not with us.
00:32:38.000 Charlie is somewhere else, and we cannot pretend otherwise.
00:32:41.000 I think it would be morally wrong.
00:32:42.000 Not just gross.
00:32:44.000 It is gross, but that is getting at the moral part of it.
00:32:48.000 All right.
00:32:48.000 Do we have our next topic?
00:32:50.000 When he was anti-AI, like some people are militantly anti-AI.
00:32:53.000 No, he loved using it.
00:32:55.000 But recreating it.
00:32:56.000 I think he loved using it.
00:32:56.000 He was really into it.
00:32:57.000 No, he was really into it.
00:33:00.000 Obviously.
00:33:00.000 No, no, no, exactly.
00:33:01.000 So, yeah, to be clear, no, that's a fair critique.
00:33:03.000 Charlie was very pro-AI.
00:33:05.000 Actually, he would use it on the show.
00:33:06.000 He would use it to research things.
00:33:08.000 He would use it on the fly, but he got really into getting good at prompts.
00:33:13.000 So he was always tweaking his prompts to get AI to do what he wanted it to do.
00:33:17.000 So he was good with it.
00:33:19.000 But yeah, just, I think, recreating human beings, that's sketchy.
00:33:24.000 And by the way, this just shows to Goya, this whole OnlyFans models, gaming the 01 Visas.
00:33:30.000 OnlyFans are done.
00:33:32.000 OnlyFans is done.
00:33:33.000 Yeah, like, you don't, like, OnlyFans, you know, I mean, this is a question.
00:33:40.000 Is their job secure?
00:33:43.000 Because, yeah, there's a lot of perverts, but like, you could have, I mean, some of these, you could, you could create women, OnlyFans, AI models out of this.
00:33:43.000 Right?
00:33:51.000 I mean, they already have.
00:33:52.000 They already have.
00:33:53.000 They already have.
00:33:54.000 We don't need to get into it.
00:33:57.000 How would you know, right?
00:33:58.000 So this, yeah, I think there's a good horror movie about this called, I think it's called Cam, where, you know, this, this girl gets like, she's one of these cam girls, but then she gets like some, I don't know, they don't really explain it.
00:34:12.000 There's some demon, I guess, takes over her social media.
00:34:16.000 Takes over the and then she's inside the camera basically controlling different things.
00:34:23.000 And the, you know, the real girl's dead or whatever.
00:34:25.000 Point being is, how would you even know?
00:34:28.000 Like, literally, how would you even know that the girl you're talking to is a real girl?
00:34:32.000 It's like catfishing, but I mean, it's the same thing.
00:34:35.000 If they can do it, it's 100%.
00:34:40.000 I think the most optimistic thing is it will have to revive in-person interactions because it's just the only thing you'll be able to trust.
00:34:46.000 That's the only way you, yeah.
00:34:47.000 Well, Blake, Blake, here's what I got to say, though, Blake.
00:34:51.000 Make sure you do the FaceTime because when you're on FaceTime, they can't run their filters.
00:34:55.000 They can't cast glamour.
00:34:56.000 Because, Blake, I would hate to see you get into a situation like you did last fall.
00:35:03.000 What?
00:35:04.000 The whole thing.
00:35:06.000 I mean, obviously, we don't need to get into it on air, but yeah.
00:35:08.000 That whole situation.
00:35:10.000 All right, Jack.
00:35:12.000 Why about that joke?
00:35:13.000 We don't need to talk about Honduras.
00:35:16.000 With the AI catfish.
00:35:17.000 Blake is dismantled.
00:35:18.000 Alrighty, Jack.
00:35:19.000 Whatever you say, whatever you say, Jack.
00:35:22.000 The one with the AI catfish.
00:35:24.000 You don't remember?
00:35:25.000 Do we have an AI catfish?
00:35:27.000 We could probably turn Tyler into a giant catfish.
00:35:32.000 We paid for that by the end of this show.
00:35:33.000 We paid for it.
00:35:35.000 We paid for this.
00:35:36.000 Oh, and I have that by the end of the show.
00:35:38.000 Turn all three of these guys into Catfish by the end of the show, please.
00:35:38.000 All right, let's get to it.
00:35:42.000 Anyway, so we have to talk about Barbies.
00:35:44.000 So, Mattel.
00:35:45.000 Meanwhile, half the audience is like Barbies?
00:35:47.000 Yeah, we're talking about Barbie.
00:35:50.000 Millions of gay men play with Barbies, don't they, Jack?
00:35:55.000 Anyway, so they've made a there's a new Barbie doll, and it has come out, and it is the Autistic Barbie.
00:36:02.000 So, first off, let's set this up.
00:36:03.000 There's someone who's doing kind of a profile of it.
00:36:07.000 So, we have clip 466.
00:36:10.000 This is funny.
00:36:11.000 So, I was a little concerned when I heard that they were coming out with an Autistic Barbie because autism is a spectrum.
00:36:16.000 It affects everybody differently, and it's also an invisible disability.
00:36:20.000 So, she has an AAC device, which I think is one of the most important details about her.
00:36:25.000 I think AAC devices are really important to show.
00:36:28.000 That's representation that really, really matters.
00:36:31.000 Then she's wearing headphones.
00:36:33.000 She has a little fidget toy, and I really like her clothing.
00:36:36.000 It's very casual and cozy.
00:36:38.000 You know, a lot of Autistic people have sensory issues with clothes.
00:36:41.000 Her eyes are slightly looking sideways, like they're not looking straight.
00:36:47.000 And, you know, a lot of Autistic people have issues with direct eye contact, which I thought was a really cool little detail.
00:36:54.000 But the last thing I want to say about her is I'm really glad that they did not choose, like, a white, blonde hair, blue-eyed, standard Barbie.
00:37:03.000 I'm assuming that she is a person of color because whiteness is so overrepresented in autism spaces, and autism affects everybody.
00:37:12.000 So glad she wasn't a white girl.
00:37:13.000 Well, so as it happens, we sent staffer Emma, Emma Kate, on a saga across the Phoenix area.
00:37:21.000 And one, apparently, this is a hot item because we had to check three different stores to find this, but we have it.
00:37:31.000 Stop it.
00:37:31.000 I want to see it.
00:37:33.000 Take a look at it.
00:37:34.000 Wow.
00:37:35.000 Like, did you just, did you just buy an Autistic Barbie?
00:37:39.000 No, the show bought an Autistic Barbie.
00:37:41.000 So it's a good idea.
00:37:42.000 Do we have to write this out?
00:37:43.000 It's a senior most show.
00:37:45.000 The person who will have to approve this expense and therefore is responsible for it is probably Andrew.
00:37:50.000 How much was wait?
00:37:51.000 Taking this out of somebody's page.
00:37:51.000 Was Autistic Black?
00:37:53.000 Was Autistic Barbie more expensive?
00:37:55.000 I don't know if it was.
00:37:56.000 I don't think so.
00:37:59.000 It will tell you a lot if there was a premium on this.
00:38:01.000 I don't think we should put it in.
00:38:02.000 Oh my gosh, it comes with all these.
00:38:05.000 Wait, her eye line is.
00:38:06.000 It comes with all these vaccines, too, at the bottom.
00:38:08.000 Did you see this?
00:38:09.000 Her eye line.
00:38:10.000 It comes with her whole vaccine schedule.
00:38:14.000 Is there literally a COVID shot?
00:38:16.000 No, it's like the MNR.
00:38:18.000 What is it?
00:38:19.000 MNRA.
00:38:20.000 MNRA.
00:38:21.000 No, it's RNA.
00:38:22.000 Now with more vaccinations than any other Barbie in American history.
00:38:27.000 Wow.
00:38:28.000 She's got bottled fluoride water.
00:38:31.000 Yeah, that's right.
00:38:32.000 She's got like all seed oils.
00:38:34.000 She's been drinking a bottle of time.
00:38:37.000 You can see anything.
00:38:38.000 She's just touching seeds everywhere.
00:38:41.000 She's been drinking straight from the tap.
00:38:42.000 She does have the fidget spinner on her hand here.
00:38:45.000 And so that's, I know that's a popular.
00:38:46.000 This is real.
00:38:47.000 Yeah, this is real.
00:38:49.000 She has IRL.
00:38:50.000 She has headphones on.
00:38:51.000 Emma found this?
00:38:52.000 Yeah.
00:38:52.000 It's cool.
00:38:53.000 Where did she find it?
00:38:54.000 I think a Walmart or something.
00:38:56.000 No kidding.
00:38:57.000 And then the AAC device.
00:38:59.000 So I guess she's presumably non-verbal because I think AAC is, if they have that, they can use it to communicate where they can point at letters or point at concepts.
00:39:08.000 Because a lot of them are actually literate or otherwise aware, but they're just non-verbal.
00:39:12.000 So you saw Autistic Barbie, the video on it.
00:39:15.000 Let's just contrast.
00:39:16.000 This is an important contrast in our culture.
00:39:20.000 Saw Autistic Barbie there.
00:39:22.000 Now we go back to 1971 Barbie, Malibu Barbie 468.
00:39:29.000 Malibu Barbie!
00:39:30.000 She's Mattel's super new suntanned Barbie.
00:39:33.000 Hey, Barbie's got a golden pan now.
00:39:36.000 My sunny surfaced hair.
00:39:38.000 Malibu Barbie has her own beach towel and sunglasses and Malibu friends.
00:39:42.000 All with that suntanned skin that makes them look great wherever they go in any of the groovy new fashions.
00:39:48.000 Ruby.
00:39:49.000 Malibu Barbie.
00:39:50.000 Yeah, we used to have a country.
00:39:53.000 I'm really like, I was literally just going to say that.
00:39:55.000 We used to be a proper country.
00:39:56.000 We used to be a country guy.
00:39:57.000 That girl's hair was like through the Barbie hair in a way that I found disconcerting.
00:40:02.000 Like the girl playing with the Barbie.
00:40:04.000 What?
00:40:05.000 I don't know.
00:40:05.000 Someone in the chat.
00:40:06.000 The girl in the ad had the same hair as Barbie.
00:40:10.000 Maybe they cast her like that.
00:40:11.000 Maybe.
00:40:12.000 Edison in the chat says how long before Trune Barbie.
00:40:16.000 Do they have a Troon Barbie yet?
00:40:18.000 No.
00:40:19.000 We just got the Autist.
00:40:21.000 So they have on the side here some alternative Barbies that are.
00:40:23.000 Do we have OnlyFan Barbie?
00:40:25.000 Oh, wow.
00:40:25.000 I mean, to be maximally progressive, they probably need to.
00:40:28.000 But I think what's interesting here is they have a variety here.
00:40:31.000 They have three different wheelchair Barbies.
00:40:33.000 They have Wheelchair Ken, Wheelchair Normal Barbie, and Wheelchair Black Barbie.
00:40:38.000 2022.
00:40:40.000 There's been a Trans Barbie for three years, apparently.
00:40:42.000 There's literally three different wheelchair Barbies.
00:40:47.000 What?
00:40:48.000 But is there Autist Ken?
00:40:50.000 And I want to know what podcast you listen to.
00:40:52.000 Is there like Paradox gamer Ken who comes with his computer that has his map painting video game on it?
00:40:57.000 Or he's conquered.
00:40:58.000 Autistic Ken.
00:41:00.000 If there's no Autistic Ken, this is sick.
00:41:02.000 There are three different Barbies in a wheelchair.
00:41:05.000 Wow.
00:41:05.000 Yeah.
00:41:06.000 I don't feel like that is proportional to the population.
00:41:12.000 And it's the old-timey, you know, pushback.
00:41:15.000 Do people mostly do that or do they mostly use like powered chairs these days?
00:41:21.000 I genuinely don't know.
00:41:22.000 No, I think if you can push yourself, you choose to transfer it.
00:41:25.000 It's good to stay in shape.
00:41:26.000 It's like your former best friend.
00:41:27.000 It's like Madison Cawthorne goes around on that thing.
00:41:30.000 I've seen both.
00:41:31.000 I've seen both.
00:41:32.000 It depends how disabled you are.
00:41:33.000 Yeah, it's a reference.
00:41:35.000 Someone says, where is...
00:41:36.000 That might be too dark of a joke to make.
00:41:39.000 By the way, I did pull up, guys.
00:41:41.000 It looks like Laverne Cox, who is a trans actor, actress, whatever, has had a Barbie since 2022.
00:41:50.000 So we got the first Troon Barbie in 2022.
00:41:54.000 The back of this box also has thick Barbie back here.
00:41:57.000 Thick Barbie?
00:41:58.000 Thick Barbie?
00:41:59.000 Is that what you just said?
00:42:00.000 There's thick Barbie on back here.
00:42:01.000 Did you notice that?
00:42:02.000 It's funny how they go.
00:42:04.000 So like thick Barbie.
00:42:05.000 That's the one that Andrew would like.
00:42:06.000 It's not named that.
00:42:07.000 She's just a little bit girthier.
00:42:10.000 What I think is funny, she's very sturdy.
00:42:13.000 She's hard to push over.
00:42:16.000 Actually, thick Barbie might be a good segue into one of our other topics.
00:42:21.000 Yeah.
00:42:23.000 So what I will say is interesting is the New York Post article.
00:42:28.000 I remember in the 90s, the controversy was all that Barbie wasn't a feminist figure, so they had to give Barbie all the different jobs.
00:42:36.000 So you got physicist, like scientist Barbie, and eventually Barbie, astronaut Barbie.
00:42:44.000 We have not had a woman president.
00:42:44.000 I remember.
00:42:46.000 I can, because of things that might make me the target audience for toys like this.
00:42:54.000 I can remember specific ads from when I was in the 1998.
00:42:58.000 I remember the Olympic figure skater Barbie inspired by Tara Lipinski.
00:43:01.000 I remember the song they played.
00:43:02.000 Go for it, Tara.
00:43:03.000 We're cheering for you.
00:43:05.000 Olympic skater.
00:43:06.000 Barbie, go for it.
00:43:08.000 I haven't seen that ad in 30 years, and I don't remember it.
00:43:11.000 Guys, I'll say this.
00:43:12.000 I'm pretty sure this is an exploitation of the autistic community.
00:43:15.000 I realize in 1998.
00:43:17.000 Genuine Barbie.
00:43:18.000 I think this is genuinely a little bit of exploiting people.
00:43:21.000 But I would say this is that this is better.
00:43:23.000 The chat just said this is better than having furry Barbies and OnlyFans Barbies.
00:43:29.000 I think it's fine.
00:43:30.000 I don't think we're going to get time before we get an OnlyFans Barbie.
00:43:33.000 No, we're going to get furry Barbies for sure.
00:43:34.000 Furry Barbie Barbies.
00:43:36.000 I think it's fine.
00:43:37.000 I don't think it's bad to say a kid can get a doll that resembles them.
00:43:42.000 I don't think it's exploitative.
00:43:42.000 I think it's fine.
00:43:44.000 No, well, it's approved by the autism self-like.
00:43:48.000 But like I'm saying, they're taking advantage of the idea that they're making they're trying to like make I mean this was trying to make money who cares like that.
00:43:56.000 We believe in making money.
00:43:58.000 No, I know, but they're doing it on the back of like people who are disabled.
00:44:03.000 I think the target audience is the disabled people.
00:44:06.000 Are they though on the spectrum?
00:44:06.000 Like buy the sounds.
00:44:08.000 Are they though?
00:44:08.000 Because you buy it.
00:44:09.000 I didn't buy it.
00:44:11.000 A staffer bought it.
00:44:12.000 No, technically, I think I bought it.
00:44:14.000 Andrew's.
00:44:14.000 Yeah, he bought it.
00:44:15.000 Oh, Andrew.
00:44:15.000 Andrew's the one who bought it.
00:44:17.000 So you manipulate it.
00:44:18.000 I am taking it.
00:44:18.000 Yeah, I had a great dunk of a response there if I just tried to be a little bit careful.
00:44:23.000 I didn't personally.
00:44:25.000 I want to take us to the New York Post.
00:44:28.000 Can we get?
00:44:29.000 Well, I just want to ask, if they make couples, could we get like since you're talking about Tyler Pinsky?
00:44:34.000 I want to get Nancy Carroll in Sonia Harding.
00:44:44.000 I mean, the Olympics are happening right now.
00:44:46.000 And instead of talking about Olympic Barbies, Blake, earlier, you were talking about a certain type of person that's really into Barbie, even when they're older, and remembers Barbie commercials from even years ago.
00:44:58.000 I'm just, I'm just connecting dots here.
00:45:00.000 I'm just keeping it.
00:45:01.000 Yeah, I know.
00:45:02.000 And as we all, and then we establish that Andrew bought this doll, so he might be in that group.
00:45:07.000 Don't bring me into this.
00:45:08.000 I have an unblemished record of heterosexuality.
00:45:08.000 He might be in that group.
00:45:11.000 Here's the hold on.
00:45:13.000 Actually, so you guys can educate people who are in the business.
00:45:15.000 Who bought Barbie?
00:45:17.000 Emma.
00:45:18.000 I came in cold to this whole topic.
00:45:21.000 But on the orders of Blake.
00:45:23.000 Blake.
00:45:24.000 Well, okay.
00:45:25.000 Yeah, exactly.
00:45:26.000 So I guess you just.
00:45:26.000 Well, okay.
00:45:28.000 So responsibility for the purchase falls to.
00:45:31.000 Okay, hold on, hold on.
00:45:31.000 But no, this is actually important.
00:45:33.000 What do you call these Disney freaks that are 40-year-olds?
00:45:37.000 Disney adults.
00:45:38.000 Disney adults.
00:45:39.000 What is that about?
00:45:41.000 Well, the Disney adults are.
00:45:43.000 This feels similar, like similar vein of be 100% serious.
00:45:47.000 The men who collect Barbies, there's basically like gay men who really like Barbie.
00:45:51.000 Like from that dimension.
00:45:52.000 That should be fabulous.
00:45:53.000 And then just there's a whole collection of people.
00:45:55.000 My Little Ponies, too.
00:45:56.000 My Little Ponies.
00:45:57.000 That's a difference, yes.
00:45:58.000 The MLP guy is like, no, there's like a, I watched this whole like, like, uh, the showcase.
00:46:03.000 There are multiple documentaries on it.
00:46:05.000 On guy, like, men, like, weird men who are obsessed with My Little Pony.
00:46:10.000 Yeah, there was a shooter recently that those are like school shooters, basically.
00:46:14.000 Yeah, that would, like, they found out that we.
00:46:16.000 Generate school shooters.
00:46:17.000 There's something like super connected with it.
00:46:18.000 It's van viewers.
00:46:20.000 It's very scary.
00:46:21.000 This is true.
00:46:22.000 It is true.
00:46:22.000 100%.
00:46:26.000 It's a brand new year and a brand new opportunity to change the world for the better.
00:46:30.000 This is one of our most important partners.
00:46:32.000 It's easier than you might think.
00:46:34.000 You can save babies by providing ultrasounds with pre-born.
00:46:38.000 Together during this Sanctity of Human Life month, we're going to save babies right here on the Charlie Kirk show to show the world that not only do we believe life is precious, but we're going to do something about it.
00:46:48.000 Your gift to pre-born will give a girl the truth about what's happening in her body so that she can make the right choice.
00:46:54.000 What better way to start this new year than to join us in saving babies?
00:46:58.000 And $28 a month will save a baby a month all year long.
00:47:02.000 A 15,000, and I know there's some of you out there that can do this.
00:47:05.000 A $15,000 gift will provide a complete ultrasound machine that will save thousands of babies for years and years to come and will also save moms from a lifetime of regret.
00:47:15.000 So start this year right by being a hero for life.
00:47:18.000 Call 833-850-2229.
00:47:22.000 That's 833-850-2229.
00:47:26.000 Or click on the pre-born banner at charliekirk.com today.
00:47:32.000 Wait, go to the New York Post.
00:47:34.000 This is this is this is like a all I see is BBL implants from.
00:47:38.000 Okay, yeah, we had to get this.
00:47:40.000 There was a lot of hype for it.
00:47:41.000 So, all right, we'll go into this.
00:47:43.000 This is also about, I guess, body stuff.
00:47:45.000 And note that Andrew is the one who's really excited to read about it.
00:47:47.000 No, since Andrew wants to talk about this girl, so Jack, Jack was prefacing or promoing our thought crime on Bannon's War Room, and you said that Bannon about spit out his coffee when you mentioned this topic.
00:48:01.000 That's why I want to lost it.
00:48:04.000 Wait, I don't have the actual actual article handy.
00:48:07.000 It's 471 here.
00:48:09.000 Well, it's the article.
00:48:09.000 All right.
00:48:11.000 Let's just pull it up, which is it.
00:48:13.000 Yeah, so let's throw it up.
00:48:15.000 But basically, what it is is, oh man, that text is extremely tiny.
00:48:20.000 I can't read that.
00:48:21.000 But basically, the people are getting the opposite of what this will do for you.
00:48:26.000 Yes.
00:48:27.000 So people are getting Brazilian butt lifts.
00:48:30.000 That is what a BBL is.
00:48:31.000 And breast implants from 11-year-olds.
00:48:35.000 Well, okay, but you're the one who wanted to talk.
00:48:35.000 From.
00:48:38.000 I didn't know what it was.
00:48:39.000 And they're from donated cadavers.
00:48:42.000 So they're taking corpses and people who need to get some.
00:48:46.000 It's off-the-shelf fat.
00:48:48.000 Yeah, off-the-shelf fat fat.
00:48:49.000 This is the Kardashian look, right?
00:48:50.000 So this is that Kardashian look that's like kind of the rage or has been the rage for a while.
00:48:57.000 You know, prior to Sidney Sweeney, like the Sydney Sweeney body taking, you know, taking back a lot of the spectrum, a lot of the airspace on this.
00:49:06.000 And so, yeah, so across the country, a growing number of patients are turning to injectable fillers.
00:49:11.000 So fillers are all over the place.
00:49:13.000 This also came up on Stranger Things, by the way, not a BBL, but the lip fillers from dearly made from the dearly departed donated fat in order to lift, plump, and sculpt their bodies.
00:49:25.000 I feel like I need to read this in a different kind of voice.
00:49:27.000 Including for hot ticket procedures like Brazilian butt lifts and breast enhancements.
00:49:31.000 Many of us in New York City are very excited about this, particularly because our patients are sometimes very thin or maybe have already had liposuction.
00:49:38.000 Said Dr. Melissa Doft, a board-certified plastic surgeon in Manhattan in an Instagram video.
00:49:43.000 The injectable filler is made from donated tissues from human cadavers that's been specially processed for cosmetic use.
00:49:49.000 Can you sell?
00:49:50.000 So, like, can you sell your question?
00:49:53.000 You're like a family member that is like, hey, gotta make it.
00:49:58.000 Who gets paid for the butt fat?
00:50:00.000 Like, does my would my wife get paid for my butt fat?
00:50:04.000 Not that there's a lot, because I mean, you know, I've been working out a little bit lately, so there's not a lot, but you know, if we're selling, if we're talking about selling butt fat, you know, and then and then Tyler lost all his, so there's nothing there.
00:50:14.000 Yeah, Tyler's not a not a but who gets paid for the butt fat?
00:50:18.000 The filler called Aloe Clay hit the U.S. market last year.
00:50:18.000 That's what I want to do.
00:50:22.000 Like, who at the FDA?
00:50:25.000 What is it?
00:50:26.000 Like, who, who approved this?
00:50:27.000 Aloe Clay Flat.
00:50:28.000 I feel like RFK doesn't know about this.
00:50:30.000 I guarantee.
00:50:31.000 No, you know what to be funny is if you ask him, like, why did you guys, you know, green light cadaver butt filler?
00:50:37.000 And he'd be like, did you green light the butt fat, Bobby?
00:50:41.000 Did you green light it?
00:50:42.000 I have alex.
00:50:46.000 I got to ask the Maha expert about the cadaver butt fat.
00:50:49.000 I'd say that less than probably 5% of board-certified plastic surgeons have it.
00:50:54.000 Dr. Sachin M. Shridharney Hirani, who began offering the procedure at his Manhattan clinic, Lux surgery.
00:51:04.000 Gosh, these guys are such like luxury.
00:51:08.000 I mean, come on.
00:51:09.000 This guy is a grifter.
00:51:11.000 He's just like.
00:51:13.000 This stuff is so gross.
00:51:14.000 It looks like injecting someone with like a candle.
00:51:18.000 This is very, very icky looking.
00:51:20.000 You know what's crazy, by the way?
00:51:22.000 So, the thing they're using this for, the BBL, it's actually like one of the most like high-risk cosmetic surgery.
00:51:30.000 I think it's the most high-risk cosmetic surgery.
00:51:32.000 Why is it?
00:51:33.000 Apparently, you can get something called a fat embolism and die.
00:51:38.000 And there's like a death rate of like one in 3,000, which is pretty high for a cosmetic thing.
00:51:44.000 The technical term for it is gluteal fat grafting, which is a great name for any procedure.
00:51:51.000 Yeah, gluteal fat, butt fat, butt grafting onto the body.
00:51:54.000 And it is a very fast-growing aesthetic procedure in the United States.
00:51:58.000 I don't know.
00:51:59.000 Well, Kelly, there are several fatalities.
00:52:01.000 You know what this is?
00:52:03.000 Can I?
00:52:04.000 Yeah, but you know what's funny?
00:52:05.000 It's all these style.
00:52:06.000 There we go.
00:52:07.000 Probably skinny white chicks that do this.
00:52:07.000 Skinny chicks.
00:52:09.000 Can I inject something here?
00:52:11.000 So I guess.
00:52:12.000 Can I inject something?
00:52:13.000 Is it more butt fat?
00:52:14.000 Interject.
00:52:15.000 Yeah, inject something.
00:52:17.000 Inject something here.
00:52:18.000 Interject.
00:52:20.000 So after World War II, there was like a huge hubbub in America because there was all these rumors that human body parts were being used in common cosmetic products, just in general.
00:52:32.000 Like this was like a big, big deal where people got freaked out.
00:52:36.000 And everybody believed it.
00:52:37.000 Like everyone believed that the Nazis and other bad people were using human parts that went into cosmetics and that was debunked.
00:52:51.000 And even the people today that still believe a lot of that.
00:52:54.000 But I think this is super weird that cadaver fat, like basically what everyone freaked out about in the 40s and 50s and maybe probably beyond that, is basically what's happening now with these injections.
00:53:08.000 That they're using cadaver fat.
00:53:10.000 They're using cadavers to inject into people.
00:53:13.000 That's pretty sick stuff.
00:53:14.000 You know what's ironic about our conversation?
00:53:16.000 In China with the forced organ harvesting of prisoners.
00:53:19.000 Falling off the bottom of the bottom of Brazil, too, by the way.
00:53:21.000 But you know what's ironic about our conversation thus far, the way it's traveled, is it's gone from complete elimination of need of humans in Hollywood, complete AI, to this weird insertion of humans in a way that shouldn't be inserted.
00:53:37.000 It's kind of like the one place you wouldn't want IRL humans is in your butt fat from a cadaver, and yet the one place you thought you would want humans is in a Hollywood movie, and yet we're getting rid of them.
00:53:37.000 Does that make sense?
00:53:50.000 I'm just saying, we're living in strange times.
00:53:52.000 Very strange times.
00:53:54.000 Blake doesn't seem convinced.
00:53:55.000 I still want to know who I don't want in the butt fat butt.
00:53:58.000 This is a butt fat dilemma.
00:54:02.000 I've been told they have created an important AI video that we should display.
00:54:07.000 So let's show it right now.
00:54:11.000 That is us as all a bunch of catfish, as requested.
00:54:14.000 I don't think those are really – they don't really show whistlers, so they don't seem to be catfish.
00:54:20.000 This looks like a Star Wars video.
00:54:21.000 Oh, that's amazing.
00:54:22.000 Yeah, yeah.
00:54:22.000 It's like, you know, gloop Splato or whatever they name those Star Wars characters.
00:54:28.000 I definitely like mine the most, I have to say.
00:54:32.000 Based on, I guess.
00:54:34.000 Based on nothing.
00:54:36.000 Look at that vacant stare of the Jack of the Jack catfish.
00:54:41.000 It's like there's nothing there.
00:54:42.000 Sounds about right.
00:54:44.000 Okay.
00:54:45.000 I think I'm done with butt fat.
00:54:47.000 Well, let's forge ahead.
00:54:47.000 All right.
00:54:49.000 We still have more fun stuff to get to.
00:54:50.000 So we have to talk about the HR game out of the UK.
00:54:55.000 So this is a very fun one.
00:54:56.000 I'm going to have to guide you guys through this a little bit.
00:54:59.000 But basically, the British government paid somebody, probably paid someone a very inflated amount of money to make an interactive HR style game about how you, as a young person, should not be entrapped by radical politics because it could be illegal and you might go to jail.
00:55:16.000 This feels like that one movie that became a big deal.
00:55:19.000 What was that movie where it was like a white kid gets radicalized and stabs somebody or something?
00:55:25.000 Oh, adolescence.
00:55:26.000 Adolescents.
00:55:27.000 And like everyone had to watch it, and they were like interrogating the politicians.
00:55:30.000 I was going to say Jumanji, but Jumanji falling out of the world.
00:55:35.000 That was Robin Williams.
00:55:37.000 Yeah, but this is what's crazy.
00:55:39.000 Everybody knows it's the like immigrant communities that are like raping the women that are like stabbing people on the subways or the tube or whatever.
00:55:46.000 And then they make this movie, Adolescence, and they try and tell that story, but then they race swap for a young white British kid.
00:55:54.000 Absolutely.
00:55:55.000 This stuff is infuriating.
00:55:56.000 It's intentional.
00:55:57.000 All right.
00:55:58.000 Psyop is real.
00:55:59.000 So this is a game.
00:56:00.000 This game is called Pathways.
00:56:02.000 I think we have to leave with the clip.
00:56:04.000 Yeah, well, so what happens is you play through the game.
00:56:06.000 And so we're going to do that.
00:56:07.000 So we need some setup here.
00:56:09.000 It's called Pathways.
00:56:10.000 It was funded by the British government.
00:56:13.000 I believe it was made for the north of England or something like that.
00:56:17.000 I think East Yorkshire or something made this.
00:56:21.000 But let's just dive into it.
00:56:23.000 This is the intro thing.
00:56:24.000 So when you play it, you choose to play as a boy or a girl, regardless of who you pick.
00:56:28.000 I'm not making this up.
00:56:29.000 The character is named Charlie, and he is a young adult.
00:56:33.000 So let's play clip 474.
00:56:38.000 Charlie was enjoying an online game with friends.
00:56:41.000 I like how this is starting.
00:56:43.000 Charlie had not long started attending a new college in East Riding.
00:56:48.000 And they were so relieved to have made new friends, having recently left.
00:56:51.000 Charlie's real happy right now.
00:56:53.000 Swimming in the middle of the day.
00:56:54.000 Charlie has started browsing new games and websites that some of the new friends use.
00:57:00.000 No adult sites, Charlie, don't worry about it.
00:57:02.000 Sometimes, though, the people on these websites say things that seem off, even slightly concerning.
00:57:08.000 Slightly concerning.
00:57:10.000 Someone on this website has encouraged Charlie to download a video, but Charlie is unsure.
00:57:17.000 It's thought crime.
00:57:20.000 A clip from this show.
00:57:21.000 How should Charlie react?
00:57:23.000 If you can't read it, the top result is tell a trusted adult.
00:57:25.000 This is a college student.
00:57:26.000 Download it.
00:57:26.000 Download it.
00:57:27.000 Tell it to do.
00:57:30.000 But they chose the radical option, which was to download and watch the video.
00:57:34.000 Let's go.
00:57:35.000 Charlie downloaded the video and shared it with different people online.
00:57:39.000 Different people online.
00:57:42.000 The access.
00:57:43.000 Charlie felt really good.
00:57:43.000 What is it?
00:57:44.000 It's Charlie Kirk talking about pilots.
00:57:44.000 Oh my gosh.
00:57:46.000 And also sharing it.
00:57:50.000 Deep down, Charlie wasn't sure if this was the right thing to do, as some of the ideas in the video were extreme and violent.
00:58:01.000 It's important to remember that downloading or streaming certain content can lead to a terrorist offense.
00:58:06.000 This is a video of a guy walking down the street in Minneapolis.
00:58:08.000 No, hold on.
00:58:09.000 You missed that.
00:58:10.000 It was like, it could result in a terrorist offense.
00:58:13.000 Yeah, they're like, if you download and watch certain videos, you can go to prison in the UK.
00:58:16.000 That is 100% real.
00:58:17.000 And so this goes through six different phases.
00:58:21.000 And what's making this amazing is what happens in the next part.
00:58:26.000 For the context, are they making kids play this in school?
00:58:28.000 Is this like a training thing?
00:58:30.000 I think it was the intent that you could use it in high school age kids, I think.
00:58:33.000 So before you go out into the world and start attending college, you have to be careful because you might watch illegal videos of Charlie Kirk that will cause you to go to prison, basically.
00:58:43.000 Because this reminds me of the watch these compliance videos once a year on different things.
00:58:53.000 And it was very similar.
00:58:54.000 Like you had to play a game and pick the right answer or go all the way through.
00:58:58.000 Anyone's in the military, cyber awareness challenge, all that crap.
00:59:02.000 You'll know exactly what I'm talking about.
00:59:04.000 Where you'd have to take it every year, then every quarter.
00:59:07.000 And then it was like, oh, but does your training officer have the certificate?
00:59:11.000 Because you didn't do it yet or whatever.
00:59:13.000 And they would force you to do this.
00:59:15.000 And it just reminds me of that.
00:59:16.000 But of course, for all children.
00:59:19.000 Yeah, this is like HR commissars coming for your kids.
00:59:23.000 But like.
00:59:24.000 Yeah, literally.
00:59:26.000 I'm getting PTSD about that.
00:59:27.000 I'm trying to figure out what it is about the UK psyche that makes them so prone and like vulnerable to the worst excesses of this mind virus.
00:59:39.000 Freedom-loving people got on the Mayflower.
00:59:42.000 Well, that is part of it, truly.
00:59:44.000 Like part of it.
00:59:44.000 Well, I think they lost all of the good guys in World War II.
00:59:47.000 I think that's a huge part of it.
00:59:48.000 So many good people left.
00:59:49.000 I mean, even after World War II, like so many British people that were like freedom-loving people.
00:59:53.000 They came with everybody without an ounce of testosterone or something.
00:59:56.000 I mean, the Americanization of Western Europe definitely created a vacuum.
01:00:02.000 But I think more importantly, it's the whole commie concept, right?
01:00:06.000 It's like they've just built so tall in some of these places.
01:00:09.000 Like, even this is the problem in Europe and so many places.
01:00:12.000 Places that were once considered extremely, and this is happening in America too, extremely conservative, are building straight upwards.
01:00:19.000 Oh, you're talking about actual physical buildings.
01:00:21.000 I thought you were like symbolic.
01:00:24.000 Like a symbolic communist.
01:00:26.000 He's built so much communist on top of a, you know, there's something that's tied to when people live on top of each other.
01:00:33.000 Oh, I totally agree with this.
01:00:34.000 Are closer together?
01:00:35.000 You can actually find where there is a, I remember doing this because Charlie came under all this scrutiny because we were talking, Charlie said something, and then he got roasted by like media matters or something like that.
01:00:46.000 And then it got daily beast or whatever.
01:00:47.000 And he was talking about how urban density creates libs.
01:00:51.000 People that live far apart, not on top of each other and rentals are conservatives.
01:00:56.000 And there actually is a density number, like people per square mile, at which you can watch it.
01:01:03.000 Because I did a whole deep dive.
01:01:04.000 I wish I remembered this.
01:01:05.000 But a density where people flip from Republican to lib, right?
01:01:09.000 There's like actually a statistical number at which you can figure out when people, how many people you can put in a square mile before they turn lib.
01:01:18.000 And that should be the guiding principle to go less than less dense than that.
01:01:23.000 There are right-wing societies that are denser than the United States.
01:01:27.000 And there are highly decent, like highly rural.
01:01:30.000 That's actually a fair thing.
01:01:31.000 I don't know.
01:01:33.000 It's multivariant.
01:01:34.000 You're right.
01:01:35.000 You're right.
01:01:36.000 What are the other variables?
01:01:37.000 In America, though, it's like a thing.
01:01:39.000 It's completely a thing.
01:01:41.000 Well, I mean, super blue.
01:01:43.000 Why are cities so freaking lib?
01:01:46.000 Why?
01:01:47.000 Well, I think cities kind of attract lib type people.
01:01:50.000 Also, we've turned cities.
01:01:51.000 You think it's self-selecting then?
01:01:53.000 Well, yeah, I think there's a lot of self-selection.
01:01:55.000 I think who actually lives in cities, you have like urban underclasses that we subsidize to live there.
01:02:01.000 And then you often have...
01:02:03.000 Curious.
01:02:04.000 So like in the 70s when cities were much whiter, were they voting more conservative?
01:02:09.000 I mean, there has been eras where New York City would sometimes vote Republican in elections.
01:02:13.000 I think the last time they did it was the 20s.
01:02:15.000 Nancy Pelosi's father was the Republican mayor of Baltimore.
01:02:21.000 Fascinating.
01:02:22.000 Yeah, I think that's a good idea.
01:02:24.000 He may have been anything about a Democrat mayor, but he was a white mayor of Baltimore.
01:02:27.000 Anything about how cities vote today is downstream of the fact that like in the 60s we blew up.
01:02:34.000 We did a giant democracy.
01:02:36.000 We basically did like ethnic cleansing of cities where there would be a riot and everyone would have to leave and all of that.
01:02:41.000 Yeah, so if you know if you don't talk about the white flight and the soft ethnic cleansing of the 1960s through the 1990s in the urban areas, I don't think you can explain this properly because it's not just density.
01:02:54.000 It's about who's actually there.
01:02:55.000 So it's not just demographic replacement.
01:02:58.000 I think there's a qualitative function to this as well.
01:03:01.000 Yeah, reliant.
01:03:02.000 But to that part, for example, Miami is one of the most dense of major American cities.
01:03:07.000 It's all high-rises right along the ocean.
01:03:09.000 And Miami was one of the most Republican cities in the last election.
01:03:12.000 Yeah, but not where the high-rises are.
01:03:13.000 But no, our downtown, Miami precincts.
01:03:17.000 Miami is the only precincts that vote for Trump, that's true, but it's on Staten Island or Orthodox okay, but the point yeah yeah, the but the Brooklyn, but that's the thing, it's the people that live there.
01:03:27.000 I mean it's fair enough.
01:03:28.000 I mean because even in Miami, they're all like Venezuelan, you know, diaspora or Cuban refugees.
01:03:34.000 We can't overlook.
01:03:35.000 The greater concept here, though and I want to just say this with Jack too is that commies love people on top of each other, because something happens with a mind.
01:03:46.000 You're able the hive mind, culture and concept.
01:03:50.000 It actually is far more maneuverable when you have people all living right on top of each other, with each other.
01:03:58.000 I'm telling you, it's just.
01:03:59.000 There's a reason why communists always have that happen.
01:04:03.000 They build up.
01:04:04.000 This is like a tragedy of the common story too, right where, like you get?
01:04:08.000 You get this in, like Russia and stuff, plus lack of ownership China, lack of ownership.
01:04:12.000 What about it?
01:04:14.000 Well, in in China, Jeremy Mao was not able to get support in the city, so he went famously went on the long march to the rural areas, and it was in the rural areas where he recruited for basically, the poor, you know, for the uh, the RED ARMY, and then it was really a city or a, you know, a conflict of the Rurals versus the Urbans, and Shianga-shek had more support in the urban areas.
01:04:35.000 I mean I I, I get what you're saying.
01:04:37.000 That's like, that's what, that's what I mean, that's what Lenin did too, but here that I mean that's, that was like the, the Russian Revolution to a certain extent too.
01:04:44.000 But I mean yeah that's look, I mean the peasants, but that that was more the, the has versus have-nots.
01:04:50.000 That that, that entire concept.
01:04:52.000 My point is, after they've constructed communism, they want to control people, and to that point is that, you know, we've injected, and and everything can be right simultaneously, but we've injected more poor people into the cities.
01:05:05.000 Yeah right, and and in that injected.
01:05:08.000 Well that, because no, but hold on well, from the 60s, you're right.
01:05:11.000 So no part of this is why you have ownership.
01:05:13.000 This is lack of home ownership, lack of land ownership.
01:05:16.000 Well, this is Flake.
01:05:17.000 Could give us a history lesson on what drove.
01:05:20.000 Because, like you had the, you had the Southern California scenario, where a bunch of like there was.
01:05:24.000 Basically rumors were going around in the southern United States that, like California, there was no racism.
01:05:29.000 So, like all of these uh, black communities from the south and maybe urban poor centers even in the north, came and they went to.
01:05:37.000 They went to South Lay, and so South Lay used to be kind of just this like suburban area.
01:05:43.000 Then all the blacks moved in and then you had this led to like watts riots.
01:05:47.000 It led to the dynamic that you ended up seeing in the 60s and 70s.
01:05:50.000 Uh, because you had a militarized police force or a bunch of like World War Ii vets, that that like that's how they dealt with stuff so.
01:05:55.000 But then what you also had was the 90s, the.
01:05:59.000 They got regentrified.
01:06:01.000 So in the 90s you had mayor Richard Reardon in La, then you had mayor Giuliani in New York, so then they had all these police flooding in.
01:06:07.000 And then you had regentrification in the early 2000, late 90s, early 2000s.
01:06:11.000 So then you had a bunch of like the cities got safer crime dropped.
01:06:15.000 I just don't know what happened demographically in those cities.
01:06:18.000 I mean, overall the country was becoming less white, was more uh, more mixed.
01:06:23.000 But i'm just curious, like I haven't actually studied that.
01:06:25.000 I'm curious, I mean, it's complicated because cities are different.
01:06:29.000 When they got blown up, it happened at different times.
01:06:32.000 Some of them weren't actually blown up.
01:06:33.000 They were fine or they were growing.
01:06:35.000 That's where you get a lot of, you know, Tampa.
01:06:37.000 Yeah, and but it's also you see things like Phoenix.
01:06:42.000 Phoenix was a city that was booming.
01:06:43.000 Phoenix didn't get blown up in this period.
01:06:45.000 That's when Phoenix explodes.
01:06:46.000 Phoenix, people move to Phoenix from cities that are going down.
01:06:49.000 Austin, Austin is a city that is excluded.
01:06:50.000 Austin's a modern one.
01:06:52.000 And I would think Austin's kind of a weird one because it's gotten so liberal, but it was always kind of known.
01:06:57.000 It was always liberal.
01:06:58.000 It's always liberal.
01:06:58.000 It's just gotten bigger.
01:06:59.000 That's a self-selection issue.
01:07:01.000 Because it's a lot of self-selection.
01:07:02.000 It was like Keep Austin weird, so all the weirdos moved there and kept it.
01:07:05.000 But I feel like a lot of this is self-selected.
01:07:06.000 It's self-selection.
01:07:07.000 It's cultural intensification.
01:07:09.000 So cities are bluer and rural areas are redder.
01:07:12.000 That's just also happened in the world.
01:07:13.000 But you've seen this in Dallas, where Dallas was kind of this conservative urban place.
01:07:17.000 It wasn't.
01:07:18.000 Dallas has always been gay.
01:07:19.000 It has to be known.
01:07:20.000 Fort Worth is still conservative.
01:07:22.000 Fort Worth is less conservative than it was.
01:07:25.000 It's less.
01:07:26.000 Houston is now liberal, but that's a lot of immigrants have moved in.
01:07:30.000 Houston, demographically, is completely.
01:07:32.000 Houston got a lot of people fled Katrina to Houston and never left.
01:07:36.000 And yeah, like large, like tens of thousands of people.
01:07:40.000 That was like a bigger.
01:07:41.000 So you got a bunch of these poor communities that like left.
01:07:46.000 I didn't know that.
01:07:46.000 Okay, interesting.
01:07:49.000 This is Lane Schoenberger, Chief Investment Officer and Founding Partner of YReFi.
01:07:54.000 It has been an honor and a privilege to partner with Turning Point and for Charlie to endorse us.
01:07:59.000 His endorsement means the world to us, and we look forward to continuing our partnership with Turning Point for years to come.
01:08:05.000 Now, here Charlie, in his own words, tell you about YReFi.
01:08:09.000 I'm going to tell you guys about whyRefi.com.
01:08:11.000 That is YREFY.com.
01:08:13.000 WhyReFi is incredible.
01:08:14.000 Private student loan debt in America totals about $300 billion.
01:08:18.000 WhyReFi is refinancing distress or defaulted private student loans?
01:08:22.000 You can finally take control of your student loan situation with a plan that works for your monthly budget.
01:08:26.000 Go to whyrefi.com.
01:08:27.000 That is whyrefi.com.
01:08:29.000 Do you have a co-borrower?
01:08:30.000 WhyReFi can get them released from the loan?
01:08:32.000 You can skip a payment up to 12 times without penalty.
01:08:35.000 It may not be available in all 50 states.
01:08:37.000 Go to whyrefi.com.
01:08:39.000 That is why FY.com.
01:08:41.000 Let's face it, if you have distress or defaulted student loans, it can be overwhelming.
01:08:45.000 Because of private student loan debt, so many people feel stuck.
01:08:48.000 Go to yrefi.com.
01:08:50.000 That is yrefy.com.
01:08:53.000 Private student loan debt reliefyrefi.com.
01:08:58.000 I want to continue in this game because it actually gets amazing with the next bit because it goes for several segments.
01:09:02.000 And this next one is great.
01:09:03.000 So this is, we didn't clip the whole part.
01:09:05.000 So the segment that it is is this guy, your Charlie.
01:09:08.000 He's going to class at the community college and he's studying for something and he's going about to get an important grade, but it's not, it's not a good one.
01:09:15.000 And it leads to something interesting.
01:09:16.000 Let's play 475.
01:09:19.000 Charlie is receiving an important grade on a piece of work they submitted for their hospitality course at college.
01:09:27.000 Charlie put in a lot of effort for this work and is excited to receive good feedback.
01:09:34.000 Charlie takes a seat in class and waits to get their grade.
01:09:40.000 To their disappointment, Charlie doesn't do as well as they expect.
01:09:44.000 They got 60 out of 100 for their work, but they wanted at least 75 out of 100.
01:09:50.000 To make matters worse, somebody else got 80 out of 100.
01:09:55.000 And the teacher said that this person has received a job offer.
01:09:58.000 For those who can't see it, this person is shown as like a Charlie has applied to dozens of jobs, but hasn't had any luck yet.
01:10:05.000 I love how they refer to Charlie as a class tells Charlie that this is proof that immigrants are coming to the UK and taking our jobs.
01:10:15.000 And then Charlie has the choice.
01:10:16.000 Does he agree with what this person said?
01:10:18.000 And it's this woman, Amelia.
01:10:19.000 Charlie approached the classmate angrily.
01:10:22.000 He agreed with the ideas and began shouting about them in class.
01:10:28.000 The teacher let Charlie know that the school has a zero tolerance on hate speech.
01:10:35.000 The teacher was concerned by Charlie's outburst and tried to get to the bottom of it.
01:10:42.000 Charlie became more agitated and ended up having to sit alone for the duration of the week's lessons because of the hurtful things they said.
01:10:51.000 Charlie has to go to community college detention.
01:10:53.000 Did you not notice that they kept referring to Charlie as the now?
01:10:57.000 I will know.
01:10:59.000 I'm so confused about that.
01:11:01.000 Well, so they do in the game, you can choose to be a boy or a girl, and in both of them, your name Charlie, and I think they just recorded it once, so I don't think it's super duper pronoun police thing.
01:11:11.000 I think it's mostly laziness.
01:11:13.000 I don't know.
01:11:14.000 I don't know.
01:11:14.000 Whatever.
01:11:15.000 But now we have only a couple more, but I want to do this one.
01:11:19.000 This is 476.
01:11:21.000 Let's continue.
01:11:23.000 This is the next appearance of Amelia.
01:11:25.000 Amelia, Charlie's close friend, has made a video encouraging young people in Bridlington to join a political group that seeks to defend English rights.
01:11:36.000 Amelia encourages Charlie to join a secret group on an app Charlie hasn't heard of before.
01:11:42.000 Charlie isn't sure whether to join, explore further, or ignore.
01:11:47.000 And of course, we have to choose to join this group defending English rights.
01:11:52.000 Based on the first video the friend posted was so funny.
01:11:57.000 They couldn't believe how many likes Amelia's memes were getting.
01:12:00.000 It was inspiring.
01:12:01.000 Amelia's memes.
01:12:04.000 Charlie joined the secret group on this new platform.
01:12:07.000 Their phone wouldn't stop buzzing with messages of support and invitations to participate in several people.
01:12:12.000 Amelia's a fed, Charlie.
01:12:13.000 She's a fed.
01:12:14.000 It's not true.
01:12:15.000 Amelia.
01:12:16.000 Charlie's mom was not so pleased and grew suspicious of all this new activity.
01:12:20.000 I will fight.
01:12:21.000 I will fight for Amelia.
01:12:22.000 Like, she's a fed.
01:12:23.000 She's not a fed.
01:12:23.000 You gotta stop following me.
01:12:25.000 Amelia is an English.
01:12:26.000 Honduras all over again.
01:12:27.000 So for those who can't see it, Amelia is shown she has purple hair and like a choker on.
01:12:31.000 She looks like a goth chick, basically.
01:12:34.000 She's a right-wing anti-immigration English patriot.
01:12:37.000 She's literally the AFD or like reform.
01:12:43.000 100%.
01:12:43.000 That's all it is.
01:12:44.000 They're literally like, oh, who likes Nigel Farage?
01:12:47.000 We're going to stereotype them and put them in a like, see what videos they're showing.
01:12:51.000 And so I don't want to show all because the next one's like, so in the next one you do, she recruits you.
01:12:56.000 Now, in Jack's argument, in the next clip, if you did it, she recruits you to go to a protest that she is not allowed to attend herself.
01:13:03.000 And then Charlie attends the protest and he gets arrested because he gets in a fight with some people.
01:13:09.000 And she's essentially.
01:13:11.000 No, no, it's not true.
01:13:12.000 He's like a sock.
01:13:14.000 So this is really funny.
01:13:15.000 Stop white knighting for Amelia.
01:13:16.000 No, I will white knight for Amelia forever because then she's totally fine.
01:13:22.000 And this, if you choose all the radicalized options, this is one of the endings you could get in the game.
01:13:28.000 It seems they took it out, but it was still accessible if people downloaded the game.
01:13:32.000 Let's do 481.
01:13:33.000 Charlie was furious that the teacher felt they needed support with their political views.
01:13:38.000 Charlie was so insulted that they stormed out and went to see their friend Amelia.
01:13:43.000 Together, the pair increased the amount of content they shared, attracting the attention of not just the teacher, but their parents and police too.
01:13:52.000 And police.
01:13:54.000 By not accepting help in time, Charlie had given themselves an opportunity to break the law with the things they were saying and the actions they chose to do.
01:14:04.000 Then Charlie gets arrested.
01:14:05.000 The cops came in and they stopped.
01:14:07.000 They shut down him and Amelia.
01:14:09.000 Yes, just like online.
01:14:11.000 Just like Winston gets set up by Julia in 1984.
01:14:15.000 All right, it's literally the same plot.
01:14:17.000 He's getting set up.
01:14:18.000 He should not have been talking so openly online.
01:14:21.000 This is just ridiculous how they were talking online.
01:14:24.000 Yeah, and that's why we have to liberate the UK.
01:14:27.000 And so this was made by the British government and was available till yesterday online.
01:14:31.000 You could go play this.
01:14:32.000 They have taken it down, but just like, you know, the lessons made by the British government.
01:14:38.000 They funded this.
01:14:39.000 Kier Starmer.
01:14:40.000 They said East Riding, East Riding of Yorkshire.
01:14:42.000 So like East Yorkshire was a region of the UK and they were using this.
01:14:45.000 But they cannot kill an idea.
01:14:47.000 So people have already generated heroic amounts of AI slop of our new waifu, Amelia.
01:14:52.000 Let's roll the 479 B-roll.
01:14:55.000 So people have been making AI clips of Amelia protesting.
01:14:59.000 That's her with the Union Jack.
01:15:01.000 The Union Jack.
01:15:02.000 These are my gods.
01:15:04.000 Wait, so Amelia is like the based right-wing meme now?
01:15:07.000 Yeah, that's like Joan of Arc Amelia there, except not fighting the British.
01:15:10.000 She's got the English flag on her shield.
01:15:12.000 That's all I'm saying.
01:15:14.000 We've got more.
01:15:15.000 We've got smoking Amelia.
01:15:17.000 If you are a woman watching this right now and you resemble this female in some way, shape, or form, email freedom at charliekirk.com and Blake is going to date you.
01:15:28.000 Men want just one thing and it's disgusting.
01:15:31.000 It's Blake, you're the disconnect for all the stories now.
01:15:35.000 This connects all the stories to AI.
01:15:38.000 There you go.
01:15:40.000 I will accept you if you're into.
01:15:42.000 Watch this last one here.
01:15:43.000 This is a great one here.
01:15:44.000 It turns into invaders.
01:15:47.000 Look, we just, we have to defend.
01:15:49.000 We have to defend the West here.
01:15:51.000 Yeah.
01:15:52.000 Woo!
01:15:53.000 Oh, man.
01:15:54.000 If you're listening on podcasts, you got to check this clip out.
01:15:56.000 Jewel.
01:15:57.000 Yeah, Amelia.
01:15:58.000 Amelia Amelia Amelia is based so the British government tried to make a game about how you shouldn't be offensive on the internet That's amazing.
01:16:05.000 Instead, they have made an unkillable idea.
01:16:08.000 You know what's crazy, though?
01:16:09.000 That's a really good point.
01:16:09.000 I hope this meme has life because this is exactly who you want to see come up through the ranks in British culture and be amazing.
01:16:20.000 There might be an Amelia party in the UK.
01:16:22.000 Although that outfit, that's not like a German outfit, right?
01:16:24.000 It wasn't like a German.
01:16:25.000 I have no idea.
01:16:27.000 I'm not endorsing that.
01:16:28.000 I'm not controlling what the people do with their memes.
01:16:31.000 So all I'm saying is, you know, reform is probably going to take back England.
01:16:36.000 Who knows how successful they would be if they get control back.
01:16:39.000 But I do think that this national populist rise uprising across Western civilization is a really, really positive thing.
01:16:48.000 And the fact that you have whole government apparatus, machinery trying to fight it with this terrible big brother, it's like a wet blanket of a simulation of a game.
01:17:04.000 Which they probably paid way too much for the government contracting process.
01:17:08.000 Someone got paid like $100,000 to make it.
01:17:11.000 You know, this is like, you know, you've got Data Republican.
01:17:14.000 You've got Mike Benz that have unearthed a ton of this stuff with a transatlantic.
01:17:18.000 I mean, this is a really like hilarious version of it.
01:17:20.000 And it's so on the nose that it's easy to mock.
01:17:22.000 But there's some people that are very sophisticated about how they undermine a country's love of itself, a country's pride in its own heritage.
01:17:31.000 And it's really disgusting.
01:17:32.000 And we've gotten slammed with it in the West.
01:17:34.000 And we're fighting, but we're like building immunities to it.
01:17:37.000 That's why this is such a fun story, because we're building immunities.
01:17:41.000 Go ahead, Jeff.
01:17:41.000 Go ahead.
01:17:42.000 I guess I was going to say that, you know, and this just, you know, my take on it.
01:17:46.000 You know, I'm not British.
01:17:47.000 I don't think any of us are here are British.
01:17:49.000 Tyler might have some British.
01:17:50.000 I don't know.
01:17:51.000 Oh, no.
01:17:52.000 No, Andrew, you've got British heritage.
01:17:53.000 I'm a quarter British.
01:17:54.000 I'm like, you're like three-quarters British heritage, but we came over on the Mayflower.
01:17:58.000 I'm like, no, no, no, I'm a British and Irish.
01:18:04.000 I am American.
01:18:05.000 So there's a huge cultural affinity for, obviously, rule following and procedure in the UK.
01:18:16.000 Queuing and lining up is really big, just having visited there a few times.
01:18:20.000 There's also a lot of obsession around like health and safety, risk assessments, and compliance.
01:18:26.000 So like with those, with those risk assessments.
01:18:29.000 So the problem, I think, is that if you get, you know, if you start crossing the line between and blurring the line between what is in the good of the nation, what is in the good of the health and safety of the people with things that are bad, right?
01:18:45.000 So you cross that line into tolerance.
01:18:47.000 So the British system then will force you into tolerance more than any other possible system.
01:18:54.000 Like, you know, the bureaucracy, C.S. Lewis, of course, in screw tape letters famously writes that a demon is a bureaucrat, right?
01:19:01.000 Hell is a bureaucracy with civil servants.
01:19:04.000 And so it's just something that's very culturally British rules, order, doing things proply, you know, that you see a lot there.
01:19:16.000 We have to avoiding fuss, chaos.
01:19:18.000 they really they really hate that stuff and so unfortunately you got a license for that meme you've got a license for that meme So like this is a.
01:19:26.000 This is a place where, like the you know, fairness and hate speech and feelings gets kind of caught up with your traditional British British cultural, cultural more of wanting to follow the rules, be fair and having and prioritizing health and safety.
01:19:45.000 Well I, I can't wait for whatever this regime that is ruling the minds and pocket books of the British government funds.
01:19:55.000 Obviously, Kier Starmer's a wildly unpopular figure even in the UK.
01:20:00.000 I feel like all of their politicians are unpopular.
01:20:03.000 That's its own funny thing.
01:20:04.000 Like there is a British sort of.
01:20:06.000 There is a kind of a tendency to just be.
01:20:09.000 They're kind of doomers, like the whole culture is doomers.
01:20:12.000 It's a.
01:20:13.000 It is a civilization that seems to have given up on itself in a very disturbing way.
01:20:17.000 I'm telling you, they lost all their good dudes in World War.
01:20:19.000 I and they did.
01:20:20.000 Although this Irish working countries that didn't even have a World War hmm, like the countries that weren't in the World War, like Sweden was neither in neither World War and they still hate themselves Sweden's kind of.
01:20:31.000 They're finding a backbone.
01:20:32.000 I'm hoping they were affected.
01:20:32.000 I like that.
01:20:34.000 Some of these groups are finding a backbone, but I think the, the Scandinavians though, there is something.
01:20:39.000 There's a bit of a pushover.
01:20:41.000 I don't know man, these are the dudes who used to go around in boats and like pillage and conquer Ireland and all that stuff.
01:20:47.000 And now it's like, how did they go from all their warriors, how did they go from the Vikings to that, to what they are now?
01:20:53.000 Or maybe maybe just the, the Swedes that are there now are the ones who stayed.
01:20:57.000 You know maybe, but the ones with Minnesota Vikings all left.
01:21:02.000 I don't know.
01:21:03.000 These are conundrums that we're gonna have to ask AI to help us solve.
01:21:07.000 No, we're not gonna ask AI, we're going to ask ourselves.
01:21:10.000 No, but this is in Minneapolis, because you're surrounded by these Scandinavians who you're sitting around and, like my brother Kevin, go follow him.
01:21:17.000 Kevin Pesovic, he's down there on the ground.
01:21:19.000 He's been in Minneapolis all week.
01:21:21.000 He was standing next to the FBI truck that, as it was being looted last night and he's filming all this and you know it's like.
01:21:29.000 And then he went down to the state capitol though, for this uh, high school walkout, you know, ice out thing they were holding yesterday with Keith Ellison, and he goes in and all the kids in the high school are Somali and then the flag is Somali.
01:21:41.000 So it's like, what is wrong with the Scandinavians?
01:21:45.000 Why will they not wake up and understand that they are being invaded?
01:21:48.000 And they have a lot of people, a lot of Scandinavians, in Seattle too, and they have yeah, and it's like well, you just got to be welcoming eh, you just got to be welcoming eh, you just got to be good to your neighbor.
01:21:58.000 Racist yeah um Irish, now I don't know, I can't do accents.
01:22:02.000 I can do like a.
01:22:03.000 I can do a Scottish accent because I watched enough brave people To make people aware, the Irish don't get enough flack for how unbelievably left-wing they are now.
01:22:13.000 They're just letting themselves get warmed up.
01:22:15.000 They have a very bad there's some harder stuff.
01:22:20.000 There's some rumblings of a switch in Ireland, hopefully, soon.
01:22:25.000 Conor McGregor is trying to rise up, right?
01:22:27.000 Yeah, huge rally that was in, I think it was Cork last year about this.
01:22:32.000 They are starting to push off because the Irish defined themselves for so many years as being anti-colonial because they were anti-British.
01:22:40.000 And then so they were like, oh, we'll just take the side of like everyone else who's anti-British, like the Palestinians and everyone in Africa and everyone in the Middle East.
01:22:50.000 And we'll let them all in.
01:22:51.000 And it's like, oh, shortening the Irish code everywhere.
01:22:55.000 Who are we to say you can't come to Ireland?
01:22:57.000 Then, you know, Shorn, Shorn, Shorn.
01:22:59.000 Jesus, Jesus, Land Six.
01:23:02.000 And so it's like, did you just use the Lord's name in vain?
01:23:05.000 We don't do that.
01:23:07.000 what i do love is ireland so i you mentioned that jack and ireland got very attached to the idea that they are like that that is starting to shift though But Ireland got very attached to this.
01:23:17.000 So the thing is, Ireland is like pickled the country.
01:23:20.000 And they got very attached to the idea that because they were pro-third world, like pro-Palestine, that they were this like moral superpower in the world.
01:23:27.000 They had so much like credibility.
01:23:29.000 And then recent events have happened.
01:23:31.000 And this is a headline in the Irish Times.
01:23:33.000 Was Ireland's reputation as a tiny diplomatic superpower just a flash in the pan fantasy?
01:23:41.000 So they actually took like pride in this.
01:23:43.000 They apparently believed that Ireland was this like country people listened to.
01:23:49.000 Yeah, because you know, when I when I spent time in Europe, and I remember everybody multiple times, but there was, I actually lived over there for a bit.
01:23:57.000 They everybody would always say that, oh, the Irish are the nicest, ranked as the nicest country in Europe.
01:24:04.000 And I kept going like, well, that's, you know, it's funny that I hear this.
01:24:08.000 So many people would tell me this, that it was obviously kind of like a known thing.
01:24:11.000 And I, I think if you internalize the fact that you are nice, then you will like culturally start, you know, opting to be nice as opposed to any other attribute, and you just get walked over.
01:24:24.000 I think if you think of yourself as this diplomatic superpower, you're just remember, nice is the lowest of the virtues.
01:24:31.000 Yep.
01:24:31.000 That's a lot.
01:24:32.000 So I can explain this from an East Coast perspective: that East Coast people are not nice.
01:24:39.000 We are, you know, like, like, definitely not nice.
01:24:42.000 Like, that's Philly, New York, like Boston.
01:24:44.000 You will not find nice on the list of our attributes.
01:24:49.000 However, however, there's a difference between nice and kind.
01:24:54.000 And I was actually talking to Libby Evans about this yesterday.
01:24:56.000 And the difference between that is nice is sort of the way you carry yourself, the way you talk, the way, and you see this with Trump all the time, by the way, right?
01:25:04.000 Trump is not nice, but he actually is kind, right?
01:25:07.000 Kind means you follow things through with what you say you're going to do.
01:25:11.000 You help people.
01:25:12.000 You put people's best interest first.
01:25:14.000 You try to do what you can to actually help others.
01:25:17.000 That's being kind.
01:25:18.000 Being nice is like being obsessed with, you know, words, or did you say something in a nice way?
01:25:23.000 Or, oh, did you have a mean tweet?
01:25:25.000 You know, no, Trump doesn't care about mean tweets.
01:25:28.000 He cares about getting the job done and actually helping people.
01:25:31.000 That's being kind.
01:25:33.000 And so I think people mistake being nice and being kind.
01:25:37.000 And by the way, you want to go all the way back to it.
01:25:40.000 The man himself, JC, Jesus Christ in the Bible is not always nice, right?
01:25:45.000 You get out, you pit of vipers, you den of vipers, overturning the money lenders and all the rest of it.
01:25:53.000 There's so much there.
01:25:56.000 And the difference, but is he being kind?
01:25:58.000 Of course he is.
01:25:59.000 He's being kind by rebuking the sinner.
01:26:02.000 I agree with all that.
01:26:03.000 Yeah.
01:26:04.000 I think nice and kind is a super important distinction to make because actually a lot of Americans, we think of ourselves as nice.
01:26:10.000 HR ladies are always nice.
01:26:12.000 They are rarely kind of nice.
01:26:13.000 And they're vicious.
01:26:14.000 Yeah, what's the one in Harry Potter?
01:26:16.000 Like everybody meets her?
01:26:18.000 Yeah, Umbridge was not always nice, but she's often superficially nice.
01:26:18.000 Umbridge.
01:26:22.000 Yeah.
01:26:23.000 That's what I'm saying.
01:26:24.000 Nice is superficial.
01:26:26.000 You don't want nice.
01:26:27.000 I mean, yes, nice is good to be in general.
01:26:29.000 Like you want to be polite, but there are times where nice should not be a priority.
01:26:35.000 Being kind should be a priority.
01:26:37.000 And I think, I just really think that a lot of people get this wrong.
01:26:41.000 By the way, I'm getting a little bit of breaking news in that the ATF, speaking of the FBI firearm that was stolen last night.
01:26:51.000 I'm just getting some word in that ATF has arrested the man who stole those firearms last night from the United States.
01:26:57.000 Consequences.
01:26:59.000 Accountants.
01:27:01.000 Don't steal federal weapons because those are actually really easy to track and federal weapons lockers.
01:27:01.000 I am worried.
01:27:06.000 Like, don't do that.
01:27:08.000 Like, just so I can't do it, don't do that, but like, don't be stupid because that's really stupid.
01:27:14.000 I can tether this to the Ireland topic.
01:27:16.000 So an interesting problem the British had in Ireland late in their ownership of it is there would be people who would do crimes against British like authorities in Ireland or they might attack police and they couldn't get convictions from Irish juries.
01:27:33.000 Irish juries would just do jury nullification on things.
01:27:36.000 And so the British had to start, I can't remember the name of the law, but they basically had to start essentially saying in these areas where this is a common problem, we basically have to suspend the right to a jury trial and allow magistrates to basically act, you know, have judicial rulings on this because it's the only way to have actual criminal justice.
01:27:36.000 Yeah.
01:27:54.000 And I wonder if we have to worry about that.
01:27:55.000 Like, what are you going to do in Minneapolis if you just can't impanel a 12-member jury?
01:28:00.000 You don't even need.
01:28:01.000 It's not Somali jurors I'm worried about.
01:28:03.000 It's Eastern Virginia.
01:28:05.000 You know who.
01:28:06.000 Renee Goods on a jury.
01:28:07.000 Yeah.
01:28:08.000 Yeah.
01:28:08.000 And they're the ones who are just going to say category.
01:28:11.000 We have this in D.C. already.
01:28:13.000 Of course.
01:28:13.000 Yeah, you do.
01:28:14.000 And so you might need to say, we're going to need to move these jury trials to new locations, or you're going to have to find other ways to make people fear the law.
01:28:22.000 The more tribal we become, the less useful juries become.
01:28:25.000 And it's bad because the jury is a great thing.
01:28:28.000 Yeah, but not every country has to be a lot of people.
01:28:30.000 But not even the jury system is, again, I believe, a British system that comes from British common law.
01:28:38.000 And again, it derives itself, like so many other American traditions, derives itself from a specific group of people.
01:28:45.000 And it's like, oh, well, we follow these procedures.
01:28:47.000 I was just talking about procedure, rules.
01:28:50.000 There are other groups of people in other cultures around the world.
01:28:53.000 And go watch a Nick Shirley video if you want to learn more about those, that don't care about rules and don't care about honor and don't care about stealing theft from another tribe.
01:29:05.000 This feels like a very good place to leave us.
01:29:09.000 Leave the show.
01:29:12.000 Stop importing people that hate us.
01:29:14.000 Please, politicians, vote for a mortar.
01:29:17.000 And by the way, one of my favorite things that happened this week, Trump blocked 70 visas from 75 different countries.
01:29:24.000 So that third world travel ban, and it included places like Brazil, which was fascinating.
01:29:29.000 Yeah, there were countries in there that weren't really third world.
01:29:32.000 Yeah, but I'm okay with it.
01:29:34.000 I mean, listen, the more than, I mean, I would do an immigration moratorium.
01:29:38.000 So I'm just like, I don't care which country gets added to the list, really.
01:29:42.000 I would do a net zero, though.
01:29:43.000 You know, two to 300,000 people leave the United States every year.
01:29:47.000 And it's like, okay, well, if somebody leaves and they indicate that they are relocating somewhere else, then I will take somebody to replace them.
01:29:55.000 But I don't need extra.
01:29:56.000 So I would do that for 10 years.
01:29:58.000 That would be my vote.
01:29:59.000 But anyway, absolutely.
01:30:01.000 And that happened.
01:30:02.000 And Trump is saying that he's going to be defunding Sanctuary City starting February 1st.
01:30:07.000 So we're going to give a clap.
01:30:09.000 Can we get a clap from the studio?
01:30:12.000 So that's the whole point.
01:30:14.000 Stop importing cultures that don't assimilate, that you can't have zero compatibility with.
01:30:21.000 And that's just how I feel about that.
01:30:23.000 The Irish need to get there.
01:30:25.000 The Brits need to get there soon.
01:30:26.000 The Germans maybe are getting there.
01:30:28.000 No, it's looking bleak in Germany, man.
01:30:31.000 Poland, on the other hand, does not have this problem at all.
01:30:34.000 Based city.
01:30:35.000 Based city.
01:30:37.000 You go east of Berlin, and people are like, they're like, yeah, why would we want people who are not like us to come to the country with not interested in that at all?
01:30:46.000 Thank you.
01:30:47.000 And Hungary, what else other countries are?
01:30:50.000 Denmark, actually has gotten pretty good.
01:30:52.000 Denmark's kind of based.
01:30:53.000 Denmark's kind of based on immigration.
01:30:55.000 We shouldn't bully Denmark too much.
01:30:56.000 There's been a little bit of trouble in Hungary, by the way.
01:30:59.000 That election.
01:31:00.000 He's been in power 20 years.
01:31:02.000 It's hard to be in power almost 20 years.
01:31:04.000 Yeah, eventually you're going to make enough enemies.
01:31:06.000 You have to tell enough people know over the years.
01:31:09.000 They got bones to pick with you.
01:31:11.000 All right.
01:31:11.000 Well, this was an amazing episode.
01:31:13.000 Jack, well done.
01:31:15.000 Thank you for zooming in.
01:31:17.000 Tyler, thank you as always.
01:31:19.000 You know, you're making a lot of time for us, even though you're running turning point action.
01:31:23.000 Got a lot of news on that.
01:31:24.000 We should do like a.
01:31:25.000 We've got a lot coming out this next week, actually, with some big announcements happening in New Hampshire and Nevada.
01:31:30.000 So you should go on the Charlie Kirk show on Monday.
01:31:32.000 Monday.
01:31:33.000 Let's talk about it.
01:31:33.000 We're going to announce it on Monday.
01:31:34.000 Oh, good.
01:31:34.000 Let's do it.
01:31:35.000 All right.
01:31:36.000 In the meantime, Jack, you know how to do it.
01:31:39.000 Keep committing.
01:31:40.000 Ladies and gentlemen, go out there and commit more thought crime.