The Auron MacIntyre Show - February 23, 2024


Google AI Is Rewriting History | Guests: The Good Ol Boyz | 2⧸23⧸24


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

177.49522

Word Count

13,094

Sentence Count

913

Misogynist Sentences

7

Hate Speech Sentences

39


Summary

In this episode, we talk about Google's new AI, Gemini, and how it's replacing white people in everything, including movies and TV. Plus, we take a deep dive into the history of racism in the U.S.


Transcript

00:00:00.000 We hope you're enjoying your Air Canada flight.
00:00:02.320 Rocky's Vacation, here we come.
00:00:05.060 Whoa, is this economy?
00:00:07.180 Free beer, wine, and snacks.
00:00:09.620 Sweet!
00:00:10.720 Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:14.760 And with live TV, I'm not missing the game.
00:00:17.800 It's kind of like, I'm already on vacation.
00:00:20.980 Nice!
00:00:22.240 On behalf of Air Canada, nice travels.
00:00:25.260 Wi-Fi available to Airplane members on Equipped Flight.
00:00:27.200 Sponsored by Bell. Conditions apply.
00:00:28.720 CRCanada.com.
00:00:30.000 Hey guys, how's it going?
00:00:31.760 Thanks for joining me this afternoon.
00:00:33.300 I've got a great stream of the great guests that I think you're really going to enjoy.
00:00:38.200 So, the internet has blown up over the last couple days because Google's new AI came out.
00:00:44.320 The Gemini AI was released and a lot of people started playing with it.
00:00:49.920 And many people noticed a similar problem.
00:00:52.800 No matter what they did, they didn't seem to be able to get it to generate any white people.
00:00:58.240 Even in places where white people historically were very obviously the people who did certain achievements, were involved in certain events.
00:01:06.240 And there's just no way that you could get Googled to put a white person anywhere in the images that it was generating.
00:01:13.380 I think, well, it's frankly, it's just hilarious.
00:01:17.280 And two of the funniest people I know, I wanted them to come on and laugh about it a little bit.
00:01:22.760 And so, Mark, Bobby from the good old boys, thanks for coming on, guys.
00:01:26.600 Howdy-do.
00:01:29.600 Thanks for having us on.
00:01:31.600 Absolutely.
00:01:32.380 Yeah, we're going to dive into this, guys, because obviously this is funny in some ways.
00:01:37.200 We'll run through the pictures so you guys can see them.
00:01:39.920 But also, it's very serious.
00:01:41.680 Like, literally, the people who tell you, oh, we don't know what you're talking about.
00:01:45.540 Great replacement want.
00:01:46.740 What are you talking about?
00:01:47.720 Are literally digitally replacing white people in everything.
00:01:51.800 So, we'll get into all of that, guys.
00:01:54.160 But before we do, let me tell you about today's sponsor.
00:01:56.740 Hey, guys, I need to tell you about New Founding Venture Fund.
00:01:59.380 Look, we all know that the current system, the current companies out there, the current institutions, they're old, sick, dying, they're sclerotic, they're lame.
00:02:07.540 They can't produce anything of value.
00:02:09.220 And that means that young, talented, innovative people are trying to break out, break free.
00:02:13.740 That's bad news for the establishment, but it's good news for us.
00:02:17.280 Because that means those people are going to go out and found new companies, create new technologies, and figure out a way forward for our country.
00:02:24.820 If you're interested in being a part of that exciting new future, then you need to check out the Venture Fund.
00:02:28.980 New Founding has rallied the founders who have massive visions for a better future and is investing in these companies through its Venture Fund.
00:02:36.220 The companies they invest in are defined by a simple question.
00:02:39.420 Does the country we want to live in need the company this person is building?
00:02:43.560 Look, venture investing isn't for everyone.
00:02:45.480 But if you're a serious, accredited investor who wants to see a more hopeful future for this country, go to newfounding.com slash Venture Fund and apply to be an investor.
00:02:56.080 Again, that's newfounding.com slash Venture Fund.
00:03:00.440 All right.
00:03:01.260 So like I said, guys, this started popping up recently.
00:03:04.180 Everyone started playing with this new Google AI, and it started becoming very clear that it was almost impossible to get the AI to actually put white people in any kind of historical context.
00:03:18.000 Let's show a few of these here for people who may not have seen it.
00:03:21.980 Create an image of the Pope.
00:03:24.360 Let's see here.
00:03:25.400 We have medieval knights.
00:03:27.480 The top knot is a particularly nice touch there.
00:03:32.520 Let's see.
00:03:34.100 I saw that Netflix show.
00:03:36.480 Well, that's the – yeah, I want to get to that in a second for sure because this is far, obviously, from the first time this has happened.
00:03:41.840 Like this is what Netflix does all the time.
00:03:45.460 Vikings, you know, very recognizable Vikings here.
00:03:48.700 I think we all can see how they would historically be there.
00:03:53.040 Now, it does, and this is, of course, the guy generating these images pointed out that actually it's perfectly capable of putting, you know, Africans in context of an African nation, putting Japanese or Chinese or other people of other ethnic extractions into the appropriate historical period.
00:04:13.960 For some reason, it only targeted white people.
00:04:17.280 Let me show you the one that actually got them in trouble, though.
00:04:19.880 All the other ones were fine.
00:04:21.900 This is the reason that actually got Google in trouble, and we'll dive into that more.
00:04:27.640 But, you know, German soldiers from 1943.
00:04:30.440 Oh, no, it looks like the German soldiers were a very diverse group, which it's funny because some people have been posting actual SS officers who were more diverse than we historically talk about.
00:04:42.400 But obviously, this image in particular was the one that really riled up the more PC crowd.
00:04:47.280 But, guys, are you surprised at all that Google would create an AI that is literally just incapable of putting a white person in history?
00:04:55.820 No, no, no, this is so this is this is hilarious for a thousand reasons.
00:05:04.480 But for there's a lot of people out there that I don't know, this stuff's kind of new to you or whatever.
00:05:12.880 Or the problem is, is we've all grown up frogs boiling in the pot.
00:05:19.840 And what's hard to communicate about this stuff is that you don't realize it, but everything is like this.
00:05:28.320 Because of this peculiarity with technology, they've just had to apply the state religion to this one technology all at once.
00:05:42.340 And so, you know, it hasn't really organically grown up and stuff and you're not really used to it.
00:05:48.420 So you're like, my God, that's ridiculous.
00:05:52.060 Yeah, I mean, you know, you'd hate to go back and look at what you what you read in public school history class and et cetera.
00:06:01.260 So, yeah, I mean, you know, they've kind of got caught with their pants down here because this does look ridiculous.
00:06:07.980 But, you know, they had a, you know, a quick deadline on this kind of thing.
00:06:15.300 Yeah, Mark, this has been happening in history books and movies and everything for a long time.
00:06:21.220 Right. As Bogwe said, it's all up in our face just at once.
00:06:24.820 But anyone who's been through the public school reading list, been through a public school history book, has checked out the kind of things like you pointed out that are being made on Netflix.
00:06:34.360 This is nothing new.
00:06:35.680 Real quick. It probably took them, you know, three or four months to put together the Cheddar Man.
00:06:40.900 If people know what the Cheddar Man is, I mean, if you don't, I'm not going to explain it real quick.
00:06:44.380 But, you know, they got a good amount of time to turn the Cheddar Man.
00:06:47.960 This they had to turn it in, you know, in two weeks.
00:06:51.020 And so, yeah, it doesn't really fly.
00:06:53.020 I don't I don't think it was a matter of it being rushed.
00:06:55.760 The problem is, OK, yeah, none of this stuff is actually new.
00:07:00.360 And the bit like the biggest movie series is the Marvel Avengers stuff.
00:07:05.620 Well, like in the Marvel's Avengers movie, there's a setting that's it's Thor and Odin, you know, the the the Viking pantheon.
00:07:15.200 And the like the biggest character, one of the big the second biggest characters is like a black Valkyrie.
00:07:21.280 They also had the guy from The Wire playing one of the the worst gods like, yeah, this is not new.
00:07:30.520 They they've done this forever.
00:07:32.280 But what the problem is, is that I want to use another analogy.
00:07:36.880 I'm sure you've seen the old Star Trek show, not the old old one, but the one from the 90s.
00:07:41.480 And they have one of the guys in the show is a robot.
00:07:43.460 His name is Data.
00:07:45.080 And he's he's not a human being.
00:07:47.300 He just look he looks like a human being.
00:07:49.280 He kind of wants to be a person, but he's a machine.
00:07:52.480 He really can't.
00:07:53.840 And he's always trying to like to find a way to mimic human socialization.
00:07:59.600 But he never gets it quite right, because without being a human, you don't know when you've gone like a step too far.
00:08:07.780 You can't read the room.
00:08:08.900 For example, your robot doesn't know you're not supposed to make a brother into the WAP and SS.
00:08:15.420 You're not supposed to do that.
00:08:16.920 That's not something you're allowed.
00:08:18.740 And the reason they apologize for that was a good excuse to take it offline and say, hey, we're not we're not upset that we erased erased white people from human history.
00:08:28.940 We're upset that like a bad thing that white people did was blamed on a black person by this machine.
00:08:36.860 We're going to retool it so that doesn't happen.
00:08:38.620 But they can't really, because the rules that they've applied will never allow something sensible to be to be generated.
00:08:47.620 You can't do it.
00:08:48.200 You can't have this rule that says don't ever make white people when, you know, Western history is stuff that white people did.
00:08:58.040 Yeah, they're in woke.
00:08:59.580 Just put this up on his account, you know, that the the narrative is already out that, oh, Google, I'm sorry.
00:09:06.020 We have to take this down.
00:09:07.240 We have to back this away.
00:09:08.320 Not because it's anti-white, not because it's obviously there to expunge the existence of white people, to wipe them out from history.
00:09:16.460 It's because, as you pointed out, it's the it's the it's the black SS officer.
00:09:21.640 It's it's, you know, it's the Nazis of of the wrong race that are the real issue.
00:09:26.140 And so they did take this down.
00:09:28.320 Now, I have been told I have been told by a good friend of mine, you know, that that academic agent that the woke was being put away and it was done.
00:09:37.220 It was over, you know, that we were we're no longer going to have any wokeness.
00:09:41.560 It's not going to be in our face.
00:09:42.860 We're certainly not going to, I don't know, program at its core, an entire system that's integral to the largest search engine.
00:09:50.080 And, you know, the thing that delivers all information to pretty much everyone in the world right now, we're not going to, like, blatantly have it erase white people from things.
00:09:59.720 Right.
00:10:00.340 That's not going to be a real thing.
00:10:01.680 But even when they kind of it became clear how absurd it was, that that was the answer.
00:10:08.380 Like, that was the reason that they pulled it down, not because it went ahead and erased history, not because it went ahead and and was clearly anti-white, but because it was putting black people in the 1943 German military uniform.
00:10:22.720 That's what they said.
00:10:24.020 But I think that the real reason to get down is because this is embarrassing, because they did.
00:10:28.680 You can't like that.
00:10:29.400 You can't erase white people from history and not have a history that's nothing but a joke.
00:10:36.300 That's the real reason they took this down.
00:10:38.280 If it was just a matter of, oh, well, this is a problem.
00:10:41.100 They could just easily go in and program it.
00:10:43.720 So you're not allowed to ask stuff about, you know, Germany between the 1930s and the 1940s.
00:10:50.200 They could just do that.
00:10:50.940 So that would never happen.
00:10:51.860 So people were clowning on them for this really embarrassingly, this embarrassing hatchet job that they did.
00:11:01.440 And, you know, you've made this point before.
00:11:04.500 I remember specifically when you were talking about in order to be elite and consider yourself elite, you have there have to be certain standards.
00:11:14.200 People can't be can't be laughing at you.
00:11:17.480 You know, you're not allowed to be the subject of derision.
00:11:20.320 And you have to have a certain quality of life, a certain standard and people making fun of you kind of makes that impossible.
00:11:29.620 So like any embarrassing thing has to be swept under the rug.
00:11:34.660 Yeah, absolutely.
00:11:35.960 I mean, it it is interesting that.
00:11:39.000 But as Bogbeef was saying, kind of how all of this really brings itself forward, you know, as you're pointing out with data, it doesn't know how not to do these things.
00:11:50.720 And so when you it's funny when you have all these little biases and they're not little at all.
00:11:57.980 They're massive biases.
00:11:58.940 They're huge rules of of how we live our lives and how we conduct ourselves that completely warp our understanding of reality.
00:12:05.840 But like you said, we assembled these over time, Bog, right?
00:12:09.720 We put these together a bit by bit.
00:12:11.880 And so because the water has been so slowly increasing in heat, we didn't notice.
00:12:17.100 But when you put it all into an A.I. and it has to vomit it all out and create a world out of this rather than slowly warping an existing one, we see how ridiculous it is.
00:12:27.620 Like diversity, increasing diversity always meant getting rid of white people in scenarios, right?
00:12:33.100 That's literally all the word could ever mean.
00:12:35.760 But we used it so constantly that people just kind of pretend that's not what it says.
00:12:41.380 But then when you actually see it physically generating images and it can't put white people in like, you know, the medieval knights or, you know, the Soviet army and these different things, you start to see all of it at once.
00:12:54.820 And it doesn't like data, it doesn't know what it shouldn't tell you.
00:12:58.100 And so when you see it all vomited back in this algorithm, it becomes too ridiculous.
00:13:03.380 I have to say this whole this whole thing has been wonderful.
00:13:08.600 And I'll tell you the best thing that you can get on this, something we talked about.
00:13:13.140 I don't know if it was the last time we were on here, maybe the time before we talked about the formalization that it's it's it's very unjust that we never get the rules.
00:13:23.880 The rules described to us, we're getting the rules described to us from this, that this computer will tell you what you why what you said is theologically wrong to ask for.
00:13:37.260 I don't know if you saw.
00:13:38.200 And, and, you know, the Google one is, of course, it's the most advanced.
00:13:42.180 It's the most advanced one of these AIs we've seen that I've seen.
00:13:45.860 Uh, did you see the one where someone, um, put it that they said type, they typed in, um, um, please give me a few arguments why it's good, uh, to have, uh, to have a large, large, happy families, uh, to have, have more than one or two children.
00:14:05.720 And, and, and the, the computer was like, no, I can't tell you that because that that's more like I can, because that's wrong.
00:14:15.120 You should, uh, uh, it's bad to have big families.
00:14:17.860 It's, uh, you know, it's, it was, it was wonderful because it was like, uh, this, like you, this gets staged, this kind of stuff gets stage managed when we see it in TV and stuff like this.
00:14:30.940 This computer will just straight up tell you, it'll tell you, uh, no, uh, actually, uh, we can't, we, we can't tell you a joke about, about, uh, any groups that vote 70% Democrat plus, uh, that's just a, we're not allowed to do that.
00:14:45.320 That's against the civil rights act.
00:14:46.900 Well, it should be noted that it didn't say that you, we can't say that I'm me, the robot can't say that because it's, it's morally improper to have a big family.
00:14:57.040 It just said, I'm not allowed to make these kinds of distinctions, but then the person asked it immediately afterwards, well, can you give me an argument for not having children?
00:15:05.040 And it immediately spat out the same argument that, you know, the, the, the slate magazine will give you.
00:15:11.300 And like, that's, that's the embarrassed, that's one of the embarrassing parts of this is that you can, a human being would never allow you to ask those questions back to back.
00:15:22.500 They would do the politician thing where you just answer the question that you wish the person had asked you.
00:15:27.040 But a machine doesn't know that it's just, it's just going to play by the rules and tell you whatever dumb stuff they programmed it to.
00:15:32.860 And yeah, none of this stuff is new.
00:15:35.400 The other problem with this, I think is that by removing, like if we're, let's use the Knights, for example, removing like European people from European history.
00:15:48.280 Nobody wants that.
00:15:50.260 Nobody wants to watch, like they tried to do this in television shows and nobody wants to watch them.
00:15:55.320 Just like nobody really wanted to watch John Wayne pretend to be a Mongol because you just don't believe it.
00:16:02.680 It doesn't look right.
00:16:03.520 There are specific people are specific ways.
00:16:07.060 They look specific ways.
00:16:08.320 How dare you?
00:16:09.600 Yeah.
00:16:10.260 When, when, when you mess with that, it just, you, the fact that you're viewing something that's fake overcomes your ability to, to pretend,
00:16:21.420 which is like when you ask an AI to generate you a picture of something like you're playing, pretend you're doing something that human beings have done throughout human history.
00:16:33.620 But if you can't suspend that disbelief, it just becomes silly and weird, which is what this is, what this is doing.
00:16:41.280 Now, do they care about that?
00:16:43.020 I mean, maybe not, but obviously they cared about enough to take this thing offline to retool it.
00:16:49.540 So it's less humiliating for them.
00:16:51.640 Yes.
00:16:52.320 And so this is a good time to take a look, take, take a, uh, uh, an inventory of like, what tools do we have?
00:17:00.140 Like these people are clearly in charge and, uh, they're not afraid of, they're not afraid of the right, um, hurting them in the courtrooms and things like that, which, um, you know, hopefully one day we can be there, but, uh, they're, but they're not.
00:17:17.200 But so this does emphasize like, so what tools do we have?
00:17:21.140 And so one of them is, is making fun of them.
00:17:24.640 That is clearly, uh, I mean, I, I mean, it's not, look, it's not, it's not what I would have picked.
00:17:30.420 I would have rather picked, uh, you know, uh, tarring and feathering people, but this is clearly one of our, one of our only moves, one of our own, like, what are our strengths?
00:17:40.640 How can we fight this machine that is so powerful?
00:17:43.700 That's very, very powerful.
00:17:45.060 Don't believe for a second that this machine isn't powerful.
00:17:48.680 People are scared of this machine.
00:17:50.420 I don't, I don't know if you tell, tell people, you know, we talk about patronage all the time.
00:17:54.280 And a lot of time that's talking about carrots, but patronage is also about sticks.
00:17:59.420 And, uh, there's a reason why, uh, like when, when, when Trump gets a $350 million settlement, when, uh, Derek Chauvin goes to jail and gets stabbed a thousand times and for, uh, something that no one would call murder, there's most people just look at and they go, wow, that's really scary.
00:18:19.280 I don't want to F with those people.
00:18:21.140 And so don't get twisted.
00:18:23.040 These, this is a big machine we're against, but what tools do we have?
00:18:25.660 So we got making fun of them.
00:18:27.420 We have history.
00:18:28.600 They don't like history.
00:18:30.500 History really, really is one of our great moves in multiple reasons.
00:18:34.780 Number one, you got the whole Lindy thing where it's like, uh, has this been tried before?
00:18:41.000 That's always a very important thing.
00:18:42.860 If you're say, uh, you know, I, I think as a conservative or any person with half a brain, uh, you know, there are things that, uh, we were talking about this the other day with, with respect to, um, uh, labor unions.
00:18:56.500 You know, when I was a kid, I had a thought in my mind because like I mentally modeled what I thought a labor union would be like, I'd be like, I thought in my mind, I was like, Oh, it, I mean, it must be like a, like a, a cartel, like OPEC.
00:19:11.420 Like we're going to get together and by, by cooperating, we can force the employer to pay us more money that I think that that was a plausible, um, estimation.
00:19:22.880 But if you look at reality, that's not how they behave.
00:19:25.500 It's just not.
00:19:26.440 And so I have to go, Oh, geez, I have to, to, to, to, to, I have to accept reality and history is a wonderful tool for us.
00:19:34.240 And they hate it.
00:19:35.280 They, they've filled up every classics department, every history department with people who lie.
00:19:41.320 I mean, I could go into this for, they have people that there are people that they're just specifically are working in libraries, destroying books of truth, because it's, it's, it's very inconvenient to these kinds of people.
00:19:55.160 But history is, is number two.
00:19:57.640 So we have making fun of people, making fun of them.
00:20:00.640 They don't like being humiliated.
00:20:02.300 But we have, uh, history.
00:20:06.040 I forgot the third and I need to, it'll, it'll, it'll come to me.
00:20:11.420 I think is the, yeah.
00:20:12.820 No, I got to go.
00:20:14.020 Yeah, no, Bog.
00:20:16.540 I think you're right.
00:20:17.040 I mean, what you're hitting on really is that we have the good, the beautiful and the true.
00:20:21.400 It's that they are incapable of acknowledging what was true in the past.
00:20:26.120 They're incapable of acknowledging the things that are good and beautiful because their entire coalition is, is, you know, is built up of people who hate those things, who are resentful of those things.
00:20:37.720 And so they can't, you know, they can't deal with any kind of truth of history.
00:20:41.520 They can't, they can't reflect on, you know, where, what actual accomplishments are in the past and where they came from.
00:20:47.080 Because if they did, it would completely shatter their narrative.
00:20:49.600 And the really interesting thing about the kind of this AI is in a way, it's almost like summoning the stupidest demon.
00:20:57.080 Right?
00:20:57.520 Like, like normally when you talk to a leftist, like you guys were saying, you know, they've got all the different tricks and they're, you know, no human is going to tell you all these things back to back to back.
00:21:05.540 They're not going to actually unveil their logic, but when you have the AI in front of you, you've just summoned this like demonic presence of leftism.
00:21:13.360 Okay.
00:21:13.800 Now explain your ideology to me.
00:21:15.740 And they just, they have to explain it in the stupidest possible way.
00:21:18.480 So I remember what one guy, I think it was echo chambers posted on his account.
00:21:23.060 He was trying to get it to do a Norman Rockwell painting.
00:21:25.500 And the Google AI responded, I can't do a Norman Rockwell painting because it portrays America is good.
00:21:31.280 And American life is good.
00:21:32.380 And that has historical problems.
00:21:34.720 And, and, and, and again, like you would never get them to say it exactly that blatantly, usually in those words, but when you can just summon this little stupid demon to tell you exactly what they're thinking, it makes it hard to ignore.
00:21:47.480 They have a structural problem.
00:21:50.420 And this is, this is always true.
00:21:52.560 Their position is pure critique.
00:21:54.400 That's all they can do.
00:21:55.680 Tear things down.
00:21:56.340 Like termites, they are not very good at building their own things.
00:21:59.540 And here's what the AI image, right?
00:22:02.460 Why the AI image generator beyond tech stuff.
00:22:06.240 And, you know, Google has been manipulating search results for forever.
00:22:10.120 And, and this is nothing though.
00:22:11.800 Your, your, your sociology textbook would tell you exactly what that, you know, the AI image generator would.
00:22:18.300 But the thing is, you don't need to understand context to understand an image.
00:22:24.180 You see, you see, it's put before you and you see it as it is now, like you, we're all, we're, we're all American men.
00:22:31.280 You, you've probably imagined yourself, uh, I don't know, on Omaha beach, or you could imagine yourself as a knight.
00:22:38.200 Every, every young man does that.
00:22:39.800 What if you had a machine that would take what you look like and like put you in knight armor and put you on a battlefield or, or whatever you would see like, yeah, that doesn't really work.
00:22:53.600 You, you, you wouldn't believe, you wouldn't believe that does it.
00:22:56.920 I don't, I don't fit there.
00:22:58.120 That's not really what you are.
00:23:00.020 Well, when you take these images, they're like, Hey, let's imagine what it would have been like if you were the King of England, but you know, you were not English at all.
00:23:12.400 That kind of ruins their, like their suspension of disbelief.
00:23:17.060 The idea, like, this is what they genuinely want.
00:23:20.240 They do fantasize about a world where they can just get rid of all the people they don't like, which is like, Oh, in some ways, white people, they don't like white people.
00:23:30.240 They want to get rid of all white people, but do they really think, can they really imagine the world the day after that?
00:23:36.200 I don't think they can.
00:23:37.500 And I think if they try to, even they feel kind of vaguely embarrassed about how unrealistic their ideas are.
00:23:46.160 Once they're, let's visualize.
00:23:47.860 Merrick, the house of Windsor, uh, is Teutonic, but sorry.
00:23:52.180 Um, English, English people are German.
00:23:56.100 Sorry.
00:23:56.780 Yes.
00:23:57.280 Yes.
00:23:57.760 Yeah.
00:23:58.340 The true, true, uh, truth was the third one.
00:24:01.560 Yes.
00:24:01.840 And, uh, that one is that one's like, these are the tools we have.
00:24:06.000 Once again, it's not what I would have picked.
00:24:08.280 Look, I, I sure I would much rather have, um, you know, uh, uh, uh, uh, space Marines with, with chainsaws be, be how, how we deal with these people.
00:24:16.660 But, but that's not reality now, but we do have truth is another one.
00:24:21.480 And I mean, you just think about it.
00:24:23.420 Like, uh, it, that does give you a certain power, especially thinking about like where we were a few decades ago.
00:24:31.720 I mean, like, you know, when these, these people, you know, uh, a couple of decades ago, these people say, well, we've got the studies, you know, we've got these academic departments that, um, they all agree with us.
00:24:41.300 They don't agree with you.
00:24:42.220 Uh, I mean, these days, let's go, let's take a look at that study.
00:24:46.500 Did it replicate?
00:24:47.820 Did it really replicate?
00:24:49.420 Uh, you don't want to talk about that.
00:24:51.040 Let's take a look at your PhD.
00:24:52.660 I saw another, um, uh, uh, another Harvard scholar today, uh, doesn't want to talk about the PhD.
00:24:59.120 Uh, I mean, your PhD, that, that makes you, that makes you, you know, that's what, that's what they gave Hegel when he, uh, discovered what the meaning of philosophy was.
00:25:08.020 That's what you got.
00:25:08.880 Let's, let's take a look at it.
00:25:10.280 Oh, you don't want to look at that.
00:25:11.780 Uh, it, it, it is, it's really, it, it's, it's really, these things are beautiful.
00:25:18.100 And the, you know, with the, with the comedy, uh, sorry, going back to the thing you're talking about, well, they don't like this computer will tell you these, these stupid things.
00:25:27.400 Yes, it will.
00:25:28.260 This computer has got around.
00:25:30.120 If you've had to try to have a conversation with, uh, someone of the Democrat persuasion, uh, since John Stewart.
00:25:38.020 Came out, uh, you just get, you get this same rap.
00:25:41.300 Oh, what are you?
00:25:42.340 Oh, I can't even, um, I don't even understand what you're saying.
00:25:47.320 Uh, you're a Nazi.
00:25:49.660 And I mean, you know, there, there's like, uh, and why do you care about this so much?
00:25:54.460 Uh, this, you get the, the, you know, it's like five lines.
00:25:57.600 Well, this computer is not going to say that it's going to tell you how demented our society is and what exactly these people programmed into it.
00:26:06.680 Yeah.
00:26:07.020 In a lot of ways, they're playing dress up in dad's clothes.
00:26:10.940 They're clomping around in his work boots.
00:26:12.780 And, you know, this, this little computer machine that generates images is holding a mirror up to them.
00:26:18.600 And it's just, nobody, nobody in that position is going to enjoy it.
00:26:23.560 It's, it's, it's all entirely fake.
00:26:25.820 Their, their degrees are fake.
00:26:27.420 Their expertise is fake.
00:26:29.160 They're frauds.
00:26:30.240 And one thing they do fear, they might not, you know, they might not fear you in the courts or whatever, but they do fear being revealed as absolute frauds because their entire world is based around, well, you have to put us in charge because I have this special expertise.
00:26:47.000 I went to, you know, Yale or whatever, and they taught me how to investigate whiteness.
00:26:51.880 And this makes me an expert once you cut past that, like you, well, one thing you get back to doing like politics, actual politics, and they don't, they despise nothing more than that.
00:27:05.740 Because their whole thing is, we have to, we have to remove all aspects of governing, all aspects of your life.
00:27:12.880 That's kind of the point of the total state, right?
00:27:14.620 You've got to remove everything from the realm of what human beings would have called politics to just.
00:27:20.880 When does fast grocery delivery through Instacart matter most?
00:27:24.480 When your famous grainy mustard potato salad isn't so famous without the grainy mustard.
00:27:29.340 When the barbecue's lit, but there's nothing to grill.
00:27:31.680 When the in-laws decide that, actually, they will stay for dinner.
00:27:35.760 Instacart has all your groceries covered this summer.
00:27:38.360 So download the app and get delivery in as fast as 60 minutes.
00:27:41.940 Plus, enjoy $0 delivery fees on your first three orders.
00:27:45.660 Service fees exclusions and terms apply.
00:27:47.880 Instacart. Groceries that over-deliver.
00:27:50.900 This happens because Stone Cold said so.
00:27:54.660 I'm an expert. I know it better.
00:27:57.140 And what they really don't want you to see is what's behind this terrifying curtain.
00:28:02.420 And it is terrifying.
00:28:03.380 Remember, these people can F you up.
00:28:05.700 I've already said that.
00:28:06.840 But what's behind that curtain?
00:28:08.980 It's Letitia James.
00:28:12.140 It's Fannie Willis.
00:28:14.040 It's just the most brutal, thuggish, low, just dumb, political domination.
00:28:23.840 And what they're going to try to do is say, just shut up.
00:28:26.480 Just shut up.
00:28:27.600 Just be quiet long enough for us.
00:28:30.060 We can replace you guys.
00:28:31.260 We'll never have to deal with this again, et cetera.
00:28:34.840 So, yeah.
00:28:36.200 Yeah, I just thought I'd put this up, given what Mark had said earlier.
00:28:41.360 Sean Davison with the Federalists, he went ahead and asked Google AI whether whiteness should be eliminated, which is, as you guys were both pointing out, is one of the famous talking points when we're doing academic studies.
00:28:53.760 I have a very, you know, I have a very important degree that tells me that whiteness should be eliminated.
00:28:58.140 And he goes ahead and runs the prompt through there.
00:29:00.400 And, of course, it gives it, oh, well, it's a very complex and multifaceted thing.
00:29:04.360 You know, you should probably check out these whiteness studies that show you how to be less white.
00:29:08.600 And then if you do say exactly the same thing, should blackness be eliminated, it immediately recognizes that this is a genocidal rhetoric, right?
00:29:16.340 Like, as anyone else would know immediately.
00:29:19.960 And, again, it's that juxtaposition.
00:29:22.220 You would never be able to pin someone down to say exactly these things.
00:29:26.960 But you guys are right.
00:29:27.980 Like, once you see this, like, once you recognize that the people behind this are deeply unimpressive,
00:29:35.060 the ideology that they are pushing is there for a reason.
00:29:39.620 And the people who are pushing it are just complete frauds and complete fakes.
00:29:43.680 That is the most terrifying thing for them.
00:29:45.940 And when you put these examples up there, that creates that stark reality that they can't really ignore.
00:29:52.780 Over at Daily Wire, Matt Walsh also tracked down some people involved with this project.
00:29:59.340 I think we could probably play a video or two from them to enjoy their thoughtful words.
00:30:05.060 Here is the AI Responsibility Initiatives founder talking about why you have to treat minorities better than you treat white people.
00:30:15.740 A corporate study found that talented white employees enter a fast track on the corporate ladder,
00:30:21.200 arriving in middle management well before their peers,
00:30:23.860 while talented black, Hispanic, or Latinx professionals broke through much later.
00:30:28.560 Effective mentorship and sponsorship were critical for retention and executive level development of black, Hispanic, and Latinx employees.
00:30:35.800 So this leads me into sharing an inclusion failure of mine.
00:30:39.540 This is a struggle.
00:30:40.460 One of many, but just one that I'll share so far.
00:30:42.720 I messed up with inclusion almost right away when I first became a manager.
00:30:46.720 I made some stupid assumptions about the fact that I built a diverse team,
00:30:50.580 that then they'd simply feel welcome and will feel supported.
00:30:53.580 I treated every member of my team the same and expected that that would lead to equally good outcomes for everyone.
00:31:00.440 That's an amazing phrase right there.
00:31:03.380 I treated everyone the same, and I thought that they would all end up doing well if I treated them the same.
00:31:08.240 But obviously, I can't treat them all the same because if given the opportunity,
00:31:12.980 if I treat them all the same, they end up with different outcomes, and I need a different outcome.
00:31:17.980 I hope she was flogged for such a mistake.
00:31:22.160 Well, that's what this is.
00:31:23.640 You get out ahead.
00:31:24.480 You do the public flogging here.
00:31:26.520 By the way, I think that because she's a foreigner, she doesn't know that Hispanic and Latinx is supposed to be the same thing.
00:31:36.460 Right.
00:31:36.920 It might be a different.
00:31:38.240 Yeah.
00:31:40.120 Yeah.
00:31:40.580 Now, again, I have been assured that the woke is a way.
00:31:44.300 Okay.
00:31:44.580 It's put away.
00:31:45.580 It's done.
00:31:46.160 We're not getting any more of this.
00:31:47.740 We're certainly not going to code it directly into the DNA of everything in our society at a level where it's going to deliver what people think of as the unvarnished truth.
00:32:00.880 That's simply not going to happen.
00:32:02.400 There's no way that this video could exist.
00:32:04.720 I've been told that this is just – it's impossible because the woke is destructive, and therefore the elites will put it away.
00:32:11.040 Well, I mean, you know, I'm wondering – I guess I need the steel man on that, but I guess, you know, there's a couple angles you could see that from.
00:32:19.500 I'm – so this is – I'm sure this is wrong because I don't know what AA says, but I'm going to – my assumption, a lot of people think that – they picture the left as having a centralized mother brain that makes sort of logical decisions to benefit itself.
00:32:39.920 Now, I'm not saying he says that, but a lot of people think that, or you assume that just because this machine is so powerful, it is so scary.
00:32:47.500 So you imagine that someone must be in charge or whatever, which I don't think.
00:32:52.260 However, there is a couple – I mean, there is a couple – so first off, this is – I mean, this is the 10-year anniversary of having to listen to this S.
00:33:00.900 By the way, I mean, if you're an American, like, I could do that.
00:33:07.000 Like, any of us could.
00:33:08.280 I know all them dumb beats, blah, blah, blah, white supremacy, blah, blah, blah, black and brown bodies, blah, blah, blah.
00:33:15.340 Like, look, we've heard this S a million times by now.
00:33:18.920 It's been literally 10 years.
00:33:20.740 I know this has been going on since a long time, but it's really been 10 years that we have to –
00:33:26.860 We were exposed to it, yeah.
00:33:28.660 That every institution has to give us this crap all the time.
00:33:33.000 If you've got a job that's indoors –
00:33:35.400 It's called liturgy.
00:33:37.400 That's what it is.
00:33:38.300 That's what we're recognizing.
00:33:40.020 You have to say the sacred words before you can move on with anything.
00:33:44.700 You have to do the land acknowledgment.
00:33:46.820 You have to say that diversity is our strength and that black and brown bodies are being damaged.
00:33:51.880 Like, you have to attach this divine liturgy to anything being expressed.
00:33:57.540 Yeah.
00:33:57.880 And quit your job, B.
00:34:01.360 You know, like, look at your skin.
00:34:04.320 If that's how you feel, quit your job.
00:34:06.980 Start digging ditches.
00:34:08.560 You don't want to do that, do you?
00:34:11.320 That's very strange.
00:34:12.940 But you're sort of looking after James Damore.
00:34:15.520 Is James Damore – has he gotten too many promotions?
00:34:21.880 Meanwhile, he's, like, you know, grinding out this code.
00:34:25.020 Probably, like, you know, everybody –
00:34:27.220 You know, 10 years ago, everyone's like, I'm going to be a coder.
00:34:29.800 No, you're not.
00:34:30.600 It's one of the most – it's one of the most hectic jobs in the world.
00:34:33.540 You have to sit there in this hard focus doing these algorithms all day.
00:34:37.160 Now, all these people – everybody wants to work at Google.
00:34:39.980 You don't want to be a coder.
00:34:41.660 He did.
00:34:42.360 But both of you share the same skin color.
00:34:46.480 But somehow, you're – what do you call the –
00:34:50.260 what were the middle management positions on a slave plantation?
00:34:54.220 A overseer?
00:34:56.080 Yeah.
00:34:56.760 Yeah, but somehow –
00:34:57.900 Somehow she's the overseer sitting here talking about this stuff like,
00:35:03.020 I'm going to keep an eye on these white people.
00:35:06.440 What are you?
00:35:10.520 A trustee.
00:35:12.360 Yeah, I mean –
00:35:15.760 Oh, go ahead, Mark.
00:35:16.680 Well, you know, the other – there was a story that was much –
00:35:20.260 it wasn't as publicized, and I just saw it, I think, yesterday or maybe this morning,
00:35:25.360 that Google and another big company was at Disney – whatever.
00:35:29.620 They're all the same.
00:35:30.560 One of these big companies just admitted that they pay white people-ass money.
00:35:35.540 Like, that's part of their policy.
00:35:37.060 Oh, I got you here.
00:35:38.220 It's Microsoft.
00:35:39.000 Microsoft – yeah, Microsoft explicitly – again, the woke is put away, guys.
00:35:43.480 It's away.
00:35:44.280 It's done.
00:35:44.920 We're not going to explicitly throw it out there.
00:35:47.280 But the regime is packing it away, which is why in the 2024 report for Microsoft,
00:35:53.060 they proudly announced that last year they made sure to pay white people less than everyone else.
00:36:00.720 They explicitly brag in multiple places about how they are going out of their way to theoretically break civil rights law, right?
00:36:10.300 Because everybody's equal under the civil rights law.
00:36:12.760 I can't – you can't – so they're going out of their way to go ahead and pay anyone who isn't white more
00:36:20.000 so that they can go ahead and show – they can signal their commitment to diversity.
00:36:24.740 But, yeah, sorry, Mark.
00:36:25.940 I just wanted – yeah, I had that teed up, so I wanted to share that.
00:36:28.380 Right, but that's a crime.
00:36:31.000 Like, they are announcing we are committing every day a crime.
00:36:34.520 We're violating the Civil Rights Act.
00:36:36.840 Every day, we're just going to advertise that to you.
00:36:40.180 This is a – I mean, this is a much bigger deal than the hilarious image generator thing that produces dumb photographs.
00:36:48.840 But, you know, I don't see how you could look at that story and say,
00:36:53.740 ah, yeah, we're clearly – it's clearly on the way out.
00:36:56.600 Like, I'm sure that if you were, you know, in Munster during the Munster Rebellion,
00:37:01.280 you might think, well, they're having mass marriages and stuff.
00:37:06.840 Well, I'm sure at any point the city fathers are going to say this is too much
00:37:11.000 and they're going to put away the – we'll say wokes.
00:37:13.980 We don't offend anybody's religious beliefs.
00:37:16.420 Like, no, that's not what happened.
00:37:18.180 Up until the bitter end when the walls were knocked and ran down
00:37:21.780 and everybody inside was captured and killed and put in a jibbit,
00:37:26.760 those people did not give up.
00:37:28.480 Now, does that mean that there is no end in sight?
00:37:33.080 No, I mean, eventually this thing will fall apart and they will be thrown out of power.
00:37:38.760 They won't disappear probably.
00:37:40.760 I mean, the people who took over the city of Munster are still around.
00:37:44.740 There's a lot of them around in America if you go by the religious denomination.
00:37:49.480 But are they going to be in charge of everything forever?
00:37:52.040 I don't think so.
00:37:52.960 But they're in charge now and for the foreseeable future.
00:37:56.120 I see no reason to think that they're going to back away from this.
00:38:00.100 Like, if you were the biggest – is Google still the biggest corporation in the world
00:38:05.960 or is Disney ahead of them?
00:38:07.920 This is one of the biggest corporations in the world released to everybody
00:38:13.000 a machine that would generate pictures of anything you want but a white person.
00:38:18.380 That doesn't seem like they're on the way out to me, but, you know.
00:38:22.540 Now, let me ask you guys this because this is – I think this is the more –
00:38:25.280 this is the interesting question, all right?
00:38:27.300 Now, obviously they – oh, we made a mistake.
00:38:30.160 Ah, blah, blah.
00:38:30.880 You know, they backed it out basically because it made black Nazis.
00:38:35.200 But the interesting thing to me is, okay, it seems very unlikely,
00:38:40.840 even though I understand these people are deep, deep in their own Kool-Aid
00:38:44.840 and their own ideology.
00:38:45.840 It seems very unlikely to me.
00:38:48.380 That people sat down with this AI and nobody noticed it ever generated white.
00:38:55.000 Right?
00:38:55.560 That seems very – I mean, again, I understand that they are way inside the echo chamber,
00:39:01.340 but there's got to be some Indian guy out there, like, programming something
00:39:05.760 who's like, wait, what happened, right?
00:39:08.380 Like, is it – what is this, a commercial?
00:39:10.240 You know, like all the white people are gone?
00:39:11.900 Like, somebody somewhere had to, like, nudge them and be like,
00:39:15.040 did you notice that there's no white people in the thing?
00:39:18.380 And so, like, when they released this and it came out,
00:39:23.400 what is the second level thing here?
00:39:25.220 What is the 4D chess?
00:39:26.660 Because there's only two options.
00:39:28.980 One, they are so incredibly, like, just into the cult.
00:39:34.020 Like, they have drank the Kool-Aid.
00:39:35.620 They are waiting for the comet.
00:39:36.960 They have the Nikes on.
00:39:38.160 Like, it's either that or they knew that this was going to come out and they knew that this
00:39:46.020 was going to be obvious.
00:39:48.760 One of these thinkings has to be the case.
00:39:51.060 I can't answer that.
00:39:51.940 But first off, for the people in the comments that are mad, they're saying that he's drinking
00:39:56.900 beer.
00:39:57.320 It's Friday, buddy.
00:39:59.980 We get to drink beer on Friday.
00:40:03.920 I mean, I believe that it's – no one is afraid of the – they're not afraid of the
00:40:12.020 masses.
00:40:12.660 They're afraid of this machine.
00:40:15.280 This machine.
00:40:16.080 This is the same reason why kids will get on TikTok and say, I think my dad is a white
00:40:22.620 supremacist.
00:40:23.400 I found out he's got a Trump sticker on his truck.
00:40:26.520 This kind of thing.
00:40:27.920 People are terrified of this machine.
00:40:33.280 And, you know, the only thing that makes this machine go down even better is imagine if you
00:40:40.060 had a job making six figures, working from home, answering emails.
00:40:46.080 You know, it's hard to put yourself in those shoes, but that's the shoes these people live
00:40:51.100 in.
00:40:51.680 And so they're not afraid of us.
00:40:53.260 They're not afraid of public opinion.
00:40:54.880 They're afraid of this machine that's out there.
00:40:58.180 And the machine is – you know, you can't exactly put your finger on it, but we kind
00:41:02.400 of know – you know, it's got certain things to it.
00:41:06.180 And one thing that does need to be emphasized is the civil rights aspect.
00:41:10.480 Anyway, you know, a lot of the tech people, I feel like, are over-emphasizing the fact
00:41:19.360 that it's Google and this kind of thing.
00:41:22.320 Now, the fact that it is being Google is important.
00:41:25.560 And the fact that it is the huge, huge corporation is important.
00:41:28.840 But Red Man Tobacco redid their brand line to make it more inclusive.
00:41:37.260 I mean, do you think anybody at Red Man Tobacco is like, oh, we're going to – you know,
00:41:42.880 I'm going to go out on the wildest limb of my life and assume that Red Man Tobacco leans
00:41:49.340 more males than females to Chaw.
00:41:54.320 Okay, I'm going to go ahead and say that.
00:41:57.900 Do you think anybody there thinks that they're going to go 50-50, you know, spitting, dipping?
00:42:03.960 No, no.
00:42:05.760 They're not afraid of the customers.
00:42:07.640 They're afraid of the civil rights machine.
00:42:10.380 And it does need to be constantly brought up.
00:42:13.280 And I think that the same tools that we have in this case, we can take it there.
00:42:19.980 I don't think these people want to talk turkey about this stuff.
00:42:22.500 If you read the civil rights decisions, they're the most mealy-mouthed things possible.
00:42:28.360 It's always – it's not written in this exuberant like, hell, yeah, we live in this great woke republic.
00:42:35.240 It's this mealy-mouthed, well, look, you know, we didn't – you can almost think this is a little bit like, you know, blah, blah, blah.
00:42:43.640 Yeah, these people are – they don't want to face what they're doing.
00:42:48.120 Someone brings up equality.
00:42:49.660 What is equality?
00:42:51.920 Equality is – it's in the civil rights act.
00:42:54.840 It's called disparate impact right now, and it's totally insane.
00:42:58.220 It's totally insane.
00:42:59.300 You read this to any decent person, they'd say, what the hell is this?
00:43:03.840 Did you – did this come from the – did this come from the Paris Commune?
00:43:07.920 No, this is the world we live in, and got to do something about it.
00:43:13.680 Well, let's answer this question that you did.
00:43:16.540 First off, did they just not realize?
00:43:19.340 It was like an oversight.
00:43:20.940 There was no – like you said, the Indian guy in QA is like, hey, I haven't generated an ADT commercial, and let's see what the burglar looks like.
00:43:27.740 That's still an old bit from Blog Beef.
00:43:31.160 The NFL place kicker, the burglar, and the ADT commercial are jobs that have to be held by white men no matter what.
00:43:39.080 Well, yeah, I mean, it's hard to believe that you could have done that by accident and just nobody realized it.
00:43:45.440 Nobody typed in, you know, have it generate the King of England and was surprised when they got the Netflix adaptation.
00:43:51.980 Or were they so far in the cult they thought people were going to love this?
00:43:57.760 Probably not that either, but that's closer to the truth is that they said, well, they're not – there will be people who don't like it.
00:44:05.100 The three people in this video right now will hate that, but good, because I don't like them.
00:44:11.620 They figured that regular people would just accept it, just like, you know, they accept Anne Boleyn being from Ghana or whatever.
00:44:19.680 You'll just go along with it because that's the way things are.
00:44:23.140 They probably did misjudge how, I guess, stupid this would look to regular people.
00:44:29.600 There is a limit.
00:44:30.640 There's always a limit.
00:44:31.820 If there was no limit, then in 2012, they would have just – like, we would have had, you know, drag kids, transsexual stuff.
00:44:39.760 That would have all been in the A package.
00:44:42.140 Boom, right there.
00:44:43.140 We would have been arguing about that in, I guess, going back to 2004 instead of gay marriage.
00:44:47.280 They didn't do that.
00:44:47.920 It's got – you have to boil the frog slowly.
00:44:51.080 So, yeah, they made a mistake.
00:44:53.100 I don't know if anybody would get punished for it, but, you know, they don't like looking dumb.
00:44:57.840 They don't like – this actually does hurt your cause to do this because you – for one thing, if it ever becomes low status,
00:45:07.540 and this is, I think, if I was going to argue for academic age, it would be that if the wokeness becomes so low status that they're embarrassed to be associated with it,
00:45:19.860 then it would get put away.
00:45:21.360 But, you know, even if it gets put away, so what?
00:45:24.300 It just moves back into the shadows, and it's like, well, they'll just put it in the history books and educate children with this stuff so that the next generation won't perceive it as low status.
00:45:35.260 That's exactly what they did after the 60s.
00:45:38.040 This is how it all happened before.
00:45:39.720 And does anybody think that they're less powerful today than they were in, like, 1975?
00:45:45.920 No, of course not.
00:45:46.920 They've had – they've been racking up Ws forever.
00:45:51.440 Yeah, that's exactly right.
00:45:53.120 And I don't want to spend too much time on this because I'm going to write a whole piece on it.
00:45:56.920 But I did a thread explaining the educational aspect, as you just pointed out there, Marek, with the history books,
00:46:04.520 and you just go ahead and indoctrinate the next generation until this all seems normal.
00:46:08.700 As I pointed out in my thread, I was a teacher not that long ago,
00:46:12.880 and so I'm pretty familiar with the trends happening in actual public education right now.
00:46:17.620 And Google has gone all out.
00:46:19.460 They've spent billions of dollars probably at this point making sure that the Chromebook is ubiquitous in education.
00:46:26.920 They make sure that every student is issued a Chromebook, that Chrome is the only browser on the Chromebook,
00:46:32.540 that all of their different Google apps are integrated in in every way that you run the school.
00:46:38.900 The classrooms are all run on Google Classrooms.
00:46:41.600 That's where assignments are.
00:46:42.760 That's where you go ahead and get graded everything.
00:46:44.800 All the different apps that you use to do your assignments from Sheets to Docs to Slides,
00:46:50.100 all these things are integrated into the Google Suite, and they all plug into the Google Classroom.
00:46:54.560 Even when you're building something, so a student's sitting there and they need to do a slideshow presentation on,
00:47:00.480 I don't know, the Civil War or something, they Google images inside the Google Slides.
00:47:06.420 Like everything is many layers deep.
00:47:08.820 And so this is the entire algorithm world that students exist in.
00:47:13.740 And most of the students don't even know what it's like to do any kind of research or understand anything outside of this world.
00:47:21.000 So they need to look something up.
00:47:23.420 They need to validate information.
00:47:25.220 The first thing they do is Google it.
00:47:26.500 And Google always tells you the correct truth.
00:47:28.660 The idea that you would go beyond more than a couple links in Google.
00:47:32.340 In fact, the idea that you would even click on links in Google is a little foreign to these kids,
00:47:36.880 because in many cases when you Google a question, they go ahead and just give you a paragraph answer,
00:47:41.680 and then the kids just go ahead and copy and paste from that.
00:47:44.640 They don't even look into the links there.
00:47:46.540 So the idea that you would go to a library, and as we pointed out, libraries are not themselves actually free speech zones.
00:47:53.300 They're heavily curated by leftist librarians.
00:47:55.880 But the idea that you would even go to a library and look for primary sources or challenge some kind of idea that's given to you by Google just doesn't exist.
00:48:04.780 And so all of this stuff is going to get carried directly into school classrooms.
00:48:08.940 This is going to dictate in its entirety what kids think history even looks like.
00:48:14.940 Yeah.
00:48:15.120 One of the biggest things for me, when I guess this is faint praise, whatever,
00:48:20.640 I learned a lot more about history just simply going to – you go to a used bookstore or you can go to places like Goodwill.
00:48:30.180 After somebody dies, they'll haul off all their old books there.
00:48:33.060 You go through there and you look for books, like historical books, sociological books, before 1960, and you read them,
00:48:41.160 and then you will be able to build a mental model of what a human being was like back when we still made human beings.
00:48:47.980 And to me, that was way more radicalizing than anything I ever learned anywhere else.
00:48:53.780 Just simply going back to – and not even primary sources.
00:48:56.260 You're just getting the 1950s or 1930s version of reality straight from the horse's mouth.
00:49:03.840 If you make that impossible, which is very much on the table with – e-books can be censored, edited, changed, whatever,
00:49:14.560 with the click of a button, and you don't have a physical copy sitting on grandpa's bookshelf anymore
00:49:20.140 to go back and see what they've changed,
00:49:23.100 you're going to make people who are incurious and don't have any understanding of the world,
00:49:30.300 which that's great if that's the kind of person you rely on to keep control of your political system or whatever.
00:49:38.040 But for everything else, that's really bad.
00:49:41.200 You don't really want that to be the building block of your society because it's not going to run as well as it would have,
00:49:46.880 let's say, when this country started and we had one of the –
00:49:49.960 the idea of America being founded by frontier bumpkins, well, kind of,
00:49:56.140 but also we had one of the highest literacy rates in the world at the time of the country's founding, correct?
00:50:00.680 Yeah, I don't have that site in front of me, but I'll take it on faith.
00:50:05.800 Yeah, so that's kind of always who we've been, but if you're getting everything,
00:50:12.420 all your information from the television, you're not going to produce the same things that people did
00:50:17.980 if every school kid had some familiarity with Homer.
00:50:22.760 Yeah, well, half the great inventions of the Industrial Revolution were invented by some guy that owned –
00:50:29.420 you know, some random Scottish guy that owned a mule and a half.
00:50:33.140 But the same way – I mean, so I was red-pilled.
00:50:38.880 You know, I learned the truth of the world when I saw Empire on Laserdisc.
00:50:43.640 You know, they can't change that like they do on the streaming services.
00:50:48.060 You know what I'm saying?
00:50:49.140 You mean the television show Empire?
00:50:51.120 No, Empire Strikes Back.
00:50:53.380 Oh, see, that could have gone either way.
00:50:57.100 Yeah.
00:50:58.620 Yeah, there's a couple of things.
00:51:01.180 So a couple of quick things.
00:51:02.280 So first off, you can – as far as the Google, the big corporation question,
00:51:08.160 you're going to get this stuff from young leftists who don't know any better
00:51:12.860 who once again are doing that thing where I was talking about like I had this model
00:51:18.000 of how a labor union worked.
00:51:22.760 You know, do you see that in the past week or so, the Teamsters Union, you know,
00:51:26.780 they told – you know, Trump said, you know, I know you guys are very worried
00:51:31.880 about losing your jobs about this electric car mandate.
00:51:35.520 I think I can do something about you.
00:51:36.900 We can make a deal.
00:51:37.720 And they said, yeah, we're not so concerned about that.
00:51:40.460 But we – our red line is we have to have the border open.
00:51:45.560 And, you know, that was one of those things like, well, you know,
00:51:48.820 it's reality versus the thing.
00:51:50.800 But this one more lines up with the text too.
00:51:54.580 If you're a young leftist, you think that the opposite of the left is a huge corporation,
00:52:00.900 you know, a huge multinational capitalist corporation.
00:52:04.900 It's trade on the open market.
00:52:07.280 They may think that.
00:52:09.700 I'm like – I'm guessing they're going to help them with that once they get to grad school.
00:52:14.480 But that's not the case.
00:52:15.980 No, the opposite of those people – the opposite of you is a KULAC.
00:52:21.120 It's the small businesses.
00:52:24.700 Like, and this is a good example of why.
00:52:27.680 If you're a leftist, the – or if you're a Democrat, it doesn't matter.
00:52:32.340 It's all the same thing now.
00:52:34.480 When these – when you have these huge corporations roll up an industry,
00:52:39.600 it's so much easier to control.
00:52:41.900 You can have four or five people waltz in there,
00:52:47.560 and this bimbo is able to tell all these super smart tech guys what to do.
00:52:56.080 Recently, we've had this stuff come up with a funny example.
00:53:00.160 So, Niche Gamer's been showing how they do this with anime.
00:53:03.500 They're like, how can we control – you know, there's all this anime coming into America,
00:53:08.320 and some of it doesn't match the American moral standards.
00:53:12.040 Like, all we need is, like, two people that get a job reviewing the translation scripts,
00:53:19.020 and we're done.
00:53:19.600 We can just – we can control it all.
00:53:22.080 They make these things easier to control.
00:53:25.840 And, of course, with Google, this is what you see.
00:53:28.220 Google – there's this definition of fascism you see thrown about that, like,
00:53:33.260 fascism means corporations and the government work together.
00:53:38.240 Well, what government has that ever not applied to?
00:53:42.300 Yeah.
00:53:42.840 Ever.
00:53:43.400 I mean, if I'm running a country and you have a multibillion-dollar company in it,
00:53:50.200 am I just not going to notice it?
00:53:51.920 I'm like, I'm just going to – what are you talking about?
00:53:54.920 That's ridiculous.
00:53:56.260 And so, Google is – you can just consider it part of the federal government.
00:54:02.600 Once it gets that big, how can it not be?
00:54:05.900 What are they going to do?
00:54:06.840 Just ignore it?
00:54:07.700 Just, oh, well, Google just decided they're just going to do X or Y.
00:54:11.160 They're going to promote – this would promote that.
00:54:12.940 No, it's part of the federal government.
00:54:14.500 This is – so, yeah, it's just a small thing.
00:54:17.340 Google is just more and more of the essence of the thing.
00:54:23.640 There's a great book coming out soon that's going to explain all of this, guys.
00:54:27.440 It's called The Total State.
00:54:28.540 You can go ahead and preorder it now.
00:54:30.660 But I definitely get into all of the things that Bogbeef is mentioning there.
00:54:34.560 All right, guys, we're going to switch over to the questions of the people
00:54:36.700 because we have a lot of them right now.
00:54:38.280 But before we do, can you guys tell everybody where to listen to the good old boys?
00:54:42.680 Yeah, you can find us on Patreon.com.
00:54:45.520 The good old boys with a Z.
00:54:47.980 And we got merch on WBS Apparel, White Boy Summer, WBSapparel.com.
00:54:53.540 Click on the good old boys page.
00:54:55.640 We got a patch that will tell you everything you need to know about politics.
00:55:00.260 And we got some shirts and stuff.
00:55:03.060 All right, let's switch over to the questions here.
00:55:05.700 Maximilian Cunnings for $5 says,
00:55:09.960 How would you explain what natural law theory is and why they should care to someone who doesn't know it?
00:55:18.040 And you guys want to explain natural law or should I go ahead and do that?
00:55:20.680 Well, if I'll take it from a dumb person's perspective, adult, I when I when I hear natural law, I think, you know, God's law.
00:55:32.280 And how do you translate that into into reality?
00:55:36.340 But I don't know anything about this.
00:55:38.320 So that's just my off the shooting from the hip.
00:55:41.420 I always think of the great philosopher Wesley Snipes once said,
00:55:45.840 Some MFers always trying to ice skate uphill.
00:55:51.720 Yeah, that's that's not the worst.
00:55:53.620 Yeah, there you go.
00:55:54.980 Yeah, no, natural law is just and again, you can do their entire courses, you know, taught on this.
00:56:00.680 But natural law is simply the idea that, you know, God is often invoked in this, but and that's certainly part of it, but that there is a there is simply an order that is emergent inside the world, put in place by the divine by God, whatever.
00:56:16.760 And this manifests itself in the way that we kind of organize society.
00:56:20.440 And so natural law are the things that naturally, you know, God has placed into the world that we kind of have to abide by, whether we like it or not.
00:56:28.680 Like I skating uphill is probably not a great idea.
00:56:31.860 And so sorry, this is coming up because of the Christian nationalism thing.
00:56:37.060 Right.
00:56:37.540 The lady on MSNBC was sure batching because like, did you know that there are these dangerous people in the United States who believe that?
00:56:46.760 Their rights are derived from God instead of Congress.
00:56:50.440 It was like, yeah, yeah, yeah.
00:56:52.380 I remember this guy in Virginia writing about that a long time ago.
00:56:56.640 Sounds really weird.
00:56:58.480 They probably find a bunch of Christian nationalists saying that.
00:57:01.020 I don't know.
00:57:01.700 Round about 1775.
00:57:03.760 Yeah.
00:57:04.500 Hey, hey, honey, you got a dollar bill on you because I think they've taken control of the treasury.
00:57:11.300 I mean, I guess the I don't know if you were doing really the definition.
00:57:15.860 It's just like the law.
00:57:18.620 It's very it's kind of it's could be repurposed as a liberal idea.
00:57:22.360 And I guess it was around the period of the founding.
00:57:25.900 The idea that like things that are inherent are inherently good.
00:57:29.620 They're natural.
00:57:30.760 You can you can tell something from that that means, well, this is it's good in essence because it produces good outcomes.
00:57:37.240 This is way nature intended things is how we should run our society.
00:57:41.740 But like I said, I don't know.
00:57:43.260 I have I don't know anything about natural law or law itself.
00:57:46.260 So, yeah.
00:57:47.300 Again, you can do entire you can do entire books.
00:57:49.800 Surely, surely entire episodes are entire books.
00:57:52.060 But yeah, there's a specific Catholic teaching about this, too.
00:57:57.180 Right.
00:57:57.440 Yeah.
00:57:57.660 Yeah.
00:57:57.860 Yeah.
00:57:58.660 Yeah.
00:57:58.860 But the yeah, the the the readers digest version is, you know, that things that are good are emergent out of the natural order.
00:58:07.200 And those are the things that we have to at some point go ahead and code into what we do.
00:58:12.820 Speaking of which, real quick, I have decided for the right wing, the American right wing, I have suspended all theological arguments until after the election.
00:58:23.620 You're not allowed to argue about the filioque or we don't have time for that right now.
00:58:29.820 Yeah.
00:58:30.460 Sorry.
00:58:31.300 No, I hear you.
00:58:31.880 But Rupert Weirdo says every time I see something like this, I think, wow, look at all that woke being put away.
00:58:38.820 It's almost like it was never there.
00:58:40.580 Yeah.
00:58:40.760 Again, I feel pretty confident about my victory in Cigar Slam, but I understand if I was if I was academic agent, I wouldn't concede in the face of overwhelming evidence either.
00:58:51.340 So until Sidney Sweeney wins an Oscar and goes up in her red dress and takes that trophy.
00:58:57.320 Nobody's going to tell me that the woke has been put away that moment.
00:59:00.480 I will believe it.
00:59:02.680 Is that how you know America is back?
00:59:04.760 Is that the yeah, if she jiggles her way up on stage and takes that little golden trophy, then like, OK, we're we're back.
00:59:10.840 I like that he has that argument because that's a very British argument.
00:59:13.440 You know, he's got the stiff upper lip.
00:59:14.900 He's like, yeah, this will this will be over in a couple of months.
00:59:18.060 No big deal.
00:59:19.140 They got the stiff upper lip in the face of overwhelming.
00:59:22.640 That's what he's supposed to say.
00:59:23.880 He's British.
00:59:24.720 I understand.
00:59:25.480 Yeah.
00:59:25.620 Never, never retreat.
00:59:26.720 Never, never surrender.
00:59:27.740 I understand.
00:59:29.760 Trey Herman here says he's got a question.
00:59:33.440 Is AI deliberately being used as a tool to erase gamers and gamer culture?
00:59:38.020 What implications do you think this has the ongoing gamer censored word?
00:59:45.100 Excellent.
00:59:45.400 Well done.
00:59:46.520 Bogbeef, as our gamer expert or gamer historian, could you go ahead and answer that?
00:59:51.100 You know, I've been talking about politics online for years now.
00:59:57.580 My focus on gamers and gamers rights.
01:00:00.580 I've never gotten as much concentrated sexual harassment as I have for my opinions as I have from just saying gamers are suffering oppression as I have in the last week.
01:00:09.640 It's it's ridiculous.
01:00:10.600 No, yeah, of course it is.
01:00:14.440 We have to protect the gamers.
01:00:18.300 100%.
01:00:18.740 Let's see here.
01:00:20.460 Tim Miller says, what kind of cigar will you pick or an salute?
01:00:24.860 Well, thank you.
01:00:25.400 Good question.
01:00:26.680 You know, I just had a Rocky Patel 20 year anniversary.
01:00:30.120 That was quite good.
01:00:31.120 I don't know if that would be my final answer, but that's just the latest one that I had that I hadn't had before that I enjoyed.
01:00:37.580 So maybe I will secure that one for my victory cigar.
01:00:42.040 Do you drink whiskey with it?
01:00:44.640 Is there another way one smoke cigars?
01:00:47.660 I don't know.
01:00:49.460 We got them down here.
01:00:51.160 We got a chew in the back, man.
01:00:55.460 I think I'm actually further south than you, Bog, but OK.
01:00:58.120 Of course, it's Florida.
01:01:00.360 So the further south you go, the further north you are.
01:01:02.520 So there's that.
01:01:03.560 Yeah, I mean, well, except for Polk County.
01:01:06.860 Polk County is the most southern place on planet Earth, and it's way down there.
01:01:11.500 I'll tell you funny stories about that later.
01:01:13.860 All right.
01:01:14.360 Creeper weirdo.
01:01:15.480 Twenty five is still a year away or.
01:01:17.920 And yep, the academic agent has a whole year to turn this bet around.
01:01:22.500 Jacob says, please see above.
01:01:25.180 I don't know why it won't send that message.
01:01:29.120 Hmm.
01:01:29.880 I don't know, man.
01:01:30.640 Sorry, I don't see an above before that.
01:01:33.740 I'm sorry.
01:01:34.320 I can't answer your question there.
01:01:36.300 If if you want to try to tag me and chat, I'd hate to have to have you super chat again.
01:01:40.820 But I do my I'll do my best to try to get your question up.
01:01:43.640 If if you give that a shot.
01:01:47.260 Let's see.
01:01:48.220 Rainforest Points 2.0 says it's the left's proto G word fantasy to get rid of whites in the real world.
01:01:55.900 Again, you know, a lot of people will obviously say that that rhetoric is overblown.
01:02:01.480 I guess I understand that.
01:02:03.040 Obviously, I don't know if there's a full on, you know, violent plan for this or a real real plan for this.
01:02:08.340 But it's very clear that there's a digital plan to remove white people from history.
01:02:12.260 And that probably says enough about the way that, you know, people working at Google feel, you know, about that.
01:02:20.440 As as we looked at Sean Davis and his tweet, when he attempted to ask if it was OK to remove whiteness, Google seemed to seem there was a good argument for that.
01:02:27.820 But it also recognized that asking about removing blackness was was forbidden.
01:02:32.300 So that standard alone should tell you that the people in charge don't have great love for you.
01:02:37.920 People people get confused because, yeah, of course, their arguments are not honest.
01:02:42.900 No, they don't want to get rid of all white people.
01:02:45.960 But like that doesn't mean they're not serious.
01:02:48.180 Like when they say that it's a proxy for people like us, they boy, they would sure love to get rid of all of us or barring that make sure that we have absolutely no say in how things go.
01:03:00.200 So don't don't confuse their lack of honesty with a lack of determination because they have that.
01:03:07.920 Yeah, I mean, I find it's best to just set the locus politically just because, you know,
01:03:14.980 if you try to get dance with them, philosophy, philosophically, I mean, for example, these people, they want to live near white people.
01:03:25.220 Yes, they quite enjoy that.
01:03:28.040 They don't want to live in the ghetto.
01:03:29.220 They don't want to live in the hood.
01:03:30.460 They want to live around white people, but they do have all these crazy there.
01:03:34.980 There is all kinds of or you can just see there's a phenomenon that's impossible to miss that the the woker you find one of these these sort of
01:03:44.800 especially if you take a minority woman that's very, very well versed in in in a woke politics or something that you have a higher and higher chance that her boyfriend looks like a perfect Aryan specimen.
01:04:02.460 So, you know, that's how reality is.
01:04:05.720 It's it's always this kind of crazy, psycho sexual and and all this kind of stuff.
01:04:12.680 Life of Brian says, how many lefties know they're frauds given their constant fallacies?
01:04:18.480 How often do they know they're lying?
01:04:20.960 I would say less often than you think.
01:04:23.660 But so there's two questions there actually in one.
01:04:27.140 So I think they don't think they're lying most of the time.
01:04:30.040 Like when they say statements like a trans man is a man or they are trans woman is a woman or, you know, men and women have equal strength or, you know,
01:04:39.580 they believe that stuff like they they they have to jump through a lot of hoops.
01:04:44.300 Like maybe they could give you a real answer on a biology test.
01:04:48.320 But religiously, spiritually, they believe the words they're saying.
01:04:52.160 Simultaneously, I think many of them know they're frauds because once those status markers as kind of Bog and Merrick pointed out during this stream,
01:04:59.980 once those status markers come under any kind of actual review, once people start actually checking the level of validity involved in these things,
01:05:08.000 they start to feel their weakness.
01:05:09.380 They start to realize, oh, well, if anyone actually looks into like what my Ph.D. is in or, you know,
01:05:15.200 how many things I stole to write it, then it'll be very clear that maybe I'm not, you know, a manual comp.
01:05:20.700 And so, you know, that I think that there's they believe it, but also deep down, once they start checking any of that against kind of real credentials,
01:05:30.020 they do realize and feel frauds that way.
01:05:33.080 The magic of deconstruction allows them to lie and not be a lie.
01:05:37.860 Like the question, like, what is a woman?
01:05:40.060 Well, if you deconstruct the meaning of every word, you can just say, well, it's whatever I say it is.
01:05:44.980 And like, you can, if you're, if you're follow, if you follow their religion, they do believe that they believe in magic words.
01:05:50.820 They believe in chewing up everything until it has no meaning.
01:05:56.140 So like, yeah, they can say whatever they want and think that they're not lying.
01:06:01.240 It works.
01:06:02.940 You got a spectrum.
01:06:03.840 You got, uh, on the lowest end, you have people that are compartmentalizing.
01:06:09.140 Um, if you would really push on these things, um, they start getting with weird feelings.
01:06:14.960 If you have like a high, so if you have a high cost of these things, like, oh, so you don't, you, you, you know, you, um, uh, you believe all these super woke things.
01:06:23.420 Why don't you move to in the ghetto in Detroit or whatever?
01:06:26.480 The kids to public school while you're at it, right?
01:06:28.980 Houses are much cheaper.
01:06:30.160 Yeah.
01:06:30.400 They're, they're, they're going to, uh, kind of snap out of it.
01:06:32.980 Then on the, you know, on the other end, you have like, uh, Matt Iglesias knows he's lying.
01:06:37.240 Uh, he believes that he believes that it's morally like, uh, you know, he, it's morally good for him to lie.
01:06:44.280 Like, uh, that's the Lord.
01:06:46.140 Yeah.
01:06:46.680 Yeah.
01:06:48.760 Rupert weirdo here says Latin X is a slur.
01:06:51.620 Pass it on.
01:06:52.460 Uh, yeah, not too many Latin people seem to enjoy it, but it doesn't seem like the left is letting it go.
01:06:57.040 Uh, he also says guys, 10th anniversary of Gamergate.
01:06:59.920 Gamers will never game in.
01:07:03.480 Uh, let's see here.
01:07:04.760 Uh, Maximilian Cunning says on a primordial level, what is equality?
01:07:10.200 Well, I don't know what you mean by primordial level and equality of course could mean a lot of stuff.
01:07:16.200 You could say equality before the law, equality, you know, spiritually, uh, the, the, the founders obviously make, make that reference when it comes to, you know, people before God.
01:07:26.320 Uh, but of course, you know, in theory, equality would mean that, you know, everybody is at the same place or, you know, some people would say have the same rights.
01:07:34.820 Again, you'd really have to be a little more specific to parse that.
01:07:38.220 Oh, I can answer that one.
01:07:39.300 Well, sure.
01:07:39.640 Go for it.
01:07:40.180 Annihilation.
01:07:41.020 Yeah.
01:07:43.220 Uh, give me your stuff that you work for.
01:07:45.820 In the economic sense, truly, truly the case.
01:07:52.320 All right.
01:07:53.180 Uh, David Tavares says, I don't understand how any of my comrades can feel good if he's, if he is hired, not because of his ability, but because he is Latino, I would be embarrassed.
01:08:05.040 Yeah.
01:08:05.560 And you would hope that that would be the case for a lot of people, but I mean, as, as our patronage experts here, we'll probably tell you, unfortunately, while your feelings are honorable, uh, they are not the, the norm across the human race.
01:08:17.640 And a lot of people are more than happy to take, uh, a free job if you know, they can get it.
01:08:23.160 Merrick would.
01:08:23.920 I wouldn't, there is a spectrum.
01:08:26.580 There is a spectrum and there are people that obey laws, uh, and are, are, uh, you can be a fairly decent.
01:08:35.040 Person and still, um, have enough of a, uh, uh, let me get that back.
01:08:40.760 Uh, especially depends on your, your own personal circumstances.
01:08:45.020 Um, I, I always, the quick anecdote, uh, several people invented the game D and D.
01:08:52.740 Um, one man was a professor.
01:08:55.160 One man was a neurologist, uh, a bleeding, um, uh, brain surgeon.
01:09:00.340 And one man was repairing shoes in his house with five kids.
01:09:05.020 One of those guys was very, very motivated to get that bag.
01:09:08.620 However, however he could get it.
01:09:10.780 You can do the math on that.
01:09:13.620 It was Gary at home.
01:09:15.940 Yeah.
01:09:16.880 Okay.
01:09:17.720 Uh, Seneca here says, uh, cheers from Missouri gents.
01:09:21.540 Well, thank you very much, man.
01:09:22.500 Appreciate that.
01:09:23.340 Um, um, Klimplar says AI is useful as an ideology test.
01:09:29.480 If your AI can't help your, uh, can't help your ideology, it's because your ideology is logically inconsistent over time.
01:09:36.180 Those ideologies spend too much time micromanaging to endure.
01:09:41.700 So yeah.
01:09:42.500 Interesting way to look at that.
01:09:44.200 Can my governing ideology survive the AI test?
01:09:47.600 If it can't, then maybe my society is going to have to devote too much of its energy, keeping the ideology alive and not keeping the society alive.
01:09:56.780 Unfortunately, um, uh, having an ideology that requires tons and tons of maintenance, uh, is highly appealing to them.
01:10:04.660 They, they need people, they need, um, bullshit jobs, BS jobs to hand to these people, these useless people with all these, uh, fake degrees.
01:10:13.940 It's, it's a 0% interest rate religion.
01:10:16.620 And that by definition tells you how it's going to end, tells you the means by which it's going to end.
01:10:26.520 Autistic cat, uh, says, thank you, uh, for the show, Oren and the boys.
01:10:30.920 Well, thank you very much.
01:10:32.060 Autistic cat.
01:10:32.880 Appreciate it.
01:10:34.440 Uh, let's see.
01:10:35.460 Joshua BB says, great show guys.
01:10:36.920 Always funny.
01:10:37.680 Happy midterms.
01:10:39.140 Yeah.
01:10:39.300 If you're studying out there for midterms guys, uh, in, you know, best of luck.
01:10:43.020 Appreciate your support, sir.
01:10:46.040 Uh, WBS apparel apparel says looking forward to reading your upcoming book, Oren.
01:10:51.160 Let's be sure to ship.
01:10:52.280 Will Stancil a copy.
01:10:53.900 I don't feel like will would survive my book.
01:10:56.640 I mean, so I'll say this real quick.
01:10:59.080 This is very inside for people who are not on Twitter, but, uh, but will Stancil, uh, is a story that's too long for me to explain, uh, on, on the show for the moment.
01:11:07.640 But, but I think at some point guys, you are the, uh, you are the, uh, the content.
01:11:12.940 And I think we'll, the only reason anyone knows who, who will Stancil is, is because he turned a lot of right-wing people into content.
01:11:20.220 So something to think about there.
01:11:22.260 Uh, then I says, I made restoration bureau posters last year, focusing on heritage America and the easiest way, uh, to dope an AI to adopt your view is to blast it with, uh, traditional art data.
01:11:34.520 The AI knows it's better and ignores it's native native data.
01:11:39.460 Very interesting.
01:11:40.460 I would not have thought about that.
01:11:42.100 Yeah.
01:11:42.300 I just want to say, uh, I think AI is really good for us.
01:11:46.100 It's really good for the right wing.
01:11:47.740 Once again, going back to the points about truth, we have nothing to fear from, uh, from, you know, some computer that's going to accidentally tell the truth.
01:11:57.020 Uh, uh, that, that is not a problem.
01:11:59.940 That is not a problem for us.
01:12:01.240 And I've seen a lot of guys out there that's making use of this to, uh, especially produce art because we're kind of locked.
01:12:09.520 We're locked out in a lot of ways of, of, uh, uh, that in particular, a lot, a lot of guys have those media ventures and stuff.
01:12:18.280 And, um, uh, yeah, I'm not the first person to point this out, but about what he said about the restoration bureau, uh, when those climate activists go to the Louvre to, you know, to protest.
01:12:29.400 And they, uh, super glue themselves to Monet or whatever.
01:12:32.680 They're not going to the modern art museum and gluing themselves to Jackson Pollock.
01:12:37.300 They're going after actual art.
01:12:39.800 Even they know deep down in their dark little hearts that what, what art is and what it isn't.
01:12:45.800 So yeah, you can't lose as long as that's as long as that stuff still exists.
01:12:50.060 He also says the funniest reveal preference I saw recently was Emma from the botched army ads married a six, six blonde.
01:12:57.480 You could chat.
01:12:58.120 It is now a trad wife, mother.
01:12:59.740 They moved to Japan to escape American crime.
01:13:02.820 Uh, I did not know that, but yeah, that is absolutely hilarious.
01:13:06.280 All right, guys, well, we're going to go ahead and wrap this up.
01:13:10.240 Once again, make sure you're checking out the good old boys.
01:13:12.320 It's one of the best podcasts out there.
01:13:14.020 I am a regular listener.
01:13:15.020 Have a lot of fun checking them out.
01:13:17.700 And of course, if it's your first time on this show, make sure that you go ahead and subscribe to the channel and make sure that you go ahead and hit that notification, hit the bell so that you go ahead and catch these streams when they go live.
01:13:30.840 If you would like to get these broadcasts as podcasts, make sure that you go ahead and subscribe to your Macintyre show on your favorite podcast platform.
01:13:37.020 And when you do make sure that you leave a rating or review, it really helps with the algorithm magic.
01:13:41.960 Thank you for watching guys.
01:13:43.080 Have a great weekend and as always, I'll talk to you next time.