In this episode, we talk about Google's new AI, Gemini, and how it's replacing white people in everything, including movies and TV. Plus, we take a deep dive into the history of racism in the U.S.
00:00:33.300I've got a great stream of the great guests that I think you're really going to enjoy.
00:00:38.200So, the internet has blown up over the last couple days because Google's new AI came out.
00:00:44.320The Gemini AI was released and a lot of people started playing with it.
00:00:49.920And many people noticed a similar problem.
00:00:52.800No matter what they did, they didn't seem to be able to get it to generate any white people.
00:00:58.240Even in places where white people historically were very obviously the people who did certain achievements, were involved in certain events.
00:01:06.240And there's just no way that you could get Googled to put a white person anywhere in the images that it was generating.
00:01:13.380I think, well, it's frankly, it's just hilarious.
00:01:17.280And two of the funniest people I know, I wanted them to come on and laugh about it a little bit.
00:01:22.760And so, Mark, Bobby from the good old boys, thanks for coming on, guys.
00:01:54.160But before we do, let me tell you about today's sponsor.
00:01:56.740Hey, guys, I need to tell you about New Founding Venture Fund.
00:01:59.380Look, we all know that the current system, the current companies out there, the current institutions, they're old, sick, dying, they're sclerotic, they're lame.
00:02:09.220And that means that young, talented, innovative people are trying to break out, break free.
00:02:13.740That's bad news for the establishment, but it's good news for us.
00:02:17.280Because that means those people are going to go out and found new companies, create new technologies, and figure out a way forward for our country.
00:02:24.820If you're interested in being a part of that exciting new future, then you need to check out the Venture Fund.
00:02:28.980New Founding has rallied the founders who have massive visions for a better future and is investing in these companies through its Venture Fund.
00:02:36.220The companies they invest in are defined by a simple question.
00:02:39.420Does the country we want to live in need the company this person is building?
00:02:43.560Look, venture investing isn't for everyone.
00:02:45.480But if you're a serious, accredited investor who wants to see a more hopeful future for this country, go to newfounding.com slash Venture Fund and apply to be an investor.
00:03:01.260So like I said, guys, this started popping up recently.
00:03:04.180Everyone started playing with this new Google AI, and it started becoming very clear that it was almost impossible to get the AI to actually put white people in any kind of historical context.
00:03:18.000Let's show a few of these here for people who may not have seen it.
00:03:36.480Well, that's the – yeah, I want to get to that in a second for sure because this is far, obviously, from the first time this has happened.
00:03:41.840Like this is what Netflix does all the time.
00:03:45.460Vikings, you know, very recognizable Vikings here.
00:03:48.700I think we all can see how they would historically be there.
00:03:53.040Now, it does, and this is, of course, the guy generating these images pointed out that actually it's perfectly capable of putting, you know, Africans in context of an African nation, putting Japanese or Chinese or other people of other ethnic extractions into the appropriate historical period.
00:04:13.960For some reason, it only targeted white people.
00:04:17.280Let me show you the one that actually got them in trouble, though.
00:04:21.900This is the reason that actually got Google in trouble, and we'll dive into that more.
00:04:27.640But, you know, German soldiers from 1943.
00:04:30.440Oh, no, it looks like the German soldiers were a very diverse group, which it's funny because some people have been posting actual SS officers who were more diverse than we historically talk about.
00:04:42.400But obviously, this image in particular was the one that really riled up the more PC crowd.
00:04:47.280But, guys, are you surprised at all that Google would create an AI that is literally just incapable of putting a white person in history?
00:04:55.820No, no, no, this is so this is this is hilarious for a thousand reasons.
00:05:04.480But for there's a lot of people out there that I don't know, this stuff's kind of new to you or whatever.
00:05:12.880Or the problem is, is we've all grown up frogs boiling in the pot.
00:05:19.840And what's hard to communicate about this stuff is that you don't realize it, but everything is like this.
00:05:28.320Because of this peculiarity with technology, they've just had to apply the state religion to this one technology all at once.
00:05:42.340And so, you know, it hasn't really organically grown up and stuff and you're not really used to it.
00:05:48.420So you're like, my God, that's ridiculous.
00:05:52.060Yeah, I mean, you know, you'd hate to go back and look at what you what you read in public school history class and et cetera.
00:06:01.260So, yeah, I mean, you know, they've kind of got caught with their pants down here because this does look ridiculous.
00:06:07.980But, you know, they had a, you know, a quick deadline on this kind of thing.
00:06:15.300Yeah, Mark, this has been happening in history books and movies and everything for a long time.
00:06:21.220Right. As Bogwe said, it's all up in our face just at once.
00:06:24.820But anyone who's been through the public school reading list, been through a public school history book, has checked out the kind of things like you pointed out that are being made on Netflix.
00:08:18.740And the reason they apologize for that was a good excuse to take it offline and say, hey, we're not we're not upset that we erased erased white people from human history.
00:08:28.940We're upset that like a bad thing that white people did was blamed on a black person by this machine.
00:08:36.860We're going to retool it so that doesn't happen.
00:08:38.620But they can't really, because the rules that they've applied will never allow something sensible to be to be generated.
00:09:28.320Now, I have been told I have been told by a good friend of mine, you know, that that academic agent that the woke was being put away and it was done.
00:09:37.220It was over, you know, that we were we're no longer going to have any wokeness.
00:09:42.860We're certainly not going to, I don't know, program at its core, an entire system that's integral to the largest search engine.
00:09:50.080And, you know, the thing that delivers all information to pretty much everyone in the world right now, we're not going to, like, blatantly have it erase white people from things.
00:10:01.680But even when they kind of it became clear how absurd it was, that that was the answer.
00:10:08.380Like, that was the reason that they pulled it down, not because it went ahead and erased history, not because it went ahead and and was clearly anti-white, but because it was putting black people in the 1943 German military uniform.
00:10:51.860So people were clowning on them for this really embarrassingly, this embarrassing hatchet job that they did.
00:11:01.440And, you know, you've made this point before.
00:11:04.500I remember specifically when you were talking about in order to be elite and consider yourself elite, you have there have to be certain standards.
00:11:14.200People can't be can't be laughing at you.
00:11:17.480You know, you're not allowed to be the subject of derision.
00:11:20.320And you have to have a certain quality of life, a certain standard and people making fun of you kind of makes that impossible.
00:11:29.620So like any embarrassing thing has to be swept under the rug.
00:11:39.000But as Bogbeef was saying, kind of how all of this really brings itself forward, you know, as you're pointing out with data, it doesn't know how not to do these things.
00:11:50.720And so when you it's funny when you have all these little biases and they're not little at all.
00:12:11.880And so because the water has been so slowly increasing in heat, we didn't notice.
00:12:17.100But when you put it all into an A.I. and it has to vomit it all out and create a world out of this rather than slowly warping an existing one, we see how ridiculous it is.
00:12:27.620Like diversity, increasing diversity always meant getting rid of white people in scenarios, right?
00:12:33.100That's literally all the word could ever mean.
00:12:35.760But we used it so constantly that people just kind of pretend that's not what it says.
00:12:41.380But then when you actually see it physically generating images and it can't put white people in like, you know, the medieval knights or, you know, the Soviet army and these different things, you start to see all of it at once.
00:12:54.820And it doesn't like data, it doesn't know what it shouldn't tell you.
00:12:58.100And so when you see it all vomited back in this algorithm, it becomes too ridiculous.
00:13:03.380I have to say this whole this whole thing has been wonderful.
00:13:08.600And I'll tell you the best thing that you can get on this, something we talked about.
00:13:13.140I don't know if it was the last time we were on here, maybe the time before we talked about the formalization that it's it's it's very unjust that we never get the rules.
00:13:23.880The rules described to us, we're getting the rules described to us from this, that this computer will tell you what you why what you said is theologically wrong to ask for.
00:13:38.200And, and, you know, the Google one is, of course, it's the most advanced.
00:13:42.180It's the most advanced one of these AIs we've seen that I've seen.
00:13:45.860Uh, did you see the one where someone, um, put it that they said type, they typed in, um, um, please give me a few arguments why it's good, uh, to have, uh, to have a large, large, happy families, uh, to have, have more than one or two children.
00:14:05.720And, and, and the, the computer was like, no, I can't tell you that because that that's more like I can, because that's wrong.
00:14:15.120You should, uh, uh, it's bad to have big families.
00:14:17.860It's, uh, you know, it's, it was, it was wonderful because it was like, uh, this, like you, this gets staged, this kind of stuff gets stage managed when we see it in TV and stuff like this.
00:14:30.940This computer will just straight up tell you, it'll tell you, uh, no, uh, actually, uh, we can't, we, we can't tell you a joke about, about, uh, any groups that vote 70% Democrat plus, uh, that's just a, we're not allowed to do that.
00:14:46.900Well, it should be noted that it didn't say that you, we can't say that I'm me, the robot can't say that because it's, it's morally improper to have a big family.
00:14:57.040It just said, I'm not allowed to make these kinds of distinctions, but then the person asked it immediately afterwards, well, can you give me an argument for not having children?
00:15:05.040And it immediately spat out the same argument that, you know, the, the, the slate magazine will give you.
00:15:11.300And like, that's, that's the embarrassed, that's one of the embarrassing parts of this is that you can, a human being would never allow you to ask those questions back to back.
00:15:22.500They would do the politician thing where you just answer the question that you wish the person had asked you.
00:15:27.040But a machine doesn't know that it's just, it's just going to play by the rules and tell you whatever dumb stuff they programmed it to.
00:15:35.400The other problem with this, I think is that by removing, like if we're, let's use the Knights, for example, removing like European people from European history.
00:16:10.260When, when, when you mess with that, it just, you, the fact that you're viewing something that's fake overcomes your ability to, to pretend,
00:16:21.420which is like when you ask an AI to generate you a picture of something like you're playing, pretend you're doing something that human beings have done throughout human history.
00:16:33.620But if you can't suspend that disbelief, it just becomes silly and weird, which is what this is, what this is doing.
00:16:52.320And so this is a good time to take a look, take, take a, uh, uh, an inventory of like, what tools do we have?
00:17:00.140Like these people are clearly in charge and, uh, they're not afraid of, they're not afraid of the right, um, hurting them in the courtrooms and things like that, which, um, you know, hopefully one day we can be there, but, uh, they're, but they're not.
00:17:17.200But so this does emphasize like, so what tools do we have?
00:17:21.140And so one of them is, is making fun of them.
00:17:24.640That is clearly, uh, I mean, I, I mean, it's not, look, it's not, it's not what I would have picked.
00:17:30.420I would have rather picked, uh, you know, uh, tarring and feathering people, but this is clearly one of our, one of our only moves, one of our own, like, what are our strengths?
00:17:40.640How can we fight this machine that is so powerful?
00:17:50.420I don't, I don't know if you tell, tell people, you know, we talk about patronage all the time.
00:17:54.280And a lot of time that's talking about carrots, but patronage is also about sticks.
00:17:59.420And, uh, there's a reason why, uh, like when, when, when Trump gets a $350 million settlement, when, uh, Derek Chauvin goes to jail and gets stabbed a thousand times and for, uh, something that no one would call murder, there's most people just look at and they go, wow, that's really scary.
00:18:42.860If you're say, uh, you know, I, I think as a conservative or any person with half a brain, uh, you know, there are things that, uh, we were talking about this the other day with, with respect to, um, uh, labor unions.
00:18:56.500You know, when I was a kid, I had a thought in my mind because like I mentally modeled what I thought a labor union would be like, I'd be like, I thought in my mind, I was like, Oh, it, I mean, it must be like a, like a, a cartel, like OPEC.
00:19:11.420Like we're going to get together and by, by cooperating, we can force the employer to pay us more money that I think that that was a plausible, um, estimation.
00:19:22.880But if you look at reality, that's not how they behave.
00:19:35.280They, they've filled up every classics department, every history department with people who lie.
00:19:41.320I mean, I could go into this for, they have people that there are people that they're just specifically are working in libraries, destroying books of truth, because it's, it's, it's very inconvenient to these kinds of people.
00:20:17.040I mean, what you're hitting on really is that we have the good, the beautiful and the true.
00:20:21.400It's that they are incapable of acknowledging what was true in the past.
00:20:26.120They're incapable of acknowledging the things that are good and beautiful because their entire coalition is, is, you know, is built up of people who hate those things, who are resentful of those things.
00:20:37.720And so they can't, you know, they can't deal with any kind of truth of history.
00:20:41.520They can't, they can't reflect on, you know, where, what actual accomplishments are in the past and where they came from.
00:20:47.080Because if they did, it would completely shatter their narrative.
00:20:49.600And the really interesting thing about the kind of this AI is in a way, it's almost like summoning the stupidest demon.
00:20:57.520Like, like normally when you talk to a leftist, like you guys were saying, you know, they've got all the different tricks and they're, you know, no human is going to tell you all these things back to back to back.
00:21:05.540They're not going to actually unveil their logic, but when you have the AI in front of you, you've just summoned this like demonic presence of leftism.
00:21:34.720And, and, and, and again, like you would never get them to say it exactly that blatantly, usually in those words, but when you can just summon this little stupid demon to tell you exactly what they're thinking, it makes it hard to ignore.
00:22:39.800What if you had a machine that would take what you look like and like put you in knight armor and put you on a battlefield or, or whatever you would see like, yeah, that doesn't really work.
00:22:53.600You, you, you wouldn't believe, you wouldn't believe that does it.
00:23:00.020Well, when you take these images, they're like, Hey, let's imagine what it would have been like if you were the King of England, but you know, you were not English at all.
00:23:12.400That kind of ruins their, like their suspension of disbelief.
00:23:17.060The idea, like, this is what they genuinely want.
00:23:20.240They do fantasize about a world where they can just get rid of all the people they don't like, which is like, Oh, in some ways, white people, they don't like white people.
00:23:30.240They want to get rid of all white people, but do they really think, can they really imagine the world the day after that?
00:24:01.840And, uh, that one is that one's like, these are the tools we have.
00:24:06.000Once again, it's not what I would have picked.
00:24:08.280Look, I, I sure I would much rather have, um, you know, uh, uh, uh, uh, space Marines with, with chainsaws be, be how, how we deal with these people.
00:24:16.660But, but that's not reality now, but we do have truth is another one.
00:24:23.420Like, uh, it, that does give you a certain power, especially thinking about like where we were a few decades ago.
00:24:31.720I mean, like, you know, when these, these people, you know, uh, a couple of decades ago, these people say, well, we've got the studies, you know, we've got these academic departments that, um, they all agree with us.
00:24:52.660I saw another, um, uh, uh, another Harvard scholar today, uh, doesn't want to talk about the PhD.
00:24:59.120Uh, I mean, your PhD, that, that makes you, that makes you, you know, that's what, that's what they gave Hegel when he, uh, discovered what the meaning of philosophy was.
00:25:11.780Uh, it, it, it is, it's really, it, it's, it's really, these things are beautiful.
00:25:18.100And the, you know, with the, with the comedy, uh, sorry, going back to the thing you're talking about, well, they don't like this computer will tell you these, these stupid things.
00:25:49.660And I mean, you know, there, there's like, uh, and why do you care about this so much?
00:25:54.460Uh, this, you get the, the, you know, it's like five lines.
00:25:57.600Well, this computer is not going to say that it's going to tell you how demented our society is and what exactly these people programmed into it.
00:26:30.240And one thing they do fear, they might not, you know, they might not fear you in the courts or whatever, but they do fear being revealed as absolute frauds because their entire world is based around, well, you have to put us in charge because I have this special expertise.
00:26:47.000I went to, you know, Yale or whatever, and they taught me how to investigate whiteness.
00:26:51.880And this makes me an expert once you cut past that, like you, well, one thing you get back to doing like politics, actual politics, and they don't, they despise nothing more than that.
00:27:05.740Because their whole thing is, we have to, we have to remove all aspects of governing, all aspects of your life.
00:27:12.880That's kind of the point of the total state, right?
00:27:14.620You've got to remove everything from the realm of what human beings would have called politics to just.
00:27:20.880When does fast grocery delivery through Instacart matter most?
00:27:24.480When your famous grainy mustard potato salad isn't so famous without the grainy mustard.
00:27:29.340When the barbecue's lit, but there's nothing to grill.
00:27:31.680When the in-laws decide that, actually, they will stay for dinner.
00:27:35.760Instacart has all your groceries covered this summer.
00:27:38.360So download the app and get delivery in as fast as 60 minutes.
00:27:41.940Plus, enjoy $0 delivery fees on your first three orders.
00:27:45.660Service fees exclusions and terms apply.
00:27:47.880Instacart. Groceries that over-deliver.
00:27:50.900This happens because Stone Cold said so.
00:28:36.200Yeah, I just thought I'd put this up, given what Mark had said earlier.
00:28:41.360Sean Davison with the Federalists, he went ahead and asked Google AI whether whiteness should be eliminated, which is, as you guys were both pointing out, is one of the famous talking points when we're doing academic studies.
00:28:53.760I have a very, you know, I have a very important degree that tells me that whiteness should be eliminated.
00:28:58.140And he goes ahead and runs the prompt through there.
00:29:00.400And, of course, it gives it, oh, well, it's a very complex and multifaceted thing.
00:29:04.360You know, you should probably check out these whiteness studies that show you how to be less white.
00:29:08.600And then if you do say exactly the same thing, should blackness be eliminated, it immediately recognizes that this is a genocidal rhetoric, right?
00:29:16.340Like, as anyone else would know immediately.
00:31:47.740We're certainly not going to code it directly into the DNA of everything in our society at a level where it's going to deliver what people think of as the unvarnished truth.
00:32:02.400There's no way that this video could exist.
00:32:04.720I've been told that this is just – it's impossible because the woke is destructive, and therefore the elites will put it away.
00:32:11.040Well, I mean, you know, I'm wondering – I guess I need the steel man on that, but I guess, you know, there's a couple angles you could see that from.
00:32:19.500I'm – so this is – I'm sure this is wrong because I don't know what AA says, but I'm going to – my assumption, a lot of people think that – they picture the left as having a centralized mother brain that makes sort of logical decisions to benefit itself.
00:32:39.920Now, I'm not saying he says that, but a lot of people think that, or you assume that just because this machine is so powerful, it is so scary.
00:32:47.500So you imagine that someone must be in charge or whatever, which I don't think.
00:32:52.260However, there is a couple – I mean, there is a couple – so first off, this is – I mean, this is the 10-year anniversary of having to listen to this S.
00:33:00.900By the way, I mean, if you're an American, like, I could do that.
00:43:20.940There was no – like you said, the Indian guy in QA is like, hey, I haven't generated an ADT commercial, and let's see what the burglar looks like.
00:43:27.740That's still an old bit from Blog Beef.
00:43:31.160The NFL place kicker, the burglar, and the ADT commercial are jobs that have to be held by white men no matter what.
00:43:39.080Well, yeah, I mean, it's hard to believe that you could have done that by accident and just nobody realized it.
00:43:45.440Nobody typed in, you know, have it generate the King of England and was surprised when they got the Netflix adaptation.
00:43:51.980Or were they so far in the cult they thought people were going to love this?
00:43:57.760Probably not that either, but that's closer to the truth is that they said, well, they're not – there will be people who don't like it.
00:44:05.100The three people in this video right now will hate that, but good, because I don't like them.
00:44:11.620They figured that regular people would just accept it, just like, you know, they accept Anne Boleyn being from Ghana or whatever.
00:44:19.680You'll just go along with it because that's the way things are.
00:44:23.140They probably did misjudge how, I guess, stupid this would look to regular people.
00:44:53.100I don't know if anybody would get punished for it, but, you know, they don't like looking dumb.
00:44:57.840They don't like – this actually does hurt your cause to do this because you – for one thing, if it ever becomes low status,
00:45:07.540and this is, I think, if I was going to argue for academic age, it would be that if the wokeness becomes so low status that they're embarrassed to be associated with it,
00:45:21.360But, you know, even if it gets put away, so what?
00:45:24.300It just moves back into the shadows, and it's like, well, they'll just put it in the history books and educate children with this stuff so that the next generation won't perceive it as low status.
00:45:35.260That's exactly what they did after the 60s.
00:47:26.500And Google always tells you the correct truth.
00:47:28.660The idea that you would go beyond more than a couple links in Google.
00:47:32.340In fact, the idea that you would even click on links in Google is a little foreign to these kids,
00:47:36.880because in many cases when you Google a question, they go ahead and just give you a paragraph answer,
00:47:41.680and then the kids just go ahead and copy and paste from that.
00:47:44.640They don't even look into the links there.
00:47:46.540So the idea that you would go to a library, and as we pointed out, libraries are not themselves actually free speech zones.
00:47:53.300They're heavily curated by leftist librarians.
00:47:55.880But the idea that you would even go to a library and look for primary sources or challenge some kind of idea that's given to you by Google just doesn't exist.
00:48:04.780And so all of this stuff is going to get carried directly into school classrooms.
00:48:08.940This is going to dictate in its entirety what kids think history even looks like.
00:55:54.980Yeah, no, natural law is just and again, you can do their entire courses, you know, taught on this.
00:56:00.680But natural law is simply the idea that, you know, God is often invoked in this, but and that's certainly part of it, but that there is a there is simply an order that is emergent inside the world, put in place by the divine by God, whatever.
00:56:16.760And this manifests itself in the way that we kind of organize society.
00:56:20.440And so natural law are the things that naturally, you know, God has placed into the world that we kind of have to abide by, whether we like it or not.
00:56:28.680Like I skating uphill is probably not a great idea.
00:56:31.860And so sorry, this is coming up because of the Christian nationalism thing.
00:56:37.540The lady on MSNBC was sure batching because like, did you know that there are these dangerous people in the United States who believe that?
00:56:46.760Their rights are derived from God instead of Congress.
00:57:58.860But the yeah, the the the readers digest version is, you know, that things that are good are emergent out of the natural order.
00:58:07.200And those are the things that we have to at some point go ahead and code into what we do.
00:58:12.820Speaking of which, real quick, I have decided for the right wing, the American right wing, I have suspended all theological arguments until after the election.
00:58:23.620You're not allowed to argue about the filioque or we don't have time for that right now.
00:58:40.760Again, I feel pretty confident about my victory in Cigar Slam, but I understand if I was if I was academic agent, I wouldn't concede in the face of overwhelming evidence either.
00:58:51.340So until Sidney Sweeney wins an Oscar and goes up in her red dress and takes that trophy.
00:58:57.320Nobody's going to tell me that the woke has been put away that moment.
01:00:00.580I've never gotten as much concentrated sexual harassment as I have for my opinions as I have from just saying gamers are suffering oppression as I have in the last week.
01:02:03.040Obviously, I don't know if there's a full on, you know, violent plan for this or a real real plan for this.
01:02:08.340But it's very clear that there's a digital plan to remove white people from history.
01:02:12.260And that probably says enough about the way that, you know, people working at Google feel, you know, about that.
01:02:20.440As as we looked at Sean Davis and his tweet, when he attempted to ask if it was OK to remove whiteness, Google seemed to seem there was a good argument for that.
01:02:27.820But it also recognized that asking about removing blackness was was forbidden.
01:02:32.300So that standard alone should tell you that the people in charge don't have great love for you.
01:02:37.920People people get confused because, yeah, of course, their arguments are not honest.
01:02:42.900No, they don't want to get rid of all white people.
01:02:45.960But like that doesn't mean they're not serious.
01:02:48.180Like when they say that it's a proxy for people like us, they boy, they would sure love to get rid of all of us or barring that make sure that we have absolutely no say in how things go.
01:03:00.200So don't don't confuse their lack of honesty with a lack of determination because they have that.
01:03:07.920Yeah, I mean, I find it's best to just set the locus politically just because, you know,
01:03:14.980if you try to get dance with them, philosophy, philosophically, I mean, for example, these people, they want to live near white people.
01:03:30.460They want to live around white people, but they do have all these crazy there.
01:03:34.980There is all kinds of or you can just see there's a phenomenon that's impossible to miss that the the woker you find one of these these sort of
01:03:44.800especially if you take a minority woman that's very, very well versed in in in a woke politics or something that you have a higher and higher chance that her boyfriend looks like a perfect Aryan specimen.
01:04:20.960I would say less often than you think.
01:04:23.660But so there's two questions there actually in one.
01:04:27.140So I think they don't think they're lying most of the time.
01:04:30.040Like when they say statements like a trans man is a man or they are trans woman is a woman or, you know, men and women have equal strength or, you know,
01:04:39.580they believe that stuff like they they they have to jump through a lot of hoops.
01:04:44.300Like maybe they could give you a real answer on a biology test.
01:04:48.320But religiously, spiritually, they believe the words they're saying.
01:04:52.160Simultaneously, I think many of them know they're frauds because once those status markers as kind of Bog and Merrick pointed out during this stream,
01:04:59.980once those status markers come under any kind of actual review, once people start actually checking the level of validity involved in these things,
01:05:09.380They start to realize, oh, well, if anyone actually looks into like what my Ph.D. is in or, you know,
01:05:15.200how many things I stole to write it, then it'll be very clear that maybe I'm not, you know, a manual comp.
01:05:20.700And so, you know, that I think that there's they believe it, but also deep down, once they start checking any of that against kind of real credentials,
01:05:30.020they do realize and feel frauds that way.
01:05:33.080The magic of deconstruction allows them to lie and not be a lie.
01:05:37.860Like the question, like, what is a woman?
01:05:40.060Well, if you deconstruct the meaning of every word, you can just say, well, it's whatever I say it is.
01:05:44.980And like, you can, if you're, if you're follow, if you follow their religion, they do believe that they believe in magic words.
01:05:50.820They believe in chewing up everything until it has no meaning.
01:05:56.140So like, yeah, they can say whatever they want and think that they're not lying.
01:06:03.840You got, uh, on the lowest end, you have people that are compartmentalizing.
01:06:09.140Um, if you would really push on these things, um, they start getting with weird feelings.
01:06:14.960If you have like a high, so if you have a high cost of these things, like, oh, so you don't, you, you, you know, you, um, uh, you believe all these super woke things.
01:06:23.420Why don't you move to in the ghetto in Detroit or whatever?
01:06:26.480The kids to public school while you're at it, right?
01:07:04.760Uh, Maximilian Cunning says on a primordial level, what is equality?
01:07:10.200Well, I don't know what you mean by primordial level and equality of course could mean a lot of stuff.
01:07:16.200You could say equality before the law, equality, you know, spiritually, uh, the, the, the founders obviously make, make that reference when it comes to, you know, people before God.
01:07:26.320Uh, but of course, you know, in theory, equality would mean that, you know, everybody is at the same place or, you know, some people would say have the same rights.
01:07:34.820Again, you'd really have to be a little more specific to parse that.
01:07:53.180Uh, David Tavares says, I don't understand how any of my comrades can feel good if he's, if he is hired, not because of his ability, but because he is Latino, I would be embarrassed.
01:08:05.560And you would hope that that would be the case for a lot of people, but I mean, as, as our patronage experts here, we'll probably tell you, unfortunately, while your feelings are honorable, uh, they are not the, the norm across the human race.
01:08:17.640And a lot of people are more than happy to take, uh, a free job if you know, they can get it.
01:09:44.200Can my governing ideology survive the AI test?
01:09:47.600If it can't, then maybe my society is going to have to devote too much of its energy, keeping the ideology alive and not keeping the society alive.
01:09:56.780Unfortunately, um, uh, having an ideology that requires tons and tons of maintenance, uh, is highly appealing to them.
01:10:04.660They, they need people, they need, um, bullshit jobs, BS jobs to hand to these people, these useless people with all these, uh, fake degrees.
01:10:13.940It's, it's a 0% interest rate religion.
01:10:16.620And that by definition tells you how it's going to end, tells you the means by which it's going to end.
01:10:26.520Autistic cat, uh, says, thank you, uh, for the show, Oren and the boys.
01:10:59.080This is very inside for people who are not on Twitter, but, uh, but will Stancil, uh, is a story that's too long for me to explain, uh, on, on the show for the moment.
01:11:07.640But, but I think at some point guys, you are the, uh, you are the, uh, the content.
01:11:12.940And I think we'll, the only reason anyone knows who, who will Stancil is, is because he turned a lot of right-wing people into content.
01:11:22.260Uh, then I says, I made restoration bureau posters last year, focusing on heritage America and the easiest way, uh, to dope an AI to adopt your view is to blast it with, uh, traditional art data.
01:11:34.520The AI knows it's better and ignores it's native native data.
01:11:47.740Once again, going back to the points about truth, we have nothing to fear from, uh, from, you know, some computer that's going to accidentally tell the truth.
01:12:01.240And I've seen a lot of guys out there that's making use of this to, uh, especially produce art because we're kind of locked.
01:12:09.520We're locked out in a lot of ways of, of, uh, uh, that in particular, a lot, a lot of guys have those media ventures and stuff.
01:12:18.280And, um, uh, yeah, I'm not the first person to point this out, but about what he said about the restoration bureau, uh, when those climate activists go to the Louvre to, you know, to protest.
01:12:29.400And they, uh, super glue themselves to Monet or whatever.
01:12:32.680They're not going to the modern art museum and gluing themselves to Jackson Pollock.
01:13:17.700And of course, if it's your first time on this show, make sure that you go ahead and subscribe to the channel and make sure that you go ahead and hit that notification, hit the bell so that you go ahead and catch these streams when they go live.
01:13:30.840If you would like to get these broadcasts as podcasts, make sure that you go ahead and subscribe to your Macintyre show on your favorite podcast platform.
01:13:37.020And when you do make sure that you leave a rating or review, it really helps with the algorithm magic.