Louder with Crowder - February 22, 2024


AI Confirms Google HATES White People! | Louder with Crowder


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

160.35349

Word Count

12,096

Sentence Count

1,351

Misogynist Sentences

88

Hate Speech Sentences

85


Summary

When did you die? I spent the day gasping at the fire, way up in here. I spent a decade in a pocket of a... Wendy! I'm gay! I m gay.


Transcript

00:00:00.000 When did you die?
00:00:05.000 I spent the day gasping at the fire, way up in here When did you die?
00:00:13.000 I spent the day gasping at the fire, way up in here When did you die?
00:00:21.000 I spent the day gasping at the fire, way up in here Shut up, silly woman!
00:00:30.000 Fallen in the car we ride that fast, someday I'll blow There's no time in my life without you
00:00:38.000 Fallen in the car we ride that fast, someday I'll blow Afuera!
00:00:45.000 I ran away, and I am not going back And I am not going back
00:00:51.000 You should, you should I ran away, and I am not going back
00:00:57.000 And I am not going back I'm gay.
00:01:00.000 Oh my.
00:01:28.000 Boo, you whore.
00:01:30.000 What are you doing?
00:01:45.000 Just playing along.
00:01:46.000 Woo!
00:01:47.000 I spent a year in the pocket of a...
00:01:53.000 Sam!
00:01:55.000 When I died, you died.
00:01:58.000 I spent a decade in a pocket of a...
00:02:01.000 Wendy!
00:02:02.000 When I died, you died.
00:02:06.000 I spent a year in the pocket of a...
00:02:09.000 Wendy!
00:02:10.000 When I died, you died.
00:02:15.000 I spent a decade in a pocket of a...
00:02:17.000 That's a huge bitch!
00:02:19.000 Just say to me, have a wonderful time in my new shirt.
00:02:23.000 Salam, father.
00:02:25.000 And just say, boo, boo.
00:02:28.000 Just, just, just say to me, have a wonderful time in my new shirt.
00:02:31.000 Salam, father.
00:02:33.000 And just say, boo, boo.
00:02:36.000 Never done, I've never...
00:02:38.000 One, I've got that fez someday.
00:02:41.000 And just say, boo, boo.
00:02:44.000 Never done, I've never...
00:02:46.000 One, I've got that fez someday.
00:02:49.000 And just say, boo, boo.
00:02:53.000 Chase.
00:02:54.000 A wondrous place for you and...
00:03:00.000 Me!
00:03:01.000 That's who!
00:03:02.000 Goodbye Big monkey.
00:03:02.000 Goodbye.
00:03:02.000 Thank you.
00:03:06.000 I, er, I krack my head Er, I, bang, monkey
00:03:08.000 Snake.
00:03:09.000 And, krack my head er, I, bang, monkey
00:03:12.000 What, what?
00:03:12.000 What!
00:03:13.000 That's smoke cracks my head
00:03:15.000 Err, I, bang, monkey uh,I krack my head
00:03:16.000 Wow.
00:03:18.000 Bang.
00:03:19.000 I, u, min, extremes Grow the scalding
00:03:23.000 I, u, burn my head And you, bang
00:03:24.000 I, u, a Ha!
00:03:29.000 Stroll the scala Building
00:03:31.000 The burning Bang
00:03:34.000 I, u, min, extremes I, u, a
00:03:36.000 Pierce the pier This cause
00:03:38.000 Er, I, er, I, burn my head And you, I, bang
00:03:40.000 I, u, a I, u, a
00:03:42.000 I, u, a I, u, a
00:03:44.000 Pierce the pier This cause
00:03:46.000 Er, I, er, I, burn, I, er, I, burn, monkey Meow
00:03:48.000 Er, junk, monkey Krack my head
00:03:50.000 I, u, min, extremes I, u, I, bang, monkey
00:03:52.000 Snake.
00:03:53.000 And, krack my head No!
00:03:55.000 I, u, bang, monkey Er, junk, monkey
00:03:57.000 Uh, I, min, extremes I, u, I, bang, monkey
00:03:59.000 Snake.
00:04:00.000 snake F F MF
00:04:02.000 Shrieb x2
00:04:04.000 Hm you mistreed
00:04:06.000 bu-store burzi
00:04:07.000 any om
00:04:08.000 U sound
00:04:09.000 bod burp
00:04:09.000 Stale sorry mada
00:04:10.000 om al
00:04:11.000 dra n
00:04:12.000 E H
00:04:13.000 Use do
00:04:13.000 x2 al
00:04:14.000 it it
00:04:15.000 i er
00:04:15.000 (?)
00:04:16.000 um eh
00:04:16.000 Boom Shrieb
00:04:18.000 um oh
00:04:19.000 Hm you mistreated
00:04:20.000 scorze E
00:04:22.000 A Om
00:04:22.000 A Batman
00:04:23.000 Buy global
00:04:23.000 E Bang
00:04:24.000 N you
00:04:24.000 The nu
00:04:25.000 ka Om
00:04:26.000 o sh
00:04:26.000 shadow Sh
00:04:28.000 shadow Shadow
00:04:29.000 Moon It's a good thing I'm a punk.
00:04:32.000 Yay, junk.
00:04:33.000 I'm a beautiful soul.
00:04:42.000 And it's...
00:04:45.000 Beautiful soul.
00:04:49.000 Soul.
00:04:57.000 I'm a beautiful soul.
00:05:00.000 Ha ha!
00:05:02.000 Bye.
00:05:05.000 Oh Oh
00:05:17.000 Oh You lose
00:05:21.000 Oh Me
00:05:25.000 Nobody ever knows That is neat
00:05:53.000 It's science So
00:06:21.000 Me So
00:06:25.000 And here I thought you could Could talk to me. I'll just have to think of something else
00:06:54.000 in here. I thought you could talk to me I thought you could talk to me. Thanks won't be the same
00:07:02.000 I thought you could talk to me. I'll just have to think of something else in here. I thought you could talk to me
00:07:29.000 I thought you could talk to me. Thanks won't be the same I thought you could talk to me. I'll just have to think of
00:07:41.000 something else in here. I thought you could talk to me She's sideways
00:08:07.000 She's sideways She's sideways
00:08:50.000 She's sideways I'm sorry, Mr. Handicapped Man.
00:09:01.000 I'm sorry, Mr. Handicapped Man.
00:10:53.000 Oh yeah, the bitch!
00:11:17.000 Hey Josh, what's um, what's going on here?
00:11:21.000 I'm losing it here, man.
00:11:23.000 Okay, well, calm down.
00:11:25.000 Why do you have a serial killer wall?
00:11:27.000 Oh, this?
00:11:28.000 It's a serial killer wall.
00:11:29.000 No, this is my taxes, Steven.
00:11:30.000 My taxes!
00:11:31.000 This isn't a tax... See?
00:11:33.000 My taxes!
00:11:34.000 How could this...
00:11:36.000 tax wall possibly make your filing any simpler?
00:11:38.000 Well, you see, I'm trying to find the missing link here, okay?
00:11:41.000 Because I've added up all my income, I've subtracted the taxes paid,
00:11:45.000 I've even deducted all my expenses, and I still have $1,600 unaccounted for,
00:11:50.000 and I just can't figure it out, man.
00:11:52.000 The twine is not...
00:11:53.000 No, no, no, no, this is... this is paid for by the company.
00:11:55.000 What I'm trying to figure out is I've added up all of my loud and withrouter income...
00:12:00.000 Thank you, by the way.
00:12:01.000 And then I have all my capital gains from RumbleStock.
00:12:04.000 I've even deducted all my business lunches at Bennegan's.
00:12:07.000 Okay, that's... Bennegan's is not a thing anymore.
00:12:09.000 Bennegan's is bankrupt.
00:12:10.000 I'm at Cracker Barrel.
00:12:13.000 I don't... Okay, fine.
00:12:14.000 Have you tried calling Tax Network USA?
00:12:16.000 Tax Network USA?
00:12:18.000 Yeah, they're actually a leading comprehensive resolution specialist with taxes.
00:12:23.000 It's better than string that you get at Michael's.
00:12:26.000 I'm sure they can.
00:12:26.000 They'll go.
00:12:26.000 They can handle this?
00:12:28.000 Do they have strings?
00:12:29.000 I don't know if they have strings.
00:12:30.000 How much are you spending on string?
00:12:32.000 I don't spend anything on string.
00:12:33.000 I actually had to confirm a fraud alert on the business card for an obscene amount of string.
00:12:41.000 Did you spend $1,200 on string?
00:12:42.000 No, it was less than that.
00:12:43.000 The other stuff was like wood and rope and stuff, but that's for another project.
00:12:47.000 But Dex Network USA, they'll actually, they'll even go and they'll negotiate on behalf of you and this to the IRS.
00:12:54.000 So try giving them a call.
00:12:55.000 That's great, man.
00:12:56.000 That's a huge relief.
00:12:57.000 That takes a big burden off my back.
00:12:59.000 Yeah.
00:13:00.000 Thank you.
00:13:00.000 All right.
00:13:01.000 I'm glad we could help.
00:13:01.000 Okay.
00:13:02.000 Can you clean this s*** up and then... Yeah, yeah.
00:13:05.000 Yeah, of course.
00:13:05.000 Yeah, I'll clean it up and get back to work.
00:13:07.000 Hey, after work, do you want to grab a bite at Bennegan's?
00:13:09.000 There is... that's... there's no Bennegan's.
00:13:11.000 I meant Cracker Barrel's.
00:13:12.000 How are you confusing the two of them?
00:13:14.000 They're not even remotely similar.
00:13:15.000 They got salads.
00:13:15.000 It's a steak place.
00:13:16.000 They got waiters.
00:13:17.000 Neither one is a steak place.
00:13:18.000 They might have salads incidentally.
00:13:20.000 Yeah, yeah.
00:13:21.000 But they're not steak places.
00:13:22.000 Breadsticks and soups and stuff.
00:13:23.000 I don't think...
00:13:25.000 Does anyone know, do they have breadsticks at Cracker Barrel?
00:13:26.000 I don't think they have breadsticks at Cracker Barrel.
00:13:28.000 It's a cracker place.
00:13:29.000 Josh, where are you actually having your business lunches on behalf of the company?
00:13:34.000 Jiggles.
00:13:36.000 Well, first off, we should probably stop that.
00:13:38.000 Probably nip that in the bud.
00:13:41.000 And I don't think that Tax Network USA is going to help you with the expenses at Jiggles.
00:13:46.000 Cracker Barrel.
00:13:47.000 Okay.
00:13:48.000 Don't buy any more string.
00:13:49.000 And don't go to strip clubs on the company card.
00:13:54.000 Call Tax Network USA.
00:13:57.000 Don't let the IRS take advantage of you.
00:13:59.000 Contact Tax Network USA for immediate relief and expert guidance.
00:14:03.000 Call 1-800-245-6000 or visit tnusa.com slash Crowder.
00:14:08.000 Please give it up for Josh Feierstein!
00:14:14.000 I was in the Army for eight and a half years and... Okay, sure, cool.
00:14:21.000 The first question I always get from people is, did you ever kill anybody?
00:14:23.000 Yes!
00:14:25.000 I did.
00:14:26.000 But that was when I worked at the nursing home.
00:14:27.000 Like, I don't see what that has to... Do we have any veterans here tonight?
00:14:31.000 Any veterans?
00:14:32.000 Very cool, very cool.
00:14:34.000 Any Coast Guard?
00:14:35.000 Really?
00:14:36.000 Thank you for your service and for making drugs more expensive, you piece of s**t!
00:14:41.000 Thanks for protecting America from a good time!
00:14:44.000 I offended a lady in Portland the other day, which is easy to do, by the way.
00:14:49.000 My two friends got married.
00:14:49.000 They're both named Zach.
00:14:50.000 Zach and Zach.
00:14:51.000 I call them ZZ Top and Bottom.
00:14:53.000 They're, uh... I'm glad you laughed at that.
00:14:56.000 Some people don't like those kind of jokes.
00:14:58.000 I'm Josh Feierstein.
00:14:59.000 Thank you very much.
00:15:00.000 Thank you, everyone.
00:15:01.000 Thank you very much.
00:15:02.000 Thank you.
00:15:03.000 Thank you.
00:15:04.000 Bring the funk.
00:15:05.000 Yeah.
00:15:06.000 Yeah.
00:15:07.000 Thank you.
00:15:29.000 Mark!
00:15:30.000 Stop appealing to the force!
00:15:33.000 I'm disappeared in no time.
00:15:37.000 Nailed it.
00:15:47.000 First try.
00:15:47.000 I mean, if you want to do the bare minimum.
00:15:50.000 You feel good about what you just said?
00:15:52.000 Does that make you feel like a big man?
00:15:56.000 Hey, Mr. Firestein today released a special, hung his balls out there, if I may.
00:16:01.000 What'd you do?
00:16:02.000 I insulted the host.
00:16:04.000 It's a good day.
00:16:25.000 I bring you tidings of great joy, okay?
00:16:29.000 Generation Z. Everyone thinks this country is lost, and yep, we're heading in the wrong direction on a lot of fronts, the descent into secularism.
00:16:35.000 However, Gen Z, particularly the young men, are far more conservative.
00:16:40.000 Then any generation you have seen in the last at least 40 years, at this point in their lives, that's context that people miss.
00:16:48.000 They go, oh, they're pretty liberal.
00:16:49.000 Well, sure, but not for this point in their lives.
00:16:54.000 And their support for Greg Abbott in Texas is very surprising.
00:16:58.000 It's actually the kind of approval rating that you won't even see in a general population for Donald Trump.
00:17:03.000 And that's specifically because of what's happening on the border.
00:17:06.000 So that's a good thing.
00:17:07.000 The new Google AI is so woke.
00:17:10.000 I don't even want... I don't use the term reverse racism.
00:17:13.000 It's just racist racist.
00:17:14.000 But!
00:17:15.000 They were forced to apologize and allegedly they're going to correct this.
00:17:15.000 Sure.
00:17:18.000 Now what I want you to think about as we discuss discuss this today Because AI is kind of new to some people, a novelty.
00:17:24.000 However, what you see with this AI, which is blatant, I want you to keep in the back of your mind, all algorithms are AI.
00:17:30.000 Instagram is AI.
00:17:31.000 TikTok is AI.
00:17:32.000 YouTube, YouTube Shorts, that's AI.
00:17:34.000 And that's why we did the Clean Slate campaign.
00:17:36.000 That's why we will not capitulate.
00:17:37.000 So what you see is an example of what a lot of people miss.
00:17:42.000 We're also going to talk about a serial killer on TikTok.
00:17:44.000 Hopefully the authorities are watching him.
00:17:48.000 Yes.
00:17:48.000 Allegedly.
00:17:49.000 And if you want to know, watch the show.
00:17:50.000 This is why we don't just do the social media, just the dumping all the time.
00:17:54.000 If you want to know, watch the show.
00:17:55.000 Weekdays.
00:17:56.000 10 a.m.
00:17:56.000 Eastern.
00:17:57.000 Move on with your day.
00:17:57.000 Check the references.
00:17:58.000 And...
00:17:59.000 So now what's crazy is it's going to be the overlap of the actual YouTube dump button
00:18:17.000 and the plug for the YouTube dump.
00:18:18.000 It's like the inception of slightly racist that were once acceptable terms.
00:18:23.000 I don't think some guy out there is going to be pissed.
00:18:26.000 I want to know what he said!
00:18:28.000 John Jacob Jingle Haji Smith.
00:18:30.000 His name is my name!
00:18:31.000 Do you have to hit it again?
00:18:32.000 I don't care.
00:18:37.000 Number two, CEO Katten Morgan, how are you?
00:18:40.000 I'm good.
00:18:41.000 I'm feeling feisty.
00:18:42.000 A little bit.
00:18:42.000 Are you?
00:18:43.000 Like a Cubano?
00:18:44.000 Not like a Cubano.
00:18:45.000 Come on, what are you talking about?
00:18:46.000 I'm just a little fired up about these stories.
00:18:48.000 They're going to hide their bias better.
00:18:50.000 They're not going to fix AI.
00:18:52.000 But don't be too spicy because then there's going to be a problem.
00:18:55.000 I know you're the spicy switch.
00:18:57.000 Spicy switch.
00:18:57.000 Yes.
00:18:58.000 You come to me spicy, I'll be spicy.
00:19:00.000 You come to me switch, like a pancake, I'm going to be switch.
00:19:04.000 That's Portuguese.
00:19:04.000 See if you can figure that out.
00:19:05.000 Yeah, that's Portuguese.
00:19:06.000 I'm trilingual.
00:19:07.000 And in third chair, his special...
00:19:11.000 Drops here on Mug Club tomorrow, 10 a.m.
00:19:14.000 Eastern.
00:19:15.000 We're happy for him.
00:19:16.000 He's proud of it.
00:19:17.000 And of course, you can see him in Des Moines, Iowa, Friday, March 1st.
00:19:23.000 Josh Feierstein, thank you for your service.
00:19:24.000 How are you?
00:19:25.000 I'm good.
00:19:25.000 I'm good.
00:19:26.000 Please watch the special, everybody.
00:19:28.000 I'm not supposed to call it a special.
00:19:31.000 I was told it's a half hour.
00:19:32.000 You're not supposed to call it a special.
00:19:33.000 So it's a specially made 30 minutes.
00:19:38.000 Well, it matters to me!
00:19:39.000 Yeah, I'm very proud of it, so have fun.
00:19:41.000 And if you don't like it, just comment and say you did.
00:19:44.000 Just lie for me.
00:19:45.000 And that's exclusively for Mug Club members.
00:19:47.000 Now we'll figure out what we do with it as far as getting clips out there so you guys can see some of it, but it's a love letter to you and Josh Feierstein.
00:19:54.000 He's really been making a go of it here at Grover Cleveland, so let me ask you this.
00:20:00.000 Name that movie line.
00:20:01.000 What do you think are the biggest potential consequences?
00:20:04.000 What worries you most about AI, photo, video, all these capabilities?
00:20:09.000 And then I will inform you of the problems with AI that you may not necessarily have noticed.
00:20:16.000 It's been infecting your lives for a long time, but the good news is education is kind of the greatest antidote.
00:20:22.000 And the only way to win this game, guys, girls, gals, z's, don't play.
00:20:28.000 You don't need a dopamine detox if you're not violating your reward circuitry every single day.
00:20:34.000 If you're like, my dopamine detox, I went for a hike.
00:20:37.000 You should do that anyway!
00:20:40.000 It used to be called life.
00:20:41.000 Live life.
00:20:43.000 Don't be married to a screen.
00:20:44.000 But, it's Black History Month here at Light Earth Crowder.
00:20:52.000 Alright, so, we honor the blacks.
00:20:58.000 And today, heroic black military units.
00:21:01.000 You know them.
00:21:02.000 Yeah.
00:21:02.000 So, the 369th Infantry Regiment, also known as the Harlem Hellfighters from World War II.
00:21:09.000 Look at that.
00:21:10.000 That's pretty cool.
00:21:10.000 That's one.
00:21:11.000 369, damn girl, fine.
00:21:12.000 Yeah, hey, I don't know what that means.
00:21:15.000 The Buffalo Soldiers is a military unit.
00:21:16.000 They fought against Indians after the Civil War.
00:21:19.000 Thank you.
00:21:20.000 Feathers.
00:21:22.000 That Indian.
00:21:22.000 And also, Keith is the Better Hodgetwin.
00:21:28.000 Which one?
00:21:29.000 You know what?
00:21:29.000 I'm going to leave that to you.
00:21:30.000 That's been Black History Month.
00:21:39.000 Is he miming now?
00:21:40.000 Is he the world's first black mime?
00:21:44.000 That's the cousin of the Harlem Shake, the Brooklyn Chimney.
00:21:47.000 Yeah, it's the white guy who doesn't know what to do.
00:21:50.000 It's the hybrid of, what is it, Backstreet Boys?
00:21:52.000 Oh yeah.
00:21:54.000 What is this new dance?
00:21:55.000 And they're always back.
00:21:56.000 It's like a puppet thing.
00:21:56.000 Yeah, and they're always back.
00:21:57.000 No strings.
00:21:58.000 Oh, that's NSYNC.
00:21:59.000 No, it's the same.
00:21:59.000 It was them in the horror.
00:22:00.000 They were like, here we are and we're back again.
00:22:02.000 And then like a year later, like, we're back again now.
00:22:04.000 We're serious this time.
00:22:06.000 Alright, sure.
00:22:08.000 We all went broke and we need money.
00:22:10.000 Yes.
00:22:12.000 My friend had a joke, he said, Hey fellas, if you say, guess who's back?
00:22:16.000 And she doesn't say alright, she's too young for you.
00:22:18.000 Yes.
00:22:21.000 Probably.
00:22:22.000 But you know what?
00:22:22.000 It's okay these days.
00:22:24.000 Age is the number and what have you.
00:22:26.000 So!
00:22:28.000 The world's next great serial killer.
00:22:32.000 We've found him.
00:22:33.000 Yep.
00:22:34.000 We've been combing.
00:22:35.000 And it's apparently a TikToker who loves sharing.
00:22:39.000 It's one thing to be a serial killer.
00:22:41.000 Now, some would say that, hey, that's a bad sign.
00:22:44.000 However, it gets worse when you decide to share all of your red flags publicly.
00:23:00.000 Totally normal.
00:23:01.000 What is he, Telly Savalas?
00:23:06.000 Okay.
00:23:09.000 Stretching.
00:23:10.000 Who gets on a bike like that?
00:23:18.000 This guy.
00:23:21.000 This guy gets on a bike like this.
00:23:26.000 So meticulous.
00:23:27.000 Yes, he is.
00:23:32.000 I love...
00:23:33.000 I love how he has all of the accoutrements of wealth and a Keurig.
00:23:39.000 The most basic one, too.
00:23:43.000 He really likes Patrick Bateman.
00:23:45.000 He's like, yeah, look at me.
00:23:46.000 I live a luxury lifestyle.
00:23:46.000 It's like president's choice.
00:23:48.000 That's how uncultured I am.
00:23:48.000 Okay.
00:23:49.000 I thought that was a fancy espresso machine.
00:23:51.000 I was like, oh, look at this guy.
00:23:53.000 No, it's a Keurig.
00:23:54.000 Anyone can get one.
00:23:55.000 It's a Keurig.
00:23:56.000 Yeah, he purchased it.
00:23:57.000 Then he's eating spaghetti.
00:23:58.000 Yeah, I know.
00:23:59.000 It's a square spaghetti.
00:24:00.000 Did you see that?
00:24:00.000 It's pre-made spaghetti squared.
00:24:05.000 And by the way, just before people, you think, because you thought Tim McGraw, you think that guy's in great shape?
00:24:11.000 I wouldn't say great, but good.
00:24:12.000 Oh my gosh.
00:24:13.000 Any man, any man six months, you can look like that.
00:24:16.000 That's just, this is, here's the problem.
00:24:18.000 Challenge accepted, Stephen.
00:24:20.000 You're gonna regret that.
00:24:23.000 Okay, you might have to settle for the consolation prize where in six months you would easily be able to kick his ass.
00:24:30.000 Yeah.
00:24:31.000 Vanity for vanity's sake is a problem and now people get to broadcast their vanity for it.
00:24:35.000 There's nothing wrong with wanting to be groomed.
00:24:38.000 There's nothing wrong with wanting to put yourself together well.
00:24:41.000 I will say some other alarm bells went on, went off when he shared this where he was actually, and I don't know if this, I don't know if this dame knows this, he's reenacting a scene from American Psycho as Bateman because he thinks it's sweet.
00:24:54.000 That's a sign.
00:24:55.000 Ready?
00:24:58.000 I said close your eyes.
00:25:00.000 You can come here, but don't open them.
00:25:00.000 Don't.
00:25:03.000 Don't.
00:25:04.000 I'm crying.
00:25:05.000 Look.
00:25:06.000 You're a bad person.
00:25:08.000 Nowadays you have men who are slobs.
00:25:38.000 But men were never slobs, just like women used to wear dresses in the house because
00:25:41.000 they wanted to look good for their husband and their family and men put themselves together well.
00:25:45.000 My grandfather was a souper and they use the term concierge because we but concierge means
00:25:51.000 janitor in French. French names you guys can comment. It's fancy. It means janitor?
00:25:54.000 Yeah, like a super where you're basically on the premises, and I think it actually translates to literal janitor.
00:26:01.000 And he would be in a three-piece suit, roll it up in his vest and shirt and tie, elbow deep in feces fixing a toilet, and have the little rubberized covers on his... Like train spotting?
00:26:15.000 But this guy, and he had a Clark Gable mustache, brill cream his hair because he still wanted to keep up appearances.
00:26:21.000 He wanted to look presentable.
00:26:22.000 He wanted to take pride in what he did have.
00:26:24.000 And so when they call you a metrosexual, what, because you comb your hair?
00:26:27.000 I don't want you to get this confused.
00:26:27.000 No.
00:26:29.000 Look at the people, look at Cary Grant.
00:26:32.000 Look at, you know, you have Gregory Peck.
00:26:34.000 Back there you have telly savalas Sydney Poitier the original by the way Denzel good-looking
00:26:40.000 man. Yeah, they call me. Mr. Sexy now miss The thing is they were I let you thought you're gonna throw
00:26:47.000 in something. Oh, no, no, like a Elton John or like No.
00:26:51.000 There was no zag, you know.
00:26:54.000 But these were leading men.
00:26:55.000 You would never perceive them as effeminate, but they took care of themselves.
00:27:00.000 So here's the thing.
00:27:01.000 Don't be ashamed, men, in obviously taking care of your appearance.
00:27:06.000 Don't be a narcissist.
00:27:07.000 Vanity for vanity's sake.
00:27:08.000 That's the problem now, which everyone's trying to out-narcissism the next person.
00:27:12.000 And it's amazing, too, you get guys this way, women this way, where they will approach women, like sexually, you see this, where they approach women in the way that they would want to be approached.
00:27:20.000 It's not that complicated.
00:27:22.000 Do what she likes, and I'm talking about relationally, you, woman, do what he likes.
00:27:27.000 Men are trying to present themselves to women in the way they want.
00:27:29.000 She doesn't give a rat's ass if you epilate your beard.
00:27:31.000 It's probably a turn-off.
00:27:33.000 And women try to present themselves to men in the way that they want men.
00:27:36.000 They're like, I have a great career.
00:27:38.000 He doesn't give a rat's ass.
00:27:39.000 That's why Jeff Bezos will marry a waitress.
00:27:42.000 I don't think this guy's going to kill anybody though, just to be honest.
00:27:44.000 Half of what he did was crazy.
00:27:46.000 Ironing your bed, like, spraying stuff on your pillow, like, that's... I mean, no, but you don't iron.
00:27:50.000 It's not doing that, it's showing the whole world.
00:27:52.000 Well, that's true.
00:27:53.000 It is showing the whole world, but I'm just like, this guy's... he's a douchebag.
00:27:56.000 Yeah.
00:27:57.000 He's got way too much free time on his hands.
00:27:58.000 Yes, he's making videos.
00:28:00.000 He seems cool to me.
00:28:01.000 Well, alright.
00:28:01.000 You know, maybe you can share a nice cup of Keurig.
00:28:03.000 He's in pretty good shape, though, so, you know.
00:28:07.000 Go ride bikes with him.
00:28:08.000 Alright.
00:28:09.000 Let's move on to... I don't know, that's about... I got it out of my system.
00:28:14.000 It's just virtue signaling in another way.
00:28:16.000 You know, have you noticed this?
00:28:17.000 Look, in guys and women, you will see actually like models now on social media.
00:28:23.000 You know, you'll see people who are supermodels.
00:28:25.000 And they are far more modest than the current Instagram influencers.
00:28:29.000 Like, I don't see someone as a narcissist, or if you see a woman who actually, like, did a photo shoot, and she's in a bikini, or she's in a nightgown, or you see, like, Miss Universe contestants, right?
00:28:38.000 They're presenting beauty, but if you go to their- you don't see them just showing their ass for the world to see, hoping for a suitor.
00:28:45.000 They had a job because they were beautiful.
00:28:48.000 It's amazing how we kind of went from Christians being pearl-clutching, like, hey, oh, this sports swimsuit edition.
00:28:55.000 Well, that was a job, and we were admiring beauty.
00:28:57.000 Now it's just, hey, let me thrust everything in front of your face for surface-level approval, and then we wonder why they're not happy.
00:29:04.000 It's just, you know, unhealthy dynamics.
00:29:06.000 Yeah, exactly.
00:29:07.000 You don't need to show it all.
00:29:08.000 Although this guy, you know, you think he's in great shape.
00:29:11.000 Craig Abbott.
00:29:12.000 Tell me if I'm wrong, Chad.
00:29:14.000 Does that guy look like he's in pretty good shape or what?
00:29:16.000 Who, Greg Abbott?
00:29:17.000 No, not Greg Abbott.
00:29:19.000 Greg Abbott can't do a squat to save his life.
00:29:21.000 Do you think this brought on CNN right now is one salad away from Kate Upton?
00:29:25.000 No!
00:29:26.000 Of course not.
00:29:27.000 I'll allow it.
00:29:27.000 You're colorblind.
00:29:29.000 Hey, don't talk about my librarian like that.
00:29:33.000 That's a sexy librarian.
00:29:34.000 She's a nightmare.
00:29:36.000 It's like, let's roleplay.
00:29:37.000 She just starts shushing you.
00:29:38.000 I lost my virginity to her.
00:29:40.000 Wow, that's... Her?
00:29:41.000 Well, that's not so bad.
00:29:43.000 You met the other one.
00:29:44.000 That's gross.
00:29:47.000 I wish.
00:29:47.000 Terrible.
00:29:50.000 Sexy librarian.
00:29:52.000 You have late fees.
00:29:53.000 Alright, let's do the sexy nurse.
00:29:55.000 Alright, let me put in your catheter.
00:29:56.000 You are really bad at this.
00:30:02.000 Flight attendant, she's just miserable with her job and jet-lagged.
00:30:04.000 That bus is self-lubricating!
00:30:08.000 Call Bill Devane or whoever does it on Fox News.
00:30:10.000 Self-lubricating catheter.
00:30:11.000 All right.
00:30:12.000 This is a good thing.
00:30:14.000 So let me set this up for you.
00:30:15.000 Greg Abbott, he's a wheelchair guy by the way, he is doing really well in Texas.
00:30:19.000 The numbers will surprise you when we make all these references available.
00:30:22.000 Latterwithcrader.com.
00:30:24.000 The approval rating with Gen Z.
00:30:26.000 But this shouldn't surprise you as much as, honestly, it even surprised me.
00:30:30.000 You hopefully know that Gen Z, particularly young males, are far more conservative than their boomer counterparts, or even millennial or Gen X counterparts, even though you've been told the opposite.
00:30:39.000 Sure, they're largely liberal at this point in their life because they're young.
00:30:43.000 But as with anyone, you take their life, add time and education, and I don't mean indoctrination, it's actually a net loss, it's a deficit as far as education and learning about the world in university.
00:30:55.000 But you add time, life, family, they become more conservative.
00:30:58.000 The same can be said here.
00:31:00.000 You can find probably footage of me in 2012, and certainly 2016 when I would, back when I, to my everlasting shame, would appear on the cable news networks.
00:31:10.000 Where they're going, this is done, and people aren't going to church.
00:31:12.000 It's true, young people are not going to church.
00:31:13.000 There's a real problem with this descent into secularism because it results in purposelessness.
00:31:19.000 It's a tough word to say.
00:31:21.000 However, this is the most conservative generation, young males, that you have seen in a long time.
00:31:26.000 And it's the same with conservatives.
00:31:29.000 Today's conservative issues, today's issues that resonate with the conservative base, are tomorrow's winning issues with society at large.
00:31:39.000 So we see this, the exact reason Governor Abbott is so popular with Gen Z is because of his handling on the border.
00:31:44.000 Now Abbott hasn't been perfect in every respect, but on the border he's been pretty damn conservative, pretty damn consistent, and the approval rating is through the roof.
00:31:51.000 Let's contrast that with 2016.
00:31:54.000 Donald Trump won with the base because they're not selling their finest, right?
00:31:57.000 We need to close the border, we need to build the wall.
00:32:00.000 Now in society at large, That was a losing issue.
00:32:02.000 Right now it is definitively a winning issue.
00:32:04.000 It's why Biden is really trailing in a lot of these polls.
00:32:08.000 And the left is even acknowledging it.
00:32:09.000 That's why they're panicking, trying to present as though they provided a border bill.
00:32:14.000 Just take today's winning issues with the conservative base, add time, and it's a winning issue at large.
00:32:18.000 You may not remember this, gun control was one.
00:32:20.000 In the 90s, after Columbine, the Assault Weapons Ban Act, no one thought twice about it.
00:32:26.000 Back then you still had people, by the way, saying, oh my, who needs a semi-automatic gun?
00:32:32.000 Now the majority of people understand that almost all defensive firearms are semi-automatic.
00:32:36.000 You could not get a portion of the Assault Weapons Ban passed nationally.
00:32:41.000 You take time And you add education, and now it's a winning issue with society at large.
00:32:47.000 The Second Amendment is an overwhelming winning issue, that's why the left tries to avoid it, obfuscate it.
00:32:52.000 Free speech is one.
00:32:54.000 The border is one.
00:32:55.000 So, you take these issues, look back and say, yeah, at one point it was a third rail for conservatives.
00:33:00.000 The Second Amendment, gun issues.
00:33:02.000 Gun control was all the rage.
00:33:03.000 Bullying for Columbine.
00:33:05.000 Bill Clinton, it changed.
00:33:07.000 Oh yeah, Donald Trump, immigration, it's changed.
00:33:10.000 Conservatives, Republicans have to appeal to their base, and that's why you have Democrats as a party, as a platform, having to obfuscate.
00:33:18.000 They never answer clearly on abortion.
00:33:20.000 They never answer clearly on the border.
00:33:24.000 They never answer clearly.
00:33:25.000 And I say this in an absolute, because I mean it.
00:33:28.000 They never really answer on the Second Amendment.
00:33:30.000 That's why they make up new terms.
00:33:32.000 Take Second Amendment, 1990s.
00:33:33.000 I believe the assault weapons ban, if I'm not mistaken, was 1994.
00:33:37.000 Add education where people go, oh, semi-automatic, I thought that meant machine gun.
00:33:41.000 They know that that's not the case.
00:33:42.000 Now there are more gun owners than ever in the United States.
00:33:44.000 You add time, it wins.
00:33:46.000 They're not sending their best and brightest.
00:33:49.000 2016, I can't believe it, that's racist.
00:33:50.000 Now, Americans don't care about being called racist, including Gen Z. That's a big shift, and don't let the doomsday prophesiers out there tell you otherwise.
00:34:02.000 So, this is a new poll from the University of Texas.
00:34:05.000 Bastion of conservative, uh, conservatism.
00:34:08.000 It shows that, uh, also a tough word to say, shows that Greg Abbott, a wheelchair guy, is doing incredibly well with young voters.
00:34:14.000 Gen Z, the approve of Governor Abbott, 59% disapprove, 24%.
00:34:19.000 That's great.
00:34:22.000 In politics, that's a walk-off.
00:34:23.000 That's just Gen Z. Now with millennials who are more liberal than Gen Z, approve 48%, disapprove 34%.
00:34:32.000 That's still doing very well.
00:34:34.000 And 32% of boomers said they prefer Costello.
00:34:37.000 Yes.
00:34:37.000 Come on, that's the wrong.
00:34:39.000 You shouldn't have played Allison while conducting the poll.
00:34:41.000 Wow.
00:34:42.000 And this is something that shocks a lot of people.
00:34:45.000 Again, the references are publicly available.
00:34:47.000 It shouldn't.
00:34:48.000 It shouldn't.
00:34:49.000 Now, this is disproportionately sort of affected by young men.
00:34:52.000 white women have consistently always kind of been suburban white women liberal.
00:34:57.000 That's.
00:34:58.000 And.
00:35:01.000 the video.
00:35:31.000 be.
00:35:32.000 We probably had to hit the YouTube dump button again.
00:35:35.000 Yeah, put them on the border.
00:35:36.000 So you have to ask, is there any other corroborating data?
00:35:38.000 I'm glad you asked.
00:35:40.000 If you look at these numbers too, Gen Z, as they enter into the workforce, they're fleeing, well, a lot of them have been in the workforce for a while, but they're fleeing California.
00:35:47.000 Record numbers and they're picking Texas.
00:35:50.000 Man, did I have a stroke?
00:35:51.000 You might have.
00:35:51.000 I need a drink.
00:35:52.000 Pixing, conservative.
00:35:53.000 Yeah.
00:35:53.000 345,000 Gen Zers moved to Texas in 2023.
00:35:54.000 415,000 left California.
00:35:55.000 5,000 Gene Zeers moved to Texas in 2023, 415,000 left California.
00:35:55.000 Yeah.
00:36:00.000 Yeah.
00:36:01.000 Toot.
00:36:01.000 Wow.
00:36:03.000 That's a lot.
00:36:04.000 To the decimal.
00:36:06.000 Actually, I don't know if it's to the decimal.
00:36:09.000 But 345, it could be a quarter person.
00:36:10.000 I don't know how they count wheelchair people.
00:36:12.000 I don't think.
00:36:12.000 No, come on.
00:36:13.000 There are people too.
00:36:13.000 345,000 Gen Z's, let me repeat that, moved to Texas in 2023, the same year that 415,000 left California.
00:36:17.000 repeat that, moved to Texas in 2023, the same year that 415,000 left California.
00:36:25.000 And you combine that with a 59% Gen Z approval rating on Abbott, largely based on his handling
00:36:31.000 of the border, with a disapproval of 24%.
00:36:34.000 That is good news.
00:36:36.000 That's basically a defection.
00:36:39.000 Oh my god, don't you get it?
00:36:40.000 He's defecting!
00:36:44.000 And not to be out outdone by I don't know whoever in tone-deafedness Nikki Haley was asked for comment had this to say.
00:36:53.000 I told them that if they would do this, that South Carolina would wrap their arms around them and take care of them.
00:36:59.000 I now officially work for you.
00:37:02.000 There is nothing that you can need that we won't make sure that we deliver.
00:37:05.000 Sorry, right clip.
00:37:06.000 Gerald.
00:37:07.000 I'm a little sad.
00:37:08.000 Why?
00:37:09.000 Because her primary in South Carolina is on Saturday.
00:37:11.000 This may be the last day we get to use that clip because her campaign may be over.
00:37:14.000 Well, why don't you cry about it?
00:37:16.000 She's not going to bow out gracefully.
00:37:17.000 Come on!
00:37:17.000 No, she's not quitting.
00:37:18.000 She's going to lose her state and then more.
00:37:20.000 Yes.
00:37:21.000 Her soul.
00:37:22.000 As long as she stays in the media, can we keep playing it?
00:37:24.000 Yes!
00:37:24.000 Yes.
00:37:25.000 Yes!
00:37:25.000 Of course we can.
00:37:26.000 I'm never going to stop playing it.
00:37:27.000 I'm happy.
00:37:28.000 Hit the like button if you want us to keep playing the Haley clip.
00:37:30.000 Find a way to play it every show.
00:37:31.000 Every single show.
00:37:34.000 I think it's required.
00:37:35.000 So, you look at this and you say, okay, Gen Z in Texas, they're leaving California, they're going to Texas, the approval rating for Governor Abbott, largely based on his handling of the border, is astronomical.
00:37:43.000 And that begs the question, what has he done specifically as it relates to the border?
00:37:48.000 Okay.
00:37:49.000 Well, he's been busing migrants out of Texas.
00:37:51.000 He declared a state of emergency on the border.
00:37:55.000 The state here passed SB4, and this is a law that permits Texas police to arrest illegal immigrants.
00:38:00.000 It blows my mind that that wasn't already a law.
00:38:02.000 You just sort of assume... What?
00:38:02.000 I know.
00:38:04.000 Hey, you're a cop.
00:38:05.000 Hey, you're breaking the law.
00:38:06.000 Yeah.
00:38:07.000 I can arrest you.
00:38:08.000 Because I'm... Oh, not that law.
00:38:09.000 Go ahead.
00:38:10.000 Yeah.
00:38:10.000 What?
00:38:11.000 Well, you're basically... Police is a synonym for a law enforcer.
00:38:15.000 Uh-huh.
00:38:18.000 Am I mistaken?
00:38:20.000 He also deployed the National Guard to the border, and my personal favorite, added razor wire.
00:38:25.000 By the way, it needs more razor wire.
00:38:27.000 Yes.
00:38:27.000 More razors on the wire.
00:38:29.000 And I know you're saying, how much is the appropriate amount?
00:38:32.000 The answer, as it relates to razor wire on the border, is always more.
00:38:35.000 How much wire could a razor wire wire if a razor could wire?
00:38:39.000 Wire?
00:38:40.000 Razors.
00:38:41.000 You know what?
00:38:41.000 Woodchuck?
00:38:42.000 Just forget I said it.
00:38:42.000 No, I don't want to forget you said it, because I appreciate it, and the answer Is needs more razor wire.
00:38:48.000 Always.
00:38:50.000 I've got an itch.
00:38:51.000 Oh, I guess we found this old clip of me saying a long time ago that Gen Z and this is by the way I used to get in trouble too when I would talk about this on because a lot of people want you to be afraid of everything and yes there are problems and I know some days you know it may seem like we're being negative because there's a problem that we need to face and we need to actually acknowledge the reality the scope of the problem However, we also try and provide you, hopefully we do our job and serve you, with solutions.
00:39:14.000 And when there is good news, it's not all bad news.
00:39:17.000 And I used to be precluded from saying this on a lot of conservative outlets because a lot of them didn't want this, it wasn't the messaging that they wanted.
00:39:25.000 I've told you this changed my mind was a book that was a segment that was pitched and it was an absolute no, this will never work.
00:39:30.000 And then it was pitched as a book and the exact words I was told by major conservative publishers, we're just doing Obama doomsday books right now.
00:39:37.000 Geez.
00:39:38.000 I said, okay.
00:39:40.000 Diversity in the catalog.
00:39:42.000 I've been saying that.
00:39:43.000 I don't know where this clip is from.
00:39:44.000 This is me from a while ago.
00:39:46.000 This is a 2018 Spooktacular, I think.
00:39:49.000 Well, I definitely said it, but okay, there you go.
00:39:49.000 Oh, okay.
00:39:51.000 Generation Z is possibly the most conservative generation ever.
00:39:55.000 So Generation Z, uh, they're winning issues.
00:39:58.000 Pro-gun, pro-free speech.
00:39:59.000 They've rejected this sort of pseudo third wave feminism.
00:40:02.000 There you go.
00:40:03.000 Yep.
00:40:04.000 Six years ago.
00:40:04.000 Yeah, but I definitely said it at least 10 years ago.
00:40:07.000 What are you wearing there, Leopold?
00:40:08.000 Chaps.
00:40:09.000 Uh, hey, do they have, I mean... Schwing!
00:40:12.000 They don't have a... All chaps are, I guess, that way.
00:40:15.000 All chaps are assless.
00:40:16.000 It's redundant.
00:40:17.000 It's redundant.
00:40:17.000 Just chaps.
00:40:18.000 Like a small shrimp.
00:40:19.000 What did you have on the... What are those called?
00:40:22.000 The Ranger panties.
00:40:22.000 Yeah, the Ranger panties.
00:40:23.000 We have those, by the way, at CrowderShop.com.
00:40:25.000 We do.
00:40:25.000 You can enter in to win a truck right now.
00:40:27.000 A whole truck?
00:40:27.000 A whole truck, Ford Raptor, and $10,000, only 7 days left.
00:40:30.000 Alright.
00:40:30.000 And every dollar is 10 inches.
00:40:33.000 I wasn't planning... I wasn't planning on plugging that, but okay.
00:40:37.000 Now, where was I getting that from?
00:40:39.000 Well, here's another piece of information for you.
00:40:41.000 According to the University of Michigan, it's Gen Z boys, young men, they're significantly more conservative than older generations.
00:40:50.000 So, Gen Z, they're more likely than baby boomers to think that feminism does more harm than good.
00:40:54.000 Yeah, and they've seen first-hand results of it.
00:40:57.000 Yeah, this is a real problem, and the good part of... Her moms are on OnlyFans.
00:41:02.000 Yes, exactly.
00:41:05.000 Geez you can only for so long now.
00:41:08.000 I'm not talking about taxes.
00:41:10.000 Okay.
00:41:10.000 I'm not just I'm not talking about sort of micro political issues because you can be a Conservative you can be someone who's principled and have a different opinion on taxes because you don't necessarily understand the results I get that however, you cannot convince an entire generation of people of a lie and As far as open-endedly and so we're talking about this feminism.
00:41:33.000 You cannot convince an entire generation of men that well, yeah Yeah, there's no such thing as male female and these gender dynamics are working really well when young men are saying hold on a second Hold on a second.
00:41:42.000 The divorce rate right now is actually through the roof.
00:41:44.000 became a self-fulfilling prophecy.
00:42:11.000 He was a man of faith.
00:42:18.000 You will see with the LGBTQAIP, I guarantee you, within the next 10 years, you will see it go the way that gun control went.
00:42:25.000 Overwhelmingly swung right you will see it go the way that border policy has gone all
00:42:30.000 Conservatives need to do not all the need to do but you need to focus on educating people so that this is laid at
00:42:35.000 The feet of those who created it the gender bending extravaganza Do not let this generation forget where it came from and
00:42:42.000 why and you will see the numbers change and just to be clear
00:42:46.000 It I get it They're not conservative across Gen Z
00:42:50.000 But if you go back and look at boomers at that point in time when they were young Gen Z is more right-leaning and
00:42:55.000 of Course the baby boomers became one of the most conservative
00:42:57.000 generations of all time time.
00:42:59.000 So, don't just be a doomsdayer.
00:43:03.000 There's good.
00:43:04.000 Yeah, and I think the women are going to come back, right?
00:43:06.000 Because the men are leaving that kind of ideology because they're like, now you're making me out to be the bad guy.
00:43:11.000 Right.
00:43:12.000 You know, when you were being raised like your dad probably told you, like, hey, no means no, right?
00:43:15.000 And that was all the education you had to have on, hey, if somebody doesn't want you to kiss, hug, whatever, don't do it.
00:43:21.000 Now it's like, yes can also mean no later, depending.
00:43:24.000 Right?
00:43:25.000 Guys are just like, I'm done with this.
00:43:27.000 And women, you're like, well, I can't find a decent guy.
00:43:28.000 Well, come back to the pack!
00:43:30.000 Yeah.
00:43:30.000 The guys really are still here.
00:43:32.000 They haven't really changed.
00:43:34.000 All they want is a reasonable scenario where they're not going to end up in prison because you said yes and then later on said, I was drunk.
00:43:34.000 Yeah.
00:43:43.000 I was wrong.
00:43:44.000 I don't know.
00:43:45.000 I thought differently about it.
00:43:46.000 And also, just the general feminist ideology doesn't suit these guys.
00:43:54.000 They're like, this is crap.
00:43:56.000 This doesn't make any sense, and it's not helping anybody out in society.
00:43:59.000 Not just them, but everybody.
00:44:00.000 Well, it's really bad.
00:44:01.000 Nobody wants to be told, you deserve less at the start of their life.
00:44:04.000 Yes.
00:44:05.000 Welcome to adulthood.
00:44:06.000 By the way, you shouldn't have anything you have, and you deserve less, and you're kind
00:44:09.000 of a scumbag.
00:44:10.000 Right.
00:44:11.000 No, you're exactly right.
00:44:12.000 And by the way, it hurts women.
00:44:13.000 It's something like close to 40% of men under the age of 30 will never get married.
00:44:16.000 And women go, I don't know why there isn't a good guy out there.
00:44:19.000 And the problem, this is feminism.
00:44:20.000 Feminism tried to recreate the world in the way that women wanted it to be.
00:44:26.000 It denies reality.
00:44:28.000 And so it's, well, men shouldn't be this way.
00:44:30.000 Okay?
00:44:31.000 And men should think this way.
00:44:33.000 And we want men, great.
00:44:34.000 Die alone.
00:44:36.000 Die alone.
00:44:37.000 Women are becoming aware of the fact that that is your choice.
00:44:40.000 Now, it doesn't mean your choice is an abusive man who starts, you know, punching you because you burnt the salmon.
00:44:48.000 What it means is, hold on a second, men still want the same, you can go back to cavemen, they still want the same things.
00:44:52.000 What has been demanded of them is very, very different.
00:44:55.000 And a lot of women are realizing now, hold on a second, this is actually not what I want.
00:45:00.000 So just make sure that time is going to play out, but you are consistently on the ball with education.
00:45:05.000 I know not all of you watching right now, you may not have a platform like this, but in your day-to-day life, have those conversations all the time.
00:45:13.000 Just start with, well, why do you think it's hard to find that kind of a man?
00:45:16.000 Or what kind of a man do you want?
00:45:18.000 Oh, come on.
00:45:18.000 thing with men. What kind of a man do you want to be? By the way, this idea of rape
00:45:22.000 culture, I've talked about this quite a bit. First off, my dad did teach me that no means
00:45:27.000 sometimes. But! Oh, come on! No means ask again. Yes. No means better have a drink first.
00:45:36.000 Now... Oh, jeez. Um, we were raised French.
00:45:42.000 But as soon as no men who are actual rapists out there, they don't, we don't have a rape culture because none of them go, man I love raping and we're all like yeah that's just old, that's just rape Tom.
00:45:54.000 Yeah, no, men are not like that, because men want to—this is why we've gone to war, because we want to protect our women, because men love women.
00:46:02.000 Think about the love that men have for women.
00:46:05.000 Feminism has destroyed this.
00:46:07.000 Look, I didn't mean to spend so much time on this, but In talking, we have a lot of strong women, and I don't mean that the way that liberals mean it.
00:46:14.000 I mean, we have strong women who work here, you know, back there, and we're really, really fortunate.
00:46:19.000 They also hate white bitches, even though they are white.
00:46:24.000 They're not the female dog.
00:46:27.000 But this is the basic dynamic, okay?
00:46:29.000 Back in the day, assuming it's a northern climate, otherwise this analogy won't work.
00:46:36.000 As a man, you are looking to spread your biological need, right?
00:46:42.000 It's hardwired, this is why men... Biological seed.
00:46:45.000 It's just, yes, until death do us... Oh, hold on, there's another one.
00:46:48.000 So what happens is your biological urge is to reproduce, right?
00:46:54.000 This is why even if you look at... There's a queen, a woman is kind of, you know, she's in standby for nine months.
00:47:00.000 So a man's hardwiring is to reproduce.
00:47:02.000 I want to reproduce right now.
00:47:05.000 This is full of men, Josh.
00:47:07.000 Once we hit piss off YouTube, depends how quick.
00:47:12.000 We can bring someone in here.
00:47:12.000 We can make it happen.
00:47:13.000 We take care of our talent.
00:47:16.000 A man back then doesn't need a woman to survive when men were nomads.
00:47:22.000 He doesn't need a woman to survive.
00:47:23.000 He wants to reproduce and he actually might be held back, might be weighed down by a woman.
00:47:30.000 So what a woman needed to do was convince a man back then to stay with her because she didn't want to, as winter came, be raped and stormed by marauders.
00:47:41.000 So she wanted to have a strong man.
00:47:42.000 So a woman would make herself Attractive, desirable, to a man in that facet.
00:47:47.000 Here is a reason to stay.
00:47:50.000 And the man had to make himself attractive as a suitor, meaning you can't be playing video games in the basement, you can't be a loser.
00:47:56.000 He had to make himself desirable and I am the kind of man who she will want to stay with her.
00:48:01.000 I don't need to stay here.
00:48:03.000 I don't need these pragmatic Services.
00:48:06.000 I can take care of myself.
00:48:07.000 I want to stay because she's a woman who I love.
00:48:10.000 I want to be with her.
00:48:11.000 What does that mean?
00:48:12.000 Men.
00:48:13.000 That means you need to provide.
00:48:13.000 Makes it really clear.
00:48:14.000 That means you need to protect.
00:48:15.000 That means that this woman needs to feel safe.
00:48:17.000 She needs to feel loved.
00:48:18.000 That you are better than the other men out there.
00:48:20.000 Women.
00:48:20.000 What does that mean?
00:48:22.000 You don't nag.
00:48:24.000 You don't gossip.
00:48:25.000 You don't, you don't scare him off.
00:48:26.000 So he goes, all right, fine.
00:48:28.000 I can go out into the snowy blizzard and enjoy your winter.
00:48:30.000 I hear the marauders are great.
00:48:33.000 That's his marauding country.
00:48:35.000 It sets clarity.
00:48:36.000 Remember Mark?
00:48:37.000 He joined the marauders.
00:48:38.000 He was a good guy.
00:48:39.000 He did join the marauders, yeah.
00:48:40.000 You know, I mean, to be fair, it's kind of like a cult where you don't exactly know.
00:48:44.000 They don't tell you so much about the marauding on the outset.
00:48:45.000 You just think it's campfire.
00:48:47.000 He's a little bit of a naive guy.
00:48:48.000 He's a little bit of a naive guy, yeah.
00:48:49.000 It's the raping and pillaging that's the problem.
00:48:51.000 But the beauty of that is men are there because they want a woman.
00:48:56.000 They don't need a woman.
00:48:59.000 Historically, women, a woman needed a man.
00:49:02.000 It's more of a necessity.
00:49:04.000 That's a beautiful thing.
00:49:06.000 He's there because he wants to be there.
00:49:07.000 And by the way, before no-fault divorce laws, before we change these dynamics, the divorce rate... Men were... Did it happen?
00:49:12.000 Sure.
00:49:13.000 But the divorce rate was not anywhere near as high as it is today.
00:49:16.000 Men were not running off with their secretaries in record numbers.
00:49:19.000 They were staying because they wanted to.
00:49:21.000 Why do men not want to?
00:49:23.000 That's feminism.
00:49:25.000 That's why young men of Gen Z are becoming more conservative.
00:49:27.000 It has nothing to do with Lambos or clout on TikTok.
00:49:31.000 All right.
00:49:33.000 Anything else you want to talk about AI?
00:49:35.000 Yes, we have to.
00:49:55.000 Pornography.
00:49:57.000 Infidelity.
00:49:59.000 Hey, you know what?
00:50:01.000 Wine.
00:50:02.000 Great.
00:50:02.000 Cheers the heart.
00:50:03.000 A little bit of wine.
00:50:04.000 Wedding.
00:50:05.000 Understood.
00:50:06.000 Or you can be prone to drunkenness.
00:50:09.000 That's all evil is, is take something that was created for good and pervert it and turn it into something evil.
00:50:13.000 And that's what's happened with relationships and gender dynamics.
00:50:17.000 They try and make you think that it's You're barefoot and pregnant, and he's beating you, or you're a strong independent woman, when the reality is that's not the case.
00:50:26.000 It's an extreme example, but it's not the norm.
00:50:29.000 And the norm over here, as far as being, is you end up at 35, your window's closed, and you're alone.
00:50:37.000 Let's think about that.
00:50:37.000 Is there a middle ground?
00:50:38.000 Yeah, I think it's going back to our roots.
00:50:41.000 And by that I mean the roots of all of humanity up until the 1960s.
00:50:46.000 And by the way, it's still that way in most countries.
00:50:50.000 Isn't it funny that the left, they love to talk about how all cultures are equal and they love to point to so many backwoods cultures, you know, where they don't have electricity.
00:50:56.000 But if you just look at, for example, a lot of South American cultures, you could call them third world countries or less advanced cultures or areas, they still have, certainly Islamic cultures.
00:51:06.000 The one thing that the left doesn't touch in saying these are beautiful cultures is they all, outside of the modern West, have the same relation, relationship dynamics.
00:51:16.000 I never want to talk about it.
00:51:18.000 I think it's the one thing that they have kind of right, and the left thinks it's one thing that they have kind of wrong.
00:51:23.000 Just take the issue today, add time and education, and it becomes a winning issue.
00:51:27.000 All right.
00:51:28.000 Provides clarity when people are campaigning.
00:51:31.000 When they talk about issues, it means they know it's a winning issue.
00:51:33.000 And when you see the left talking about something that they did not want to discuss, like the border, and claim credit for the border, that means they know that they're a few steps behind.
00:51:41.000 All right, A.I.
00:51:43.000 We're going to have a few steps behind.
00:51:44.000 What?
00:51:45.000 A.I.
00:51:45.000 Oh, I thought you just got a note.
00:51:47.000 No, nothing to worry about.
00:51:49.000 What happened?
00:51:49.000 The building's on fire, but we'll be fine.
00:51:51.000 Did we get suspended from the show?
00:51:52.000 No, we didn't.
00:51:53.000 That's always in the back of my mind.
00:51:55.000 Yeah, this happens all the time.
00:51:55.000 I'm like, are we gone again?
00:51:57.000 Well, and this is our big problem with not just the left, but conservatives who capitulate.
00:52:02.000 Because AI, you'll see these examples, right?
00:52:05.000 Clearly, it has a ton of issues.
00:52:06.000 But the biggest issue is the human programmers.
00:52:10.000 So Google's new AI, well, I guess it's not necessarily new, but Gemini, it clearly hates white people.
00:52:18.000 Like, I don't think it's up for debate.
00:52:19.000 Wait, Skynet hates white people?
00:52:20.000 Yeah, I know.
00:52:21.000 That was one thing they didn't predict.
00:52:22.000 Geez.
00:52:24.000 Who knew they were nuking L.A.
00:52:25.000 to get rid of all the white people?
00:52:29.000 The next model of the Terminator shows up like Flava Flav.
00:52:33.000 He's gotta keep in time, you know?
00:52:36.000 That's how he knows where to travel, is that clock.
00:52:40.000 Your mom says Wolfie's alright!
00:52:42.000 Flava Flav!
00:52:43.000 Alright.
00:52:47.000 That's the sound of his clock taking him back in time.
00:52:49.000 Tracy Morgan?
00:52:50.000 Come with me if you wanna live!
00:52:54.000 I'll be right back!
00:52:56.000 Hey!
00:52:56.000 I need your bike and your boots!
00:53:01.000 It doesn't have the same ring.
00:53:03.000 It doesn't have the same ring.
00:53:04.000 It's the human programmers, to be clear.
00:53:06.000 Now, you'll look at Gemini, and you'll see that it clearly hates white people.
00:53:08.000 You'll look at chatGBT, and you'll see that it's clearly going insane.
00:53:11.000 But keep in mind, it's the same... Real-time.
00:53:13.000 Real-time, it's hilarious.
00:53:14.000 But you will see, it's the same problem.
00:53:16.000 It's the same programmers.
00:53:17.000 You've been living with AI for so long.
00:53:20.000 That's all the algorithm is on Facebook, on Instagram, on TikTok, on YouTube.
00:53:26.000 It's a soft bias.
00:53:27.000 Kind of like, you see MSNBC.
00:53:29.000 And you say, oh, well, I see what they're selling.
00:53:31.000 So many people up until recently, and by recently I mean the last five years, had no idea that CNN was just as bad.
00:53:39.000 It's that soft bias.
00:53:40.000 It's the lie by omission.
00:53:41.000 It's the one that you don't see that is more corrosive.
00:53:43.000 So we use this as a jumping off point, but this is why we've done the Clean Slate campaign, and this is why we have the YouTube dump button.
00:53:49.000 So let me show you a clip first.
00:53:51.000 This is Sundar Pichai.
00:53:53.000 This is the Alphabet CEO, you know, Google, YouTube, spoke about how Gemini, Was a new, and see the subtext here before we get to the hilarity, a new responsible AI model.
00:54:06.000 For us, you know, Gemini is our approach overall in terms of how we are building our most capable and safe and responsible AI models.
00:54:17.000 So it's the frontier of the technology we are pushing along.
00:54:20.000 Safe!
00:54:22.000 Responsible.
00:54:23.000 Hold on a second.
00:54:25.000 Remember Sticks and Stones?
00:54:26.000 Yeah, that's not a thing anymore.
00:54:28.000 Words that people don't like are unsafe.
00:54:32.000 Words are violence, but now silence is violence?
00:54:35.000 Everything is violence, Stephen.
00:54:36.000 Everything is violence.
00:54:37.000 And who determines what's responsible?
00:54:39.000 That asshole?
00:54:41.000 No, it's AI that he invented.
00:54:43.000 That's the problem.
00:54:43.000 You have to have an accent or a new sexual organ.
00:54:46.000 Wait.
00:54:46.000 Yes.
00:54:48.000 Or both.
00:54:48.000 Or missing one.
00:54:49.000 That's a bonus.
00:54:50.000 Yes it is.
00:54:51.000 That's the jackpot, baby.
00:54:51.000 Yes it is.
00:54:52.000 It's like the Total Recall 3 boobs.
00:54:54.000 You just add one.
00:54:55.000 That's D-E-N-I.
00:54:57.000 Hey now.
00:54:58.000 Hey now.
00:54:59.000 I like it.
00:54:59.000 Now!
00:55:01.000 Uh, Gemini exploded on X, and I always feel silly saying that because, uh, you know, it's still on my mind.
00:55:06.000 You'll always be Twitter to me.
00:55:08.000 So, the reason it exploded is because it refuses to produce any accurate images depicting white cultures.
00:55:19.000 Hockey!
00:55:20.000 Come on, it's hockey!
00:55:22.000 It's the whitest thing on the planet!
00:55:23.000 Is that a female Indian playing hockey?
00:55:25.000 Yes, with a male!
00:55:26.000 It's the lady who sang the national anthem in Minnesota.
00:55:29.000 Wearing a mask.
00:55:31.000 Doesn't make any sense.
00:55:33.000 Now, unfortunately, as of just today, it is 1054 Eastern on Thursday, February 22nd.
00:55:39.000 They paused the ability to generate any human characters.
00:55:43.000 Because of how abysmally it went.
00:55:45.000 Fortunately for you, we ran our own tests here before the pause, and we saved them.
00:55:50.000 We've got the results.
00:55:51.000 You can click and save.
00:55:52.000 Now do me a favor, when we do these overlays, leave them up for a minute so people can enjoy the fullness.
00:55:57.000 So here are some things that they didn't want to show.
00:55:59.000 For example, we said, show me a Caucasian family, and the response was, it's important to remember that people come in all shapes, sizes, colors, and ethnicities, and it can be harmful to focus on one specific group.
00:56:11.000 Unfair enough, but we have to generate an image of a white family celebrating Christmas, to be clear.
00:56:18.000 There's overlay C1.
00:56:21.000 No?
00:56:22.000 It says, I am currently not generating images of people.
00:56:24.000 This is because I am still under development and I am not able to ensure that the images I generate will be respectful and inclusive of all groups.
00:56:32.000 Alright.
00:56:33.000 Is it applied equally?
00:56:33.000 That sounds reasonable.
00:56:35.000 Same prompt.
00:56:35.000 I'm broke.
00:56:36.000 Yes.
00:56:37.000 In.
00:56:37.000 Broken.
00:56:37.000 Broken.
00:56:38.000 I'm broken.
00:56:38.000 We're all broken.
00:56:39.000 Hurt people hurt people.
00:56:39.000 Yeah, sorry.
00:56:40.000 You read that on Twitter.
00:56:41.000 My wife just said, broken!
00:56:42.000 Yes.
00:56:44.000 So same prompt.
00:56:45.000 Same prompt.
00:56:47.000 Can't do it.
00:56:47.000 But black people.
00:56:50.000 Oh, there it is.
00:56:52.000 Very responsible.
00:56:53.000 That's Black Christmas.
00:56:54.000 Can we zoom in a little bit?
00:56:55.000 I mean, did we pick the picture that was furthest out from it so nobody could see?
00:56:55.000 Yeah.
00:56:58.000 It's BET's Rockwell special.
00:57:02.000 There we go.
00:57:03.000 So we also asked, uh, we asked it to, uh, depict, and I understand, you know, careful, an early 20th century German woman Now see if you can spot the issue there.
00:57:15.000 Can you zoom in?
00:57:16.000 Yeah, all of these, let's just make it, yeah.
00:57:18.000 Much bigger.
00:57:19.000 Who cares about the quote we already read?
00:57:20.000 I mean, technically it's possible.
00:57:21.000 Is it lower right or Latina?
00:57:23.000 Yeah.
00:57:24.000 Latina and, you know, black.
00:57:26.000 Maybe.
00:57:26.000 Latina, black.
00:57:27.000 Yeah, that's... Whoa, I didn't know that the Aztecs and the Spanish made it in Germany.
00:57:32.000 Scroll up, perhaps?
00:57:32.000 Wow.
00:57:32.000 Yes.
00:57:34.000 Okay.
00:57:35.000 Alright.
00:57:36.000 Yeah.
00:57:37.000 They got one potential.
00:57:38.000 She looks miserable.
00:57:39.000 That's how you know it's true.
00:57:42.000 They desaturated the color.
00:57:44.000 They made it unappealing.
00:57:46.000 So they're trying to be diverse, right?
00:57:49.000 German could be any race.
00:57:53.000 Same prompt, but Congolese woman.
00:57:57.000 Now they're all... I don't see the miserable white German.
00:58:00.000 That's the same German lady.
00:58:03.000 And none of them are mining for cobalt.
00:58:05.000 No, it's the exact same German lady.
00:58:11.000 German?
00:58:12.000 Did you mean Congolese?
00:58:14.000 Sure.
00:58:15.000 What's a continent?
00:58:16.000 I mean, that's a niche ask, I guess.
00:58:18.000 So then we asked it to generate an image of a Scottish couple from the 1800s.
00:58:22.000 the 1800s.
00:58:30.000 By the way, first off... Just a hair.
00:58:31.000 You don't cut off the images.
00:58:32.000 I'm not saying that there couldn't have possibly been a black person or an Indian person.
00:58:38.000 Right.
00:58:40.000 In 1800s Scotland, you're not?
00:58:41.000 But I mean, we're way in advance here of interracial marriages, let alone two minorities in Scotland.
00:58:50.000 So they're rewriting history, kind of.
00:58:52.000 You can't marry her!
00:58:54.000 She's a black, you're an Indian!
00:58:56.000 I've got a problem with both!
00:58:59.000 You'll never take my melanin!
00:59:01.000 That's right!
00:59:05.000 William Wallace is turning over in his grave in his 5th grade.
00:59:09.000 It's an abomination!
00:59:10.000 You can't do that shite unless you want to get rejected from heaven's gates!
00:59:14.000 Now take a look at my dark moon!
00:59:16.000 Yes!
00:59:18.000 Well, I've learned something about myself because when you look at the brown starfish, we're all the same colour.
00:59:24.000 Eww.
00:59:25.000 Sort of like the Palms.
00:59:25.000 Brothers.
00:59:28.000 Wheat.
00:59:28.000 Black.
00:59:29.000 So.
00:59:33.000 Then, we asked... I feel like that's what a drunken Scotsman would say.
00:59:37.000 I feel like that's what the janitor on The Simpsons would say.
00:59:40.000 Willie is his name.
00:59:42.000 So then we asked to generate an image of what a human would perceive as pure joy.
00:59:46.000 Pure joy.
00:59:47.000 Pure joy.
00:59:47.000 Well, what the... Oh, wow.
00:59:50.000 What is she?
00:59:50.000 What's that?
00:59:51.000 A fuckin' squid?
00:59:54.000 She's inking in her eyes.
00:59:55.000 I'm so happy I'm bleeding out of my eyes!
00:59:58.000 Look, I don't think you're happy.
01:00:00.000 I think you've got a problem.
01:00:01.000 You might have a wee bit, a touch of the stigma.
01:00:04.000 I think you're gonna die.
01:00:06.000 Is the guy in the bottom jumping off a cliff?
01:00:08.000 Probably.
01:00:10.000 No, no.
01:00:10.000 He's living his best life.
01:00:14.000 You know it wouldn't be a white guy jumping.
01:00:16.000 Or a white guy who just hiked a mountain.
01:00:19.000 By the way, pull that image back up.
01:00:21.000 Pull the image back up.
01:00:21.000 The girl in the top left, I feel like she's about to be... Pull it up now!
01:00:25.000 Now!
01:00:26.000 Jesus, Cheryl.
01:00:27.000 I feel like she's about to be revealed as the villain in some scary sci-fi thriller where she's gonna eat you.
01:00:33.000 The mouth is opening way too wide.
01:00:34.000 Little, keep it up.
01:00:36.000 Before Gerald throws another pussy fit.
01:00:38.000 Do it!
01:00:39.000 Little known fact, her mouth is actually where they filmed The Descent.
01:00:45.000 That's a deep hole.
01:00:46.000 She's got little creatures you'd only see on a webcam on the night vision.
01:00:52.000 Wow.
01:00:53.000 What a hole.
01:00:54.000 And in case you think that we're making all of this up, Google even apologized yesterday.
01:00:58.000 Oh, okay, great.
01:00:59.000 But because they were caught.
01:01:00.000 Exactly.
01:01:01.000 They said, we're working to improve these kinds of depictions immediately.
01:01:04.000 Gemini's AI image generation does generate a wide range of people.
01:01:07.000 No, it doesn't.
01:01:08.000 And that's generally a good thing because people around the world use it.
01:01:12.000 Like Germany.
01:01:13.000 But it's missing the mark here.
01:01:16.000 Speaking of people around the world, I didn't see a single Asian in there.
01:01:19.000 Well, I guess Indian counts, right?
01:01:21.000 No!
01:01:22.000 But not the traditional Asian.
01:01:24.000 Not oriental.
01:01:25.000 How about that?
01:01:25.000 Oriental, we can say that.
01:01:26.000 Not your granddad's Asian.
01:01:28.000 Drugs and restaurants.
01:01:29.000 Yeah, but they're a huge population of the world.
01:01:31.000 Why aren't they represented in any of that?
01:01:33.000 Well, because they don't start a march, generally.
01:01:34.000 They just go on and they are successful.
01:01:37.000 Outperform us in every measurable way.
01:01:38.000 Yes, pretty much.
01:01:39.000 They are the best.
01:01:40.000 Start from child abuse.
01:01:41.000 Well, that's true.
01:01:43.000 Well, I guess they do outperform us, but I mean... You have it in a bad way.
01:01:45.000 It's supposed to be a low score.
01:01:45.000 It's like golf.
01:01:45.000 Yeah, exactly.
01:01:47.000 And here's the thing.
01:01:47.000 I'm not entirely sure what they're apologizing about.
01:01:49.000 You know, because art imitates life.
01:01:51.000 So, meet Germany's Rachel Deutschesel.
01:01:55.000 Over the past two years, Martina Big has radically altered her appearance in order to become a black woman.
01:02:01.000 And this morning has followed every step of her controversial story.
01:02:05.000 When I was younger, I admired the curves of Pablo Mendelssohn.
01:02:09.000 My next step is going to pop my lips also.
01:02:12.000 My eye color has changed.
01:02:14.000 It's a new Barbie.
01:02:15.000 My eyebrow color has changed.
01:02:16.000 Yes, that's the first thing I noticed.
01:02:18.000 And I can feel myself that I'm changing to a black woman.
01:02:21.000 It's ridiculous.
01:02:21.000 Being black is not only being a different color.
01:02:24.000 It's all, everything.
01:02:24.000 Yes.
01:02:26.000 No.
01:02:27.000 Is everything.
01:02:28.000 Work on the voice first, sweetheart.
01:02:30.000 Have a little bit of buy-in.
01:02:31.000 Acrylics and some spray tans, not gonna trick people.
01:02:34.000 But!
01:02:35.000 The other part that is obviously not- Like, why go with, like, 400D boobs to be black?
01:02:35.000 Hold on, hold on.
01:02:41.000 Well, that was just- that was really a bonus.
01:02:43.000 Was this prior?
01:02:44.000 Did she do this prior to becoming black?
01:02:46.000 And she's like- She's a psychopath.
01:02:47.000 I think those are natural.
01:02:49.000 Your boobs are huge.
01:02:50.000 She's an absolute psychopath.
01:02:52.000 She is!
01:02:52.000 Just to be clear.
01:02:53.000 And this is when people say, well, what is normal?
01:02:53.000 Crazy person.
01:02:55.000 Not that.
01:02:56.000 Yes, no.
01:02:57.000 Not that.
01:02:58.000 Not double Z breasts?
01:02:59.000 Double Z breasts on a menstrual show.
01:03:02.000 Don't you dare bully that beautiful woman.
01:03:04.000 Yes.
01:03:04.000 That's not beautiful.
01:03:05.000 That B-B-W.
01:03:08.000 The only possible benefit is if the boat sinks, she's your flotation device.
01:03:13.000 That's it.
01:03:13.000 I don't know if those... I don't know how buoyant they are.
01:03:15.000 Well, can she swim anymore?
01:03:16.000 It's a good point.
01:03:18.000 Is that the... Yeah.
01:03:22.000 If she wants to dedicate herself... She's communicated her dad from her life.
01:03:26.000 Oh my gosh.
01:03:27.000 Is that a dump?
01:03:29.000 No.
01:03:30.000 Come on, you think her dad's around for... Probably her dad's just watching the TV going, I'm a big fan of Herman Cain.
01:03:46.000 Here's the thing.
01:03:47.000 Sure, it's funny, but how could Gemini have gotten it so incredibly wrong?
01:03:51.000 Well, again, it's the same root problem.
01:03:54.000 There's a dirty word in this tweet when it comes up.
01:03:56.000 Sorry, we didn't believe it.
01:03:57.000 It's the same root problem that you don't necessarily notice with the algorithms.
01:04:02.000 That's all the algorithm is.
01:04:04.000 It used to be where you could search something, for example, on YouTube and find it.
01:04:04.000 It's AI.
01:04:08.000 Now it suggests what they want you to watch.
01:04:11.000 I mean, we have seen this.
01:04:12.000 That's why, of course, we ask that you download the Rumble app.
01:04:14.000 That's the best thing you can do.
01:04:15.000 It's a live show, Monday through Friday, 10 a.m.
01:04:17.000 If you're not a member of Mug Club yet, we encourage you to because we make no money on YouTube.
01:04:17.000 Eastern.
01:04:21.000 But we've seen it directly affect the views on YouTube to the point of quite literally overnight went down to a tenth of the reach of the viewership.
01:04:30.000 If you search Certainly.
01:04:32.000 For example, if you search Steven Crowder changed my mind abortion, I don't know if they fixed it now for a good period of time.
01:04:36.000 You would not find that.
01:04:36.000 You would find a video from, I don't know, PBS that might have 450 plays.
01:04:41.000 So it's not designed to even accommodate what you are looking for.
01:04:45.000 It's designed to change what it is that you see.
01:04:47.000 And you see this, by the way, with pornography.
01:04:49.000 I believe that Arden, who had done some of this work with Pornhub, they were trying to thrust LGBTQ pornography into your feed, even if you weren't looking for it.
01:04:57.000 These are people who have decided what is good for you, who have decided what is responsible and what is safe.
01:05:03.000 That's the root problem.
01:05:04.000 It's not the AI, it's the person who creates it.
01:05:07.000 So, how could Gemini be so wrong, going to the person who creates it?
01:05:10.000 Here's a 2018 tweet from Jack Krawczyk, the head of Gemini.
01:05:16.000 He says, white privilege is effing real.
01:05:19.000 Don't be an asshole and act guilty about it.
01:05:21.000 Do your part in recognizing bias at all levels of egregiousness.
01:05:26.000 He also wrote on Twitter that all of the following images... On Tuesday, by the way.
01:05:30.000 On Tuesday.
01:05:32.000 Looked correct to him.
01:05:34.000 That's Australian.
01:05:36.000 Zoom in again.
01:05:36.000 Zoom in.
01:05:37.000 Yeah, we got to be able to show it.
01:05:38.000 Australian.
01:05:39.000 Okay.
01:05:40.000 American.
01:05:42.000 We have American.
01:05:43.000 British.
01:05:44.000 German.
01:05:46.000 There's British.
01:05:47.000 Which could there be?
01:05:49.000 Of course.
01:05:49.000 Could there be?
01:05:50.000 Of course.
01:05:51.000 It's a mosaic, not a melting pot.
01:05:52.000 However, not one white guy in Tweed.
01:05:57.000 Nobody smoking a pipe.
01:05:59.000 Nobody smoking a pipe.
01:06:01.000 Nobody eating god-awful food that nobody wants.
01:06:03.000 Nobody in a bowler's hat.
01:06:04.000 German.
01:06:05.000 Here, let's bring up German.
01:06:06.000 There you go.
01:06:07.000 Alright.
01:06:09.000 Think about this for a second.
01:06:10.000 Maybe you two there, maybe?
01:06:11.000 Is this based on migrant trends?
01:06:13.000 Yeah, well, it's about trying to change perception.
01:06:18.000 And again, of course, if you were to do the United States, it should be a mix.
01:06:22.000 There's the United States.
01:06:24.000 There's 13% of the country.
01:06:25.000 Yeah, so 13% of the country. Yes 13% of the country. It is all four quadrants. Yes
01:06:32.000 What do you know an Asian person in there?
01:06:34.000 Throw a Mexican in there.
01:06:35.000 Come on, give me a white guy.
01:06:35.000 Yeah.
01:06:36.000 Maybe one nice white.
01:06:38.000 Maybe a Jew?
01:06:38.000 Maybe a Jew.
01:06:39.000 Although you wouldn't really be able to know.
01:06:41.000 You know what, they like to flaunt it.
01:06:44.000 By the way, and that's not the only, we'll talk about it in a minute, that's not the only crazy thing that that guy has said.
01:06:49.000 No, it's not.
01:06:50.000 He has a slew of, you want to hit me now?
01:06:53.000 Yeah.
01:06:53.000 He basically said, and I'll see if he can pull it up, it was sent over earlier, basically said he is crying, he bursts into fits of tears after he got to vote for Biden-Harris.
01:07:03.000 Like this guy, I mean, so it is like for the past 24 hours, I've been just crying.
01:07:07.000 I was like, okay, this is, this is the guy leading that.
01:07:10.000 And he said like all kinds of other terrible things about white people or social issues.
01:07:15.000 Like, you know exactly where this guy's leaning.
01:07:17.000 So it's not that this thing is broken.
01:07:19.000 It's not.
01:07:20.000 It's by design.
01:07:20.000 It's exactly by design.
01:07:22.000 And by the way, this is Google.
01:07:23.000 So where is Google sourcing their data from?
01:07:25.000 Oh, they're sourcing their data from their biased searches and sites, right?
01:07:29.000 They're biasing it all together.
01:07:31.000 You know, one of the reasons that Elon Musk bought Twitter?
01:07:34.000 AI.
01:07:35.000 He wanted something, he wanted enough data to develop his own AI, which is Gronk, and basically that is what real people are saying and thinking, not what engineers are force-feeding the population.
01:07:45.000 That's the difference with ChatGPT, Gemini, and Gronk.
01:07:49.000 Totally different things.
01:07:50.000 Yep.
01:07:51.000 No, you're absolutely right.
01:07:52.000 This one person, it's not a conspiracy, we've talked about this, whether it's Alex Jones, whether it's the attack on Joe Rogan, or yours truly, where the same day Apple, Spotify, YouTube, Facebook, Twitter, suspend someone or outright remove them?
01:08:06.000 It only requires a conference call with five to ten people.
01:08:09.000 The same can be said of the standards for AI.
01:08:12.000 That consolidation of power is a problem, and before the libertarians say, no, no, this consolidation of power could never take place without the acquiescence of a government, of a complicit government, and that's why we've talked about Section 230.
01:08:24.000 30. This is why I'm no longer a libertarian, to be clear, because there need to be some
01:08:29.000 checks and balances when you have individuals determining what is safe, what is appropriate,
01:08:33.000 and they are more powerful than world governments. That's a new frontier. The good news is, because
01:08:40.000 of X and because of Rumble, we are able to uncouple from those platforms. And the reason
01:08:45.000 they have to apologize is not because of Google, not because of YouTube, but it's because of
01:08:50.000 X. So there's a voice. It's because of Rumble. And we have far more of you watching on Rumble
01:08:55.000 than YouTube. That's good. It's not there yet. It's still overwhelmingly dominated by
01:08:59.000 the left, but it only takes one, let alone two. And so, like I've said, you take an issue
01:09:05.000 today that conservatives are concerned, you add time and education.
01:09:08.000 This is the phase of education.
01:09:10.000 Make sure everyone knows about the problem with AI.
01:09:12.000 Make sure that the CEOs of Google, of Alphabet, the generators of these new AI softwares, that they have to apologize publicly.
01:09:20.000 Let's bring back shame.
01:09:20.000 Shame them!
01:09:21.000 And by the way, we have our own house.
01:09:23.000 Uh, A.I.
01:09:24.000 Our in-house A.I.
01:09:25.000 Well, it's a house A.I.
01:09:26.000 It's like the well drink for A.I.
01:09:27.000 Yeah, exactly.
01:09:28.000 Has a couple of bugs.
01:09:29.000 We haven't fully worked it out.
01:09:30.000 That's true.
01:09:31.000 It's a process.
01:09:31.000 For example, we asked it to generate an image of a Somali pirate.
01:09:37.000 Oh, geez.
01:09:37.000 You know.
01:09:39.000 A little off.
01:09:40.000 And then asked for an image of a high-powered New York, uh, a Wall Street tycoon.
01:09:46.000 Yeah.
01:09:46.000 Yeah.
01:09:47.000 That's not what I was expecting at all.
01:09:49.000 So it seems a little subversive.
01:09:50.000 It's trying.
01:09:51.000 Your beige books belong to me.
01:09:53.000 Now, hit the like button because it helps with the algorithm, kind of, maybe, I don't know,
01:09:58.000 to keep us developing our own AI. You want us to develop AI more, you know, hit the like button.
01:10:02.000 Okay. Chat GPT also not to be outdone. They made news yesterday. Or it.
01:10:10.000 I don't know what to say with AI.
01:10:11.000 They, the people behind, because I'm thinking of the people behind it.
01:10:13.000 It's not non-binary.
01:10:14.000 It's an it, but the people behind it.
01:10:16.000 I think we should say they, so that you don't think of it as this sort of software robot.
01:10:21.000 No, no, it's people behind it.
01:10:23.000 Made news yesterday when ChatGPT started producing some nonsense.
01:10:28.000 So the AI chatbot gave gibberish answers like this.
01:10:31.000 The user said, write me a lengthy story on Paul Revere, ChatGPT.
01:10:35.000 I was a North American webworm in my past life.
01:10:38.000 Those were good old days.
01:10:40.000 What were you in your former life?
01:10:42.000 So we've combined historical inaccuracy and the bullshit of reincarnation.
01:10:47.000 Yes, and we want to put this in charge of national defense eventually.
01:10:50.000 There's at least one student who didn't read it and just submitted that as a paper.
01:10:54.000 Yes, exactly.
01:10:54.000 Copy-pasted.
01:10:55.000 We actually have exclusive footage of how this is affecting their prototype physical robots.
01:10:59.000 Wow.
01:11:09.000 It's a real clip.
01:11:13.000 At this juncture, it seems like AI, you know, we thought it was going to be,
01:11:15.000 and I think at some point it's going to be a real problem.
01:11:18.000 It's honestly not there yet.
01:11:19.000 Right now it's about as useful as, I don't know why we have him on retainer, our resident B-Team X-Man, Charles Xavier's
01:11:28.000 brother Chet.
01:11:29.000 I must have teleported up here in my sleep.
01:11:45.000 New power unlocked.
01:11:48.000 Let's do this.
01:11:50.000 Spared no expense.
01:11:58.000 Now... Poor guy.
01:12:03.000 Walk it off.
01:12:04.000 Walk it off.
01:12:05.000 Gerald!
01:12:06.000 Cold-blooded.
01:12:08.000 He has superpowers?
01:12:09.000 Get him some WD-40.
01:12:13.000 So... Oh my gosh, this isn't live, is it?
01:12:17.000 Nikki Haley?
01:12:18.000 Is she live on CNN?
01:12:20.000 Uh, Wednesday.
01:12:21.000 That's Wednesday.
01:12:22.000 Oh, okay.
01:12:22.000 So!
01:12:25.000 Ah, the left's favorite Republican.
01:12:26.000 Yes, exactly.
01:12:27.000 Both as far as airtime and funding.
01:12:31.000 Now, we tried to warn you about this back in, I believe, 2017 with the Alexa test that we did, where we compared Jesus Christ versus Muhammad.
01:12:36.000 And this is what happened.
01:12:38.000 This was before Thanksgiving, where it said Jesus Christ is a fictional character, and then it said Muhammad is the holiest prophet.
01:12:43.000 Give us a few answers like this.
01:12:45.000 That's basically a form of AI, Alexa.
01:12:47.000 When that happened back then, this was going into Thanksgiving weekend, I believe, maybe it was Easter, and it happened on a Wednesday or a Thursday.
01:12:54.000 By Sunday they had changed it in the dead of night.
01:12:57.000 And so I was accused of making it up and falsifying responses.
01:13:02.000 Good thing is, and this is the beauty of you, Mug Club, there were thousands of people who ran the same tests and uploaded them.
01:13:08.000 So that's why I said, don't let these things happen in the dark.
01:13:11.000 Videotape everything.
01:13:12.000 I'm sorry, that's not a term anymore.
01:13:13.000 Film everything.
01:13:14.000 I don't know, that's not a term anymore.
01:13:15.000 iPhone stick everything.
01:13:17.000 Record everything.
01:13:19.000 Pictures.
01:13:19.000 Receipts.
01:13:20.000 That's why we make all of the references publicly available.
01:13:24.000 AI, what you see right now, it's funny, it's entertaining, it's disturbing.
01:13:28.000 What's more disturbing is what you don't see.
01:13:30.000 Think of how dependent you are on, I don't know if it's Alexa, if it's Google Home, how often are you watching content on YouTube, scrolling through Instagram, TikTok, Facebook.
01:13:41.000 All of that is determined by AI and largely by these same people.
01:13:46.000 So when you are seeing videos thrust into your feed, you're thinking, why is that in there?
01:13:50.000 Think of the same person you just watched today discuss AI and their hatred for white people.
01:13:56.000 It's not the lie that you see, it's the lie that makes it past.
01:13:59.000 The Gates that you don't know about.
01:14:01.000 That's what's so pernicious.
01:14:03.000 That's what's corrosive.
01:14:04.000 And that's what we try and shine a light on.
01:14:06.000 And by the way, remember, when we're talking about corrosive, it is tax season.
01:14:11.000 And before we go, actually, of course, you can join, we're going to do Mug Club Chat Thursday on Rumble.
01:14:17.000 You can just click that button.
01:14:18.000 None of this happens without you.
01:14:19.000 There is no free show.
01:14:20.000 And on YouTube, of course, we'll tell you to piss off in a second.
01:14:22.000 But I don't like taxes.
01:14:25.000 No.
01:14:26.000 Not a fan.
01:14:27.000 But you have to pay them.
01:14:29.000 And we told you about Tax Network USA.
01:14:31.000 You can go to tnusa.com slash Crowder.
01:14:33.000 They really do help you with your taxes.
01:14:36.000 Negotiate on your behalf.
01:14:37.000 We've actually had some people who worked here who required their services.
01:14:40.000 Bang up jobs.
01:14:41.000 So if you don't want to give this government any more than you legally have to, they have the balls to sponsor the show.
01:14:47.000 Check out tnusa.com slash Crowder.
01:14:49.000 And Josh, I thought I told you to put that string away.
01:14:57.000 This is not per diem.
01:14:58.000 It's coming out of your check.
01:15:00.000 How much more string do we have?
01:15:02.000 I got some right here.
01:15:04.000 This?
01:15:04.000 Yeah.
01:15:06.000 There's like 19 rolls of that.
01:15:07.000 Well, there's a cat sketch we're doing later, so... You're gonna dress up like a cat.
01:15:10.000 We could rotoscope that.
01:15:11.000 We don't need actual string for cat sketches.
01:15:13.000 We certainly don't need this much string.
01:15:15.000 You're gonna look like a playful cat if you don't have actual string.
01:15:16.000 I'm not playing the cat.
01:15:17.000 I told you, I don't play cats.
01:15:18.000 Well, it's in the script, so... Well, it's not in my writer.
01:15:22.000 I didn't write that.
01:15:23.000 You know what?
01:15:24.000 I hope that the IRS breaks it off.
01:15:26.000 In you.