Dems' "Dark Brandon" Scare Tactics, And Reality of AI Facial Recognition Tech, with Jesse Kelly and Kashmir Hill | Ep. 696
Episode Stats
Length
1 hour and 36 minutes
Words per Minute
189.4604
Summary
12 Democratic lawmakers crossed the aisle to side with Republicans in New Hampshire on a bill banning gender affirmation surgeries for minors. Plus, Vivek Ramaswamy once again puts on a masterclass on how to deal with the media.
Transcript
00:00:00.680
Welcome to The Megyn Kelly Show, live on Sirius XM channel 111 every weekday at New East.
00:00:12.040
Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show and happy Friday.
00:00:16.820
It's so great to make it Friday, isn't it? Especially I feel it for my kids.
00:00:20.760
They're just so happy. Here where we live, it's a shorter day on Fridays,
00:00:24.980
which makes it even better for them. And I can feel their relief. I feel secondhand relief.
00:00:29.740
Although I got to tell you, when I was off for two weeks, I was missing you guys. I was missing you.
00:00:34.360
I was missing my team and I was missing the news and I was missing you and doing the show.
00:00:38.240
So, you know, vacation's good in the right amount. Meantime, we begin the show with sanity prevailing
00:00:44.820
in New Hampshire. God bless the lawmakers in New Hampshire. This actually gave me hope for the
00:00:51.960
future of our country. Twelve Democratic lawmakers crossed the aisle to side with Republicans,
00:00:58.640
two of whom voted the wrong way. The Republicans were not unanimous. Shame on you two. Last night to
00:01:04.600
pass a bill banning so-called gender affirming surgeries for minors. That's not what they are.
00:01:11.760
They're not gender affirming. They cut off children's body parts, their genitals before they can even
00:01:18.460
legally smoke a cigarette because they suffer from some gender confusion. What sane society allows this?
00:01:27.260
Um, and yet they passed a ban on it in the North Carolina house. It still has to go to the Senate.
00:01:34.000
We believe they have the votes and then we'll see what Sununu does in New Hampshire. This is Chris
00:01:39.220
Christie's big state. This is where he's polling between 10 and 11%. He's against these bans.
00:01:44.860
Pay attention, New Hampshireites. Your favorite at 10% is against you on this issue. If New Hampshire
00:01:54.340
passes this ban, we'll be the 21st state in the union to do so. And these bans should be unanimous.
00:02:01.000
They should be in all 50 states. Unfortunately, they didn't go so far as to include the ban
00:02:06.680
on puberty blockers into cross-sex hormones, which sterilizes children. I mean, do they know that?
00:02:13.920
Are they paying attention? Because what, in what world is it okay to sterilize an 11-year-old?
00:02:19.380
Okay. Um, one step at a time. Uh, I have nothing but praise today for those Dems who crossed the
00:02:26.140
aisle to make this happen. Wait until you hear what happened to one progressive lawmaker
00:02:30.520
when he dared to stand firm against the mob. Plus, Vivek Ramaswamy once again puts on a masterclass
00:02:39.100
on how to deal with the media. Joining me now to discuss it all, Jesse Kelly, host of the Jesse
00:02:47.020
Kelly Show, and I'm right over on The First TV. Jesse, welcome back to the show. Great to have you.
00:02:52.760
It is great to be here, Megan, although I'm going to push back on you on something there.
00:02:56.680
You said it's great that the kids are almost out of school on Friday. I like when my kids are in
00:03:01.900
school, Megan. My house is so quiet. My wife and I were talking this morning and there was nobody
00:03:06.100
interrupting. There was no screaming. There was no messes anywhere. It was just like a real
00:03:10.440
conversation. I love when my kids are in school. You know, you have a good point. I was just talking
00:03:15.600
about this with friends in the summer, you know, when they're off full time. Yeah. Let's just say
00:03:20.540
it's like a hard to find private adult time. And when you have like, my youngest is 10 and you get this
00:03:26.020
on the door, like what's going on in there? What are you doing? So thankfully, I take your point.
00:03:37.920
Our kids have our kids have been scolded enough to know if the door is closed,
00:03:45.900
Walk away. OK, so maybe we might have another school day on Monday, by the way, because snow day,
00:03:52.140
because we're expecting snow here in the Northeast. Remember snow? Remember how we used to get snow?
00:03:58.180
I am going to join you in the Northeast actually next week. I'm not join you, but I'll be in the
00:04:02.340
Northeast next week, Megan. I'm used to living in Houston now. So basically, I'm charm and soft
00:04:07.240
when it comes to the cold. I used to be I grew up in Montana. I used to be all about that life. And now
00:04:12.160
the second it hits 40, I'm throwing on winter clothes and hats. It's my wife and me bundled up. So I'll be
00:04:19.380
dying. Oh, no. I have to say, like, I really miss snow. I as a kid who spent her first 10 years in
00:04:24.980
Syracuse and then moved to Albany, then went back to Syracuse for college. I miss snow. I used to have
00:04:31.500
snows like winters where I couldn't go outside because the snow was over my head. You know, my
00:04:35.900
parents had to be there holding my hand so that I could breathe. Now it's like every snowstorm gets
00:04:43.040
reduced from eight inches to three inches and then comes out at a half an inch or passes you by
00:04:48.260
altogether. It just feels so lame now. Megan, you know why you miss snow and I don't miss snow
00:04:54.560
because you were a daughter and daughters get spoiled by their parents. Not that I'm saying
00:04:58.840
you were spoiled, but daughters don't get woken up at 5 a.m. to go shovel the driveway like my dad
00:05:04.300
did with me in Montana before school. 5 a.m. He'd walk in and he'd laugh. He'd hand me a snow
00:05:09.580
shovel and say, better bundle up. It's cold out there. I was just shoveling the snow. That didn't
00:05:14.460
happen to young Megan Kelly. That's why you miss it and I don't. You're right. Never,
00:05:18.840
never once. I've never had to do it. And now it's fun because I have a little like a walkway
00:05:24.740
to get to my studio. And we even had it heated because Doug was like, we'll make the kids. The
00:05:30.960
kids will have to shovel the snow off of that. I was like, they're never going to do it. It's going
00:05:36.160
to be preschool. They're not going to go out there like I am not mean enough to them. So I'm like,
00:05:41.540
we're putting the heaters underneath the tiles. So I get out there because you're right. We're
00:05:45.520
raising them soft these days. That's right. All right. Well, anyway, we may be having snow this
00:05:51.480
weekend and my fingers crossed that it actually happens. Something else that's happening today
00:05:55.500
going into the weekend is not snow, but the opposite darkness, not the light of the beautiful
00:06:00.500
snowflakes, but the darkness of Brandon, who returns to the stump today as Joe Biden officially
00:06:06.440
starts doing 2024 campaign events and is going reportedly to deliver the message that democracy
00:06:14.820
depends on him, on his reelection. And now we see the strategy unfolding. The New York times did a
00:06:21.920
long piece on it this week, and we're going to hear more of it today. The Biden strategy of running
00:06:26.860
on Bidenomics and his record is failing. You've seen Trump is leading him in all seven of the swing
00:06:32.160
states among likely voters, among registered voters. He's just crushing. So they've got to
00:06:38.140
do something different. And so they're going to go back with a strategy, Jesse, that let's face it,
00:06:42.560
has worked for the Democrats in 20, in 22, and even in some extent, the 23 special elections,
00:06:49.820
which is make it all about Trump, Trump, Trump, Trump, Trump, Trump, Trump, Trump, Trump,
00:06:53.540
Trump, democracy dies and doctors, that kind of thing. What do you make of it?
00:06:56.080
I think it's as much as I despise it and despise him. It is brilliant politics. That's very effective.
00:07:02.600
It was earlier this week, Megan, Brandon Johnson, that idiot mayor of Chicago, that commie piece
00:07:07.640
of trash who's just wrecking the city. He gave a speech and he's very unpopular there, including
00:07:13.220
by the areas that elected him because the city's full of illegals. It's full of crime. Now everyone's
00:07:17.860
looking around. Oh my gosh, how could this happen anyway? So he gets up and he gives a speech,
00:07:21.580
Megan. And he starts talking about the Confederacy. Well, this is Jefferson Davis and the Confederacy.
00:07:27.220
And everyone kind of rolled their eyes and mocked him, but it was brilliant. Yeah. You hate me. You
00:07:31.520
hate me. You hate me. But look over there. They're even worse. That's it. There's the demons over there.
00:07:36.720
It's really all Joe Biden has. He's going to do the exact same thing for Trump. There's no record to
00:07:41.440
run on. Joe Biden is not popular at all, but they understand because of media poisoning and messaging,
00:07:47.760
which they've been very effective of. Donald Trump is not popular either. And so they're going to make
00:07:52.780
sure the election is about Trump and not Joe Biden. And I guess we'll see what happens.
00:07:58.480
Yeah. I mean, whenever they talk about Trump, Biden's numbers do go up. And I don't know that
00:08:03.260
Trump's exactly go down, but Democrats and independents get reminded of the drama that's
00:08:07.340
around Trump that they didn't really enjoy. Maybe they enjoyed his policies, but they didn't enjoy his
00:08:12.060
behavior around J6. And just when he was in office, there was a lot of drama where you kind of want
00:08:18.500
to be thinking about your own life, not the life of the guy sitting in the oval, right? That's kind of
00:08:24.060
the ideal government where that's one benefit of Joe Biden is you don't hear from him that often. He's
00:08:28.980
the shrinking executive because he has to be. Sadly, he's actually running a lot of crazy ass left
00:08:34.200
wing policies out of that oval thanks to the people around him. But in any event, I do. I agree with
00:08:39.380
you. I think it's a smart strategy, whether it's going to overcome the deficit that already exists.
00:08:44.860
I don't know. We've got, what, 11 months now, and those are going to include four criminal trials
00:08:49.380
for Trump and so on and so forth. So, I mean, how do you see it now? A lot of Republicans I talked to
00:08:53.640
out in Montana over the weekend of their vacation, they're feeling bullish. They're feeling like
00:08:58.100
Trump's going to get it and Trump's going to win. Yeah, I don't think I echo that. I'm not saying
00:09:04.480
he's not, but I definitely don't echo the optimism because this is what I see out there right now,
00:09:10.540
Megan. There's so many parts of this evil system we have in this country that are going to go all
00:09:16.400
in to try to make sure Donald Trump can never be elected president again. People talk about the
00:09:21.140
ballot. They're removing Trump from the ballot in Colorado. They're removing him from the ballot in
00:09:25.540
Maine. We have the indictments. We have this and that. And people ask me all the time, hey, Jesse,
00:09:29.460
what are they planning? What are they planning? What are they planning? The answer is everything.
00:09:33.780
Everything. Every different commie in this system, from secretaries of state to the commie street
00:09:38.740
trash to the senators, doesn't matter which one they are. They're all going to go all in with whatever
00:09:45.380
it takes to try to smear Trump, stop Trump, everything else. Are they going to be successful?
00:09:51.260
I don't have any idea. I don't have a crystal ball. But they were successful last time, Megan,
00:09:56.400
when Donald Trump was president of the United States of America. That's before he was convicted
00:10:01.600
of any felonies, which he will be, as unjust and stupid as that is. He's going to be convicted of
00:10:06.520
felonies. The media is going to make him out to be a disastrous insurrectionist felon. And is that
00:10:13.500
too much for the norms and normas of this country to vote for? I don't have the answer to that question,
00:10:18.820
but I don't feel near as confident as everyone else right now. Honestly, I roll my eyes, Megan,
00:10:24.300
when I see people on the right, talk about the poll numbers. Look at the polls. The polls look
00:10:28.860
great. The right does this thing where the left will make us eat 10 pounds of cow crap. And the
00:10:35.480
second they hand us a mint to wash the taste out of our mouth, we celebrate it like it's some kind
00:10:39.940
of a victory. I remember when Trump got arrested in Atlanta and they put his mugshot up there.
00:10:44.780
They arrested the former president over ridiculous charges, and they're going to send him to prison for
00:10:49.300
that. He's going to go to state prison in Georgia if he gets convicted to that. No ifs,
00:10:52.640
ands or buts because of the appeals process there. And the right spent an entire day celebrating how
00:10:57.600
cool the mugshot was. We're not even playing the same game here. They're all in on their game. They're
00:11:04.540
all in. They are going to do everything they can to destroy him, destroy his people and everything
00:11:08.980
else. And we still, we still pretend like it's a game on our side. Wow, did you see the new AT
00:11:14.580
poll? I find it to be childlike and ridiculous, to be honest. Well, and one bad thing about, you know,
00:11:20.200
those polls is, you know, the polls look pretty good before those 2022 midterms too. And we all
00:11:26.200
know how that worked out without the last minute conviction of any of those candidates and so on.
00:11:32.700
So you're right there. It's fraught with peril. People ask me all the time, same thing, like,
00:11:36.360
what's going to happen? And I think it's like obvious at this point that Trump's going to get
00:11:39.940
the nomination. Something absolutely catastrophic would have to happen for him not to get the nomination.
00:11:44.540
But am I that confident he's actually going to win? I'm not, because the Democrats are very good
00:11:50.140
at what they do and they're very disciplined. And, you know, I was talking to somebody today,
00:11:55.460
like friends with a diehard Democrat, and that Democrat was saying, what do you mean? He's not
00:12:00.500
too old. What do you mean? He's totally competent. What do you mean? What's happening at the border?
00:12:04.280
So there are a lot of these Democrats who are maybe well-educated, but low information voters
00:12:09.700
who they don't have the problems with Joe Biden that you might or I might, and are fully prepared
00:12:16.260
to rush to the polls and to get the caravans to go vote for him. And God knows what else will happen
00:12:21.040
on election day. I don't know. It's dark. Okay. There are other candidates in the race. He's not.
00:12:26.940
Yeah, go ahead. No, no, sorry. I just, that reminded me of a story. Speaking of Chicago,
00:12:31.460
sorry to interrupt it. What you just said reminded me of a story. I was on vacation myself a couple weeks
00:12:36.140
ago and ended up at a big table. And there were a couple of liberal white women there, you know,
00:12:39.720
the most evil creatures on the planet. And they were from Chicago and they were bragging all these
00:12:44.440
rich white women. And they were bragging about how they voted for Brandon Johnson. And one of them
00:12:47.800
called him BJ. Like that's like, they were buddies. That's all. I love BJ. We love BJ and other people
00:12:52.320
at the table. It wasn't really a political talk, but they started talking about the crime situation
00:12:56.400
in Chicago, how bad it was. People were getting shot and robbed and verbatim Megan on my life,
00:13:00.880
on my life, cross my heart and hope to die. This is what she said. Well, yeah, you might get robbed,
00:13:05.200
but you won't get targeted. What? That was honestly what she said, Megan. I know it. Well,
00:13:13.060
you might get robbed, Megan. If you go to Chicago, they might stick a gun in your face and take your
00:13:17.280
wallet, but they're not going to seek you out to murder you. So what's the big deal? What are you
00:13:21.160
complaining about? That's, that's how far gone these people are. Some of these people, Megan,
00:13:26.880
some of these people are so far gone. Their entire worldview has been built up by this. You can't,
00:13:32.640
you can't pull it out like a, like a game of Jenga or their entire world comes crumbling down.
00:13:37.420
That woman could get mugged tonight, Megan, tonight on her way home. That woman could get
00:13:42.020
assaulted and mugged and she would woke up to wake up tomorrow and vote Democrat. I don't know how
00:13:45.680
you fix that. Well, this, this brings me to my second topic, which is the downfall of Claudine Gay.
00:13:51.920
And I've heard many different takes on what, if you zoom out, what does it mean?
00:13:56.700
Um, our, our friends over on the ruthless program program, they were saying, this is great,
00:14:02.500
great news. Cause it's a, it's a win. It's finally like a win for the right, which never has its shit
00:14:07.760
together. And they never band together to get anything done. And this is one instance in which,
00:14:13.440
you know, you had the free beak and you had Chris Rufo, you had all these other commentators online,
00:14:18.420
and then you had the help of some centrist and left of center, very well-known folks like Bill
00:14:24.620
Ackman pushing for it. And it was a win to get this charlatan removed from what should have been
00:14:31.660
a prestigious position at one point, whether it is today, serious doubts. Okay. So I can see that.
00:14:38.100
Then you've got the leftist woke crowd, absolutely melting down that this was racist to remove her,
00:14:45.360
that this is just part of white people's rage in seeing black women elevated to the positions of
00:14:55.120
power that we deserve. We whites are very angry about black women like Claudine getting, getting
00:15:01.580
elevated. And that's what this is really about. Like our anger at her ascent to position of power.
00:15:06.760
Obviously you and I don't agree with that shit, but, but you know, the black women are still with Joe
00:15:11.560
Biden big time. And I do wonder whether they're more in that second camp. Like, yeah, you know,
00:15:18.900
they tend to be more woke. They tend to be more Democrat. So what, how does it all shake out though?
00:15:23.780
The right feels energized. The center left is migrating to us because they see wokeism has
00:15:28.960
corrupted our nation or the core woke left is totally activated. That as Rufo put it,
00:15:37.420
a scalp was taken by one of their beloved token black women atop the positions of power in America.
00:15:46.380
First of all, it's pretty emblematic that our first scalp is one that doesn't have hair.
00:15:51.460
That's that's one. It wasn't our scalp. It wasn't our scalp. Yes. We helped Christopher Rufo did great
00:15:57.800
work in the free beacon. They did great work on the plagiarism and things like that.
00:16:01.200
Claudine Gay is gone because some of, as you pointed out, the center left came for her and
00:16:07.200
the donors came for her. This reminds me of when Andrew Cuomo got sacked in New York.
00:16:11.860
Everyone celebrated on the right because Andrew Cuomo is a piece of trash. The right didn't take
00:16:16.220
out Andrew Cuomo. The Letitia James and the Democrat machine there knifed Andrew Cuomo on the ribs. The
00:16:22.120
right had nothing to do with that whatsoever. They were unable to take him out. We celebrate when the
00:16:27.580
commies kill each other, but that's what commies have always done. So look, I'm not saying it's a
00:16:31.820
bad thing. Frankly, Claudine Gay being fired at Harvard is just a good start. You'd be better off
00:16:37.420
if you fired everybody on the campus, bulldozed the buildings to the ground and made it into an
00:16:41.540
orphanage or something like that. That would actually be better for the country. As far as what it means
00:16:46.200
for Democrats, you're going to see a ton of something in this next year. And Joe Biden actually gave the
00:16:52.840
game away with his opening ad. He ran some opening ad. This is opening ad for 2024.
00:16:57.580
And one of the main issues, if not the only issue, if I remember right, he cited in there was voting
00:17:02.080
rights. Voting rights? What is it, 1950? What is he talking about voting rights? What they're going
00:17:09.100
to do is an endless amount of Black outreach in 2024 because one of the things that you know
00:17:15.640
that most people do not in Democrat circles is they must have 92, 93 percent of the Black vote to win
00:17:22.640
elections. If that number even drops down to 80, Democrats cannot win national elections.
00:17:27.600
Their party is based on getting virtually every Black vote in the country. They are losing some
00:17:33.340
of them right now because of the brilliant GOP stunt of shipping illegals into places like Chicago
00:17:39.140
and New York. They're shipping these people into the poor Black communities. Poor Black communities
00:17:43.320
already had crappy schools. Now they're overrun with a bunch of kids who don't even speak English,
00:17:47.300
and they're mad about the whole thing. But what I'm saying is Democrats are going to have to spend
00:17:51.700
an unusual amount of time and money doing Black outreach right now. And things like this
00:17:57.060
Claudine Gay stuff hurt that cause. That's why Obama was behind the scenes pushing to keep her.
00:18:03.940
They have to maintain that base, that Black woman base, the Black voting base, or they're not going
00:18:09.200
to be able to keep power. And you're going to see a lot of that in the 2024 election season.
00:18:13.740
It's amazing to me. It's like they want to say, oh, she was fired because she's a Black woman.
00:18:20.080
And meanwhile, the truth is the only reason she had the job is because she was a Black woman. That's
00:18:24.600
what people are objecting to. If she had been qualified, if she had done her job well, if she'd
00:18:28.700
been a true scholar, even if she'd been a leftist, that would have been fine. I mean, look at like,
00:18:33.460
and I'm not just picking conservatives, but look at Ayaan Hirsi Ali. She's a Black woman.
00:18:37.380
She's totally brilliant. I guarantee you Ayaan Hirsi Ali has never plagiarized anybody in her life.
00:18:42.100
I'd be thrilled if she got elevated to the president of Harvard or she's at the Hoover
00:18:46.000
Institution. Let's make it Stanford tomorrow, as would most people who are objecting to
00:18:51.640
Claudine Gay's behavior. It's not about race or her gender for people who want her gone. It's
00:18:59.260
it's the fact that she's a fraud. It's the fact that she doesn't belong in the position. And we
00:19:04.380
all know it. And even the people at Harvard have known it for a long time.
00:19:07.920
Yeah. But this is what the communist does, Megan. They know that to the people who are using that
00:19:13.700
they use these shields forever because they've always been effective on the right, because the right
00:19:19.720
has some sort of a moral founding, even though the right can get crazy, too. But they have some
00:19:24.140
sort of a moral fabric and the communist does not. He has no moral fabric. So he uses your values
00:19:29.000
against you. You would if you if someone actually thought you were a racist, Megan, if they genuinely
00:19:34.120
thought that that would bother you because you're not because you're a human, it would bother you
00:19:37.720
with somebody thought that the communist knows that they know you're not, but they know that.
00:19:42.880
So they understand a great way to maybe attack Megan Kelly or destroy her arguments is simply
00:19:47.900
call her a racist. They do this all the time. You're a misogynist. You're a racist. You're a Nazi.
00:19:52.500
You're a white supremacist. And what it does is it gets you off the topic at hand. I see the right
00:19:57.700
play this game all the time. It gets you off the topic at hand. And now you're talking about things
00:20:02.360
they want to talk about. Racist. I've got black friends. And now you're not even talking about
00:20:07.160
the issue at hand. I watched it happen during the final couple of years of Trump's presidency
00:20:11.280
when he would give an interview. And every time the reporter would have Trump denouncing white
00:20:16.000
supremacy, 18 minutes every single hour. Well, yes, I denounce white supremacy. Do you denounce it?
00:20:21.120
Yes, I denounce it. Playing their game on their field with their refs, enforcing their rules.
00:20:26.060
And we wonder why we lose the messaging battle, the race, misogyny, all that crap,
00:20:30.840
anti-gay stuff. All that stuff's just a shield they use to shut you up. We have to stop letting
00:20:35.940
it work. You are so right. Oh, my God. What you said is so right. And I'll give you an example
00:20:44.120
today of it in our current presidential politics. Nikki Haley was asked over the Christmas break
00:20:51.820
by, yes, it was obviously a Democrat plant, what started the Civil War, which she didn't answer
00:20:58.000
well. She did not mention slavery. But who the fuck, sorry, is talking in 2024 presidential race
00:21:05.080
about, gee, what led to the Civil War? Is that an issue? We're worried about the border. We're
00:21:09.800
worried about the economy. No one's worried about what started the Civil War right now.
00:21:15.520
So this is the Democrats launching a bomb into the campaign of someone they perceive as a threat
00:21:22.100
because the polls show she would beat Joe Biden by 11 points. Trump is beating Joe Biden by some four
00:21:29.120
to six points. She would beat him by 11. So they're terrified that she would get it. She's not
00:21:34.040
looking like she's going to get it. Let's be honest. But it'd be very helpful to have her kneecapped.
00:21:38.740
So she gets asked this question. She doesn't answer it well. And now here's the follow up on a CNN town
00:21:45.940
hall that happened last night and the CNN moderator throws the bomb back in her face again. And in
00:21:53.400
answering it, she makes yet another misstep, which then Van Jones and Abby Phillips on CNN freak out on
00:22:02.560
Nikki Haley about again when the town hall ends. Listen to SOT 10. I should have said slavery right off
00:22:11.540
the bat. But if you grow up in South Carolina, literally in second and third grade, you learn
00:22:18.360
about slavery. You grow up and you have, you know, I had black friends growing up. It is a very talked
00:22:25.000
about thing. I was over, I was thinking past slavery and talking about the lesson that we would learn
00:22:31.700
going forward. I shouldn't have done that. Okay. Wait, before I get you to respond, I love that you're
00:22:37.900
laughing. Wait, here's Van Jones after the fact. She was cleaning it up with a dirty rag. I mean,
00:22:44.920
it wasn't a cleanup at all. Um, it's painful. I don't get it. Um, I, I think it says something
00:22:52.640
about her. I think it says something about the Republican base. It's literally what you just said.
00:22:58.760
I had, I on my life, Megan, I had not seen that clip before I said it, but they all do it. I had
00:23:05.040
black friends. See, I want everyone to understand because normal people will run into this. It's not
00:23:11.260
just Nikki Haley with your liberal aunt Peggy. When she shows up at the Christmas party, screaming
00:23:15.500
about her, her 15th abortion, you're going to run into this with her as well. When they sit down with
00:23:20.580
you and they talk about the civil war, what about the Confederacy? What about Nazis? They're trying to
00:23:25.340
associate a term with you. They're trying to marry that term to you. They do this all the time,
00:23:30.580
masterfully and offense. You want to play offense against these people. All the GOP does is know how
00:23:35.680
to play defense. Well, I have black friends. Offense is, are you a Nazi? Yeah. Are you a pedophile?
00:23:42.640
Now that's a horrible thing to say, right? That's a horrible thing to say to somebody. Is it not? Well,
00:23:46.760
is it not horrible to associate me with Nazism? If we're going to associate words that have nothing to
00:23:51.760
do with me and you're going to try to attach them to me, then I'm going to attach them back. I'm
00:23:56.420
going to attach horrible words back to you. Jesse, do you denounce white supremacy? Do you denounce
00:24:01.380
pedophilia? Are you pro pedophilia? Prove that you're against pedophilia right now. That's actual
00:24:06.740
offense in changing the conversation. But the GOP is so scared of its own shadow, so scared of the media,
00:24:14.300
so scared of how they're going to be framed. They actually get themselves talking about the civil war
00:24:19.080
at all, Megan. As if Van Jones or any of those boobs on CNN actually have a single bit of emotional
00:24:28.940
attachment to the civil war. I'm a history freak. I love the civil war. I'm not emotional about the
00:24:34.620
whole thing. There's nothing you could say that would offend me about the civil war because it
00:24:38.100
was like 170 years ago or whatever it was. I don't do math very well. None of these people are emotional
00:24:43.280
about it. They're trying to attach something ugly to you. And all the GOP does is know how to meekly
00:24:48.960
back up. It's, I'm not racist. My black friends, it makes me want to vomit, honestly.
00:24:54.380
I know. I distinctly remember it was after, I don't know which controversy it was, but it was
00:25:00.140
one of the ridiculous controversies that the Media Matters crew was making up about me and
00:25:04.200
allegedly being a racist. And a couple of my friends who happened to be black were like,
00:25:09.480
should we go out and do a photo op together? And I was like, it's a hard no.
00:25:13.520
So we're good. Uh, no, but Nikki Haley just fell into the trap and you know, maybe she's too green.
00:25:21.660
Maybe it hasn't been done to her enough. I have to give credit. You and I have both ripped on the
00:25:25.520
vague when he's deserved it in the past, but he nailed it this week. It was a very good week for
00:25:30.160
him when Dasha Burns of NBC got after him. Um, first Washington post came for him, which I'll get
00:25:37.800
to in a second, but this just happened, I think yesterday and it's gone viral today. Dasha Burns
00:25:42.140
of NBC tried to turn him into, um, a racist or like a racist adjacent because of his positions on
00:25:50.880
various things. And truly it was a masterclass in how to handle this nonsense. Here it is in part.
00:25:55.500
Do you believe punctuality is a vestige of white supremacy, Dasha? Look, because if you don't,
00:26:02.460
then you have a disagreement about many of the people who are defining those terms
00:26:04.620
or the written word or the use or the nuclear family. This is, these aren't my words. These
00:26:09.440
are the words of intellectual proponents from Ibram Kendi to the Iona Presleys to BLM that have
00:26:14.260
said these are vestiges of white supremacy. So we can't have it both ways. We have to have an
00:26:18.380
honest discussion. You brought up Jussie Smollett as the best example of white supremacy.
00:26:25.500
News in the back of a fake actual attack on him that we have to contend with.
00:26:29.640
And yet, and yet you have examples like the Buffalo shooter in New York, just in 2022.
00:26:36.240
But you are also cherry picking when you bring up Jussie Smollett.
00:26:38.480
I'll look at all of the statistics. More black on black crime. If you really care about actual
00:26:42.800
crime against black Americans, let's get to the root causes of it in the inner cities of
00:26:46.960
The Anti-Defamation League tracked a 38% increase in white supremacist propaganda last year.
00:26:54.460
Yeah, the ADL, I don't think is a particularly credible source.
00:26:57.600
I think the media did not hold the police accountable.
00:27:00.480
The Republicans are actually starting to gain ground, gain traction with the black community,
00:27:04.740
Do not worry that your rhetoric is pushing them away.
00:27:08.160
There are folks in the GOP right now who are concerned about your rhetoric.
00:27:12.580
Well, you know what? I'm concerned about their corruption.
00:27:16.960
If I may just finish this, if I may finish my point, Dasha, I think I will be better
00:27:25.560
We're getting close to the promised land that Martin Luther King envisioned.
00:27:28.980
We're as darn close to it as we ever have been.
00:27:31.700
And so what bothers the heck out of me is it's right when we're close to that promised
00:27:36.840
I may not get there with you, and he didn't get there with us.
00:27:39.140
But I think it desecrates the legacy of our civil rights movement.
00:27:41.740
It desecrates the legacy of Martin Luther King that right when we get closest to the
00:27:45.280
point of having racial equality and gender equality and even opportunities for people
00:27:55.300
To then obsess over systemic racism, to then obsess over white guilt and otherwise, we're
00:28:00.500
creating new waves of racism, Dasha, that we otherwise would have avoided right when
00:28:05.120
we're closest to having achieved what even the proponents of the civil rights movement
00:28:19.340
And like I've said, I don't trust Vivek at all.
00:28:22.820
I find him to be extremely untrustworthy and snake oily, but I want to make sure I give
00:28:28.860
He's become a chaos agent, which we need on the right that someone who's smart enough
00:28:33.860
and charismatic enough to change the conversation and make these people look and feel stupid.
00:28:38.440
And he gets all the credit in the world for that.
00:28:41.120
I will push back on just one thing he said there, but it's a very minor criticism.
00:28:45.720
We're not as close as we've ever been to some kind of racial harmony.
00:28:49.000
We were as close as we've ever been to some kind of racial harmony, probably in the 80s
00:28:55.260
And then the communists in this country decided they could use the civil rights thing to really
00:29:06.620
All this race stuff, the gay stuff, all this stuff, this is all about just blowing the
00:29:12.100
If tomorrow every single position of power was occupied by a black person in this country,
00:29:17.120
they wouldn't slow down or back off for even a second because it has nothing to do with
00:29:20.960
black people or gay people or women or whatever it is.
00:29:25.220
When you understand it's all just about destruction.
00:29:27.780
That's why they want to destroy the nuclear family.
00:29:29.680
That's why they want to cut your kid's penis off.
00:29:39.560
These are evil, dirty, demonic communists who are out there to destroy everything, and
00:29:44.580
they're being very successful at it at this point in time.
00:29:46.660
So what happened in that clip was just, I mean, he saw her coming from a mile away, and
00:29:57.440
Because he's, I mean, he's literally written the book on wokeness and what they're trying
00:30:01.560
to do on the left, similar to what you were just saying.
00:30:04.020
And one of the things that struck me was here she is clearly trying to perform for her
00:30:10.880
And you can see like the plaintiff whining, what about this?
00:30:17.020
What about white supremacy and the Anti-Defamation League?
00:30:22.400
And it was, it's so nice to hear a politician who's done his homework, who knows that the
00:30:27.600
ADL is a joke of an institution that only ever criticizes people on the right.
00:30:34.600
Go Google what they've said about Tucker Carlson.
00:30:36.820
I mean, they, and by the way, they completely mission strayed from where they originally began.
00:30:42.020
They've started to sound a little bit more like a policing organization for anti-Semitic
00:30:47.000
comments against Jews in the wake of the Israel attack.
00:30:49.680
But really their favorite cause for the past 10 years has just been anything a conservative
00:30:53.300
says, anything a conservative says that's not woke.
00:30:56.180
So good for Vivek for knowing that there's absolutely no stock to be put into this group and shoving
00:31:02.020
it back in her face, her whiny little unprofessional face.
00:31:08.080
I think you got shamed after your John Fetterman interview because you told the truth about what
00:31:16.320
And ever since you've been trying to make it up to your leftist base to prove you're one of them.
00:31:25.760
Let me squeeze in a quick break and then we'll come back.
00:31:28.060
And we have so much more fun to do, Jesse Kelly.
00:32:05.100
I mean, is there a bigger race hustler in America decided to punish Bill Ackman, the billionaire
00:32:10.780
investor who's been pushing to get these three women who's abominable testimony on Capitol
00:32:16.940
And he's outside of Bill Ackman's office with like the people he met on the subway that day.
00:32:25.940
He's like, we are going to storm Bill Ackman's building.
00:32:31.440
I could fall asleep in the middle of this tapioca pudding fest.
00:32:38.160
People like this guy don't have the power they used to.
00:32:43.400
And Al Sharpton, I've always thought he was an odd character, a very odd character.
00:32:53.400
Now he looks like a lollipop and it weirds me out every time I see the guy and he's kind
00:33:00.560
It looks all shrunken in to go get a donut or something like that.
00:33:06.000
He's clinging to something that has worked for him.
00:33:08.780
Only now it's kind of old and pathetic a little bit.
00:33:12.840
Have you ever, I'm 42 now, so I'm getting older.
00:33:15.440
Megan, you ever seen that guy get up and play pickup basketball?
00:33:18.260
And you can tell he used to be okay when he was younger.
00:33:20.580
And now he's out of breath after one time up the court and he just can't really do it
00:33:26.600
That's Al Sharpton when he shows up to all these civil rights protests now.
00:33:31.080
That's all you've known, but go to the Caribbean with more tax money you didn't pay.
00:33:39.260
I mean, truly he's, he's out of magic acorns and you can see it, but he's still playing the
00:33:46.580
He's a white house correspondent for the nation.
00:33:48.640
Listen, he's, listen to, listen to this racist stuff.
00:33:55.480
All of this is happening, Claudine Gay, because racist white folks had to chew with their mouths
00:34:01.420
closed for two months after George Floyd was murdered.
00:34:08.020
They'll keep roasting any black people they can get their hands on until they satiate their
00:34:12.780
bloodlust while people from apartheid states call for colorblind societies.
00:34:21.040
Racist whites just do this whenever they feel their positions of power are threatened.
00:34:26.440
The best advice I can give to any black person is hard to follow.
00:34:31.480
But the trick is to not ever rely on white folks for anything.
00:34:35.480
Because if you do, then that means they can take it from you the moment they get in their
00:34:44.100
Can you imagine if a white person tweeted anything like this about a person of color?
00:34:50.520
Yeah, look, we joke a lot, Megan, you and I, and I'm glad we do.
00:34:55.520
But this is one of the things that we really, we should talk about more.
00:35:00.620
There's a dangerous situation happening in this country, and history books say it's a
00:35:06.020
Whenever you take any group of people, whether it be a religion or skin color or whatever,
00:35:14.480
And othering them becomes sanctioned at the highest levels.
00:35:18.480
It's not, you know, one dirt ball on the street corner.
00:35:21.620
When you hear that kind of rhetoric from the president, from media figures, from academics,
00:35:26.460
Harvard, all the others, when it is universal across the board, white people suck, white
00:35:34.980
They learn about white colonizers and all these things all over the country.
00:35:38.680
What you're doing is you're creating a very dangerous situation for white people in this
00:35:43.540
And I know it's very hard to see this now because we live in the United States of America.
00:35:50.540
And this can manifest itself in some really, really ugly, really violent ways.
00:35:55.620
And it would be nice if one political party in this country had the balls to actually step
00:36:05.180
They're all going to still try to play the commie game.
00:36:07.380
But it would be really nice if we could talk about the systemic racism that is taking place
00:36:14.660
But if one guy on the street corner hates me for the color of my skin, OK, that sucks.
00:36:20.080
If it's sanctioned by the DOJ president, by academics, by everything else, sanctioned racism
00:36:26.580
by the institutions of a nation is what ends up killing people.
00:36:30.900
And there's an anti-white racism in this country that's despicable and should be talked about
00:36:44.020
Can you imagine sending out a tweet about black folks just pissed off they had to chew with
00:36:50.960
their mouths closed for two months and talking about black bloodlust?
00:36:56.400
The blacks will keep roasting any white people they can get their hands on.
00:37:00.300
That's I'm just reversing the races in what he put out, what he put in writing and posted
00:37:07.120
and said the trick is, can you imagine it reversed again, to not ever rely on black folks for
00:37:18.160
And as I point out, he's a White House correspondent for a major publication.
00:37:31.480
I left NBC because I talked about Halloween costumes.
00:37:37.760
And then he's like, they're out for the blacks.
00:37:46.180
The one comfort I have, Jesse, is that some people are talking about it now.
00:37:49.720
So today, the beginning of 2024, unlike five years ago, even when it like nobody was saying
00:37:57.760
anything about this racism, you know, by people like Ellie Mistal.
00:38:04.000
Now they've been stewing in it for five plus years.
00:38:12.380
The question is, how much have they had it and how much more are we going to allow this
00:38:15.920
to go on because it's it's stirring up terrible racial tensions?
00:38:35.760
And so when it happens to them, they're really tempted to just kind of shrivel up or ignore
00:38:41.080
it or they don't want to talk about it because our social shame system is so upside down in
00:38:45.560
this country, you're not allowed to push back on that.
00:38:48.180
But people have to start getting a lot more vocal about naming this and attacking these
00:38:53.280
people, because, again, I can't stress this enough.
00:38:55.740
Like you just pointed out, this is not some person.
00:38:58.520
This is not some random dude sitting in his mom's apartment putting out something stupid
00:39:04.040
This stuff has been institutionalized at the highest levels.
00:39:07.900
The head of the DHS, CIA, FBI, presidents, Harvard presidents, people and everyone in
00:39:18.680
Now, white people suck, white this, white that.
00:39:22.980
And I wish I really wish there was a much bigger movement pushing back against it.
00:39:29.020
Megan, you and I have talked about this before that I talk about you were talking about Montana.
00:39:33.240
We used to go hiking in Montana and you'd see eventually these huge boulders, bigger than
00:39:37.780
a car, and you'd go hiking and you'd find one that had been split in two or split in
00:39:43.140
And you're thinking to yourself, was it God himself that came down with an axe?
00:39:48.220
What split it was, over time, rocks like societies develop cracks.
00:39:56.740
The freeze comes, the water expands, the boulder, boom, splits in two.
00:40:06.460
We were on the way to having a relatively harmonious society a few decades ago.
00:40:18.620
They dig in and they split us all apart from each other.
00:40:22.400
But people are scared to discuss it because no one wants to be called a racist.
00:40:28.220
To correct myself, he's the justice correspondent for the nation.
00:40:37.000
I am not ready to reenter white society after the pandemic.
00:40:42.260
As the pandemic wanes and I have to leave the safety of my whiteness-free castle, I know
00:40:47.600
that racism is going to come roaring back into my daily life.
00:40:50.740
Over the past year, I have, of course, still had to interact with white people on Zoom or
00:40:54.720
watch them on television or worry about whether they would succeed in re-electing a white
00:40:59.740
But white people aren't in my face all the time.
00:41:01.780
I can more or less only deal with whiteness when I want to.
00:41:07.120
I've just been able to limit my exposure to them.
00:41:09.920
This man is gainfully employed and appearing on MSNBC every other night.
00:41:23.620
And while we may not be able to defeat it, we can certainly call it out.
00:41:26.700
Okay, before I move on from the Claudine Gay thing, I do think it's interesting, speaking
00:41:32.720
of the disgusting media, now they're doing hit pieces on Bill Ackman's wife.
00:41:37.520
I mentioned Bill Ackman, the billionaire who's been fighting back against anti-Semitism and
00:41:41.680
led the charge to get rid of Liz McGill and now Claudine Gay, and now is looking at the
00:41:45.660
MIT lady, Business Insider comes out with a hit piece on his wife, who used to be at
00:41:51.460
MIT and I think did her PhD at MIT, and went back and dissected her dissertation and found
00:42:00.100
some paragraphs that they say should have had quotations around.
00:42:05.120
Like she did cite the author, you know, but she didn't put the quotes on right before,
00:42:15.700
But she didn't actually put the quotation marks in there.
00:42:20.400
Like, take that, Bill Ackman will humiliate your wife if you stay on this tear.
00:42:35.680
But they understand, they create these organizations, the ADL, like you were talking about earlier.
00:42:44.640
And what they do is they whip up mobs that intimidate good people from coming out.
00:42:51.860
But if you're not on their side, they at least want you to shut up and be afraid.
00:43:02.740
But two, so the next billionaire doesn't get quite so out about that because he doesn't
00:43:07.560
want a new hit piece in the New York Times or whatever, ADL, whatever it may be.
00:43:14.980
They're very good at making you feel like the heat of a thousand suns is on you.
00:43:25.320
This is what the commies have always done everywhere.
00:43:28.740
I mean, they used to stand in front of your business in Mao's China and scream at anybody
00:43:33.820
who came inside of your shop because you were one of the bad people.
00:43:38.900
You didn't want to be marked as the person who was walking in to buy wontons from Jim because
00:43:44.480
And so eventually people stopped going and you had to leave the country.
00:43:48.660
They do the exact same thing in this country with the various little lefty organizations.
00:43:53.560
And as you know, Megan, lots of these organizations are nonprofits.
00:43:57.240
Our nonprofit industry is flat out criminal in this country.
00:44:00.480
So much of these nonprofits are funded because you're not allowed to know who the donors are
00:44:04.240
by these big commie billionaires and they do blatantly political things.
00:44:08.800
Very nonpartisan report on why Megyn Kelly is an evil misogynist and a racist.
00:44:14.640
And then the other parts of society will cite the nonprofit as it's somehow legitimate.
00:44:20.320
Well, you see Joe Biden got up and he said the ADL said Megyn Kelly's a racist.
00:44:26.980
And it's it's very effective, to be honest with you.
00:44:28.920
Yeah, they've been working very hard to do this to Tucker Carlson for quite some time,
00:44:32.520
like completely diminishing this raging racist misogynist, put him on the front page of the
00:44:39.340
That's why the ADL got involved in Tucker Carlson's alleged misogyny and racism is supposed to be
00:44:48.200
Good news out of New Hampshire, which I have to say makes me feel very happy because it's
00:44:58.200
And here the New Hampshire House has voted for sanity, saying we approve a ban on these
00:45:17.020
Two Republicans abandoned, but they got 12 Dems to vote in favor of the band.
00:45:25.020
I would have liked puberty blockers into cross sex hormones banned because it sterilizes kids.
00:45:32.500
Take a listen to Representative Jonah Wheeler, Democrat, on why he did it.
00:45:38.340
Rise today, despite the uncomfortability of this vote, because for me, it comes down to
00:45:46.260
whether or not kids should be able to get these surgeries.
00:45:49.280
And despite the fact that I am a liberal, despite the fact that I believe in non-discrimination
00:45:55.400
for trans people, for gay people, for queer people, and that I will fight until my very
00:46:01.720
last day, until they are recognized as human beings.
00:46:06.120
The question before us is whether or not children under the age of 18 should be able to get these
00:46:22.500
I honestly feel like divine intervention went into New Hampshire last night and made this
00:46:30.320
No, it is a step in the right direction, Megan.
00:46:37.480
I applaud those Democrats because that takes guts.
00:46:40.820
At the same time, I will just say, just to close out with this, we are in a lot of trouble
00:46:47.020
as a society because these bans are even something that has to happen.
00:46:53.080
But you shouldn't have to ban doctors from cutting off a 13-year-old girl's breasts.
00:46:58.800
That's not a thing that should ever have to come up before the law because it should
00:47:03.660
And even if it did occur to one of them to do that, he should be so afraid of society that
00:47:11.720
But it goes to show we don't have a politician problem.
00:47:14.420
We have a people problem, like I talk about on my show all the time.
00:47:17.500
By the way, the Jesse Kelly Show podcast, go subscribe to it, everybody.
00:47:24.060
I think you'll really enjoy my brother Jesse's program.
00:47:27.400
He's just a brother in ideology and sense, not blood brother.
00:47:37.820
But you're absolutely right, the fact that it's a problem to begin with.
00:47:40.060
And if you look at I'll see it, I'll just say in a couple seconds we have left, it's
00:47:44.620
already like the problem starting to percolate up.
00:47:46.940
I'm sorry to say when it comes to pedophilia and the attempt to normalize, quote, minor
00:48:00.720
There was a publication on Vice yesterday that was raising.
00:48:06.180
Oh, they're they're they're pushing, you know, like the pedophilia, the crazy right in response
00:48:12.520
OK, no, he actually was accused of wanting underage girls and having.
00:48:17.240
So there's there should be no normalization of it.
00:48:21.000
And anybody who deigns to actually do it should be ostracized.
00:48:41.560
I've been wanting to talk to this woman for a long time.
00:48:44.760
We are hearing a lot these days about artificial intelligence, of course, or AI.
00:48:49.060
But here's one way it's already changing our lives, even if you don't know it.
00:48:52.940
There's one good piece of it and there's one potentially very disturbing piece of it, at
00:48:56.920
least joining me now to explain what I'm talking about is New York Times journalist
00:49:02.440
She specializes in, quote, looming tech dystopia.
00:49:08.720
And is the author of the national bestseller, Your Face Belongs to Us, a secretive startup's
00:49:18.540
It is the riveting story of a small AI company that advanced facial recognition technology
00:49:36.580
So first of all, your name is based on a Led Zeppelin song.
00:49:47.540
I talked to Crystal Ball when she first became a public person and I was like, what's the
00:49:52.960
And her dad was like some sort of, he was a nuclear physicist or an astrologist.
00:50:03.460
So this, this is a great book because it's got something for everyone, Cashmere.
00:50:08.980
It's like, I think the left is generally not to reduce everything to politics, but I think
00:50:12.920
the left is generally concerned about AI and like where it's going.
00:50:15.580
And I know the right is very concerned about giving government more power to spy on us and not just
00:50:23.680
government, but even third party agencies or anybody.
00:50:26.500
And I, for me, it's, it's most, it's interesting for both of those reasons, but it's also interesting
00:50:31.080
because I really do care about like women who are the victims of domestic violence or
00:50:37.640
stalkers, which has happened to me and like the number of things you have to go through in
00:50:43.640
And look, let's face it, I've got some money so I can do that with relative ease these days.
00:50:49.120
But most women who are subjected to domestic violence or stalkers have no money and just
00:50:55.420
the hoops that they have to jump through to try to protect themselves are already too great.
00:50:58.480
And this technology that you wrote about doesn't work to their advantage at all.
00:51:05.760
So tell us just how you sort of got started on this.
00:51:09.700
Because I know, I think you were in my friend, Meryl Gordon's journalism class, right?
00:51:20.340
Well, no, because I mean, like you, I'm just curious, what made this your beat once
00:51:26.960
So journalism for me was kind of a second career.
00:51:34.380
And I was in my late 20s when I started on the journalism journal journey and was in Meryl
00:51:39.920
Gordon's class at NYU and was thinking about, you know, what should my beat be?
00:51:45.480
Uh, and at the same time, I was thinking about how invasive the practice of journalism is
00:51:51.580
that you're writing about people who sometimes don't want to be written about, uh, you're
00:52:03.700
And I just was thinking a lot about what privacy was in the modern age with all of this new
00:52:09.540
And so at NYU, I pitched a beat called the not so private parts about this kind of intersection
00:52:18.600
And it was supposed to be a year long project, but it's what I've been writing about in the
00:52:25.340
Your most recent piece I saw was about how our cars are spying on us and are being used
00:52:32.420
in some circumstances by when people get a divorce.
00:52:35.300
If one spouse is the registered owner, he or she can spy on the spouse who may get the
00:52:49.620
I mean, the world that we live in now is just so difficult in so many ways because, you know,
00:52:55.920
the, all these objects are internet connected things in your home, you know, your TV, uh,
00:53:01.040
your coffee pot, maybe, and cars now are, are collecting a lot of data.
00:53:06.180
Um, it's, it's concerning because most people don't understand how much information is being
00:53:13.380
And this particular issue I was writing about in this story is that many modern cars have
00:53:18.360
apps that you can use to see where they are, to unlock them, to make the horn honk.
00:53:24.520
They're convenient features when, you know, you park somewhere in the parking lot and you
00:53:27.860
can't remember where, um, but I was talking to domestic violence experts who say that these
00:53:33.700
convenient features are being weaponized in kind of abusive relationships and, uh, women
00:53:40.120
it was, it was only women I talked to were, you know, separating from husbands and finding
00:53:46.440
that their husbands were tracking where they were going by firing up the car app and looking
00:53:51.700
at where the, the car was even harassing them, you know, by, uh, making the horn honk and making
00:53:57.600
the lights turn on, making the car start in the middle of the night in their garage. Um, and they
00:54:03.460
would contact the car manufacturer and say, Hey, like, stop giving my husband, my ex-husband
00:54:08.540
access to the car. Um, and the car manufacturers just were not able to help them. They said, because
00:54:16.140
the, you know, the car was also in the husband's name or maybe only in the husband's name,
00:54:20.680
even though the woman had a protective order, um, or had been awarded the car during divorce
00:54:26.800
proceedings. It's amazing when you look around and I'm going to get to the book and what you
00:54:31.520
revealed about this company, what they're making their product, but it is amazing when you look
00:54:35.620
around you and realize how, how much of your privacy you've already sacrificed to live in the
00:54:41.920
modern world. You know, we know, uh, that Facebook and the social media companies are tracking
00:54:47.500
everything about us. And even now, like when you try to opt out of cookies or anything like that,
00:54:51.700
it's so hard. They make you jump through so many things and your email address gets sold
00:54:56.300
to so many companies. And every day you get a new email from a new, you didn't ask for,
00:55:00.560
and then to unsubscribe, you know, like they want you to enter your email to unsubscribe.
00:55:04.760
You're like, wait a minute. Is this a dummy account? I'm like, what am I, who am I doing a
00:55:07.980
relationship with now? There's just, I complained on the show a couple months ago about,
00:55:11.780
I was trying to buy a coat in Chicago. And the woman was like, what's your email address? I'm
00:55:17.180
like, why do you need to know that? Just give me my, here's my credit card. It works. Give me that
00:55:21.920
and give me the receipt. Nope. Need your email address. What we had an argument, you know,
00:55:27.060
just at every turn, even we have life, life 360 on our phones. Right. So like you can see your kids
00:55:33.700
now that a couple, two of my kids have phones. Well, Doug and I went on it. Okay, fine. They can see
00:55:39.400
where I am on life 360. Did you know, if you press something on life 360, you can go back and see
00:55:43.340
every single spot you've visited over the last 30 days, at least it's all right there. Like your
00:55:50.360
entire life. It's very disconcerting. The amount of privacy we've already sacrificed.
00:55:57.440
Yeah. I mean, I think there's a lot of benefits, right. That have come from the way that we live
00:56:01.840
today. The fact that with our, you know, smartphones, you can land anywhere in the world and you can call
00:56:07.720
an Uber, you know, you can figure out which restaurants to eat at. Uh, technology has
00:56:12.720
benefited us in many ways, but increasingly there's this kind of constant, you know, background data
00:56:18.420
collection, and it's not always being way used in ways that benefit us. You know, there's these apps
00:56:24.020
on your phones. They have third party, you know, ad networks that are keeping track of where you're
00:56:30.880
going and creating that same kind of list of places you've been that you've seen created by life
00:56:36.420
360 in an app that you have chosen to use. And so I, I, that's what I kind of try to track in my
00:56:42.500
journalism is, you know, what is happening? You know, where, how is the data collect being collected?
00:56:49.120
Who is using it? And when is it being used in ways that really harm you? Um, because that's what I
00:56:55.140
get concerned about is, you know, what's the harm here? Um, how is this coming back to haunt people and
00:57:00.780
how can we prevent those kinds of uses of the technology? So that's the perfect setup for
00:57:07.740
Clearview AI, this company that you found out about and wrote an article about, wrote a book about,
00:57:15.400
and really they've given you a lot of access. So they tried to prevent it at first, but ultimately
00:57:21.040
they submitted because they realized it's not great to not cooperate with the New York times when
00:57:25.580
they're doing an in-depth piece on you. Um, and this company is emblematic of everything we just
00:57:32.820
discussed. They're doing some stuff that is great that most people would say, right on, go get them.
00:57:39.420
We need a lot more just like this, but this has the potential to, and is most likely going to veer
00:57:46.140
into a lane that many of us would find very disturbing. So let's start with, uh, your initial
00:57:53.440
encounter with this company and what kind of turned you on to them and their initial stiff
00:57:59.200
arming of you. Yeah. So it started for me in the fall of 2019. I had just become a reporter at the
00:58:06.120
New York times and I got a tip from a source, somebody I knew from the privacy security world
00:58:11.820
who had been doing public records requests to police departments about what facial recognition tools they
00:58:17.720
were using. And he'd gotten this 26 page PDF back from the Atlanta police department. Um, and it
00:58:25.640
included a legal memo written by Paul Clement, a very high profile lawyer now in private practice,
00:58:31.880
but used to be solicitor general for George W. Bush. Exactly. Uh, and he was describing this tool
00:58:39.960
called clear view, how it had, um, I think, uh, billions of photos at that point that had been
00:58:47.840
scraped from the internet, you know, without anyone's consent to build this facial recognition tool
00:58:52.780
where you could take a photo of somebody, a stranger, upload it to the app, and it would return all the
00:58:59.340
other places on the internet where their photo appeared revealing, you know, their name, their social
00:59:04.420
media profiles, um, maybe details about their life, maybe photos they didn't even know were on the
00:59:09.500
internet. He said he had tested it with, you know, uh, lawyers at his firm. It were, it returned very
00:59:14.880
fast and accurate results. Um, and, you know, it had scraped Facebook, Instagram, Venmo, LinkedIn,
00:59:21.920
you know, basically all your favorite social media sites, as well as kind of the wider web.
00:59:25.720
And he had written this legal memo for police departments who might be interested in using it
00:59:31.020
to reassure them that they could use the app, um, without violating any state or federal privacy laws.
00:59:38.480
And I am reading this and I am just astounded. I mean, I've been, I've been covering privacy at
00:59:44.060
that point for more than 10 years. And I had never heard of the, the kind of technology that could do
00:59:49.920
this. And it was being offered by this company I had never heard of before called Clearview AI.
00:59:55.860
And the more I started looking into them, the stranger it got.
00:59:59.400
So just as a, as an overarching theme, they're using what you refer to as one's face print.
01:00:09.100
People are familiar with fingerprints and generally would be reluctant to place their fingerprints
01:00:14.620
online as a record associated with them. You know, you wouldn't want that out there.
01:00:20.360
They're familiar with an iris scan also seems very intrusive,
01:00:24.260
but a face print is also existent on every person. And while yes, your face, it can be seen in various
01:00:33.740
photos. Your face print would be much more widely and easily detected by this technology.
01:00:40.200
And it will collect photos of you that you didn't even know existed. Half, half photos,
01:00:46.300
three quarter photos. You're in the background on something, not even posing.
01:00:49.240
It's extremely sophisticated and good at what it does. And you make the point in the book that
01:00:55.820
the people who put this company together, they're not some geniuses that this has been considered
01:01:02.020
and rejected by all the big, basically tech companies who are already out there collecting
01:01:08.100
our data. But this was a bridge too far for all of them. Yeah. I mean, when I first started looking
01:01:14.180
into Clearview AI, there was very little out there about them. They've kind of taken pains to,
01:01:19.240
hide who was behind the company, which was ironic given they were putting all this information out
01:01:24.700
there about all of us. I reached out to them. I reached out to Paul Clement. I reached out to
01:01:29.680
anyone I could find kind of attached to the company and no one would respond to me. So I thought, well,
01:01:35.740
maybe I can, I can, I even actually had an address on their website and it was only a couple blocks
01:01:41.300
away from the New York times office in Manhattan. And I walked over there and discovered that there was
01:01:46.600
no such address. I was kind of looking for it. I compare it in the book to Harry Potter because
01:01:50.360
I was like looking for a platform that wasn't there. So they were very secretive when I first
01:01:54.620
discovered them. And so I went to police officers who I thought had used the app based on it kind of
01:01:59.860
showing up, um, on, on budgets or because they were appearing in these public records requests.
01:02:06.420
And I ended up talking to a, uh, financial crimes detective in Gainesville, Florida named Nick
01:02:12.840
Ferrara. And he was telling me, wow, I, I love Clearview AI. I would be their spokesman if I could.
01:02:19.820
The tool works so well. It's way more powerful than anything I've ever used before. Um, he said that
01:02:26.780
he had a, a pile of, um, unsolved cases on his desk. He'd run them through the state recognition,
01:02:33.340
facial recognition system in Florida, not gotten any matches. And so, um, run them through Clearview
01:02:39.300
AI. And he said, I got match after match after match. He said, it was really incredible. And so
01:02:43.840
I said, well, this sounds great. I'd love to see how it works. And he said, well, send me your photo
01:02:47.920
and then I'll screenshot the results for you and send them your way. So I sent them, I sent him my
01:02:53.500
photo and then he goes to me and he stopped responding to any of my messages. This basically
01:02:59.140
happened again with another officer though, before he stopped talking to me, he did run my photo and
01:03:04.820
he said, I didn't have any results, which we both thought was very strange because I have a lot of
01:03:08.900
photos on the internet. Eventually I'd find out that Clearview AI had, uh, put an alert on my face
01:03:15.280
so that they were being notified when I was being searched for. They were reaching out to these
01:03:20.380
officers and saying, you know, uh, don't talk to her. You're violating the terms of our app by running
01:03:26.160
her picture. And so this was alarming to me because it showed me that this company could,
01:03:31.900
you know, see who law enforcement was searching for, could control whether they could be found.
01:03:37.280
And it just shows the power of facial recognition technology, this idea that you can be searched for,
01:03:42.740
there can be an alert on your face and people might be reacting to you and you might not even realize
01:03:49.640
it. And we kind of have seen that play out at Madison square garden where James Dolan,
01:03:54.920
the owner of Madison square garden decided to start looking for, um, he decided he wanted to
01:04:00.980
ban certain people from coming into his venues, lawyers who worked on, who had sued, who had sued,
01:04:08.200
uh, you know, Madison square garden or any of their other companies. And so they went to their law firm
01:04:14.160
websites and scrape their photos, you know, from their own biographies on their websites and create
01:04:19.760
this big ban list. And when those lawyers tried to go to Madison square garden, um, to see a Knicks game
01:04:25.580
or the Rockettes at radio city music hall or Mariah Carey concert, they would be stopped at the door
01:04:31.860
and turned away and told, you're not welcome here until your firm, you know, drops this suit or settles
01:04:37.980
this suit, um, where the suit comes to an end. And so it just shows you how powerful this technology
01:04:42.820
could be in the hands of corporations, for example, companies who want to know who you are the minute
01:04:49.460
you're walking through the door. Yes. It's, it's like, I mean, you think about, I like you could go
01:04:56.360
either way with it, but you know, after January 6th, all the proposed bans on anybody associated,
01:05:02.560
not just with the, the protesters, the rioters, but with team Trump, like anybody with the last name,
01:05:07.560
Trump, anybody who's, who's on the Trump team or the administration, all banned publishers were
01:05:12.940
saying no books by you. Imagine that expanded to, you can't come in here for coffee. You can't come
01:05:19.640
in here to watch a Knicks game. You can't like, it could go so far beyond that. And at least, at least,
01:05:25.020
um, in the example I just raised, it would require a name and the person would have had to have asked for
01:05:32.900
something and submitted a record. This is who I am. And this is what I'm at. It's not just like
01:05:37.220
walking in for a cup of coffee, like doing the things we do for a, to, to exist. And just your
01:05:43.520
mere face, your face print tells them so much about you. It's like making yourself instantly famous
01:05:50.600
basically. And you don't want to be famous. You want to be a private civilian. I do always feel
01:05:56.360
funny talking about this topic with somebody like you who, you know, the ship has sailed for you.
01:06:02.740
Most places where you go around, they know that you're Megan Kelly, but the rest of us still
01:06:07.080
have a certain degree of anonymity as we move through the world. And I do, my big fear with
01:06:13.440
facial recognition technology is that it brings an end to that. And that it takes all the information
01:06:18.860
that over the last couple of decades that we've been online, that we've been putting information
01:06:24.460
out there about ourselves, that it's been collected without our knowing it. There's all these dossiers
01:06:29.620
now that basically exist for all of us, that that could just be attached to us in the real world,
01:06:35.500
that our face becomes the token to, to be able to access all of this information about you
01:06:42.520
all of the time. And it can be used to judge you in ways. Yeah. I mean, whether you're a liberal or
01:06:48.480
you're conservative or you're, you know, rich or you're poor or who you work for.
01:06:55.940
These guys who started this firm, they're more right-leaning. They, I guess if you had to put
01:07:01.960
money on who they would want to use it against, if they really went nefarious, it would be against
01:07:06.720
liberals because they're like, one of the big investors is Peter Thiel, who's a conservative
01:07:11.020
investor. And the other guys, as I understand, they, they met at the Trump convention in 2016,
01:07:16.520
the Republican National Committee, like these are more right-leaners.
01:07:18.860
Yeah. They met, they met before the Trump convention, but, uh, apparently the kind of
01:07:23.980
idea for Clearview first came about when they were at the Trump convention, they were thinking,
01:07:29.200
wow, there's all these strangers here. Um, you don't know who anyone is. Wouldn't it be nice if
01:07:34.120
you had some kind of tool, an app on your phone where you just kind of pointed it at somebody
01:07:38.580
and it told you, give you an indication, you know, who this is. Are they a friend? Are they a foe?
01:07:43.760
Are they somebody I should get to know? So even in the beginning of this company,
01:07:47.820
they were thinking about this kind of this, this use and this kind of more polarized world,
01:07:53.540
you know, who's on my side, who's not on my side.
01:07:57.680
Yeah. I mean, it's very, it's, it's very scary. It could be used against everyone,
01:08:02.140
depending on whose hands it would fall into. And let's not kid ourselves. That's the concern.
01:08:06.940
It's that this Clearview AI will not be the only company using it. It's going to be widespread.
01:08:12.840
And there's a real question about whether it can be stopped at all. This really could be our
01:08:16.840
future in 10 years. Everyone might have it. And truly privacy may be a thing of the past.
01:08:21.820
One of, I think one of the executives said that to you, like it's, it's over cashmere. Like there
01:08:27.120
is no privacy that those days are over. Yeah. I mean, one of that was one of the investors
01:08:32.680
in Clearview AI. I said, when he, when I first started tracking down the company and I went to
01:08:38.820
his door and he ended up letting me in in part because I was quite pregnant at the time. And I said,
01:08:43.420
oh, you know, I've come all this way, offered me water. We sat down and he was talking about how
01:08:48.080
excited he was about Clearview AI as an investment that he believes he hoped in the future that the
01:08:54.720
same way you Google someone's name, you would Clearview someone's face. And he said, you know,
01:08:59.960
right now, Clearview is just selling it to police departments. But his hope was that they would start
01:09:05.000
selling it to everybody, that it would be an app on everybody's smartphone. And I said, you know,
01:09:10.740
that seems kind of alarming to me, this idea that we wouldn't have the right to be anonymous anymore.
01:09:16.740
And he said, yeah, I realize it's dystopian, but I just think that's the nature of technology that
01:09:21.060
it's, you know, it's eroding privacy and there's nothing we can do about it. And, you know, I think
01:09:26.240
tech companies are selling these kinds of tools. That's what they want people to believe, you know,
01:09:30.380
give up. There's no hope for your privacy, just accept this. But I think that we still can protect it.
01:09:36.780
And there are examples in the past of times that we've done that. So I remain optimistic that we
01:09:42.000
might still preserve a bit of our anonymity. Well, and you've got the one state of Illinois,
01:09:46.660
which is like the one state that's done something to protect its citizens from this kind of technology.
01:09:51.500
We can talk about that in a minute, but let's just spend a minute on. So finally they did,
01:09:55.840
you were pregnant and you, you did what a good reporter will do, which is somehow got yourself in
01:10:00.240
the door. And eventually they started talking to you. And, um, there's a very interesting guy behind
01:10:07.080
the technology. What's his name? Tom. Wanton Tat. Wanton Tat. Okay. Who is Wanton Tat?
01:10:16.320
Wanton Tat is the kind of technical mastermind behind Clearview AI. He grew up in Australia,
01:10:22.540
was always really interested in computers, technology. At 19 years old, he drops out of college
01:10:29.700
in Canberra, Australia, and he moves halfway around the world to San Francisco, um, kind of chasing the
01:10:37.020
tech dream. And he at first was creating Facebook quizzes and iPhone games, um, and not really having
01:10:45.440
a lot of success. And eventually he ends up moving to New York, kind of falling in with this more
01:10:50.980
right-leaning crowd. And then he goes and creates this incredibly powerful technology, Clearview AI.
01:11:00.060
And I kind of, you know, at first the company did not want to talk to me. Eventually they came around
01:11:04.300
and I've actually spent a lot of time with Wanton Tat, really interesting, um, character. And I asked
01:11:10.180
him, you know, how did you go from building iPhone games to this incredibly powerful, you know,
01:11:16.680
potentially world-changing technology. And he said, I was standing on the shoulders of giants.
01:11:22.620
He said at the beginning, he just went on to Twitter and followed machine learning experts.
01:11:27.840
He went on to GitHub, this kind of place where, uh, computer scientists share code. And he looked
01:11:32.940
up facial recognition. And we both started laughing when he's telling me this. He said,
01:11:37.980
I realize it sounds like I Googled a flying car and then I built one, but this kind of Clearview's
01:11:44.880
journey, Wanton Tat's journey really, um, reveals what has happened in technology, which is that
01:11:50.660
there has been a lot of sharing and open sourcing, um, of these AI tools. And it allowed him and kind
01:11:57.780
of like a ragtag band of people to create a really, really powerful technology. And at first I thought
01:12:04.580
that Clearview, um, had had this kind of technological breakthrough to create this tool. But in my reporting
01:12:10.940
for the book, I discovered that as you were saying before, Facebook and Google had both created
01:12:16.820
technology like this internally and decided not to release it because they thought it was too
01:12:21.920
dangerous. And so what Clearview had done was more of an ethical breakthrough than a technological one.
01:12:28.720
Ethical breakthrough. That's a nice way of putting it. A disregard perhaps, but again, I'm not against
01:12:34.280
the company. I love what they're doing in law enforcement. The scary thing is when it goes beyond that.
01:12:38.660
So let's spend a minute on what they are doing for law enforcement, because when I get to the
01:12:42.720
stories about them nailing pedophiles, I'm cheering on my feet and they are, they are using this and
01:12:50.360
have, have found pedophiles through some incredible detections of, as I was kind of saying, not even
01:12:57.700
people who are front and center in various photographs. Yeah. I talked to a department of
01:13:02.960
home and security agent, um, who, this was actually the first time that the department of home and
01:13:07.700
security used Clearview, he had this case where, um, they had, uh, come across an, an image of abuse
01:13:15.580
in the, in a, in a Yahoo account of somebody who was based outside of the country. And he had the
01:13:23.520
photos and he's, he's, he's thinking, what do I do? How do I solve this case? Um, and so he sent a,
01:13:32.560
he, um, he ended up running the, the, he did a screenshot of the abuser space and he sent it to
01:13:39.040
other child crime investigators. And he said, Hey, does anyone else recognize this person? Have you
01:13:43.400
seen them in other photos? Um, they knew it was somewhere in the U S because of, uh, an electrical
01:13:49.140
outlet. They could see it was a U S outlet. So they knew it was somewhere in the United States.
01:13:52.440
And there was another agent who had access to Clearview at that time. And so she ran the photo
01:13:58.200
and sent him one of the results, which was an Instagram photo. And when he first looked at it,
01:14:04.140
he didn't see the abuser's face. And he said, Oh, he's, he's not here. And the other agent told him,
01:14:10.200
look in the background. And in the background of this Instagram photo is this guy standing at a booth
01:14:16.160
and that wound up being the person. And so this investigator followed these breadcrumbs,
01:14:22.140
figured out where he worked, figured out who he was, figured out who, where he lived in Las Vegas,
01:14:26.620
and they wound up arresting him. And he's in jail now, you know, getting this child out of his, um,
01:14:33.800
his access. And the department of Homeland security, ICE, it was the unit he was part of. They said,
01:14:39.860
we, we have to have this tool. And they wound up signing up for it. And so department of Homeland
01:14:44.240
security now does have this contract with Clearview that they've had for a few years now.
01:14:48.880
And he said, you know, we just never would have found that guy without this technology. So you
01:14:53.360
can see why this is so appealing to law enforcement. You know, when you only have a face to go on,
01:14:59.100
this is something that could help you solve that case. So this is like a guy who wasn't necessarily
01:15:04.460
even posing for not the illegal photo, but the other photo just happened to be caught in the
01:15:10.940
background of someone's photo that's available on the internet or on Facebook or in someplace where
01:15:15.760
Clearview searches. And he made the mistake of having a picture of him committing this disgusting
01:15:21.740
sin and crime in a picture that was on his email or however law enforcement got its hands on it to
01:15:26.620
begin with. And they matched it just from the back, him being in the background of a, I mean, that's
01:15:32.820
amazing that what you point out in the book, and it kind of got me thinking about it though,
01:15:38.260
is on the privacy front there, even with law enforcement, they're doing something that is,
01:15:45.360
you know, a little disconcerting, which is you and I are potentially in that photo lineup,
01:15:53.000
right? Like our face print, maybe not us because we're females, but I'm just in general,
01:15:58.020
maybe all American men were in that photo lineup without really consenting to be in the photo lineup
01:16:06.540
with their quote face print. Right. I mean, Clearview AI says that they now have 40 billion
01:16:13.060
photos in their databank. And so I know, I know for a fact that you and I are in that, you know,
01:16:20.640
in that database. And so every time somebody does run a Clearview search, they are searching all of
01:16:26.580
those photos. They are searching through all of our faces essentially for a match. And so that,
01:16:32.440
that worries some constitutional experts, they say, Hey, you know, I think if the United States
01:16:37.220
government built this, you know, we would probably be fighting back. This seems almost
01:16:42.680
unconstitutional, you know, that we're all part of every search that's being run through this tool
01:16:47.100
and, and it can go wrong. You know, there have been a handful of cases now where people have been
01:16:53.700
arrested for the crime of looking like someone else, because we're not all unique snowflakes. Some of us
01:16:59.440
look similar to other people. And so there is this concern that if you act on the facial recognition
01:17:05.960
search alone, you might end up arresting the wrong person beyond that bigger. I would imagine they,
01:17:11.300
they require more than that, right? It can't just, it can't just be only Clearview recognition.
01:17:17.880
So hopefully, ideally, if the police are doing it right, it won't just be Clearview identification.
01:17:22.860
But I have written about cases where they have arrested people based on not much more evidence
01:17:29.220
than that. A, a guy who was arrested in Atlanta for shoplifting, essentially purses in and around
01:17:39.280
New Orleans. And he was arrested. The police, he's like, why am I being arrested? He got pulled over.
01:17:45.120
Why am I being arrested? And they said, oh, for larceny in, in Jefferson Parish. He said,
01:17:51.580
where's Jefferson Parish? They said, it's in Louisiana. He said, I've never been to Louisiana.
01:17:55.780
But he ended up getting arrested and spending a week in jail before they realized they had the
01:18:02.300
wrong person. He did look a lot like the offender and they had done a Clearview search. He'd come up
01:18:09.340
as a match. And when the police looked at his Facebook profile, they saw that he had a lot of
01:18:13.740
friends in New Orleans. So based on very little information, I mean, not, not enough, no one
01:18:19.340
should be arrested based on that. He had been arrested. And so sometimes these things do go
01:18:23.680
wrong because of confirmation bias, automation bias, where the police just rely too heavily on this
01:18:30.360
high-tech tool, which does seem so amazing and often is so amazing.
01:18:35.360
I, like, how accurate is it then? Because, you know, I've told the audience before, you know,
01:18:40.560
of all the weird conspiracy theories that are out there. And there's, there are plenty of weird
01:18:44.520
ones. One of the weird ones that's out there is that I am Nicole Brown Simpson, like either
01:18:52.200
reincarnated or she never died. I'm not exactly sure how it works, but somehow we're the same person.
01:18:58.160
So truly, like, what about two people who look very similar in certain of their features?
01:19:04.140
Is Clearview generally very good at distinguishing between two similar looking people or not so good?
01:19:12.960
Well, so there's, um, there is a federal lab called the National Institute for Standards and
01:19:18.840
Technology or NIST that tests all the facial recognition algorithms or runs these tests periodically.
01:19:24.320
And a lot of these algorithms now are incredibly accurate, like they're 99% accurate. But it does
01:19:31.520
depend on, you know, the quality of the image that you run. Is it a grainy, you know, still from a
01:19:38.460
surveillance tape? Uh, it might not work as well under those circumstances. Um, anecdotally, you know,
01:19:45.180
I have seen it work quite well when Juan Tontat has run Clearview searches on my own face. There are no
01:19:52.000
doppelgangers who come back. It is me. It's photos that I've put out there. It's me at concerts in
01:19:59.800
2005 in the background of other people's photos. Um, yes, there was a photo of me in the background
01:20:08.360
of someone else's, um, it was, well, there was a woman in the background of someone's photo walking
01:20:12.680
by. And at first I didn't think it was me until I recognized my coat that I had bought at a vintage
01:20:17.720
store in Tokyo. It was so unique. It had to be me. I mean, it is, uh, Juan Tontat said it's a time
01:20:23.880
machine. I invented it and it really is. It's, I mean, it was, it was incredible. I was able to
01:20:28.080
connect my face to me, you know, in profile in 2003, four. I mean, it's, it's, it's kind of
01:20:35.120
astounding how well it can work under the right conditions. Oh my gosh. All right. So that's one
01:20:39.700
thing. I mean, maybe people out there are feeling less safe than I am knowing that it's in the hands
01:20:46.240
of law enforcement right now across the country. Many law enforcement divisions already have this on
01:20:50.820
you. Um, I'm still, my D my default is generally to be trustworthy of law enforcement. I have a cop
01:20:57.000
in the family. I don't know. That's, but I feel very less, very much less secure when it comes to
01:21:02.240
private citizens having this stuff because it's not even just like Megan Kelly. Okay. Anybody can
01:21:08.220
Google that and see cashmere hell. What, you know, what comes up about cashmere? It's so much more.
01:21:13.680
Like you say, it's like private photos that you did. You had no idea. Maybe whatever,
01:21:17.980
maybe you went to some March, you know, at one point or who knows what you did or when you were
01:21:23.140
a stupid kid that now private citizens or your enemies know about and could use against you.
01:21:30.140
Or like the thing I worry about is the freaks out there. People who are crazy, who just want
01:21:34.780
information on you that you would never voluntarily give. Now they've got it. And maybe even they have
01:21:41.440
photos like, like, for example, in my case, I don't publish any of my addresses for very obvious
01:21:47.140
reasons. But what if my neighbor was out playing football with their kids on the front lawn? And
01:21:53.080
I got caught in the background of one, you know, now am I going to have to run out there and be
01:21:56.640
like, give me that photo, delete that photograph, right? That you can't do that. But I could be there
01:22:02.740
and I could be in front of my house and now it's identifiable, like all this stuff. So let me take a
01:22:08.220
quick break and we're going to pick it up there and talk about that risk and the glasses, which is what
01:22:13.860
got my attention to this whole case to begin with. More with Cashmere Hill right after this.
01:22:19.180
I'm Megan Kelly, host of the Megan Kelly show on Sirius XM. It's your home for open, honest and
01:22:26.600
provocative conversations with the most interesting and important political, legal and cultural figures
01:22:31.540
today. You can catch the Megan Kelly show on Triumph, a Sirius XM channel featuring lots of hosts
01:22:36.960
you may know and probably love. Great people like Dr. Laura, Glenn Beck, Nancy Grace, Dave Ramsey and
01:22:44.360
yours truly, Megan Kelly. You can stream the Megan Kelly show on Sirius XM at home or anywhere you are,
01:22:51.180
no car required. I do it all the time. I love the Sirius XM app. It has ad-free music coverage of every
01:22:59.660
major sport, comedy talk, podcast and more. Subscribe now, get your first three months for free.
01:23:04.280
Go to SiriusXM.com slash MKShow to subscribe and get three months free. That's SiriusXM.com
01:23:13.080
slash MKShow and get three months free. Offer details apply.
01:23:23.040
Now, I've been thinking about this, including yesterday we had on Nancy Grace and we were
01:23:27.740
talking about the Brian Kohlberger Idaho murders case. And I want to play you this clip because it
01:23:33.840
relates to some of this technology potentially, where she was stating here her suspicions. She made
01:23:40.440
it clear. This is her view about Brian Kohlberger and who he is and then offered some facts about
01:23:47.920
what we know about some of his past interactions with women. Listen to this soundbite. And you know,
01:23:53.280
another thing which is not going to get brought up at trial, I guarantee you, because the defense is
01:23:57.740
going to argue it's too incendiary and prejudicial, blah, blah, blah. And so the theory that Brian Kohlberger
01:24:04.080
is in fact an incel, involuntary, celibate, and hates women because he can't be with women. And
01:24:11.140
remember, he was also banned from a bar because he would go up to women and say things like,
01:24:17.140
what's your home address? I would run for the heels as if I had seen a monster. If some creepy dude
01:24:22.360
comes up to me at a bar and says, what's your home address, lady? Uh-uh. N-O. He had to get thrown out of
01:24:27.560
that bar. So if that happens to you in today's day and age, and you're not a public figure, right?
01:24:33.260
Like I am, you're usually fine because the person doesn't know your name. They can't Google you.
01:24:40.200
And it's, you do have to spend some money and make some effort to find somebody's address,
01:24:45.440
even a private citizen today, but it's knowable. I mean, let me tell you people,
01:24:48.960
if you've gone down to the DMV and given them your real address, like most people do when they go to the
01:24:53.520
DMV because you registered a vote and all that, it's findable on the internet for $40. It's very
01:24:58.880
easy to find somebody's home address. Very easy. So it's creepy to think that, you know, now if,
01:25:06.120
if this technology is available to private citizens, that bar encounter takes on a whole new meaning
01:25:12.460
because he'll know who you are like that. He'll have your name. He'll have those, he'll know you went
01:25:19.820
to the concert, you know, where you were wearing the jacket from Tokyo, you know, 20 years ago,
01:25:24.480
he will have so much information about you instantly. And that brings me to the glasses
01:25:30.440
because it's not going to require a big mainframe computer for him to get that info about you
01:25:36.620
under this Clearview technology. Yeah. I mean, so Clearview is working on these augmented reality
01:25:44.860
glasses. Uh, they had funding from the air force to develop them. They would be used at military
01:25:49.380
bases so that soldiers can theoretically identify threats from very far away. Um, but yeah, I mean,
01:25:56.320
you can imagine this world in which maybe we do, I have all start wearing augmented reality glasses and
01:26:02.560
with tools like Clearview, you might be able to be able to identify the people around you in real time.
01:26:08.440
Um, I hate to tell you, we are already in that world to a certain extent. Clearview has limited its tool
01:26:15.420
to law enforcement and the government, but there are other copycat companies that have created
01:26:21.860
the same kind of technology as Clearview. Their databases aren't as big, but they're on the
01:26:27.760
internet right now. Sites that you can use for free sites that you can pay a subscription to
01:26:32.760
where you upload a photo of somebody and it will show you other places on the internet where their
01:26:38.440
photo appears, where you might be able to find out what their name is, you know, where they live.
01:26:42.980
So this is, this is not a future scenario. This could happen to you in a bar, you know, tonight,
01:26:49.580
um, where somebody walks up to you, they are creepy. You never want to see them again. They
01:26:56.320
surreptitiously take your photo and all of a sudden they could know who you are. I mean, I do think that
01:27:02.620
that is a very scary scenario. On the other side, maybe you're talking to somebody who seems great.
01:27:09.240
They're saying all the right things. You take their photo, you look them up and all of a sudden
01:27:14.040
you see that they have this, uh, criminal record or they have this online reputation that you find
01:27:21.280
really disturbing and you want to walk away. So it's like, you know, technology in so many ways,
01:27:26.160
it's just this double edged sword. There's positive use cases and negative use cases. And it really is
01:27:31.540
about who's using it and how they're using it. Oh my gosh. I mean, I'll tell you this,
01:27:36.580
the one thing you should not put your home address on your driver's license or give it to the
01:27:41.300
government, get a PO box, just get a PO box. It's a bigger pain in the ass to get your mail,
01:27:46.320
but it will put a layer between you. And I mean, look, it's, it's, I've done it because I'm well
01:27:52.420
known, but it's, this is everyone's well known. Now there, there are no more civilians with technology
01:27:59.000
like this out there. So take those steps before it becomes a problem in your life. I mean, you have to do
01:28:04.920
it preemptively before the weird guys trying to find you. It's just so dark. I don't know this.
01:28:11.620
It's, I know you write about this in the book, but it's very much like minority report where like
01:28:15.680
everything about us is out there. Like there's that scene in minority report where Tom Cruise is
01:28:20.160
walking through the, like the shopping mall and all the ads are personalized to him because they
01:28:25.700
can see, I don't know if it's his iris or his face, but like this is the future's here and we cut it
01:28:30.700
here. It is just to remind those who haven't seen the movie in a while.
01:28:34.400
A road diverges in the desert. Lexus. The road you're on, John Anderton, is the one less
01:28:46.280
Good evening. You can move the old-fashioned way.
01:28:51.160
So I see that cash rear. And the only thing I think is how do I opt out? How do I say, I don't want them
01:29:14.780
doing that to me. And I don't like, I want to opt out somehow. So can we opt out?
01:29:19.660
Yeah, it's funny. I think that we see that movie that way and that people working in technology see
01:29:31.920
So there are ways to opt out. It depends on where you live. Basically your face has different privacy
01:29:39.240
protections depending on your address. So Clearview AI, for example, will allow you to get out of their
01:29:48.320
database if you live in a state that has a privacy law that requires them to delete information about
01:29:54.960
you. And there's just a handful of states that have those privacy laws. California, Colorado,
01:30:00.820
Connecticut are examples. If you live in those places, you can go to Clearview AI's website and
01:30:07.480
you'll have to upload a photo of yourself. And you'll be able to see your report, like see what's
01:30:14.980
in your database, what's in their database about you. And then you can tell them to delete you.
01:30:21.580
And same goes, I think for Illinois, we talked about how Illinois has this law. It's a very
01:30:27.560
unique law to protect people. But yeah, if you live in Illinois, it says that companies can't
01:30:33.980
collect your face print, collect your biometric information without your consent, or they have
01:30:39.640
to pay a very hefty fine. So we talked earlier about Madison Square Garden and how they ban lawyers.
01:30:46.440
The company that owns Madison Square Garden is doing that at all their venues in New York City,
01:30:51.640
but not at their theater in Chicago, because they would need lawyers consent to have their face prints
01:30:58.760
and ban them from coming in. And yeah, and some of these other tools I talked about that are on
01:31:04.100
the internet right now where you can do this. A lot of them have opt-outs, but again, you have to
01:31:10.780
submit your face, kind of tell them who you are in order to get out of their databases, which not
01:31:15.400
everyone is comfortable doing if you care about privacy. Right. It's like I was saying, when they
01:31:20.860
make you enter your email to unsubscribe and you're like, well, wait a minute, you emailed me to begin
01:31:26.200
with. So you have my email. So what is this? What am I being asked to enter into here? It feels like
01:31:31.440
our relationship is getting stronger, not weaker, which is my goal. Okay. So I like that. I like
01:31:38.540
the potential of that. I know that my friend and a man I deeply admired, admire currently, argued this
01:31:46.500
case on behalf of Clearview, Floyd Abrams of New York Times versus Sullivan. He's the father of Dan
01:31:54.720
Abrams, who's also a friend. Floyd's 84 years old. He's a giant in legal circles. So Clearview
01:32:01.200
hired him to go in and argue against the ACLU, which sued them over this. And my pal Floyd,
01:32:07.920
I guess it either he didn't win or it wasn't looking like he was going to win. And what
01:32:12.260
happened? So, yeah, so they hired Floyd Abrams because he is the expert on the First Amendment,
01:32:18.620
which is the, you know, freedom of the press, freedom to information. And Clearview was making
01:32:24.260
this argument that they have a First Amendment right to collect public information that's on the
01:32:31.380
internet and analyze it. And they said they're just like Google, you know, they're just scanning the
01:32:38.040
internet and collecting it and organizing it. And instead of organizing it, you know, by name,
01:32:43.180
they're organizing it by face. And so, yes, Floyd made this argument. And a few of the different
01:32:49.640
lawsuits, there have been quite a few, against Clearview AI, including in Illinois, where the
01:32:54.540
ACLU sued them. And the judge there, it didn't make it to, the case didn't go all the way. It wound
01:33:01.280
up settling. But the judge said, no, the First Amendment is not going to protect Clearview AI.
01:33:07.240
Illinois, the state of Illinois still has the right to say that you're not allowed to do this
01:33:11.180
particular thing with somebody's face print. And so that suit did settle with Clearview agreeing
01:33:18.540
in the future to only sell this database, you know, of billions of photos to law enforcement,
01:33:25.620
to the government. And they said they won't sell it to private entities. They won't sell it to
01:33:29.640
individuals. So the ACLU saw that as quite a win. And Clearview saw it as a win, too, because they said
01:33:35.680
that's what we're already doing. And that's just what we'll continue to do.
01:33:38.320
Hmm. I wonder if they could get like, you know, one of those actresses who's had all that work
01:33:43.840
done, you like the Melanie Griffith or like the Meg Ryan of like, you know, you've got mail versus
01:33:50.280
the Meg Ryan of today. Would it notice, you know, like, are the criminals going to start getting
01:33:55.380
plastic surgery to get past this? Someone's going to come up with a moisturizing cream that dulls the
01:34:00.580
lens, you know, that Clearview would put on you. I don't, there's going to be some
01:34:04.120
technological advancement, probably to counteract the creep in the bar with the glasses, don't you
01:34:11.920
think? Maybe those 3D plastic masks that you like pull over your head like a mission impossible.
01:34:19.080
It is, it can be, it can be hard to evade facial recognition. I did this experiment with some of my
01:34:26.180
colleagues at the Times on one of these sites that's available to anyone to use, a site called
01:34:31.120
Pym Eyes. And, you know, we uploaded photos where, you know, somebody's face was half covered,
01:34:36.980
like with a COVID mask, their nose, their mouth, and it was still able to recognize them and find
01:34:42.940
photos of them. I mean, it is astounding how powerful facial recognition has gotten. So
01:34:47.360
it can be hard to evade it. It is possible. I talked to one lawyer who managed to evade the ban
01:34:53.060
and go to a Knicks versus Cavs game at Madison Square Garden, even though she was on the list
01:34:59.540
by wearing a baseball cap, glasses, and a COVID mask. That was enough to get through MSG security.
01:35:06.980
The lawyers. All right, we only have like 30 seconds left, but is there anything you want
01:35:10.880
to flag for us that we need to be worried about in addition to all the stuff we've already discussed?
01:35:14.960
I guess just thinking, knowing that this, this power is out there now, just think about the
01:35:22.020
photos that you do put online and whether they need to be public photos or whether you want to
01:35:27.740
make them not be on the internet, or if you do make them private so that these companies, and there's
01:35:33.760
more and more of them out there, aren't there out scraping it and using it in ways that you
01:35:38.440
wouldn't want or didn't expect. You know, it's like yet another reason not to put your kid on the
01:35:43.800
internet. Do not put your kid's face all over the internet. Be careful. You don't know how those
01:35:49.820
photos are going to come back to haunt him or her. What a fascinating discussion. Cashmere Hill,
01:35:55.920
the book is Your Face Belongs to Us. You'll learn a lot. You'll be fascinated. It's a quick,
01:36:01.060
easy read. Thank you so much for writing it. Thank you, Megan. Wow. Okay. I want to tell the audience,
01:36:06.800
we're going to be back on Monday with Maureen Callahan. She's going to come here inside the studio,
01:36:11.380
and we're going to talk to her about some of the latest shenanigans. The Royals. Did you see the
01:36:17.960
clip that's going around about Madonna? I'm dying to talk to Maureen about this, among other things.
01:36:23.720
And we're also going to have the head of Stop Anti-Semitism here to unveil their anti-Semite of
01:36:29.280
the year. Have a great weekend, everyone, and we'll see you then. Thanks for listening to The
01:36:36.260
Megyn Kelly Show. No BS, no agenda, and no fear.