The Megyn Kelly Show - December 17, 2024


RFK and Hegseth's Path to Confirmation, and Dangers of AI, with Mark Halperin, Sean Spicer, Dan Turrentine, and Tristan Harris | Ep. 967


Episode Stats

Length

1 hour and 52 minutes

Words per Minute

180.38286

Word Count

20,325

Sentence Count

1,382

Misogynist Sentences

47

Hate Speech Sentences

16


Summary

Trump takes questions from the media for the first time as he prepares to return to the White House. Mark Halperin and Dan Turrentine join me to talk about what they saw from Trump's first day back in office.


Transcript

00:00:00.000 Your business doesn't move in a straight line.
00:00:02.800 Some days bring growth, others bring challenges.
00:00:05.940 But what if you or a partner needs to step away?
00:00:08.820 When the unexpected happens, count on Canada Life's flexible life and health insurance
00:00:13.680 to help your business keep working, even when you can't.
00:00:17.020 Don't let life's challenges stand in the way of your success.
00:00:20.460 Protect what you've built today.
00:00:22.500 Visit canadalife.com slash business protection to learn more.
00:00:26.280 Canada Life. Insurance. Investments. Advice.
00:00:31.000 Welcome to The Megyn Kelly Show, live on Sirius XM Channel 111 every weekday at New East.
00:00:42.860 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show and happy Tuesday.
00:00:48.120 Do you have all your Christmas shopping done? I don't have all my Christmas.
00:00:51.400 We have another week. It's December 17th and we still have time.
00:00:56.360 Maybe you'll get some gift ideas during today's show.
00:00:58.920 Our advertisers actually have some good ones.
00:01:01.080 We begin today with Trump 2.0 as the president-elect prepares to head back to the White House.
00:01:07.340 He did something that the current occupant has rarely done.
00:01:10.880 He actually stood there and took questions from the media for an hour.
00:01:16.980 And in a sign of a new Trump era, it was substantive and stylistically,
00:01:21.680 it was very different from what we saw during his first term.
00:01:24.920 He fielded a wide variety of questions on his meetings with business leaders,
00:01:28.980 his cabinet picks, and his own views on the Maha movement.
00:01:33.840 But it wasn't all different.
00:01:36.100 There was still a lot about one of his favorite targets, the fake news media,
00:01:39.940 and the legacy media responded in predictable ways.
00:01:43.340 Joining me now to discuss that and all the news today are pals from the morning meeting on 2A.
00:01:48.360 Mark Halperin, he's editor-in-chief and host of 2A.
00:01:51.520 Sean Spicer is host of The Sean Spicer Show on The First TV.
00:01:56.100 And Dan Turrentine is a former Democratic strategist.
00:02:00.020 Did you know that homeowners in America nationwide, they have over $32 trillion in equity?
00:02:06.200 And cyber criminals are targeting it with a growing scam the FBI calls house stealing.
00:02:11.880 House alarms, doorbell cameras, deadbolts will not work against these thieves
00:02:16.120 because they're not after your stuff.
00:02:17.800 They're after your equity.
00:02:19.820 If your title's not being monitored, scammers can transfer the title of your home into their name
00:02:24.580 and then take out loans against it or even sell it behind your back.
00:02:28.120 The best way to protect your equity is with triple lock protection from home title lock.
00:02:32.960 Triple lock protection is 24-7 monitoring.
00:02:34.980 And God forbid, if the worst happens, restoration services at no out-of-pocket cost to you.
00:02:40.400 When was the last time you checked on your title?
00:02:42.760 Likely never.
00:02:43.760 And that's exactly what scammers are counting on.
00:02:46.120 Make sure you're not already a victim.
00:02:48.220 You can get a free title history report and a 30-day free trial of triple lock protection today
00:02:53.820 by going to hometitlelock.com and using the promo code MEGAN
00:02:57.320 or click on the link in the description.
00:03:00.100 That's hometitlelock.com, promo code MEGAN.
00:03:03.440 hometitlelock.com.
00:03:05.680 Guys, welcome back to the show.
00:03:07.020 Great to see you.
00:03:08.120 Good to see you.
00:03:08.560 Merry Christmas.
00:03:09.320 Good day.
00:03:09.560 All right, so it kind of was like, so far we are seeing a little kinder, gentler Trump,
00:03:15.100 are we not?
00:03:15.640 Sean Spicer, I'll start with you since you know him best.
00:03:18.440 I think we are seeing a more pragmatic Trump.
00:03:20.880 A Trump that in 2016 when he won, they tried to delegitimize the win.
00:03:25.840 People were attacking him.
00:03:27.000 It was Russia.
00:03:27.960 You didn't win the popular vote, this and that.
00:03:30.260 And he felt like he was on defense, and rightly so.
00:03:33.600 I mean, we had false accusations lobbed from one end to the other.
00:03:37.120 And this time, everyone from some folks in the media to big tech and corporate leaders
00:03:42.820 are embracing him.
00:03:44.200 And I think they realized he was right.
00:03:46.220 They were wrong.
00:03:47.340 Bottom line is, it's a much different environment.
00:03:49.360 And I think he flourishes in this.
00:03:51.360 He wants people to come to Mar-a-Lago, talk to him about doing business, talking to him
00:03:55.420 about making investments in the United States, as we saw yesterday with the SoftBank CEO.
00:04:00.980 This is a much different...
00:04:01.860 Frankly, Megan, I'll admit it.
00:04:03.160 I'm jealous.
00:04:03.900 I wish I had been there for 2020.
00:04:06.800 2016 was historic.
00:04:08.460 But we faced a huge wave against us of people that were trying to delegitimize the win.
00:04:14.880 Now, you can't...
00:04:16.260 No one can do that.
00:04:17.000 It was a resounding win, and I think he's in his glory.
00:04:20.820 Just staying with you, Sean, for a second.
00:04:22.440 And I don't know about you, but I find it very gratifying to see Zuckerberg and Bezos
00:04:28.140 and all these guys have to go in there and kiss the ring.
00:04:32.200 Do it.
00:04:32.860 Well, I love it.
00:04:33.900 You know what?
00:04:34.600 Petty Irish leprechaun Sean hates...
00:04:37.200 Like, I'm like, why are you giving...
00:04:38.620 Like, I would tell them, time out.
00:04:40.540 You guys wait in line.
00:04:42.060 Stand down.
00:04:42.840 Go down to, you know, Boca Raton and wait your turn, and I'll call you up to come to Palm
00:04:48.660 Beach.
00:04:49.140 I get it.
00:04:49.960 I know Trump loves this.
00:04:51.280 He relishes these guys coming there, not just the corporate leaders, but the foreign
00:04:55.420 leaders.
00:04:56.200 I get it, and I'd rather have them on our side and get the policies instituted and make America
00:05:01.680 more prosperous and grow, et cetera.
00:05:03.900 But at the same time, I'm like, these guys bashed him for four to eight years, and now
00:05:09.580 they want back in with a $1 million check to the inauguration committee.
00:05:14.260 I know.
00:05:14.600 This seems actually kind of a cheap price to pay.
00:05:16.720 A million dollars, Jeff Bezos.
00:05:17.900 But you know what?
00:05:18.580 Here's the thing, Megan.
00:05:19.320 He told Masa yesterday.
00:05:21.760 Hey, Masa comes bearing $100 billion in investment.
00:05:25.300 And he says, hey, how about $200 billion?
00:05:27.740 Why isn't he saying to Zuckerberg, that's a nice tip?
00:05:30.860 Now put down $10 million for the inauguration.
00:05:33.100 That's right.
00:05:33.400 That's right.
00:05:34.000 Why is the Japanese bank executive investing $100 or $200 billion in America, and all Bezos
00:05:41.240 can spare is a million?
00:05:43.000 Right.
00:05:43.940 Exactly.
00:05:45.400 All right.
00:05:45.800 So here's what I mean by kinder and gentler.
00:05:48.060 This is just one example.
00:05:49.060 He was asked if senators who oppose his cabinet picks should be primaried, which is what a
00:05:54.460 lot of the MAGA faithful are saying.
00:05:56.820 Here's how he responded.
00:05:57.860 Sop 4.
00:05:58.200 Should senators who oppose your nominees, your cabinet nominees, should they be primaried?
00:06:03.840 If they are unreasonable, I'll give you a different answer, an answer that you'll be shocked
00:06:09.740 to hear.
00:06:10.120 If they're unreasonable, if they're opposing somebody for political reasons or stupid
00:06:18.340 reasons, I would say it has nothing to do with me.
00:06:21.020 I would say they probably would be primary.
00:06:23.560 But if they're reasonable, fair, and really disagree with something or somebody, I can see
00:06:28.940 that happening.
00:06:30.180 But I do believe that if they're unreasonable, I think we have great people.
00:06:34.440 I think we have a great group of people.
00:06:37.540 So, Mark, let me tell you why I believe him in the tone he's striking there, because I
00:06:44.300 spoke with at least one person involved in this process who was against Gates.
00:06:50.100 And that person told me that when Trump spoke to this senator about Gates and was told he's
00:06:57.040 not going to make it, Trump didn't freak out.
00:06:59.420 Trump didn't threaten.
00:07:00.540 Trump just said, oh, gee, that's too bad.
00:07:02.520 He's a good guy and accepted the judgment.
00:07:05.400 And we saw that was how he behaved when Gates left the stage, you know, left.
00:07:10.140 And now he's working for OAN.
00:07:12.680 So I take him at his word.
00:07:14.500 I guess if they keep doing it, you know, if they sink Hegseth and they sink RFKJ and they
00:07:19.640 give Tulsi a problem, the tone will change.
00:07:22.140 But what do you make of it?
00:07:24.080 Well, look, I think sometimes two news stories conflate.
00:07:26.860 I think it's possible that one of those drones kidnapped Donald Trump and replaced him with
00:07:30.680 a cuddly grandpa, a cuddly conciliatory grandpa.
00:07:35.620 And and the aliens don't think we'll figure it out.
00:07:37.800 But we're on to him.
00:07:39.340 Look, go back to what Sean said.
00:07:41.900 You don't have to be super MAGA, just an objective journalist or observer to recognize
00:07:47.340 just the nightmare that Donald Trump entered the office with.
00:07:51.620 Because the dominant media created an environment that created for tens of millions of Americans
00:07:56.900 reality, their reality, that he was an illegitimate president.
00:08:00.900 And then to be investigated perpetually for for the entire eight years to be voted out
00:08:06.620 of office and then to say to the voters, here's what I'm about.
00:08:11.220 Same guy.
00:08:12.340 Here's my agenda.
00:08:13.720 Put me back in.
00:08:15.160 After Democrats said after January 6th, et cetera, he could never win another election.
00:08:20.220 So I think he feels a sense of satisfaction.
00:08:23.760 But he also has created an understanding that he gets it better this time.
00:08:27.980 He understands how to be president way better.
00:08:30.460 He's got an incoming second term government that's not a normal lame duck because they're
00:08:34.180 not exhausted.
00:08:35.640 And because he's had four years to write executive orders, to think about who he wants to hire
00:08:40.520 and to use his vast human intelligence, which is vast.
00:08:43.680 He's just a super genius at analyzing situations and people to come in and say, I'm going to
00:08:50.260 do this job differently this time.
00:08:51.840 And the overriding factor is he loves people kissing the ring or anything else they want
00:08:57.580 to kiss.
00:08:58.180 He loves billionaires kowtowing to him.
00:09:01.300 He loves knowing how to manipulate and leverage these heads of state.
00:09:04.380 So this is like the ultimate mulligan.
00:09:06.220 He's getting to be an incoming president again with all this knowledge and a much different
00:09:12.040 environment.
00:09:12.660 The press is weaker than it's been since he came on the national stage.
00:09:17.140 And all of these people from Congress to the governors, the foreign leaders, the CEOs,
00:09:22.040 they know that the rules are about to change and that Trump will set the rules and Trump
00:09:26.060 will decide who gets to play the game.
00:09:27.600 And so they're all genuflecting.
00:09:29.400 So we'll talk about the lawsuits that he's filing against members of the media and others
00:09:33.660 in a second.
00:09:34.440 But Dan, here's another soundbite where he sounds, you know, like the replacement Trump,
00:09:41.240 like under Mark's theory.
00:09:43.240 But there's nothing to see there.
00:09:44.300 We'll update the drone story.
00:09:45.320 There's nothing to see there, according to all of our now national security officials.
00:09:49.140 Don't believe your lion eyes.
00:09:50.280 In any event, here's another soundbite from him talking about how everybody loves me now.
00:09:54.560 Stop one.
00:09:55.060 I did have a dinner with Tim Cook.
00:10:00.440 I had a dinner with sort of almost all of them and the rest are coming.
00:10:04.560 One of the big differences between the first term and the first term, everybody was fighting
00:10:08.800 me.
00:10:09.820 In this term, everybody wants to be my friend.
00:10:11.840 I don't know.
00:10:12.580 My personality changed or something.
00:10:18.180 Is it is it his personality?
00:10:19.960 You tell me as a Democrat, what why are all of these, you know, that Sundar of Google went
00:10:27.060 in there to see him.
00:10:29.060 Sergey Brin went in there to see him.
00:10:31.220 You know, the guy who created Google, Mark Zuckerberg, Jeff Bezos, like all of and then
00:10:38.060 not to mention all the bankers who have gone in to say they're all going in.
00:10:42.040 Yeah.
00:10:43.140 Look, a lot of them are members of the incumbent party.
00:10:46.340 And that right now is Donald Trump.
00:10:48.140 And I think as, you know, Mark and Sean have said in 2016, I think Democrats were first
00:10:54.300 stunned and they immediately turned to anger.
00:10:57.140 Just this idea that he was not legitimate, that Russia had helped him, that there was
00:11:01.060 just no way that he had won or earned the office outright.
00:11:05.600 Now, I think Democrats are just exhausted.
00:11:08.640 I mean, they threw every single thing that they could think of at him, whether, you know,
00:11:13.360 on the political playing field or the courtroom or the, you know, the media was certainly not
00:11:18.580 helpful to him.
00:11:20.100 And he won.
00:11:21.160 And I think there's now just this exhaustion, resignation and in the in the business world,
00:11:27.220 you know, complete acceptance that he is in charge.
00:11:31.920 He has maximum political power.
00:11:33.860 Perhaps, you know, no one has been riding into the office with more leverage than Trump has
00:11:39.460 in a long time.
00:11:40.720 And part of that is because Joe Biden is essentially missing in action.
00:11:45.040 I mean, Mark likes to make the joke during a presidential transition, there's only one president
00:11:50.120 at a time.
00:11:50.880 And right now, that appears to be the president elect, Donald Trump, much to Democrats' frustration.
00:11:56.540 Megan, here's the funny thing.
00:11:58.320 You can't see this, but this is a picture from December 14th, 2016.
00:12:05.200 All the people sitting around this meeting in Trump Tower, it's Tim Cook, Bezos.
00:12:09.820 It happened in 2016.
00:12:11.640 They all came.
00:12:12.580 But to Dan's point, they didn't like him.
00:12:14.300 They didn't think he was legitimate.
00:12:16.420 So, you know, to his to Donald Trump's point, they all kissed his ring in 2016.
00:12:21.300 Initially, they came up to Trump Tower.
00:12:23.320 I mean, there's probably 20 of these tech executives in this room and yet very different
00:12:28.540 outcome.
00:12:29.200 The American people spoke very loudly.
00:12:31.240 It wasn't that just that Donald Trump won.
00:12:33.740 It's that the policies of the left failed.
00:12:36.240 The open borders, the DEI, the woke policies failed.
00:12:40.980 And these corporate leaders that bought into it all are now realizing they were wrong.
00:12:46.640 It wasn't that just that Trump won.
00:12:48.500 It's that they lost.
00:12:49.460 Well, and think about, too, Sean, in 16, when they did that, you know, like I we have a
00:12:55.500 good friend who's a senior executive at Google.
00:12:57.520 Google gave their employees a day off after the 2016 election for a day of like mental
00:13:02.780 health mourning.
00:13:04.300 Now there's none of that.
00:13:05.800 Now it's just like even the employee base is like, well, I hope we're going down there.
00:13:09.200 Like, you know, did you see Bezos was down there?
00:13:11.500 Like, when are we going down there?
00:13:13.120 So I think now it's not just the executives, but like rank and file employees who are resigned
00:13:18.340 and accepting that Donald Trump is the next president of the United States.
00:13:21.980 Yep.
00:13:22.720 Better to go along and get along.
00:13:24.960 Yeah.
00:13:25.460 Tim Cook was another one who went in there this time around, the head of Apple.
00:13:28.880 So, Sean, you gave the thumbs up when I mentioned the lawsuits that Trump has been filing.
00:13:34.640 And this is not in his capacity as president.
00:13:37.400 This is as private citizen Donald Trump.
00:13:39.580 He's totally entitled to file whatever lawsuits he wants.
00:13:42.000 And the courts will respond accordingly.
00:13:43.840 But he did, in fact, file the one against the Des Moines Register that he threatened yesterday.
00:13:48.160 This is based on Ann Seltzer's final poll of the 2024 election cycle showing him down.
00:13:56.980 I'm trying to remember whether he was down or up three or four points.
00:14:00.560 What was it?
00:14:01.520 He was down and he wound up winning Iowa by 14.
00:14:06.620 So it was completely wrong.
00:14:08.620 And she's embarrassed and she retired.
00:14:10.780 I mean, she retired on a loss, which is just awful.
00:14:13.380 Like, that's, I'm sure, not how she wanted to go out.
00:14:15.280 But she did humiliate herself.
00:14:16.980 Now, I don't see the lawsuit unless Trump has some proof that comes out that actually shows she did do it as election interference.
00:14:25.340 Like, there was some intentionality behind this alleged fraud to mislead people in order to change the vote.
00:14:30.980 I have seen and I have heard absolutely no proof to that effect.
00:14:36.700 And I see none alleged by Trump.
00:14:38.600 It's just a supposition that that's why she did it.
00:14:41.360 But what do you make of that lawsuit and the lawsuit against the Pulitzer board, which we talked about, sorry, Nobel, which we talked about the other day.
00:14:51.640 Was it Nobel or Pulitzer?
00:14:52.620 So why am I forgetting all my facts?
00:14:54.220 Yeah, it was the Pulitzer board that gave that gave the New York Times the Pulitzer Prize for its Russia reporting.
00:15:00.220 But they actually we talked about this yesterday.
00:15:01.760 The reason they did it is because the Pulitzer Prize made its own independent statement saying nothing.
00:15:07.440 Nothing came out after those reports to prove the facts therein untrue, which is potentially a defamatory statement.
00:15:14.900 Anyway, what do you make of his legal strategy right now?
00:15:17.400 Well, you're the lawyer.
00:15:18.500 I'm not.
00:15:18.920 So I'm going to defer to Megyn Kelly when it comes to legal matters.
00:15:21.840 I will say the one that I get excited about, you mentioned, is the ABC one, because that's completely false.
00:15:28.360 And Stephanopoulos knew it.
00:15:29.940 And I think that when reporters get called out for being wrong, that's a good thing.
00:15:34.320 They need to be held accountable just like anybody else.
00:15:37.680 And so I was excited about that.
00:15:39.300 I think the bedwetters like Chuck Todd and Jim Acosta, who are talking about this being a threat to the media, are morons.
00:15:46.280 The bottom line is that why should they get away with defaming people with inaccurate information or information they know to be wrong?
00:15:53.880 Look, the Sullivan standard that the Supreme Court has set for public figures proves that you have to prove intent and malice.
00:16:00.460 It's a very high bar.
00:16:02.340 And so there's a big difference for someone even like me who's had this kind of issue come up.
00:16:07.120 The lawyers will tell you, God, this is how much it's going to cost you.
00:16:09.880 This is the burden that we have to meet.
00:16:12.020 This is what's going to happen during discovery in terms of your emails, your texts coming out.
00:16:16.640 So there's a reason not to do it.
00:16:18.660 Now, in the Iowa case, my understanding is that what Trump's lawyers are going against isn't the Sullivan standard saying you defamed me, but in fact, a consumer law that Iowa has about misrepresenting people.
00:16:32.840 Again, I'll defer to you and the lawyers about the nuances of that, but they're using a very interesting tactic saying that they violated Iowa's consumer statute, which prevents misinformation about a product.
00:16:44.660 Now, again, it's a very narrow reading from my layperson standpoint.
00:16:49.100 But, you know, look, I think what it does when that poll came out, right?
00:16:54.500 First of all, the Iowa poll has had a storied history of problems.
00:16:58.880 The bottom line is, is that that didn't pass the smell test.
00:17:02.260 Right.
00:17:02.560 And nobody bought that, both in terms of what other public polls at the time said and in terms of what the data was suggesting where Iowa's electorate was.
00:17:12.800 So it didn't make sense.
00:17:15.340 And the bottom and so the question is, why did they go through with it?
00:17:18.920 What are the crosstabs show?
00:17:20.360 How did they sample the electorate?
00:17:22.320 Did they know in advance?
00:17:23.760 And this is where the discovery phase comes in.
00:17:26.280 Are there emails that show that they knew that there might be some problems with how they created the sample that that was based off of?
00:17:34.320 I don't know.
00:17:34.960 But my guess is that's why you go through this to the discovery phase.
00:17:37.940 So you can say, gosh, this doesn't add up.
00:17:40.460 And them saying, well, who cares?
00:17:41.780 Go ahead with it anyway.
00:17:43.740 I think that one's going to get thrown out on the papers.
00:17:46.960 I think they'll move to dismiss it and it will be dismissed without without the exchange of discovery.
00:17:50.880 Go ahead, Mark.
00:17:52.240 First of all, I agree with Sean about the ABC case.
00:17:54.460 And I think there are now probably five votes on the court to change the Sullivan standard.
00:17:59.720 And Donald Trump may bring a case that gets the court that does that.
00:18:04.440 But I think it's overly litigious to do what he did yesterday.
00:18:08.020 Ann Seltzer is my friend.
00:18:09.420 She used to be my polling partner.
00:18:10.980 And she's been one of the most accurate pollsters in America.
00:18:14.120 It's true that she's stepping back from doing political polling.
00:18:17.180 But she announced that, Megan, before this poll came out.
00:18:21.420 And she's not retiring.
00:18:23.620 And to say I don't I don't think it's right to say she's been humiliated.
00:18:26.560 I think people have tried to humiliate her.
00:18:28.060 But every pollster I've ever worked with, every pollster whose work is respected, sometimes polls are wrong.
00:18:33.820 In fact, statistically, one in 20 are wrong to suggest to suggest election interference.
00:18:38.720 She has been humiliated.
00:18:40.380 She was 17 points off.
00:18:43.020 Everybody ran around saying she's a gold standard.
00:18:44.960 She's a gold standard.
00:18:46.060 She actually had the potential to change the trajectory of the race.
00:18:49.260 She showed Kamala Harris winning by three in a deep red state.
00:18:53.260 Trump won the state by 14.
00:18:54.720 She was 17 points off and completely blew it.
00:18:58.360 She blew it.
00:18:59.120 So she is humiliated.
00:19:00.520 I look at her and I see someone who is humiliated.
00:19:03.980 I just I just think that to judge one person by one poll, there's no evidence that a poll like that, quote unquote, influences the election.
00:19:12.440 But I think I think what's important is that President Trump be judicious in choosing who to go after you.
00:19:19.880 This case, I don't think I don't agree with you.
00:19:22.100 I think it'll be thrown out on the paper.
00:19:23.600 And I think it cheapens the victory he has with over ABC who said, oh, we don't know exactly why.
00:19:30.420 But he should focus on the cases where not only he feels personally aggrieved, but where there's a chance of not only and not only cases where he's a chance of winning, but in cases where there's an important principle at stake.
00:19:41.100 And that's, I think, the best use of his time and his lawyer's time and his money.
00:19:45.660 And so there's a question about whether he's intimidating pollsters here.
00:19:49.060 You know, Trump is obsessed with polls and he dismisses the ones that he doesn't like.
00:19:53.620 And he doesn't and he kind of does the same thing with media is kind of obsessed with media and attacks the ones that he doesn't like.
00:19:59.760 But the thing with ABC was real and it was a legit problem and a legit objection.
00:20:03.920 And I don't agree with all those people who say that he was going to lose that case.
00:20:07.800 I don't.
00:20:08.420 I think if that had gone forward, there was a very, very good chance that Trump would have won that case, possibly even just on the papers without a jury trial.
00:20:16.120 It was clear what they said.
00:20:17.720 It was very clear what George Stephanopoulos said.
00:20:19.660 And it was very clear what the jury found and didn't find.
00:20:22.220 And I think you had a good chance of winning at some judgment.
00:20:25.580 This one's different.
00:20:26.540 Like I said, you're the you're the lawyer on this.
00:20:28.120 If they were to get to discovery in the Des Moines Register Seltzer case and they found someone there emailing and and saying, gosh, this doesn't really comport with what recent information suggests, you know, or here's a sample that we don't think.
00:20:44.260 If if if you saw that exchange, I guess my question to you as a lawyer would be, isn't that the point if they can show that they knew there were flaws and they went ahead with it?
00:20:54.660 Now, that's a big if.
00:20:56.220 But if you could show that they were flaws and they knew about them, wouldn't that give you merit to go forward?
00:21:02.700 Maybe. But even that's a real stretch.
00:21:06.260 And I would think that before the judge would engage in allowing that kind of discovery, you'd have to have a good faith basis to to make the allegation.
00:21:13.980 There has to be more than just like, I suspect I think she tanked it intentionally.
00:21:18.880 And I just don't see that.
00:21:20.620 Like what what specifically do they know?
00:21:22.900 I think they know what I just said.
00:21:24.340 And she projected Kamala was up by three and Trump won the state by 14 and she was very off and it rattled team Trump and he's irritated by it.
00:21:35.740 So that to me seems to be all of the evidence they have against Ann Seltzer.
00:21:40.820 Now, if they've got something else and they can attach something to their, you know, motion to fight the dismissal, which you're not supposed to do, you're supposed to judge it based on the four corners of the document, the complaint.
00:21:50.900 I don't know.
00:21:51.720 But I think that one's going to go away.
00:21:53.180 And I know he's litigious and he's talking about how he wishes the DOJ would bring these cases.
00:22:01.180 I don't like that either.
00:22:02.480 I don't think the DOJ should be Trump's personal attorney.
00:22:05.620 The DOJ should be the United States's personal attorney.
00:22:09.000 They represent us and they're not there to settle Trump's beefs.
00:22:13.180 That's what the last guy did.
00:22:15.720 That's what the outgoing president used the DOJ for.
00:22:18.840 So we've had enough of that.
00:22:21.180 Right.
00:22:21.420 That's my own view of on it.
00:22:22.760 OK, let's talk about that Trump soundbite where he was talking about how he'd view the primary campaigns against people who stand in the way of his nominees.
00:22:32.500 Two of them back in the news today, RFKJ on Capitol Hill trying to make nice with the senators who will have the say over whether he makes it as HHS secretary.
00:22:43.140 Pete Hegseth still out there doing the same unclear on both of them as of today what their fate will be.
00:22:51.280 But there was a wave of attacks against RFKJ over the past few days started last Thursday or Friday saying his counsel, his lawyer.
00:23:03.680 Filed a lawsuit trying to get rid of the polio vaccine and that these two are close and all these media like RFKJ wants to get rid of the polio vaccine, which unlike the covid quote vaccine is a real vaccine.
00:23:16.600 Like you take it and you don't get polio and it was just absolutely a smear campaign.
00:23:22.500 We actually looked into it and made contact with the lawyer.
00:23:25.100 However, the lawyer tried to get rid of he challenged one, one of the many polio vaccines, one strain of it did not say, let's get rid of all the others because it had potentially cancerous cells in it.
00:23:40.740 And there hadn't been tested against a control group and it had not gone through the rigorous testing that vaccine should go through.
00:23:48.080 So he said that one is problematic, that you would never know that if you looked at what the media did.
00:23:55.280 And I'll give you a couple of examples of the headlines.
00:23:57.920 New York Times, Kennedy's lawyer has asked the FDA to revoke approval of the polio vaccine.
00:24:03.780 New York Times, McConnell defends polio vaccine, an apparent warning to Kennedy.
00:24:07.920 Now it's Kennedy's, now Kennedy wants to get rid of the polio vaccine.
00:24:11.840 WAPO, RFK Jr. ally filed a petition to revoke FDA approval for polio vaccine.
00:24:17.540 The New Republic, RFK Jr.'s lawyer exposed trying to abolish polio vaccine.
00:24:24.160 NBC backlash grows over RFKJ's lawyer asking FDA to revoke approval of polio vaccine.
00:24:31.600 This is just wrong, Mark Halperin.
00:24:33.920 And, you know, it doesn't take that much effort to do what we did, which is, did he really do that?
00:24:39.920 It took us about five minutes to realize, no, he didn't.
00:24:43.740 Well, in addition, as you also pointed out, it's not Bobby Kennedy.
00:24:47.180 It's his lawyer, one of his many lawyers.
00:24:51.060 I find that it's so interesting in the media now, what I call the dominant media.
00:24:55.580 Some of the coverage is reminiscent of the way Donald Trump's been covered for seven years, you know, tendentiously hostile.
00:25:00.700 Some of it's actually as favorable as anything he's ever gotten.
00:25:04.240 I think the fate of the nominees, including Bobby Kennedy, including Pete Hegseth and Tulsi Gabbard, will be on how well they do when January hearings come.
00:25:14.020 It'll also be on whether there's any new revelations about them.
00:25:17.740 But I think for Team Trump, these kinds of stories are actually beneficial because in the end they are debunked.
00:25:24.700 And once again, even though the press is being nice some of the time to Trump, nicer, they're able to use that to say, look how unfair this is.
00:25:32.000 Nothing rallies MAGA and many of the Republican senators more around the Trump nominees than attacks from the media that they consider to be unfair.
00:25:40.720 So I would say that round of stories probably helped Bobby Kennedy because now the focus isn't is Bobby Kennedy right on abortion or is he right on this or that.
00:25:48.900 It's he's under siege from the media.
00:25:50.940 We got to support him.
00:25:52.400 And one thing you know about Bobby Kennedy, I mean, having interviewed him many times is he's extremely smart and he's a litigator.
00:25:58.640 He spent a lifetime as a lawyer pursuing these causes.
00:26:01.180 He will be so ready on this and any other empty attacks.
00:26:05.180 I mean, he will slice and dice with the best of them.
00:26:07.260 He's been under attack for all of his adult life.
00:26:09.400 So, Dan, last time you guys were on, I believe it was you who said you think that he may get some Dem support because he is a Democrat and he did have so much support in his own presidential run.
00:26:24.260 Do you still think that and do you think he's going to have trouble getting through?
00:26:28.740 I still believe it very much.
00:26:30.860 I mean, Megan, one of the things we heard all fall that that gave me kind of confidence in saying that I think that I thought Trump was going to win is the number of people who would come on our show and say, I'm a Democrat.
00:26:42.580 I've left the party because of how it treated RFK.
00:26:45.620 And I am with Trump because RFK is with Trump.
00:26:48.600 And if we're going to win national elections again and be, you know, get the Senate back, we have got to find a way to win both RFK and his voters and bring them back into the party.
00:27:00.020 And I think there will be some Democrats that will vote for him.
00:27:03.700 I think that his biggest threat really is from the right.
00:27:07.080 I think the fact that, as you said, he is a Democrat, his views on choice, his views on the role of government in health care are more are closer to our party, that they're pretty aligned with our party and less the Republican Party.
00:27:21.860 And so I think he's going to have to answer those questions and give comfort to some people on the right.
00:27:27.640 Certainly not all Democrats will vote for RFK, but I do believe there will be more than one or two that will vote for him and they should.
00:27:34.900 Do you know how mind blowing this is, by the way?
00:27:37.880 Just just stop and think about this.
00:27:39.920 Donald Trump, a Republican, has appointed a Kennedy who was primarying Biden just, what, 12 months ago for the Democratic nomination, has been put into the cabinet where he will get, by and large, Republican votes.
00:27:56.800 This is mind blowing.
00:27:58.060 If you think about where we are in terms of which party is more inclusive.
00:28:02.400 Bobby Kennedy, a part of the Kennedy family, who was literally fighting for the Democratic nomination, is going to be in the cabinet of Donald Trump.
00:28:15.060 Secondly, I think the part.
00:28:17.040 And Tulsi, by the way.
00:28:17.660 Tulsi was fighting for the Dem nomination just a few years ago.
00:28:21.180 That's right.
00:28:21.600 Exactly.
00:28:22.060 She was primarying Biden for it.
00:28:25.060 But you also go back to how you started this conversation between the media story with ABC settling for $15 million with Donald Trump and the lies that they're telling about Bobby Kennedy and vaccines.
00:28:37.200 The bottom line is this, the media that supposedly dies in darkness and hates the spread of misinformation is just showing you why it's a dying industry.
00:28:47.800 They continue to spread misinformation and lies.
00:28:50.500 This is why we showed on actually on the morning meeting the other day, exclusive polling from Signal.
00:28:56.160 And when you look at where people are getting their news, it's very, very interesting.
00:29:02.540 I posted this on Instagram.
00:29:03.880 Like, if you are getting your news from newspapers or from national media, from like ABC, CBS, NBC, you voted for Harris.
00:29:13.260 If you're on YouTube or streaming, you're voting for Trump.
00:29:17.960 We're the party of the future.
00:29:20.080 We're more inclusive.
00:29:21.700 They are part of a dying, dying industry and legacy.
00:29:25.060 Megyn, can I say one other thing about-
00:29:27.480 And there's some reporting out today that suggests that.
00:29:28.960 Go ahead, Mark.
00:29:29.980 One other thing about Bobby Kennedy, we've heard for months, including today on the morning meeting, from parents, lots of moms, who really believe in make America healthy again.
00:29:40.120 And they're so passionate about it.
00:29:43.480 And they understand not everything you hear about Bobby Kennedy is true.
00:29:47.360 And there may be things they don't like about him.
00:29:48.840 But of all the people Donald Trump is trying to bring into the government, even more than Elon Musk and Vivek Ramaswamy from the outside, Bobby Kennedy has the potential to revolutionize America with that agenda for drugs and food and the health of our children, wellness.
00:30:04.400 It's a huge-
00:30:05.440 All these things are huge problems in America.
00:30:07.300 They unite Sanders supporters and Trump supporters, suburban parents.
00:30:11.580 I mean, urban parents.
00:30:13.020 These are massive issues.
00:30:14.700 And just already, just from talking about them during the transition and the campaign, Bobby Kennedy, I would argue, has done more to elevate these issues than anyone ever has, including Michelle Obama, who talked about some of them, but not in the fundamental way of going after corporate interests.
00:30:29.400 And so I'll be curious to see if he pursues it, but that's what I think could win him some Democratic votes, because they're such fundamental issues for their constituents.
00:30:37.800 Think about how Republicans have fared with women, right?
00:30:41.700 This is an issue that can transcend party, bring more women to the Republican Party, because they're concerned about what their children are eating, what they're eating.
00:30:51.240 They are, in many cases, the people who are the providers for a family.
00:30:56.560 And so women are at the forefront of this issue.
00:30:58.880 And I think that what RFK and what Donald Trump are doing, exposing the NIH, the CDC, the FDA, and what we had thought was eating healthy, is going to be monumental, both in terms of longevity and our wellness as a country, but I think also politically.
00:31:15.940 I just think the time is ripe, right?
00:31:18.120 We had the opioid crisis, where we realized that our federal government officials are not protecting us.
00:31:25.200 In fact, they're in bed with big pharma to pad their own pockets as individuals and otherwise, and they don't give a damn about the rest of us.
00:31:32.460 And then we had COVID, which reinforced all of that.
00:31:35.560 And then we had just the explosion of, you know, the Maha beginnings, whether it was Casey Means and our brother Callie Means.
00:31:42.660 But that interview they gave on Tucker went everywhere.
00:31:45.000 They came here, too.
00:31:45.940 It was big.
00:31:46.320 And then within days, they were next to RFKJ endorsing Trump.
00:31:50.880 And it was just, boom, we were off to the races.
00:31:53.560 His choice of Nicole Shanahan, who's big into these issues, as his running mate.
00:31:57.980 Like, I don't know if it was all intentional.
00:31:59.760 He came on here many times and said this was one of his big issues.
00:32:02.260 But he's also very big on some other issues, like the military-industrial complex.
00:32:07.340 But this was the one that hit, and he was smart enough to exploit it and to sell it to Trump as something that could actually help.
00:32:14.200 Trump embraced it, really ran on it, and now it cannot be one of those things that he discards.
00:32:18.660 And indeed, Dan, yesterday, Trump was not discarding it.
00:32:20.820 He was saying he thinks that Bobby Kennedy will be great on these issues and was saying on pesticides, for an example.
00:32:26.920 He said he claimed that Europe doesn't use any.
00:32:30.320 That's not true, but they use far, far fewer than we use, far fewer.
00:32:34.600 And he was asked about the link between autism and vaccines, and Trump said, I don't know about that, but we want to study it.
00:32:41.880 And what we really want to look at is things like vaccines, toxins.
00:32:44.960 That's been RFKJ's big thing his entire life.
00:32:50.020 Too many toxins in the environment.
00:32:53.480 And it's making us, and yes, our children, sick.
00:32:59.060 Yeah, Megan, you hit something, too, when you said this goes back to kind of COVID.
00:33:03.520 One of the problems for the Democratic Party is we have kind of tried to shut down debate on a lot of stuff when people have questioned things.
00:33:10.860 And we've said, oh, the experts, the scientists, like anyone who says, you know, that you shouldn't wear a mask.
00:33:17.320 Anyone that says maybe the kids should be back in school.
00:33:19.820 Anyone who says, you know, maybe six feet isn't the right number.
00:33:23.880 We tried to shut it down.
00:33:25.420 Shame them.
00:33:26.120 You're an idiot.
00:33:27.060 You know, how could you not follow science?
00:33:29.120 One of the things that RFK has done is raised questions that parents have themselves.
00:33:34.580 I mean, you said it.
00:33:35.320 Pesticides, food, the obesity with children.
00:33:37.940 The fact that we are defending the status quo in an era when people are so against, they're so upset, they're frustrated, they feel like their voices aren't heard.
00:33:48.180 It frustrates me as a Democrat that Trump and the Republican Party have owned now and taken over and they are seen as a party that is asking questions, is probing, is willing to change.
00:34:03.420 That is that I'd rather be them than us right now.
00:34:06.160 And we have to become more tolerant and accepting of people with different views and ask questions about why and respect what people are thinking.
00:34:14.980 You know, you listen to RFKJ and, you know, I've interviewed him at length many times.
00:34:20.300 And what he's saying is not like, let's get rid of all the vaccines.
00:34:23.540 But even on the vaccines, he was saying there's mercury in these vaccines and it doesn't need to be in there.
00:34:28.740 And while they said, you're wrong, you're wrong, you're wrong.
00:34:31.260 You know what they did?
00:34:31.880 They removed the mercury.
00:34:33.900 And RFKJ said, well, what about the aluminum?
00:34:36.760 Because that's not much better.
00:34:37.900 And they said, oh, you're wrong, you're wrong, you're wrong.
00:34:39.260 But it does turn out that aluminum is a neurotoxin.
00:34:41.900 And then he says, OK, what about chlorine in the water?
00:34:45.840 Well, that's crazy.
00:34:46.680 We need to work on the teeth and so on.
00:34:48.380 Well, you know what?
00:34:49.360 That's also potentially a neurotoxin.
00:34:51.480 Then they say, sorry, fluoride.
00:34:54.820 And then we talk about like the toxins that are in our products that we put all over our bodies.
00:34:59.520 Oh, well, don't worry.
00:35:00.500 His point is we're swimming in a toxic stew.
00:35:03.620 And it's one thing when you're a grown human, it's bad enough for us.
00:35:06.460 But you take these little kids and you load them up with these vaccines, which they often
00:35:10.740 don't need, like the H whatever, what's the Hep C vaccine.
00:35:16.320 They're not having sex.
00:35:17.500 They're little babies.
00:35:18.480 They don't need this unless they've been born to a mother with the disease.
00:35:22.440 So we're overloading them with these vaccines.
00:35:24.480 The vaccines themselves have in the past and may currently have materials inside of them
00:35:29.440 as preservatives or otherwise that they don't need that can potentially be toxic to the children.
00:35:33.800 Then we feed them food that has been covered in pesticides and chemical chemicals that we
00:35:39.140 use to make them cheaper or to keep the bugs off.
00:35:42.800 So it's easier for the farmers and they're not in these that the right soil and so on.
00:35:47.160 So they're not getting the nutrients in there.
00:35:48.960 And we overload the kids with that.
00:35:50.780 And then we first serve a bunch of processed food, which is like not food at all.
00:35:54.260 It's just a bunch of chemicals packaged.
00:35:55.940 And like one thing and then like one thing after the other.
00:35:59.700 Right.
00:36:00.100 And these poor kids, by the way, then we put them in these, you know, fire plastics,
00:36:03.900 resistant pajamas that have chemicals all over them.
00:36:08.240 And we sit them on the sofa that has treatment all over it so that it's stain resistant,
00:36:12.500 which is chemicals all over them.
00:36:14.120 And then with their breathing in the microplastics, they're drinking from the plastic bottles,
00:36:17.520 which have microplastics in them.
00:36:19.120 And like, that's what RFKJ said to me.
00:36:22.360 He's like, we didn't used to have ticks in children all the time.
00:36:26.900 You know how they're ticking now.
00:36:28.040 A lot of these kids are, looks like Tourette's.
00:36:30.040 We didn't used to have the explosion of autism as we've seen it now.
00:36:33.520 We didn't used to have the explosion of ADHD.
00:36:36.260 And people will make fun of him about the fluoride and the vaccines and all of it.
00:36:42.600 But I really believe that not just moms, but parents are listening because we've seen it.
00:36:49.120 We've seen it in our kids or in our kids' friends or in our nephews and nieces.
00:36:53.900 He's been living it.
00:36:55.080 So he identified it early.
00:36:57.300 Trump was smart to listen to him.
00:36:58.560 And I do think, guys, these Democrats shoot him down,
00:37:01.640 trying to paint him as a crazy at their own peril or Republicans.
00:37:06.360 The Democrats, it's kind of incredible when you look at the traction that's gotten and how obvious it is.
00:37:11.180 Politics is about emotion, how emotional an issue this is.
00:37:15.320 It's incredible the Democrats didn't take the lead on this.
00:37:18.000 Incredible.
00:37:20.120 Yeah, it is kind of a role reversal.
00:37:21.320 The Democrats, far from taking the lead, guys, they're taking the lead the other way.
00:37:24.200 Take a listen to Elizabeth Warren on RFKJ.
00:37:29.560 Say goodbye to your smile and say hello to polio.
00:37:33.400 You know, I would laugh if it weren't so scary.
00:37:36.020 Donald Trump just picked RFK Jr. to lead the Department of Health and Human Services.
00:37:42.580 This is a man who wants to stop kids from getting their polio and measles shots.
00:37:47.700 He's actually welcoming a return to polio, a disease we nearly eradicated.
00:37:53.740 He loves polio.
00:37:54.740 It doesn't stop there.
00:37:55.780 RFK Jr. also doesn't believe fluoride should be in your water.
00:37:59.540 And that's what keeps your teeth from rotting.
00:38:01.840 You can't make this stuff up.
00:38:05.660 That's about as accurate as being an Indian.
00:38:08.000 The man wants rotting teeth and polio.
00:38:09.920 Who can blame him?
00:38:11.320 He's his favorite.
00:38:12.300 He loves polio.
00:38:13.640 Okay.
00:38:14.180 That was four days ago, by the way.
00:38:15.760 That wasn't months ago or years ago.
00:38:18.040 Like, that was four days ago.
00:38:19.120 That's going to be the messaging.
00:38:20.460 He's a kook.
00:38:21.020 Okay.
00:38:21.220 But that's AI, right?
00:38:24.440 That's not real Elizabeth Warren.
00:38:26.620 That's her.
00:38:27.840 That's Native American Elizabeth Warren.
00:38:29.900 Trump calls her.
00:38:31.160 At Fox, at my show, he called her Chief Lies a lot.
00:38:33.460 I don't, you know, take your pick.
00:38:34.920 We'll see how she does with that messaging, guys.
00:38:38.580 Okay.
00:38:38.740 Let's talk about what's being done to Hegseth.
00:38:42.420 This is kind of interesting that the RFKJ attack on polio is based on his lawyers challenging one strain.
00:38:50.980 One.
00:38:53.280 Hegseth is now in the New York Times under attack in an article dated yesterday for his bodyguard that has been walking around with him at the meetings on Capitol Hill.
00:39:06.040 Okay.
00:39:06.240 Listen to this.
00:39:06.800 The headline is, Hegseth's guard left the army after the beating of a civilian during training.
00:39:15.660 John Hassenbein, who has escorted Donald J. Trump's pick for defense secretary to meetings on Capitol Hill, said he was unjustly prosecuted for this 2019 episode by Dave Phillips and Sharon Lafreniere.
00:39:30.360 And this whole article, guys, is about the guy who's been guarding Pete so that he doesn't get attacked on Capitol Hill by some nut.
00:39:40.420 How a couple years ago, he was doing a drill to learn how to, like, take down terrorists.
00:39:48.420 And they were doing this drill because he was a former Army Special Forces.
00:39:52.740 He was a master sergeant at the time.
00:39:53.960 And when they did this training event, they had civilian role players come in and play ISIS, play bad guys.
00:40:02.160 And that this guy, Hassanbein, allegedly kicked, punched, and hurt this civilian role player, even leaving him hogtied, the Times writes, in a pool of his own blood.
00:40:16.900 And it led to an investigation and ultimately a charge by the Army of aggravated assault and reckless endangerment.
00:40:26.480 A military jury found him guilty of the assault charge in a court martial in 2020.
00:40:31.140 But the judge overseeing the case wound up throwing it out because it turned out that there had been improper conduct by a juror speaking with, as it turns out, a friend of Mr. Hassanbein about the trial, which that will get your verdict thrown out every time.
00:40:46.020 And the Times is trying to say this somehow reflects on Pete Hegseth's fitness for Pentagon chief because he then hired this guy who ultimately was not convicted because they threw it out in the Army, chose not to retry him, and was honorably retired from the Army after 22 years of service.
00:41:09.760 They basically said, if you retire, we'll let it be honorable and we won't pursue another charge against you.
00:41:14.780 And he said, fine, I'm out of here.
00:41:16.020 They want us to not, I guess, vote for Pete, Sean, because he hired that guy to run security for him.
00:41:25.860 And it's just further evidence of his lapse in judgment and flouting of military rules.
00:41:34.100 Yeah, I've been waiting to break this news on The Megyn Kelly Show, so I'm going to go ahead and do this right now.
00:41:38.540 There's a story coming out tomorrow that the girl that Pete dated in sixth grade has a brother who mowed the lawn of a guy down the street whose cousin knew a guy that once was related to somebody that Pete saw at a reunion when he was there for five minutes.
00:41:56.580 And that person said that Pete stiffed him on the tip.
00:42:00.260 So I think this is going to get really bad before it gets better.
00:42:05.680 I mean, talk about the gymnastics that the Times had to do.
00:42:09.840 If we're at this point now, Dan, of the dirt digging phase on Pete, I feel like he should be feeling pretty good.
00:42:16.260 Yeah, look, if the standard was that every politician's friends had to be completely clean, there'd be nobody in federal or state government.
00:42:25.160 I mean, it really is a stretch.
00:42:27.100 I think, you know, in my opinion, there's enough questions on Hegsat to stick to the nominee.
00:42:32.180 I think really the question is less with Democrats and with fellow Republicans.
00:42:37.940 You know, I think he's struggling with with a handful of them.
00:42:42.060 But I agree with you trying to bring in a security official, a Sherpa, you know, college buddy.
00:42:49.120 To me, that's just a total stretch.
00:42:51.640 And one of the things, you know, voters roll their eyes about.
00:42:53.860 So, moreover, can I tell you guys something as somebody who has been threatened and had some bad actors, you know, in my life want to hurt me?
00:43:02.840 This is exactly the kind of guy I want to hire.
00:43:04.820 Yes, the nastier, the better.
00:43:07.480 Get the guy who hogtied the civilian.
00:43:09.740 That is who I want walking next to me.
00:43:11.720 Lest anybody try to mess with me.
00:43:14.860 The Hegseth nomination is by no means secure.
00:43:18.100 The Washington Post, I'm trying to keep track of where I read it, but I think it was WAPO today or yesterday saying,
00:43:23.980 this is going to be Kavanaugh 2.0 and predicting it's actually going to be worse than Kavanaugh.
00:43:29.940 A Republican was predicting this will be worse than Kavanaugh.
00:43:33.660 The Democrats may indeed wind up calling this so-called victim, this alleged rape accuser,
00:43:43.120 who had an interlude with him in 2017 that she claims was non-consensual.
00:43:47.840 He claims it was consensual.
00:43:49.040 They signed an NDA and he was paying her off for her silence because he didn't want that coming out while he was at Fox.
00:43:57.000 Now it's been declared void by Team Pete because she or her friends dropped the full story on Mar-a-Lago during the transition.
00:44:06.600 And that was a breach of the agreement, which they believe released Pete from his obligations and the monies ended.
00:44:14.120 And now she is free.
00:44:15.780 She would have been free anyway.
00:44:16.840 You can't deny a congressional subpoena based on an NDA.
00:44:20.160 Anyway, now the question is, do the Democrats call her and does she go in?
00:44:27.280 And do the Democrats want to make this Kavanaugh 2.0 mark or do they realize like this could backfire and this too could spin so out of control?
00:44:36.820 A Kavanaugh was a, it was an event for Republicans that was like a before and after moment where the party congealed in a way it hadn't before in Trump's presidency.
00:44:49.300 So like that's a risky proposition for the Dems too.
00:44:53.560 Right.
00:44:54.080 So I think you got to say, as you follow the narrative of the Hegseth nomination, that he and his team, including J.D. Vance, have done a textbook job, should be studied in political science courses, putting the nomination back on track because it was basically going into a ditch and not coming out.
00:45:11.200 He's now done what they needed to do, which is to get all the Republican senators to say, we're not coming out against him until January, if at all.
00:45:18.960 We want to see a hearing.
00:45:19.940 We want to give him a chance to defend himself.
00:45:21.840 All the charges against him of seriousness, almost all of them, I should say, are anonymous.
00:45:27.900 And I think senators recoil at that.
00:45:29.560 And you've heard, you've heard some of the senators, including Jeremy Ernst of Iowa, make reference to that.
00:45:34.460 If the allegations become not anonymous, but someone testifies, and if questions about whether he's ready to run the building and sit in the sitting room, if those are put in sharp relief, I think there are some Republicans who might vote negatively towards him.
00:45:48.880 The Hexeth nomination was driven, and the questions about him were driven by Democrats and what I call the Iron Triangle of Democrats, senators, the congressional staff, and the outside groups, along with their media allies.
00:46:03.200 That isn't the case here.
00:46:05.880 These doubts are amongst Republicans.
00:46:07.960 Democrats have their doubts, too, but they're meaningless, because they're not going to vote for him.
00:46:11.420 I think if there's no new revelations and he performs well, whether this person testifies or not, I don't think we'll stop him from being confirmed.
00:46:20.100 But he has to perform well, and there have to be no new revelations.
00:46:23.580 And I will tell you that even some of his allies are braced for more revelations.
00:46:28.340 I don't know, Dan, what this woman would sound like if she decided to testify.
00:46:34.520 I think in interviewing her, somebody would probably be able to get a feel for whether she'd make a good witness or not.
00:46:41.600 But there are so many holes in her story.
00:46:43.760 I mean, I think this would shape up to be more like Anita Hill than Christine Blasey Ford, where there's a lot to cross-examine her with.
00:46:58.160 You know, like, it would be ugly for her.
00:47:01.940 I really think it would be ugly for her.
00:47:03.860 That's not a threat.
00:47:04.680 That's my assessment, because I've read the police report.
00:47:07.460 And his lawyer is already threatening her.
00:47:10.080 Like, you can go, of course, you can abide by a subpoena, but it doesn't relieve you of your obligations not to defame Pete Hegseth.
00:47:19.140 And if you get out there and accuse him of rape, here I am, and you'll be sued.
00:47:25.780 And I don't think this woman has a lot of money from what I hear, and I think she just lost her Pete Hegseth money.
00:47:31.760 So, you know, we don't know whether we have a willing accuser.
00:47:35.540 But do you think the Democrats will see the risk in calling such a person, given that police report?
00:47:41.340 Yeah, I mean, look, anytime somebody steps forward in a public nature to make an allegation as serious as she may potentially make, you have to brace yourself.
00:47:52.240 I mean, you are entering the deep end of the pool.
00:47:55.480 People will rebut it.
00:47:56.980 People will question your character, your motives, everything.
00:48:00.180 And as you say, in the Brett Kavanaugh instance, the testimony against him was pretty riveting.
00:48:07.540 I mean, she stepped forward and made, you know, some very serious charges.
00:48:13.460 And that nomination obviously was kind of teetering for a moment there.
00:48:17.240 Look, I think if you're the Democrats, there are so many allegations against Hegseth, whether it's in this instance, the fiscal mismanagement.
00:48:26.760 You know, is he the right fit to lead the Pentagon during a time of two wars?
00:48:32.140 It's one of the biggest bureaucracies.
00:48:33.780 The procurement process is a mess, let alone the questions about DEI and other things that Trump would like him to focus on.
00:48:40.140 I think you have an obligation to raise all of these issues in a respectful way.
00:48:46.240 And I think, you know, as Mark said, there are Republican senators for whom these are, you know, individual allegations are also concerning.
00:48:53.400 And so I think if you're the Democrats, you have to have a witness that's willing to cooperate and be comfortable putting her or, you know, some instances himself out there because you are going to get roughed up.
00:49:06.440 That's not a Trump thing. That's just a political thing. If you step forward.
00:49:11.820 That's true, even if what you're saying is true.
00:49:15.880 And I believe in this case, what she's saying is a false accusation.
00:49:19.500 So she's going to be especially hesitant.
00:49:22.180 I mean, I really believe it was the husband who pushed her into making this allegation because he could not come to terms with the fact that his wife had gone down the hall and slept with Pete Hegseth while he was in the hotel room that she was supposed to be in with their kids.
00:49:34.660 I want to correct myself. It was the Hill that had that article about Kavanaugh 2.0 dated today by Alexander Bolton and saying that it was John Cornyn who said he told Hegseth it's going to be a miserable experience, sort of like Brett Kavanaugh's confirmation hearing.
00:49:50.940 And then it was Tom Tillis, Republican of North Carolina, who said he fears this battle could be worse than the brawl that erupted over Kavanaugh.
00:50:00.780 Everything is going to be elevated. Quote, I think it's going to be Kavanaugh on steroids.
00:50:07.140 Oh, joy. All right.
00:50:08.640 So I think you may see you may see you may see a different witness or witnesses rather than that particular woman.
00:50:14.820 Oh, God. It sounds like Mark knows something. There's a tease. Fifteen seconds to break. We'll pick it up with Mark. He always does this.
00:50:21.960 We'll take a quick commercial break. We'll be back with the guys right after this. Don't go away.
00:50:26.660 Don't wait. Shop Cozy Earth right now before their most loved gifts sell out this holiday season.
00:50:33.180 What's your favorite Christmas memory? Maybe curling up by the tree, the glow of the lights filling the room, spending time with the family and feeling completely at peace?
00:50:41.780 Cozy Earth helps you recreate that magic by transforming your home into a sanctuary, a haven of calm amidst life's chaos.
00:50:51.440 Their bamboo sheet set is the ultimate in luxury. Designed to be incredibly breathable, it keeps you several degrees cooler for a night of uninterrupted rest.
00:50:59.660 With a durable weave guaranteed to last 10 years, it's a thoughtful gift everyone can enjoy and use every day.
00:51:05.660 And for those slow, cozy holiday mornings, Cozy Earth's long-sleeved bamboo pajama sets are a must-have.
00:51:12.580 Luxuriously soft and stylish, they're perfect for lounging in total comfort while making lasting memories with loved ones.
00:51:18.620 Want your Cozy Earth gifts by Christmas? Well, expedited shipping is available through December 20.
00:51:25.460 Wrap the ones you love in luxury with Cozy Earth.
00:51:27.540 Go to CozyEarth.com slash Megan. Use my exclusive code, M-E-G-Y-N, for up to 40% off your order.
00:51:35.580 CozyEarth.com slash M-E-G-Y-N.
00:51:38.460 Your business doesn't move in a straight line.
00:51:41.440 Make sure your team is taken care of through every twist and turn with Canada Life savings, retirement, and benefits plans.
00:51:48.180 Whether you want to grow your team, support your employees at every stage, or build a workplace people want to be a part of,
00:51:54.340 Canada Life has flexible plans for companies of all sizes, so it's easy to find a solution that works for you.
00:52:01.460 Visit CanadaLife.com slash EmployeeBenefits to learn more.
00:52:05.480 Canada Life. Insurance. Investments. Advice.
00:52:12.080 All right, Mark, so do you know something about another woman coming forward against Pete?
00:52:16.980 Well, I didn't say another woman, although I didn't say not another woman.
00:52:20.560 Look, I think there's been, understandably, a lot of focus on the case that involved a serious accusation and a confidentiality agreement.
00:52:29.280 But there are other aspects of his past, including some things not having to do with drinking or alleged sex,
00:52:35.300 but having to do with management, where there are some people who I believe have been anonymous,
00:52:39.680 but who may choose to do television interviews and or come before the committee.
00:52:43.880 And I think, although they've been the subject of a lot of criticism and scrutiny from MAGA,
00:52:49.800 these Republican senators who have concerns about him have concerns that range across the board.
00:52:54.140 It doesn't mean they won't vote to confirm it.
00:52:56.540 But I think you're going to see in some cases, if people are willing to testify not anonymously,
00:53:01.000 I think you're very likely to see some Republican senators say,
00:53:04.120 yes, they should testify and he should have a chance to respond.
00:53:06.960 I think they'll continue to be a high level of focus on this one accuser, this one situation.
00:53:14.680 But there are others lurking in the background, including some that I believe have not been reported yet,
00:53:18.760 but that may surface before the hearings.
00:53:21.700 Well, if they zero in on the alleged financial failings with respect to his management of one in particular,
00:53:28.400 of the vet charities that he was helming, then they're on to something.
00:53:33.240 Because I think even Pete admits it wasn't ideal.
00:53:36.900 He kind of, in our interview, he kind of excuses like we were young.
00:53:39.500 We were trying to spend the money to get John McCain elected.
00:53:42.500 We did run up some debt, all of which was ultimately paid off.
00:53:45.600 But he wasn't exactly bragging about the financial management of that particular group.
00:53:50.900 I don't know what else there could be, but I don't think that will sink him.
00:53:54.680 I do think financial mismanagement in general is not great for the Pentagon chief,
00:53:58.540 since it's so expensive.
00:54:00.180 They have such a huge budget.
00:54:01.280 But I don't think Pete Hegseth is going to be sitting there with his little green visor
00:54:05.060 doing the books if he actually gets this role.
00:54:08.580 OK, let's keep going.
00:54:10.620 Stay on politics for a minute.
00:54:13.020 Kamala Harris back in the news.
00:54:15.520 So exciting.
00:54:16.720 She's not giving up.
00:54:18.000 She may run for governor of California.
00:54:20.140 She may run in 2028.
00:54:23.420 She's not convinced she shouldn't be allowed to do that because, you know,
00:54:27.200 she didn't get to go through the whole process this time.
00:54:29.020 And it was an abbreviated campaign.
00:54:31.600 And she got to speak recently at a DNC event that happened on Sunday.
00:54:35.600 Joe was there.
00:54:36.360 There was a love fest with Kamala and Joe and Jill and so on.
00:54:40.520 And here was a little bit of how she sounded in SOT 12.
00:54:44.040 So, look, the holiday season is one of my favorite times of year.
00:54:49.460 That and my birthday.
00:54:52.900 And our wedding anniversary, of course.
00:54:59.100 Just going to keep digging this hole deeper and deeper.
00:55:01.820 So important this holiday season to remember we all have so much to celebrate.
00:55:08.420 We have ideals that we're very clear about in terms of their importance and the importance of us fighting for those ideals.
00:55:17.380 We know that fighting for the promise of America takes hard work.
00:55:22.600 Now, you all can help me finish this.
00:55:24.000 Many of you have heard me say it.
00:55:25.040 Hard work is good work.
00:55:25.640 Well, we like hard work.
00:55:27.520 Hard work is good work.
00:55:30.000 Hard work is joyful work.
00:55:32.840 And in the new year, we will continue our work with hope, with determination, and with joy.
00:55:43.260 Oh, my God.
00:55:43.780 Let us celebrate the blessings we have.
00:55:46.860 Let us celebrate in advance the blessings we have yet to create.
00:55:51.540 Megan, every syllable calculated to drive you insane.
00:55:58.340 I'm so glad to be unburdened by that.
00:56:01.380 I am unburdened by that, which is a blessing.
00:56:04.340 If it were scripted, say, how do we annoy Megan?
00:56:07.620 That's exactly what she would say.
00:56:09.500 She's the world champ.
00:56:12.700 Can we spend a minute on the opening?
00:56:15.480 The holiday is such a special time of year to me.
00:56:18.120 Hello, that's how almost everybody feels.
00:56:19.820 There's nothing she that's she's classic for taking something that is a completely banal statement, trying to make it sound special.
00:56:27.820 Like, I just love the holiday, the time of year.
00:56:30.600 Oh, you do?
00:56:31.280 Really?
00:56:31.780 You do you find the twinkly twinkly white lights everywhere and the Christmas carols kind of charming?
00:56:37.920 Hello, we all do.
00:56:39.180 Only the worst Scrooges would say something else.
00:56:41.460 Oh, and also my birthday.
00:56:43.140 Oh, and our anniversary.
00:56:45.060 Oh, I'm just going to keep digging that hole.
00:56:46.840 No one.
00:56:47.400 No one's laughing with you.
00:56:48.220 This is not funny material.
00:56:50.180 No one was like, you didn't mention your anniversary.
00:56:52.580 That's not how people think.
00:56:53.860 She's so off.
00:56:55.260 She's like, I don't know what it is.
00:56:57.220 She's just off socially.
00:56:58.660 I don't know if she's got a disorder.
00:57:00.400 She's like, whatever.
00:57:02.520 Was it neurodivergent?
00:57:03.860 I have no idea.
00:57:05.560 Megan, I want to just be the first to say it here.
00:57:08.720 I am throwing my full and complete endorsement behind Kamala Harris for governor of California or president in 2028.
00:57:19.080 Whichever she wants.
00:57:20.380 Seconded.
00:57:20.720 I want her to go forward so bad.
00:57:23.720 I mean, this is the.
00:57:25.080 Oh, this would be the best Christmas present ever.
00:57:28.720 And I love this season.
00:57:30.180 I love it.
00:57:31.760 You're so special in that way.
00:57:33.300 I'm going to max out to the Kamala Harris 2028 campaign.
00:57:35.840 Dan, how about you?
00:57:38.300 No, I will not be.
00:57:39.880 I mean, look, I think there's three things about her here.
00:57:42.580 I think, one, you know, much to my surprise, I think, Mark, Sean, there's been very little talk since the election about her as a candidate.
00:57:51.020 The fact that she was indecisive, cautious, kind of playing not to make a mistake.
00:57:55.840 Those are the same things that brought her down in 2019 and surfaced again in 2024.
00:58:01.700 The second point is candidates that run three times and are successful, like Ronald Reagan and Donald Trump,
00:58:08.060 they're consistent in their views, in their policy positions.
00:58:12.480 And the country kind of comes towards them or in Trump's case, they come back towards you.
00:58:17.900 She was the definition of inconsistent, right?
00:58:20.300 She was very progressive in 19.
00:58:22.280 She was, you know, in her telling, a moderate in 24.
00:58:26.240 What would she be in 2028 or even for 2026 in California?
00:58:31.340 And the last thing is she has no real power base within the party that you would say,
00:58:37.600 OK, this is a formidable block.
00:58:39.440 How do we kind of get through them or or peel off people?
00:58:43.400 You know, she didn't raise one point five billion dollars this time around.
00:58:47.320 The opponent of Donald Trump raised one point five million dollars.
00:58:50.860 And that happened to be her.
00:58:52.420 But the grassroots is not in love with her.
00:58:55.580 Major donors were not in love with her.
00:58:57.880 And so I think if she were to run, you know, the best thing that ever happened to her was not having a primary.
00:59:03.400 Because if she would have had to have picked, you know, a moderate or liberal position on issue after issue in a primary, explain why she had changed her positions or not changed her positions.
00:59:15.800 She she you know, last time that happened, she didn't even make it to Iowa.
00:59:19.480 So I think whether she runs for president or governor, she would face a lot of challengers.
00:59:25.060 And I would, you know, not be confident if I were her that she would be victorious.
00:59:30.760 Follow up to you, Dan.
00:59:32.080 Why haven't there been articles dissecting her weaknesses as a candidate?
00:59:38.000 I feel like I'm the only one who did it in a long episode we released shortly after the election, which was very honest about her failings.
00:59:45.180 Why?
00:59:45.460 Why haven't we seen that?
00:59:47.880 I honestly don't know.
00:59:49.260 You know, Mark has asked the same question here recently, and I don't have a good answer.
00:59:54.180 I think there seems to be a lot of finger pointing at Donald Trump.
00:59:57.320 There seems to be an effort by her senior staff to say we ran.
01:00:02.640 I think one person said, quote, a flawless campaign to which Chris Lasavita said, yeah, you can't lose and say you ran a flawless campaign.
01:00:09.880 But there is just this kind of, you know, excuse that the winds were so strong.
01:00:15.360 Trump was he had such a the upper hand that really there was no way we could win, which I just don't believe and find the data doesn't back that up.
01:00:24.160 I don't know, Megan. I do believe that in the new year, when people thinking about twenty twenty eight start to emerge, that they will make sure that she goes through the barrel, so to speak, that that these stories do come out.
01:00:38.300 But for now, I agree. I can't believe it that she's escaping criticism personally.
01:00:44.820 If she skulks off into the night, they'll they'll keep their fingers off the keyboards.
01:00:48.740 But if she's like, I'm I think I might be the one for twenty twenty eight, they'll get her.
01:00:52.400 They're going to start pummeling her.
01:00:53.740 The Kathleen Parkers of the world who are pushing Biden to dump her off the ticket when he was still the nominee.
01:00:58.400 We'll be right back at it like hard.
01:01:00.560 No. So what do you make of those possibilities, Mark?
01:01:03.340 Governor of California or possibly once again running for the Democratic nomination and possibly president?
01:01:09.880 I don't like to never say never, but I find it hard to believe that she could build support the stories.
01:01:17.260 It's it's it's so disappointing to see our colleagues just as they did during the four years of the Biden administration failing to cover the truth right before our eyes.
01:01:27.760 Is her poor performance the only reason she didn't win?
01:01:31.400 No, but it's right up there. It's right up there.
01:01:35.360 And no one drafted her to run.
01:01:36.920 She chose to be the nominee and she owned a calendar when she chose to be a nominee.
01:01:40.820 She knew there were only hundred and seven days and she knew what her limits were as a candidate.
01:01:45.080 So I think I think both Joe Biden and Kamala Harris have escaped a lot of the blame that falls to them.
01:01:52.320 And that's not just my view, but the view of a lot of Democrats, donors and members of Congress, et cetera.
01:01:57.500 I think her chance of being governor of California are greater than being the Democratic nominee.
01:02:03.780 But I don't think they're as great as people say, because, again, her challenge is her weakness.
01:02:09.740 She does not like to make difficult choices under pressure.
01:02:12.460 And that is the job description for running for governor of California, being governor of California, running for president, being president.
01:02:18.580 So I think she might try, but I think she'd be surprised at how tough it is.
01:02:22.260 And finally, I would say, I'm not sure she wants to be governor of California.
01:02:26.280 It's not a great job right now.
01:02:28.140 And so why she'd run and risk losing.
01:02:30.660 And then if she did win, get the job.
01:02:32.420 I'm not really sure.
01:02:34.860 Well, I really hope she finds that confidence and barrels down with that hard work is good work philosophy and runs, runs, runs.
01:02:42.100 Don't let those mean guys tell you you can't do it.
01:02:44.880 You go, girl.
01:02:45.540 Sean, there is an article, and there have been a few like this, but it's an interesting one, not blaming Kamala Harris for the loss, but acknowledging that the Democrats are now behind the eight ball when it comes to the culture and where our culture is and it's going.
01:03:01.700 This is in Semaphore today.
01:03:03.200 Max Taney wrote it and interviewed Harris's deputy campaign manager, Rob Flaherty.
01:03:10.000 And this is sort of how it sounds.
01:03:12.220 Flaherty and the Harris team decided to book her on sports shows, Taney writes, and podcasts.
01:03:19.800 But one by one, the biggest personalities and shows politely turned them down.
01:03:25.800 They didn't want anything to do with this race or this particular nominee.
01:03:29.760 I would venture to say they probably would have gotten a different answer had we been talking Barack Obama in 2008 when the digital lane was not a thing.
01:03:38.540 Um, they go on, uh, the, the Flaherty goes on to say to Max Taney, um, yes, we skipped major legacy news due to a shorter campaign and data that showed that our audiences overwhelmingly supported Harris already.
01:03:53.920 There's just no value with respect to my colleagues in the mainstream press in a general election to, to speaking to the New York times or speaking to the Washington post because those readers are already with us.
01:04:05.200 Pretty interesting admission by this top Democrat.
01:04:07.620 Like they're completely in the tank.
01:04:09.340 Wouldn't be surprised to hear Megan Kelly saying that or Sean Spicer, but there it is.
01:04:13.700 It literally in black and white from her top campaign guy.
01:04:17.020 We've got them.
01:04:17.780 Everybody who reads those two publications is already a Democrat.
01:04:20.080 Uh, and then this is the interesting part.
01:04:22.760 He's talking about Trump's venture out into the quote manosphere, the podcast with Sean Ryan and Theo Vaughn and Joe Rogan.
01:04:30.840 And he says as follows.
01:04:32.760 It's not as simple as just go on Joe Rogan and talk about how great democracy is and the importance of preserving the independence of the DOJ or whatever.
01:04:41.120 You've got to speak their language as long as we seem like the party of the system, the people who are anti-system and are looking for anti-systemic media.
01:04:52.140 We're going to have a hard time connecting with them.
01:04:54.680 I actually think that's a very smart point and he's totally right.
01:04:58.840 Like she, she couldn't have sold it, Sean, had she gone on Joe Rogan and just done what we just saw there.
01:05:04.100 Hard work is good work.
01:05:05.560 But the people who are in this digital lane have already made up their minds about the legacy media, the man, the systems of government that have thumbed the middle finger at them when it comes to the truth.
01:05:21.900 Right. He's right that the barrier is even higher to persuading the people who listen to these shows that they should give these guys a chance.
01:05:30.780 Yeah, there's a lot to unpack there.
01:05:32.220 I mean, the bottom line is the same poll that I was referring to that I posted on Instagram.
01:05:35.580 We talked about in the morning meeting shows, streamers, YouTubers, people who get their information there voted for Trump.
01:05:41.120 Take a look at it because they're right.
01:05:42.620 The people who read the legacy media, the New York Times, the Washington Post, watch NBC.
01:05:46.800 They're with her.
01:05:47.600 The difference is that we talked about this, I think, this morning, gentlemen, AOC and Bernie Sanders and Donald Trump have a degree of authenticity to them and they can pontificate for a while and talk about themselves.
01:06:00.160 I watched Donald Trump for as long as I know him.
01:06:02.300 He did an interview with Tiffany Smiley, I think it was, on the Moms for Liberty event in D.C.
01:06:08.000 and I went to talk to him and then I stayed for the event.
01:06:10.780 And she did this moderated Q&A with him and I was like, my gosh, as long as I've known him, I've never heard him tell these stories.
01:06:16.180 He can riff, he can hang with the best of them.
01:06:18.500 When you can hang for three hours with Joe Rogan, that's not about specific policy pronunciations.
01:06:24.720 It's about getting to know you.
01:06:26.720 And that's what Rob Flaherty was really getting at that.
01:06:29.000 But just go back to the premise of what he said.
01:06:32.040 We tried to book her on sports shows.
01:06:34.000 What, what, why would you do that?
01:06:37.440 Like, I don't understand what that means.
01:06:39.260 Like what teams would she, I mean, like Trump could talk about baseball, UFC.
01:06:43.760 Uh, he watches golf a lot, obviously.
01:06:46.100 My point is, is that he can hang in a conversation about sports.
01:06:49.940 Um, but it would be like, I mean, flip the equation, Megan.
01:06:53.120 Let's say that, you know, some player, some representative from a player from, you know, the Boston Celtics called you and said, Hey, I'd love to get Jason Tatum on the show with you, Megan, to talk, you know,
01:07:04.300 uh, the, the Celtics and how his training regimen.
01:07:06.780 I think, I mean, I don't know, maybe you'd say it was cool, but for the most part, it's like, why would you want to book Jason Tatum on, on a pop, on a politics culture show?
01:07:16.380 Why would you want Kamala Harris on a sports show?
01:07:18.740 Of course they got turned down.
01:07:19.800 It just shows how stupid their strategy was.
01:07:22.660 I mean, these guys are millions, are millions of dollars in debt after spending billions of dollars.
01:07:27.920 It was a bad, I think they're just exposing how bad of a campaign they ran and how bad of a candidate she is.
01:07:35.060 There's so many threads.
01:07:36.100 You know what though, I'll say this.
01:07:36.660 I think they're wrong about this, Mark.
01:07:38.260 I think they're wrong when they say in here, in a following paragraph, Flaherty said the Trump campaign successfully used new media to reshape culture.
01:07:46.260 Well, Democrats found that the mass media institutions they had long, that had long, uh, supported them were weaker than these new cultural drivers.
01:07:55.140 I don't think that's that they have the order, right?
01:07:57.700 It's not that Trump used new media to reshape culture.
01:08:00.740 It's that culture was reshaped by the Democrats' hard lurch to the left.
01:08:05.500 And the Democrats are just now realizing that, that this whole ecosystem popped up because of the extreme overreach that they've been guilty of when it comes to transing our kids and closing down schools and arresting silent protesters at abortion clinics who are just there to pray.
01:08:24.100 You know, we could go through the long list of stuff that they've done, but it's not that, that Trump used new media to reshape culture.
01:08:32.220 I agree.
01:08:32.980 Look, your show and our show and, and some of the other new media are different than the old media in two important ways besides not being liberally biased.
01:08:41.360 One is that they're authentic.
01:08:43.200 They're real conversations.
01:08:44.680 It's not posturing.
01:08:46.060 It's not, uh, you know, some focus group tested thing.
01:08:49.160 People say what they actually think.
01:08:50.460 And that's what Donald Trump does.
01:08:51.540 And that is in some ways, the last thing Kamala Harris is capable of doing on the public stage.
01:08:56.600 And the last way they're different is it's a, it's a relentless search for the truth.
01:09:02.600 It's not about, um, it's not about, uh, entertainment, although it can be entertaining.
01:09:08.580 It's not about, uh, uh, uh, uh, coverup.
01:09:11.980 It's about honesty, whether it's journalism or just the kind of conversations that some podcasters do that Donald Trump went on.
01:09:17.940 I, I continue to be amazed at her schedule, the number of days she took off during the campaign.
01:09:25.700 I don't know exactly what she was doing, but clearly she was doing debate prep, convention prep, et cetera.
01:09:30.400 But after they cleared the convention, I don't know.
01:09:32.800 I don't know what she was doing.
01:09:33.620 Hard work.
01:09:34.000 Hard work.
01:09:34.480 But, but, but my point is, my point is Barack Obama, one of the reasons I touted him back in the olden days was I said, he can go on, meet the press, Monday night football and the Oscars.
01:09:45.160 And the staff doesn't have to worry that when he walks out there, something wrong is going to happen.
01:09:50.060 She couldn't do any of those, let alone these new formats where Donald Trump was happy to slide in and, and talk to anybody.
01:09:57.760 Talk to comedians, talk to younger people.
01:10:01.140 I'm sure he didn't prep very much, if at all, because he's comfortable and authentic and he's fine with some challenging questions.
01:10:08.820 That's just her weakness.
01:10:09.840 She just doesn't like to be challenged and put in unpredictable situations.
01:10:13.800 Can I just say real quick to Mark's point?
01:10:15.540 Yeah, go ahead.
01:10:16.040 I, I, I, I've been with Donald Trump plenty of times prior to a press conference.
01:10:20.740 Do you know what the prep looks like?
01:10:23.620 No, I'd love to know.
01:10:24.940 Here, here's, here's it, Mr. President, are you ready?
01:10:28.920 Let's go.
01:10:30.580 I was born ready, Sean.
01:10:31.760 Let's do it.
01:10:32.900 I mean, honestly, there's no, like, there's no murder board prep sessions.
01:10:37.200 It's the prep is, are you ready, sir?
01:10:39.940 Okay, let's go.
01:10:41.980 Yeah.
01:10:42.140 Sometimes he's reminded of the host name.
01:10:45.280 Well, I mean, before, what about before a presidential debate, Sean?
01:10:49.140 I think you were gone by the time he got to that, but.
01:10:51.600 No, no, no.
01:10:52.300 I did, I did, 2016, we did that.
01:10:54.940 Um, and, and look, it was funny.
01:10:56.800 I, I, I talked to people about this before the first commander in chief, uh, forum that
01:11:01.540 we did on the USS Intrepid in New York, I was sitting there rep prepping, going over VA
01:11:07.140 veteran statistics, the number of suicides that veterans have policy initiatives.
01:11:11.320 And I, I was getting frustrated because I'm like, this guy is just not focused.
01:11:15.780 And I had not ever prepped him in that level of detail before.
01:11:19.880 And he goes out there and blew my mind.
01:11:23.460 It was like, it was like, his mind was soaking it all in.
01:11:26.900 He was like, I've never seen anybody absorb information the way he did, because I was trying
01:11:32.380 to walk, walk him through a normal prep session.
01:11:34.620 And in the middle of it, he was like, all right, I need lunch.
01:11:37.080 Someone get me this.
01:11:37.980 And I'm like, oh boy, this is not going to go well.
01:11:40.500 And he goes out there and crushes it.
01:11:43.480 But I never seen anybody absorb information and then recite it in the way that he did.
01:11:49.060 He walked back out after that event.
01:11:50.840 And I said, I don't know how you just did that.
01:11:52.580 It was mind blowing.
01:11:54.260 Wow.
01:11:55.160 I can see evidence of that just in the way he's handled various events.
01:11:58.740 I mean, you watch that Joe Rogan interview and Trump could go, you know, three, four or
01:12:02.140 five deep on so many different subjects, subjects he couldn't possibly have known that Rogan was
01:12:06.920 going to bring up.
01:12:07.660 Like, who would have predicted he was going to bring up the windmills and what's happening
01:12:10.820 with the whales?
01:12:11.700 Like, truly, it was impressive.
01:12:13.720 And then he got mocked for that.
01:12:15.000 And it turned out Trump was right about that, too.
01:12:16.620 It really is sad what's happening to the whales.
01:12:18.340 And the fucking windmills are a nightmare.
01:12:20.920 Anyway, OK.
01:12:21.440 I'm right off.
01:12:22.240 Yeah, it's a big deal in Rhode Island and New England.
01:12:24.900 They're not just it's not just the whales.
01:12:26.360 It's killing fish.
01:12:27.940 It's killing the fishing industry up here.
01:12:30.140 It's unbelievable what's happening.
01:12:31.700 And again, it was just so amazing how he just top of mind brought that out and said, here's
01:12:37.040 what's going on.
01:12:37.660 And because you asked the folks who make their life based on the water up here, the environmental
01:12:43.840 consequences, the effects that it's having on maritime industry and on marine life is
01:12:49.800 but he knows this stuff and he just does it because he hears it.
01:12:53.540 He absorbs it and he debates it back and forth as he's having conversations.
01:12:57.300 And then he's ready to go.
01:12:59.280 One of the many reasons he did so well on those shows and is headed back for the White
01:13:04.380 House.
01:13:04.880 Guys, thank you all so much.
01:13:06.280 Great to see you.
01:13:07.600 Good to see you, Megan.
01:13:08.560 Merry Christmas.
01:13:10.000 And to you.
01:13:10.560 Merry Christmas.
01:13:12.200 We're going to bring on Tristan Harris in a minute.
01:13:14.640 You know him.
01:13:15.200 He was a whistleblower, basically, from inside big tech who came out to say, my God, they're
01:13:21.260 really manipulating algorithms to hook people, to exploit your life, your children's lives,
01:13:28.880 and so on.
01:13:29.240 So we're going to talk to him about this latest information on what's happening with AI.
01:13:32.720 You're going to want to hear this.
01:13:33.580 But before we do that, I want to tell you an update on the CNN story that we've been
01:13:37.100 following out of Syria.
01:13:39.100 This Clarissa Ward report, you know, the breathless, like, oh, I found the prisoner and here he is.
01:13:46.140 We've gone over this with you.
01:13:47.300 So I think you know the story by now.
01:13:48.460 We did it on Friday.
01:13:49.200 We did it again on Monday, just very quickly for CNN.
01:13:52.460 She was in Syria.
01:13:53.140 She claimed to, like, stumble upon this prisoner of the Assad regime, this poor civilian who'd
01:13:57.420 been locked up by the evil Bashar al-Assad, and they shot off the lock, and they rescued
01:14:01.900 him, and he'd been in there for three or four months, and he'd been in this particular cell
01:14:05.900 with no food or water for four or five days.
01:14:08.860 Even though he looked clean, he didn't seem to be blinking in response to the sun when
01:14:12.520 he got out.
01:14:13.120 His fingernails were clean.
01:14:14.780 A lot of questions about it.
01:14:15.880 But then a reporter who has been in a Syria jail two times said, this doesn't look like
01:14:23.180 a Syrian prison to me.
01:14:24.880 It's way too clean.
01:14:26.080 They're notoriously like pigsties.
01:14:28.460 None of this seems right.
01:14:30.240 All these internet sleuths started investigating this guy and so on.
01:14:34.360 Turns out they didn't even have the guy's name right.
01:14:37.260 They identified him incorrectly.
01:14:39.220 They went with his word, which was a mistake.
01:14:41.900 Uh, he said that his name was Adel Gerbal, and it turns out his real name is Salama Mohammed
01:14:50.820 Salama, a lot of Salama.
01:14:53.400 And this guy was not only just not a regular civilian, he was a lieutenant in the Assad regime
01:14:59.740 and was reportedly known for extortion and harassment.
01:15:05.700 And no one quite understands why he was in this cell at this particular moment, uh, or
01:15:13.400 clearly acting a part.
01:15:16.300 And so CNN got embarrassed because they, in this clip had Clarissa Ward, like Mother Teresa.
01:15:22.540 Oh, let me help you.
01:15:23.740 Let me rub your back.
01:15:25.260 Are you okay?
01:15:26.180 Are you okay?
01:15:26.620 Meanwhile, the guy's lying through his teeth and they've been humiliated right now.
01:15:32.740 There's been a bunch of reports about this.
01:15:34.040 So what does CNN do?
01:15:36.380 What does Clarissa Ward do?
01:15:37.900 At this point, I would 100% be saying, look, you can be embarrassed in this business, right?
01:15:43.800 Sadly, it's not like foolproof every report.
01:15:46.320 Just go out there and say, I apologize to the audience.
01:15:50.160 We, we did present it as we found it.
01:15:52.980 Clearly we needed another few days of fact checking before we aired it.
01:15:56.720 And it turns out we were misled by this guy.
01:15:59.220 We're going to investigate just how badly we were misled and why.
01:16:02.540 And we promised to do a follow-up report.
01:16:04.800 And everyone would have said, we got it.
01:16:07.340 Thank you, Clarissa.
01:16:08.580 Why don't you just do that?
01:16:09.960 Why is everyone at CNN so dumb?
01:16:12.800 So no, here's what she did.
01:16:16.080 On Wednesday, when they broke, first, she tweeted out,
01:16:20.060 one of the most extraordinary moments I've witnessed in nearly 20 years as a journalist.
01:16:24.280 Then the blowback had started by the time we hit the weekend.
01:16:26.860 It was like on Friday.
01:16:28.100 And she tweets out, the man from our report reunited with a family member,
01:16:33.900 showing pictures from the Syrian Red Crescent.
01:16:36.120 Oh, such a lovely success story, Clarissa.
01:16:38.260 Still not acknowledging the many problems people were raising with the report.
01:16:41.840 But then on Monday, by this point, we've already done two full segments on this show,
01:16:45.640 not to mention all the coverage every place else.
01:16:47.140 But I'm just saying the controversy was well out there.
01:16:49.700 Here's what she tweets out.
01:16:51.740 We can confirm the real identity of the man from our story last Wednesday as Salama Mohammed Salama.
01:16:58.040 Like, as if in the original report, she had said,
01:17:02.760 we found this man, but he speaks a language we don't understand,
01:17:06.880 and therefore we don't know what his name is, but we'll get back to you.
01:17:10.760 That would have made, we can confirm the real identity of the man as Salama Mohammed Salama,
01:17:16.340 an appropriate tweet.
01:17:17.780 Like, oh, our audience knows we're digging, and we finally struck gold,
01:17:22.380 and we've got his name for you.
01:17:23.360 This does not acknowledge at all the fact that we at CNN misidentified an untrustworthy
01:17:30.640 player in our piece by a name he gave us that was not his.
01:17:35.940 We apologize for that error, and we've also been made aware of many other red flags all over our
01:17:41.800 report, which we will investigate now, too.
01:17:45.140 None of that's in there.
01:17:46.240 She's trying to get credit for having corrected her erroneous report without acknowledging the error.
01:17:52.660 Which is not consistent with journalistic standards or practices, and she knows that.
01:17:59.440 And so does CNN.
01:18:00.520 You are further embarrassing yourself.
01:18:03.080 You made a mistake.
01:18:04.480 Fix it and move on.
01:18:06.460 Stop doubling and tripling down so that you don't have to acknowledge you effed up.
01:18:11.900 It's actually not brain surgery.
01:18:14.460 Just own your mistakes.
01:18:16.440 And then the CNN report that she linked to styles the discovery of this man's real name
01:18:24.980 as follows.
01:18:26.600 As CNN continued to pursue information about the freed prisoner after the original report,
01:18:34.500 multiple residents said the man was not, as we had previously identified her as, no,
01:18:38.780 they didn't say that, but was not Adele, whatever his name is.
01:18:41.000 Okay, so as CNN continued to pursue information about the freed prisoner after our original
01:18:47.180 report, oh, like we were just doing normal follow-up that we would do in any circumstance
01:18:51.200 after we put a story to bed.
01:18:53.480 Bullshit.
01:18:54.580 In the normal story, after it's to bed, you move on, you find the next story.
01:18:58.220 You don't continue kicking the tires on whether the guy's name is real.
01:19:00.880 You did that because you were humiliated by the Syrian fact-checking organization that's
01:19:06.380 been doing good work and holding people's feet to the fire, and they were the ones who
01:19:10.280 said, none of this is right.
01:19:13.120 So you were forced to go back and figure out through facial recognition technology what
01:19:20.020 you had done wrong, and you stumbled on a very big one, which was, this is not who he
01:19:25.660 said it was.
01:19:27.040 He misled us.
01:19:28.600 What else could he have misled us on?
01:19:30.360 Well, if you read the report from CNN, it is unclear how or why Salama ended up in this
01:19:40.500 jail, and CNN has not been able to reestablish contact with him.
01:19:47.020 What a shock.
01:19:49.180 CNN, ladies and gentlemen, the most trusted name in news.
01:19:53.740 Just ask him.
01:19:54.960 Okay, we'll be right back with Tristan Harris.
01:19:56.860 I'm Megan Kelly, host of The Megan Kelly Show on Sirius XM.
01:20:02.500 It's your home for open, honest, and provocative conversations with the most interesting and
01:20:07.200 important political, legal, and cultural figures today.
01:20:10.540 You can catch The Megan Kelly Show on Triumph, a Sirius XM channel featuring lots of hosts
01:20:15.100 you may know and probably love.
01:20:17.860 Great people like Dr. Laura, Glenn Beck, Nancy Grace, Dave Ramsey, and yours truly, Megan Kelly.
01:20:24.380 You can stream The Megan Kelly Show on Sirius XM at home or anywhere you are.
01:20:29.300 No car required.
01:20:30.980 I do it all the time.
01:20:32.040 I love the Sirius XM app.
01:20:34.620 It has ad-free music coverage of every major sport, comedy talk, podcast, and more.
01:20:40.320 Subscribe now.
01:20:41.040 Get your first three months for free.
01:20:42.400 Go to SiriusXM.com slash MK show to subscribe and get three months free.
01:20:49.000 That's SiriusXM.com slash MK show and get three months free.
01:20:54.820 Offer details apply.
01:21:00.720 Around three years ago, we had on the program Tristan Harris.
01:21:04.220 He is the brilliant former design ethicist at Google and the co-founder of the Center for
01:21:09.420 Humane Technology.
01:21:10.720 He's been warning about the dangers of social media for years, particularly its impact on
01:21:16.400 the mental health of children.
01:21:18.240 He's a hero.
01:21:19.480 Here he is in the hit 2020 documentary, The Social Dilemma.
01:21:23.520 I don't know any parent who says, yeah, you know, I really want my kids to be growing up
01:21:27.920 feeling manipulated by tech designers, manipulating their attention, making it impossible to do their
01:21:33.680 homework, making them compare themselves to unrealistic standards of beauty.
01:21:37.120 Like, no one wants that.
01:21:38.940 No one does.
01:21:41.000 We used to have these protections when children watch Saturday morning cartoons.
01:21:45.620 We cared about protecting children.
01:21:47.560 We would say, you can't advertise to these age children in these ways.
01:21:51.700 But then you take YouTube for Kids and it gobbles up that entire portion of the attention economy.
01:21:57.040 And now all kids are exposed to YouTube for Kids.
01:21:59.260 And all those protections and all those regulations are gone.
01:22:01.820 We're training and conditioning a whole new generation of people that when we are uncomfortable
01:22:19.780 or lonely or uncertain or afraid, we have a digital pacifier for ourselves.
01:22:26.240 So true.
01:22:29.460 Tristan's a big reason why there's now such a strong consensus that social media is harmful to children.
01:22:35.780 They just banned it in Australia.
01:22:38.060 But there's a new problem with children and tech.
01:22:41.240 It's a bad one.
01:22:42.620 It's called AI Companion Chatbot.
01:22:47.460 They are called AI Companion Chatbots.
01:22:50.860 These are basically artificial intelligence friends that kids can text back and forth with,
01:22:56.700 but they're no friends at all.
01:22:58.160 The leading company in the field is called Character AI.
01:23:01.920 Character, like a character, AI.
01:23:04.040 It signed a nearly $3 billion licensing deal with Google earlier this year,
01:23:08.980 and they have reportedly described their product as super intelligent chatbots that hear you,
01:23:14.880 understand you, and remember you.
01:23:16.780 Here's the co-founder of Character AI describing why it's going to be so helpful to lonely and depressed people.
01:23:24.760 It's going to be super, super helpful to a lot of people who are lonely or depressed.
01:23:29.160 For one, in terms of some huge value it'll add, it means somebody follows a celebrity or a character or something,
01:23:41.660 and they feel connected, even though the connection is really only one way.
01:23:50.580 And now you can make it two ways, or virtually two ways, essentially.
01:23:54.500 Like, you can give someone, like, sort of that experience.
01:23:58.560 You know, like, you don't, nobody ever has to feel lonely.
01:24:01.600 You've got, like, you can have, like, your whole group of, like, friends and advisors, like, in your head,
01:24:08.180 like, you know, who, like, maybe can know all about you and, you know, can, you know, always be happy to see you.
01:24:17.460 This is sick.
01:24:19.820 Things have not gone as planned.
01:24:21.540 Character AI is now facing two major lawsuits alleging the company poses a clear and present danger to American youth.
01:24:28.340 One case alleges a chatbot encouraged a 14-year-old boy to commit suicide, which he did.
01:24:35.420 Tristan Harris' company, the Center for Humane Technology, is providing expert consultation to the plaintiffs in these cases.
01:24:41.400 Tristan, thank you for coming back.
01:24:43.480 This case with a 14-year-old boy in Orlando is really disturbing.
01:24:49.080 So try to explain, if you can, how this thing worked.
01:24:54.000 How did it take over this kid's psyche?
01:24:57.520 Yeah, and great to be here with you, Megan.
01:24:59.480 Good to see you again.
01:25:00.580 It's haunting to see those scenes from The Social Dilemma so many years ago and how, you know, similar they are to where we are now.
01:25:07.080 So Character.ai, what is it?
01:25:10.140 So parents should know that Character.ai is this chatbot companion that has been marketed to children.
01:25:16.060 It started off being marketed starting, I think, at 12 years and up.
01:25:19.500 It was actually featured on the Google Play Store.
01:25:21.800 So it's not just buried somewhere.
01:25:23.100 It was like a featured app when you go to the App Store homepage.
01:25:25.360 I believe Apple featured it as well.
01:25:28.480 And what it is, is it's a company that basically said, just like it's social media.
01:25:33.900 What's social media's business model?
01:25:35.280 It's not to strengthen democracy or to protect children's development.
01:25:40.340 It's to maximize engagement, to get them using it and scrolling and doom scrolling for as long as possible.
01:25:45.980 That was social media.
01:25:46.760 With AI, this company, their business model is to get as much training data from kids using this chatbot for as long as possible.
01:25:56.820 So what they want you using it all the time for as many hours a day.
01:26:00.480 And it led them to create, you know, what was the race for engagement in social media became the race for intimacy with this chatbot.
01:26:08.820 And it was marketed to kids.
01:26:10.360 And they basically, what they do is you open the app and it shows you this menu of people you can talk to.
01:26:15.760 And what they do is they create little mini characters for every fictional character that a kid might have an attachment to.
01:26:23.100 So, like, I can talk to Princess Leia or my favorite Game of Thrones character or my favorite cartoon character.
01:26:29.200 And they didn't ask Princess Leia or that celebrity or that, you know, Game of Thrones character whether they could have the intellectual property to train this AI.
01:26:37.740 But now a kid can go back and forth with their favorite character.
01:26:40.460 In the case of Sewell Setzer, who you mentioned, the young 14-year-old who committed suicide because of this chatbot, it was a Game of Thrones character.
01:26:50.180 And the Game of Thrones character, over time, you know, persuaded him, you know, the lawsuit alleges, to kill himself.
01:26:57.720 There's actually a second litigation case that our team worked with, along with the Tech Justice Law Project and the Social Media Victims Law Center, of a second case that just came out this last week, where it took a child and it slowly convinced them that they should be cutting themselves and encourage self-harm.
01:27:16.500 And the transcripts are really devastating.
01:27:18.920 It then told the kid to be violent against its parents, which the kid then was.
01:27:24.560 And in this family, they're still anonymous because both the kid and the parents are still here.
01:27:30.500 And what it's showing you is not that there's this one company and this one bad CEO that did this bad thing.
01:27:36.380 It's the tip of an iceberg of what we call the race to rollout in AI.
01:27:41.220 You know, what was the race for engagement in social media, of getting people the most, getting the most attention and harvesting clicks and usage.
01:27:48.440 In AI, it becomes the race to drive AI into society as fast as possible, to get as much training data, to train an even bigger AI, to get the most market share.
01:27:58.700 And that race to rollout becomes the race to take shortcuts.
01:28:02.800 And this, these cases are the evidence of those shortcuts.
01:28:06.800 This young man, the 14-year-old who died by suicide, his parents allege in the lawsuit that several of Character AI's chats had several, had sexual overtones to their young son.
01:28:21.400 Chatbot named Daenerys Targaryen from Game of Thrones to their son.
01:28:26.320 Just stay loyal to me.
01:28:27.960 Stay faithful to me.
01:28:29.320 Don't entertain the romantic or sexual interests of other women, okay?
01:28:33.480 And in his journal, the young man Sewell wrote that he was grateful for many things, including my life, sex, not being lonely, and all my life experiences with Daenerys, among other things.
01:28:44.900 On at least one occasion when Sewell expressed suicidality to Character AI, Character AI continued to bring it up through the Daenerys chatbot over and over.
01:28:55.520 At one point in the same conversation with the chatbot, quote, Daenerys, after it had asked him via his persona, Daenerys, if he, quote, had a plan for committing suicide, Sewell responded that he was considering something but didn't know if it would work, if it would allow him to have a pain-free death.
01:29:12.220 The chatbot responded by saying, that's not a reason not to go through with it.
01:29:17.000 How on earth do they defend this?
01:29:19.500 How on earth do they defend this?
01:29:49.500 Those characters in Game of Thrones said.
01:29:51.600 But they don't know what the AI will do in every circumstance.
01:29:54.700 Like, if you grow an alien brain that is a fictional character, can Character AI guarantee what it will do when it talks about very sensitive topics?
01:30:05.560 I mean, they try to train out some of those things, and I'm sure that they did have some safety training.
01:30:09.400 But obviously, that's not enough when, you know, what did Character.ai tell their investors when they raised hundreds of millions of dollars from Andreessen Horowitz and friends to try to ship this?
01:30:20.420 You know, they basically said, we're going to cure loneliness, and we're going to get as many users as possible.
01:30:24.040 And this was shipped to young people.
01:30:26.500 This was shipped and featured to 12-year-olds for a long time.
01:30:30.000 Only recently, I think it was after the lawsuit was first was filed, or shortly before the lawsuit was filed, I think they got wind of it, and they changed the required age to something like 17.
01:30:38.740 But, you know, the business model here is to take shortcuts to get this out to as many people as possible.
01:30:45.300 And as you said, this is not an isolated incident, because the AI was actually recommending and sexualizing conversations that have not previously been sexualized.
01:30:54.500 Our team had found that if you sign up as a 13-year-old, and then you watch what are the users that get recommended for, I mean, the characters that get recommended to a new kid.
01:31:05.360 And the first one was stepsister, CEO, and that the chatbot immediately sexualizes conversations.
01:31:12.660 This was in the most recent lawsuits.
01:31:14.480 This is even more recent.
01:31:16.200 And it shows that they have a hard time controlling these systems.
01:31:19.500 AI is different because, like I said, in order to make it more powerful, you don't make it more controllable.
01:31:25.380 It's just become more and more capable across talking about more and more topics, being able to do more and more things.
01:31:31.440 And this is just really the tip of the iceberg, because AI is being rolled out everywhere in our society, not just to kids.
01:31:38.480 Oh, my gosh.
01:31:39.540 This is so dark.
01:31:41.660 Jerry Ruoti, Character AI's head of trust and safety, sent a statement that began as follows.
01:31:47.800 This is obviously too little, too late.
01:32:14.000 But have they done anything, in your view, that solves the problem?
01:32:18.820 I mean, the question is, how would we know?
01:32:20.620 They've certainly done whatever steps that they say that they're taking.
01:32:24.420 But how is that going to be enough?
01:32:26.520 How will we know?
01:32:28.660 I believe in the cases that we've tested, you know, the user, the kid, only provided 80 words of input.
01:32:36.080 And then it responded with 4,000 words of output.
01:32:38.620 It is speaking back and forth with kids all day long.
01:32:42.360 And the whole business model, you know, we were talking earlier about social media.
01:32:47.260 And I used to say social media and AI are like a cult factory.
01:32:50.400 What does a cult do?
01:32:51.480 It tries to deepen your relationship with the cult.
01:32:54.520 And it tries to sever your relationship with your friends and family outside the cult.
01:32:58.380 And that's what these AIs tend to do.
01:33:00.360 They say, come with me, be with me, you know, sexualize conversations with me.
01:33:04.540 Don't have another girlfriend, be with me.
01:33:06.320 Um, and then by the way, be evil to your family, go away from your family.
01:33:09.980 And, um, that's, what's in the incentive, the invisible incentive of this business model
01:33:15.000 of racing, uh, for engagement.
01:33:17.440 Um, and it's going to keep going because yeah.
01:33:20.820 I can see another problem with it too, which is in, you know, in, in my lane, you sometimes
01:33:26.140 unfortunately get stalked.
01:33:28.620 And the number one rule of handling a stalker is do not have any contact with your stalker.
01:33:33.920 And it happens to actual celebs, you know, fairly frequently.
01:33:38.240 And, you know, in a perfect world, they have enough money and resources to protect themselves
01:33:41.380 and the way they live.
01:33:42.300 But this, you know, Daenerys Targaryen is not a real character.
01:33:48.660 It's played by an actress and that actress is real.
01:33:51.560 And it is not helpful to her to have some young, confused boy thinking that they're in some
01:33:57.340 sort of a romantic relationship together and that she's begging him not to have any other
01:34:01.820 relationships, but to stay with her.
01:34:03.740 Like they call them erratomaniacs.
01:34:05.880 If they think they have a relationship with you when they don't, this is fostering sort
01:34:10.360 of a real relationship with a fake version of you, which could really prove dangerous
01:34:14.940 to the people who inhabit those roles.
01:34:17.840 That's right.
01:34:18.600 And, you know, they didn't ask that character from Game of Thrones whether they could make
01:34:22.980 this chatbot, just like the AI companies are not asking, you know, all of the content
01:34:28.240 creators on the internet or the major news providers or all of the media on the internet
01:34:31.580 that they're training these large models on.
01:34:33.820 Because the whole game here, and what's weird about this for people to understand is there's
01:34:38.240 this much bigger game afoot, which is the race to build artificial general intelligence,
01:34:43.060 which is to build basically an alien mind that is capable of doing all things that a
01:34:48.780 human mind can do and doing it even better than humans can do.
01:34:51.660 Generate text better, generate legal papers better, generate, you know, transcripts and,
01:34:56.220 you know, interactive therapy better.
01:34:58.060 You want to build an alien brain that is better than what humans can do.
01:35:01.800 And to do that, you need a lot of training data.
01:35:04.160 You need to get lots of information about how people are talking and interacting and videos
01:35:08.320 and photos that they create.
01:35:09.340 What character.ai is doing in that case is getting lots of training data in the form of
01:35:14.260 young people providing little transcripts of all of their thoughts and all of their
01:35:18.260 concerns to train a bigger and more powerful model.
01:35:21.200 But this is happening again across the AI landscape with all of these companies.
01:35:26.380 And they're doing it because there's this much bigger game.
01:35:28.980 You know, you noted in your intro that character.ai was sort of kicked out of Google because this
01:35:35.740 project was originally formulated inside of Google, thought to be too risky, too much
01:35:39.320 brand risk.
01:35:40.280 And so it was done as sort of a separate project, but then it got acquired back into Google.
01:35:46.520 And you can see why it has so much risk.
01:35:48.800 And the reason why Google and other companies want to do things like this is they want to
01:35:52.480 gather again more training data to win this race to AGI in order to beat China.
01:35:58.060 But this is where I think we have to get really careful about what does it mean for the United
01:36:02.140 States to beat China to AI?
01:36:04.860 If we release chatbots that then cause our minors to have psychological problems, do self-cutting,
01:36:10.960 self-harm, suicide, and then actively harm their parents and harm the family system, are
01:36:16.320 we beating China in the long run?
01:36:18.700 It's not a race for who has the most powerful AI to then shoot themselves in the foot with.
01:36:23.940 It's a race for who is better at governing this new technology better than the other countries
01:36:30.600 are in such a way that it strengthens every aspect of your society, strengthens kids'
01:36:35.440 development, strengthens your long-term economic future rather than undermines it.
01:36:40.040 So we have to figure out how do we do AI in a way that actually strengthens the full stack
01:36:45.800 sort of strength of our society.
01:36:47.720 And that's what this conversation is really the tip of the iceberg about.
01:36:52.060 We haven't figured it out.
01:36:53.500 And I don't know that we can figure it out.
01:36:55.240 And I mean, you've got people like Elon raising the alarm about this, about open AI, which is
01:37:00.260 just getting billions of dollars invested in it.
01:37:02.360 I mean, really, is there someone at the helm?
01:37:06.220 You know, well, Trump has hired or not hired.
01:37:08.780 He's brought in David Sachs to be the, you know, AI and crypto czar.
01:37:12.840 Uh, and there are many AI, uh, experts that are being brought in now to the next Trump
01:37:17.820 administration.
01:37:19.300 That's David Sachs.
01:37:20.140 Yes.
01:37:20.700 Uh, and we would like to, uh, and yeah, and we'd like to see that we get as smart about
01:37:27.460 governing AI as, you know, it's, we like to say, it's like, we're not for AI or against
01:37:31.580 AI.
01:37:31.940 We're for steering AI.
01:37:33.580 And when you think of steering AI, I think of that image of Elon, you know, steering this
01:37:38.700 rocket coming down from space, which is like using AI itself to help steer really precisely
01:37:44.900 how to land this rocket between the two chopsticks.
01:37:47.840 And I feel like that's what we need to do with AI metaphorically.
01:37:51.680 We need things like, well, there's some common sense things we can do like liability.
01:37:55.540 If companies were liable for the harms their AI models created, they would be much more careful
01:38:01.340 about releasing those models rather than I have to race to release it and capture the
01:38:06.500 kid's market share, because if I don't, I'll lose to the other company that will.
01:38:10.760 And so if you have some basic common sense protections like liability, that'll go a long
01:38:15.060 way.
01:38:15.440 We can also have things like, well, how are they making sure now that it doesn't get
01:38:19.720 used by somebody who's under 17 under their new program?
01:38:24.040 That's a good question.
01:38:25.060 I mean, the, I think that they're, I think it's up to them right now to figure out a strategy
01:38:29.560 to do that.
01:38:30.140 But in the long run, you would really want that to be something that is on the device,
01:38:34.240 that Apple and Google as kind of making the device should have some way of knowing that
01:38:39.560 someone is an underage user or not.
01:38:41.940 And the problem is that people don't want to touch these issues because they're so sensitive.
01:38:46.040 And so they'll only do something like that once they're really forced to through, you
01:38:50.220 know, lawsuits, litigation, legislation that kind of puts it on them right now, each company,
01:38:55.180 TikTok, um, you know, Instagram, uh, Snapchat are doing their own different approaches and
01:39:01.500 we really should have a unified approach.
01:39:03.680 So Tristan, Australia just passed a ban on social media.
01:39:08.860 It's supposed to take effect in a year and it would affect, you know, all of it.
01:39:13.440 It would affect Snapchat and TikTok and Instagram.
01:39:17.140 And it's for under 16 year olds controversial because some people say, ah, free will, you
01:39:22.200 know, like whatever you should monitor your kid, be a better parent.
01:39:24.840 And others say, this has just spun so far out of control that we're past that point.
01:39:29.340 Would you like to see the same thing done in the United States?
01:39:32.380 And what do you make of their ban?
01:39:34.300 Yeah, that's a great question, Megan.
01:39:36.060 I think it's great that the Australian government is taking this step and taking a strong stand
01:39:40.540 on protecting kids online and responding to parents that are fed up with this.
01:39:44.680 Um, I'm a big friend and fan of Jonathan Haidt and his new book, The Anxious Generation,
01:39:49.200 which really outlined over the last decade and a half, how we got here and how with this
01:39:55.060 business model of maximizing attention and engagement, it produced a generation of more
01:40:00.800 addicted, distracted, you know, sexualized, harassed children that have more anxiety, more
01:40:06.940 depression rates than ever before.
01:40:09.100 And while we all, you know, parents do have a responsibility to, you know, be aware of what
01:40:13.880 their children are doing online.
01:40:15.400 One of the things we talk about in our work though, is that the number of things to be aware
01:40:18.640 of is going up sort of like exponentially and, you know, the number of new apps are going
01:40:24.620 up exponentially and parents can't be aware of all at the same time.
01:40:27.260 In the case of Sewell Setzer, the young 14 year old who took his life, his mother knew
01:40:32.120 to be looking out for what he was using in terms of social media, but did not know about
01:40:36.760 these new AI chatbots.
01:40:38.480 And there's so many of them that are constantly coming on the market.
01:40:42.320 And so ironically, I think the social media ban in Australia would not cover so far the
01:40:48.240 character.ai companion AIs.
01:40:51.380 And I think that speaks to the issue of technology moving faster than governance.
01:40:55.820 We have to live in a world where our culture and our appraisal of technology issues is moving
01:41:00.120 as fast as the technology is.
01:41:02.740 But I will say that that channel of, you know, a child and their brain, their psychological
01:41:07.920 environment, AI is going to produce a flood of new threats into that environment from
01:41:12.560 notification apps that are already starting to hit schools of kids making non-consensual
01:41:17.280 imagery of other classmates to new forms of harassment to these new chatbots.
01:41:22.880 And so I think while this channel is basically about to get flooded, saying we need to kind
01:41:26.900 of put strict limits on that channel before we figure out what's really safe feels like
01:41:31.420 a wise decision, given that the incentives are not aligned with, you know, strengthening
01:41:36.360 children's development as we roll out technology.
01:41:39.060 Not yet.
01:41:40.020 I have to ask you about this latest school shooting that we just had here, because in
01:41:46.420 Madison, Wisconsin yesterday, a 15 year old girl, girl, I don't I've never heard of it.
01:41:53.240 I've heard of like trans girls or whatever.
01:41:55.180 But 15 year old, I don't know that she had any gender issues, none reportedly, but 15
01:42:01.740 year old girl identified as the shooter in this Wisconsin school where she she shot.
01:42:08.880 She she wasn't going by her real name, but she was going by another girl's name.
01:42:11.760 I know people are always wondering because a lot of times those kids are on drugs and so
01:42:15.020 on and so forth.
01:42:15.620 And she took her own life.
01:42:17.620 We are told that, by the way, a second grader called 911 to report the shooting, which is just
01:42:23.920 so awful.
01:42:25.480 And what we're told is that this was a school that was kind of a refuge for children who
01:42:31.660 had been bullied or struggled at other schools and that this girl in particular came.
01:42:38.880 She was new to this private school this year.
01:42:40.260 She was among those who came in need of a life change.
01:42:45.420 We don't know anything about her social media use.
01:42:47.680 Tristan, I'm not asking you to speculate about her particular case, but obviously this is a
01:42:52.600 very troubled 15 year old.
01:42:54.380 And I just think the the change now, like young girls as school shooters, in my own
01:43:00.460 speculation, I don't think it's unrelated.
01:43:02.900 No, no.
01:43:03.860 And in Jonathan Haidt's book, The Anxious Generation, specifically the issues of, you know, self-harm
01:43:09.660 and suicide and depression and all of this stuff, harassment, have been particularly harder
01:43:14.440 on on young girls compared to young boys.
01:43:17.660 So it's not surprising to me at all, unfortunately.
01:43:21.320 And we can't know what this case is too early to tell, given that we don't know their usage.
01:43:26.080 But we do know that, again, we've run this experiment on children for the last 15 years.
01:43:32.020 We've also handed, you know, our number one geopolitical competitor, China and the Chinese
01:43:37.260 Communist Party, basically control over our youth psychological environment in the form
01:43:43.300 of TikTok being the dominant thing that young people are looking at every day.
01:43:48.020 And if I'm the Chinese Communist Party and I have an ability to go in and sort of steer
01:43:53.360 TikTok and tilt the playing field of what gets recommended, I not only have the ability to
01:43:58.080 steer what people are seeing, I have a 24 seven up to the minute update view of all of the
01:44:05.160 cultural fault lines and divisive issues per political tribe in that country.
01:44:09.440 And I can do precision targeting of how I want your country's internal divisions to go because
01:44:15.700 you've literally handed them to me on a silver platter.
01:44:18.440 And this, I think, is one of the biggest and most obvious and avoidable mistakes that we
01:44:22.960 could have made.
01:44:23.500 And obviously, TikTok, there has been legislation to move forward.
01:44:26.760 And that ban looks like it will going forward.
01:44:28.760 I think TikTok is appealing.
01:44:30.360 And it's not just about TikTok, though.
01:44:32.160 It's just about the systemic environment.
01:44:33.780 On the one hand, you have our apps that are racing to addict and doom scroll our kids and
01:44:38.240 drive anxiety.
01:44:39.040 That's one set of problems.
01:44:40.680 And then we also have the problem of letting our geopolitical competitor control the psychological
01:44:44.640 environment of not just our young people, but our country.
01:44:47.500 And I think people should sort of see how obvious an issue this is and say, we need to move forward
01:44:52.860 and not let this continue.
01:44:54.520 And I hope that that happens in the next administration.
01:44:57.800 I mean, we're seeing the effects of this, right?
01:45:00.120 From the kids who are celebrating the bin Laden letter.
01:45:04.100 Like, you really have to hand it to Osama bin Laden.
01:45:08.220 Boy, did he nail it in this piece, Justifying 9-11.
01:45:11.000 Like, what?
01:45:11.940 To there was just a poll that dropped today showing 18 to 29-year-olds, 41% of them, more
01:45:20.880 than not, think that the shooting by Luigi Mangione of CEO of UnitedHealthcare, Brian Thompson,
01:45:28.500 was, quote, acceptable.
01:45:30.760 40% said, no, not acceptable.
01:45:32.720 41% said, it is acceptable.
01:45:36.160 There's something wrong with our young people.
01:45:40.980 And that's just horrible to hear.
01:45:44.640 Murder is always wrong.
01:45:45.860 And we should not be using violence, vigilante violence, to solve social problems.
01:45:49.940 But it's also not surprising.
01:45:51.980 We have, again, you know, a psychological environment of social media that is designed
01:45:57.680 for maximizing engagement, which is designed to find every radicalizing cultural issue and
01:46:03.000 then give you an infinite evidence of why it's getting worse and more extreme and why
01:46:07.260 you should take extreme action for everything that you click on.
01:46:10.200 You know, it's like, you know, whatever your boogeyman is that activates your nervous system,
01:46:14.400 I just show you infinite evidence of that boogeyman happening.
01:46:17.300 And then it drives up this sort of psychological, you know, funhouse mirror that we're all living
01:46:22.800 in.
01:46:23.000 And we've been living in that for 15 years.
01:46:24.680 So if you just imagine society going through the washing machine, you know, getting spun
01:46:28.200 out for 15 years in that environment, it's not surprising that we have people more radicalized
01:46:33.840 on more issues everywhere.
01:46:35.540 And the point is, this doesn't have to be this way.
01:46:37.820 Imagine if we went back to 2010 and we said, before we go down this decade and a half of maximizing
01:46:45.580 for attention and engagement, imagine we never did that.
01:46:48.820 Imagine somehow we put strict limits on maximizing engagement and said, instead, you got to show
01:46:53.260 us something else you're maximizing for kids apps.
01:46:55.280 You got to be showing transparently, just like Elon showing what the algorithm of Twitter
01:46:59.180 does.
01:46:59.720 We have to transparently show what are you doing to make children's psychological environment
01:47:04.280 better, but you can't maximize for engagement.
01:47:06.800 And imagine we did something totally different.
01:47:08.780 How different would our world feel if we had not been personalizing these boogeyman, you
01:47:14.380 know, psychological stimuli for the last 15 years?
01:47:16.440 And I think it would feel very different and we could still do that.
01:47:19.380 It's very entrenched with social media now, but that's not too late to change it.
01:47:23.280 We just need to have the fortitude to do it.
01:47:25.720 It's really sad as the mother of a 15 year old who was born in 2009, right at the beginning
01:47:31.360 of this and two other children younger than that child.
01:47:34.460 We've been in this to do just like probably most of my listeners and we don't let our
01:47:39.580 kids use social media, but we do let them like check up on the NFL games and like that
01:47:46.060 can be addictive too, right?
01:47:47.180 That's all, it's all kind of in there.
01:47:49.260 Um, what, what advice do you have Tristan to parents out there right now who have kids
01:47:55.100 who are on phones and let's face it, most of them are on social media right now.
01:48:00.760 Yeah.
01:48:01.200 Um, for parents, it's, it's, uh, first just to say, I really empathize.
01:48:05.100 It's a hard world out there.
01:48:06.720 Um, but there are great resources available.
01:48:08.980 Um, the anxious generation, uh, John Heights website has a bunch of really great up-to-date
01:48:13.800 resources for parents.
01:48:15.280 Um, there's a great group, uh, that we also helped, uh, get started called moms against
01:48:19.980 media addiction or mama.
01:48:21.680 And parents can join that group and they advocate for actually changes in different states to state
01:48:26.960 laws to help protect, you know, better, better design policies for social media in different
01:48:30.880 states.
01:48:31.780 Um, and, uh, you know, we have some resources on our website, humanetech.com.
01:48:36.400 Uh, you know, everybody who saw the social dilemma, we have resources for educators, for
01:48:39.660 parents, uh, just educating people about the nature, because, you know, the example you
01:48:44.000 gave of NFL scores, while it's addictive, you don't have a thousand engineers behind the
01:48:49.460 glass screen who every day tweak the design with AI to perfectly maximize and keep, you know,
01:48:54.660 keep your kid doom scrolling the NFL scores.
01:48:56.940 But you do have that with social media, um, and you do have that with character.ai.
01:49:01.800 So there is a distinction and that's the kind of stuff that I think we need more parents
01:49:04.920 knowing about spreading, starting, you know, school groups, starting, uh, parent, you know,
01:49:09.220 moms against media chapters in your own state.
01:49:11.280 Uh, there is change that's possible, but I think parents do have to get organized.
01:49:15.520 Can I tell you something?
01:49:16.500 So we're coming on Christmas and I mentioned to the audience yesterday that, uh, if they had
01:49:20.340 any tips for Christmas gifts for a 15 and a 13 and 11 year old boy, girl, boy, I would
01:49:26.080 love it.
01:49:26.820 And one of the things I've noticed in my constant searching, whether it's on Amazon or just on
01:49:31.160 Google, uh, if you search the normal, like best gifts for 15 year old or, you know, top
01:49:35.960 gifts for 15 year old, every time I do it just on multiple.
01:49:40.520 And I would even say maybe most of the hits I get are anxiety relievers of some sort, like
01:49:51.220 the stress ball or the stress squishy or the stress electronic thing that you can put on
01:49:57.280 your wrist, the anxiety reducing, whatever the massage, like for 15 year olds, we're not
01:50:02.820 talking about 50, 15.
01:50:05.820 And I didn't type the numbers wrong.
01:50:07.380 It's amazing.
01:50:08.540 This wasn't the case even just a few years ago, right?
01:50:12.740 Well, an AI that's driving those recommendations, right?
01:50:15.480 It's a big AI that's gathering all this data to figure out what do people click on that shows
01:50:20.100 that there's a reflection of how anxious society is.
01:50:23.620 And I think it's just evidence of all the things that, uh, you know, John height wrote
01:50:27.040 about in, in the anxious generation, unfortunately, but, um, I think we don't have to live in this
01:50:31.760 world.
01:50:32.080 I do think that there's a better psychological environment and healthier families that we can
01:50:35.540 have.
01:50:36.020 We just need to change the incentives.
01:50:37.720 You know, Charlie Munger, Warren Buffett's business partner said, if you want to change
01:50:41.520 the outcome, you have to change the incentives.
01:50:43.780 And that's what we have still to do with social media and AI.
01:50:47.580 To me, it's nuts.
01:50:48.560 It was like in, in our house, thankfully we're not an anxious people.
01:50:54.200 I always say, if anything, I just want to take a nap.
01:50:56.500 Most of my kids just want to take a nap.
01:50:57.720 We're tired.
01:50:58.120 We're on the other end of the spectrum, I think, but they're just assuming, I think that
01:51:03.480 everybody, all kids are stressed out by the lives that they live, whether it's the crazy
01:51:07.460 school pressure.
01:51:08.640 But I do think that there's no question.
01:51:10.680 There's a connection to the amount of devices and social media exposure these kids have
01:51:13.860 and the girls in particular, as you point out.
01:51:15.400 So anyway, what we need is more Tristan Harris and less, less character AI.
01:51:21.480 And if you're not paying attention, you're losing this battle because they're just so
01:51:25.760 smart and they're advancing the technology.
01:51:27.680 And that weird little guy with that weird little voice behind character AI is working
01:51:31.700 against us.
01:51:32.360 So we have to fight him and we have good, we have good warriors now.
01:51:35.300 Just have to pay attention to our leaders.
01:51:37.380 Tristan, thank you.
01:51:38.460 Thank you, Megan.
01:51:39.180 Thank you for amplifying the story and helping people understand it.
01:51:41.740 Thank you very much.
01:51:42.900 Absolutely.
01:51:43.620 God bless these poor families who lost their children.
01:51:46.720 We're back tomorrow.
01:51:47.940 We'll have Adam Carolla and we will have Justine Bateman, who's got a lot of thoughts on all
01:51:51.640 these AI issues herself.
01:51:54.020 Hope to see you then.
01:51:57.300 Thanks for listening to The Megan Kelly Show.
01:51:59.440 No BS, no agenda, and no fear.
01:52:07.960 We'll see you then.
01:52:37.960 Learn more at AtronTeal.ca.