The Glenn Beck Program - October 04, 2025


Ep 268 | If Americans Seem Crazy, Here’s Why | Jonathan Haidt | The Glenn Beck Podcast


Episode Stats

Length

53 minutes

Words per Minute

178.00984

Word Count

9,449

Sentence Count

145

Misogynist Sentences

14

Hate Speech Sentences

20


Summary

Jonathan Haidt, the man behind The Anxious Generation, joins Glenn Beck on the Glenn Beck Show to talk about the dangers of social media and the need to protect our kids. Glenn Beck is a conservative commentator, bestselling author, and social media strategist. He is a frequent contributor to the New York Times, CNN, and the Wall Street Journal, and is one of the most influential people on both sides of the political aisle.


Transcript

00:00:00.300 While other money managers are holding, Dynamic is hunting.
00:00:05.720 Seeing past the horizon, investing beyond the benchmark.
00:00:09.860 Because your money can't grow if it doesn't move.
00:00:13.400 Learn more at dynamic.ca slash active.
00:00:19.540 And now, a Blaze Media podcast.
00:00:22.820 Hello, America.
00:00:23.680 You know we've been fighting every single day.
00:00:25.540 We push back against the lies, the censorship,
00:00:27.900 the nonsense of the mainstream media that they're trying to feed you.
00:00:31.920 We work tirelessly to bring you the unfiltered truth, because you deserve it.
00:00:36.760 But to keep this fight going, we need you.
00:00:39.240 Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:00:42.940 Give us five stars and lead a comment,
00:00:44.960 because every single review helps us break through Big Tech's algorithm
00:00:48.800 to reach more Americans who need to hear the truth.
00:00:51.740 This isn't a podcast.
00:00:53.120 This is a movement, and you're part of it, a big part of it.
00:00:56.460 So if you believe in what we're doing, you want more people to wake up,
00:00:59.360 help us push this podcast to the top.
00:01:01.700 Rate, review, share.
00:01:03.300 Together, we'll make a difference.
00:01:05.420 And thanks for standing with us.
00:01:06.700 Now let's get to work.
00:01:08.740 There may be only one cause that can bring the left and right together again,
00:01:12.960 and that is protecting our kids.
00:01:15.420 My next guest has been calling out social media companies
00:01:18.260 for damaging our children on, quote, an industrial scale.
00:01:22.840 I want you to listen to this podcast before you let your 10-year-old download Instagram
00:01:27.260 or, you know, God forbid, get them an AI-powered stuffed teddy bear for Christmas.
00:01:33.920 And yes, those animals are coming.
00:01:36.020 It's a real thing.
00:01:37.400 When Americans disagree about absolutely everything,
00:01:41.720 let's agree on this.
00:01:43.080 No, your sex robot doesn't have any rights.
00:01:48.580 AI is not real.
00:01:50.500 No, your 10-year-olds are not going to literally die without TikTok either.
00:01:55.680 But don't take it from me.
00:01:57.500 Take it from my next guest.
00:01:59.120 My guest on today's show, the man behind The Anxious Generation,
00:02:02.640 an absolute must-read,
00:02:04.600 The Anxious Generation, How the Great Rewiring of Childhood
00:02:08.200 is Causing an Epidemic of Mental Illness.
00:02:11.060 He's a social psychologist and a best-selling author.
00:02:14.960 His name, Jonathan Haidt.
00:02:30.260 Jonathan, welcome to the podcast.
00:02:31.760 You are a guy who I've been following for a long time
00:02:37.760 that I think probably has diagnosed the problem better than anybody else.
00:02:46.740 With everything that has been happening in the last couple of weeks
00:02:49.680 and this weekend,
00:02:51.660 I think there's a lot of people that are feeling like,
00:02:55.140 no way out, no way out.
00:02:58.300 Is there a way out of this?
00:03:00.280 And how do we do it?
00:03:02.320 Oh boy.
00:03:03.320 That, yeah, that is a hard question.
00:03:06.360 Now, is there a way out?
00:03:10.380 That is actually unknown.
00:03:12.960 I can't sit here and tell you that if we do X, Y, and Z,
00:03:15.940 we will escape from this.
00:03:18.340 My original research,
00:03:19.620 before I started working on what phones are doing to kids,
00:03:22.300 was on polarization, political polarization,
00:03:25.280 and what's causing it, what's driving it,
00:03:26.840 what's accelerating it.
00:03:28.680 And there are a lot of factors,
00:03:30.140 especially the rise of social media.
00:03:32.600 Many, many factors are causing us to hate each other more.
00:03:36.100 And when we hate each other more,
00:03:37.520 we're willing to break the law for our side
00:03:40.160 because things are so urgent.
00:03:42.660 We're willing to tolerate someone on our side
00:03:44.580 bending norms or breaking norms.
00:03:46.440 And that's where we are now.
00:03:47.360 Now, both sides are more willing to use undemocratic
00:03:50.840 or even illegal means to get their way.
00:03:53.380 So this is a very dangerous time.
00:03:54.980 I'd say more dangerous than anything since the Civil War.
00:03:57.080 I would agree with that.
00:04:01.420 When you say both sides are willing to break the law or whatever,
00:04:06.440 I know you are so fair and you look to be super fair.
00:04:13.600 And so I'm asking this question for my bias to check me.
00:04:17.140 But it seems, Antifa, the really nasty, nasty stuff
00:04:25.120 that is tearing our streets apart, burning down our cities,
00:04:28.760 and now shooting people when it comes to political,
00:04:33.040 when it comes to crazy, I think we all have a share in that.
00:04:36.180 But when it comes to political, it does seem to be on the left.
00:04:40.080 Is that just my bias?
00:04:41.520 So it swings back and forth.
00:04:45.500 And a fundamental rule is that we see what we want to see.
00:04:50.860 We don't pay attention to things.
00:04:52.100 We don't want to see our media environment
00:04:54.380 sends us things that support our views.
00:04:56.280 So the data that I know of,
00:04:59.060 there are many people who have been tracking political violence
00:05:00.860 for a long time.
00:05:02.360 And there were two periods where left-wing violence
00:05:05.040 was more common than right.
00:05:06.780 And those are 1968 to 73,
00:05:09.940 that really radical revolutionary period,
00:05:11.940 weather underground,
00:05:13.100 hundreds and hundreds of bombings.
00:05:14.920 I mean, that was a terrible, terrible period.
00:05:17.100 And that was driven by the left.
00:05:19.460 And I just saw,
00:05:20.420 there was just an article in The Atlantic a couple of days ago
00:05:22.680 saying a recent study shows that in the last year or two,
00:05:26.360 I can't remember what period of time,
00:05:28.500 in the last two years,
00:05:29.720 political violence is coming more from the left,
00:05:31.780 from the right.
00:05:33.140 Other than that,
00:05:34.060 it has been more from the right than the left.
00:05:35.820 Now, there are all kinds of fights about what do you include?
00:05:38.040 And so I don't know,
00:05:38.580 I'm not an expert on all of that.
00:05:39.940 I'm just saying,
00:05:41.200 you know,
00:05:41.480 since Charlie Kirk's horrific assassination,
00:05:45.320 what you see on the left
00:05:46.620 is people will always list
00:05:48.540 the assassinations on both sides
00:05:51.000 to say,
00:05:51.340 see, it's not just us,
00:05:52.120 it's not just us.
00:05:53.080 And they'll point to,
00:05:53.840 everyone points to the murder
00:05:55.000 of the two Minnesota lawmakers.
00:05:58.140 But people on the right
00:05:59.220 tend to go down the list
00:06:00.320 from Steve Scalise's shooting
00:06:01.740 all the way through Charlie Kirk's.
00:06:03.220 So this is something,
00:06:05.700 in a culture war,
00:06:06.780 for every action,
00:06:07.700 there is a disproportionate
00:06:09.500 and opposite reaction.
00:06:11.520 And the extremists on both sides
00:06:13.620 are violent.
00:06:14.760 Right now,
00:06:15.080 the left is,
00:06:15.720 right now,
00:06:16.240 the left is more.
00:06:16.840 That's true.
00:06:18.160 Is there enough
00:06:19.380 of the middle
00:06:21.380 left
00:06:22.680 to
00:06:23.280 hold
00:06:24.980 things in place?
00:06:26.340 That's the question
00:06:29.020 of the hour
00:06:29.760 of the year
00:06:30.300 of the decade.
00:06:31.780 It used to be
00:06:33.100 that what the majority thought
00:06:34.840 mattered at least a little.
00:06:36.760 You know,
00:06:36.880 what the powerful think
00:06:37.820 always matters more.
00:06:39.700 What the wealthy think,
00:06:40.880 you know,
00:06:41.060 they get their way more
00:06:41.860 with legislation.
00:06:43.020 But it used to matter
00:06:44.540 to some degree
00:06:45.220 what the public thinks.
00:06:46.420 Now,
00:06:46.940 in our new media environment,
00:06:48.620 that only matters
00:06:49.320 on election day.
00:06:50.560 What most people think
00:06:51.300 only matters on election day.
00:06:53.080 Other than that,
00:06:54.080 the world we live in
00:06:55.380 is one influenced
00:06:56.520 by what's coming at us
00:06:57.760 through social media.
00:06:59.200 Even cable TV
00:07:00.300 tends to have
00:07:01.920 a lot of stories
00:07:02.720 about something
00:07:03.320 that happened on Twitter.
00:07:05.320 So,
00:07:05.920 I think that's one
00:07:07.420 of the changes
00:07:07.800 I'm tracking
00:07:08.380 as to why things
00:07:09.060 are getting so much worse
00:07:10.200 since I started studying
00:07:11.100 this around 2006.
00:07:12.640 Things are getting
00:07:13.020 so much worse
00:07:13.940 because we've moved
00:07:15.360 from broadcast television
00:07:17.480 long ago
00:07:18.040 to cable TV,
00:07:19.240 which is sort of more,
00:07:19.840 you know,
00:07:20.140 more focused
00:07:20.720 or narrower casting,
00:07:23.200 to social media,
00:07:23.980 which is microcasting.
00:07:25.600 And so,
00:07:26.100 this media environment,
00:07:27.020 I think it's very hard
00:07:28.100 to have a decent,
00:07:30.580 good,
00:07:30.800 democratic discussion
00:07:31.740 in that environment.
00:07:38.080 I know you,
00:07:39.520 you know,
00:07:40.500 are aware of me.
00:07:43.040 And,
00:07:43.640 Jonathan,
00:07:44.480 I am,
00:07:46.980 I mean,
00:07:48.260 I find myself
00:07:50.880 in a position
00:07:51.800 that I don't think
00:07:53.560 a lot of people
00:07:54.280 have been in
00:07:55.040 historically speaking
00:07:56.380 to have the reach
00:07:59.160 that I have,
00:08:00.000 to have the voice
00:08:01.600 that I have,
00:08:02.160 to have made
00:08:03.520 my own mistakes.
00:08:07.100 And,
00:08:07.760 and when I say reach,
00:08:09.680 it's reach on one side
00:08:11.580 because we're so polarized.
00:08:12.740 And I feel
00:08:14.860 a tremendous
00:08:16.380 responsibility
00:08:17.620 to
00:08:19.460 not just
00:08:21.480 do no harm,
00:08:22.420 but try
00:08:23.380 to do good.
00:08:25.240 And,
00:08:25.720 I mean,
00:08:26.200 this is,
00:08:26.620 this is probably
00:08:27.360 a conversation
00:08:28.040 we should have
00:08:28.500 just had off the air
00:08:29.380 because this is so,
00:08:30.700 for me,
00:08:31.760 but
00:08:31.860 what
00:08:33.460 can be done?
00:08:35.960 What,
00:08:36.160 what,
00:08:36.420 what advice
00:08:37.860 would you give me
00:08:38.800 or anybody like me?
00:08:40.180 Because we're,
00:08:41.400 you know,
00:08:41.600 in the old days,
00:08:42.360 I had an audience.
00:08:43.920 Now my audience
00:08:44.540 has an audience.
00:08:45.720 So I guess it is
00:08:46.520 for everybody.
00:08:47.680 Right.
00:08:48.200 What can be done?
00:08:49.800 Oh,
00:08:50.240 Glenn,
00:08:50.520 I love this.
00:08:51.160 I love this question.
00:08:53.000 So let me,
00:08:53.700 let me start
00:08:53.980 at the beginning here.
00:08:54.780 You know,
00:08:54.900 I first became aware
00:08:55.820 of you,
00:08:56.700 you know,
00:08:57.180 when you had
00:08:57.560 the Glenn Beck show
00:08:58.940 and like,
00:08:59.660 what was it?
00:09:00.040 When did you start?
00:09:00.800 2004 or five?
00:09:01.720 That's when I was
00:09:02.280 being and paying
00:09:02.860 attention to you.
00:09:03.860 And back then
00:09:04.800 I was very much
00:09:05.580 on the left.
00:09:06.260 I was a Democrat.
00:09:07.560 I was studying
00:09:08.220 moral psychology.
00:09:09.020 I was,
00:09:10.320 I was trying
00:09:11.860 to explain to the left
00:09:13.000 what they're missing.
00:09:14.100 And my first essay
00:09:14.840 on this was called
00:09:15.420 What Makes People
00:09:16.240 Vote Republican?
00:09:17.100 And it was,
00:09:17.800 it was really meant
00:09:18.760 not as a criticism.
00:09:19.640 Republican was like,
00:09:20.580 look,
00:09:20.820 people on the left,
00:09:21.380 you have,
00:09:21.780 you do not understand.
00:09:22.860 You have no idea
00:09:23.740 of the moral foundations
00:09:24.640 of the right,
00:09:25.600 which I think
00:09:26.020 are very,
00:09:26.340 are very respectable.
00:09:27.660 But I saw you
00:09:28.640 back then
00:09:29.860 as a,
00:09:30.180 as a polarizing figure.
00:09:31.200 And that's what
00:09:31.580 the media market was.
00:09:32.760 That's what the media
00:09:33.440 landscape was.
00:09:35.080 But as I listened
00:09:36.080 to you,
00:09:37.000 I learned a lot.
00:09:38.500 Like,
00:09:39.080 that really helped
00:09:40.000 me understand.
00:09:40.600 I remember something,
00:09:41.180 I'll never forget this.
00:09:42.000 You said something about,
00:09:42.860 there's some like
00:09:43.360 environmental program
00:09:44.540 and you were saying
00:09:45.480 it's not about
00:09:46.100 the environment,
00:09:46.680 it's about control.
00:09:48.560 That always stuck with me
00:09:49.740 and that was a very
00:09:50.440 helpful insight
00:09:51.100 and I got a lot
00:09:51.640 of that from you.
00:09:52.180 So,
00:09:52.860 what I'm saying is,
00:09:53.680 I think,
00:09:54.080 of course,
00:09:54.500 you used to be
00:09:55.360 one of the forces
00:09:56.260 that was increasing
00:09:57.540 polarization
00:09:58.120 as many people
00:09:59.660 on the left were.
00:10:01.000 And,
00:10:01.520 what I remember,
00:10:03.060 you know,
00:10:03.280 I know you're,
00:10:03.720 you're a really
00:10:04.140 complicated guy
00:10:04.980 and you've,
00:10:05.460 you've gone through
00:10:06.040 all kinds of growth
00:10:07.040 and I remember
00:10:07.800 an essay you wrote
00:10:08.440 in the New York Times
00:10:09.160 in which you,
00:10:09.840 you seem to be saying
00:10:10.720 that you,
00:10:11.320 you regret having been
00:10:12.800 such a force
00:10:13.700 or is that,
00:10:14.440 is that correct?
00:10:14.920 Is that a fair statement
00:10:15.660 that you've?
00:10:16.340 Yeah,
00:10:16.680 and I,
00:10:17.320 I mean,
00:10:17.660 I did the best I could
00:10:19.300 with the knowledge
00:10:20.060 that I had at the time,
00:10:21.160 but in retrospect,
00:10:22.440 I'd do it completely
00:10:23.760 different if I could.
00:10:25.640 Yeah.
00:10:26.260 You know?
00:10:26.560 Okay,
00:10:26.940 good.
00:10:27.260 So just,
00:10:27.860 okay,
00:10:28.100 so just so,
00:10:28.740 so I'm,
00:10:29.120 I'm up on,
00:10:30.000 on where you are.
00:10:32.040 I'll share some thoughts
00:10:33.060 about how you might be
00:10:34.780 an even better part
00:10:35.780 of the solution
00:10:36.460 and that is
00:10:38.740 what,
00:10:40.680 so as I said,
00:10:41.560 I used to be on the left,
00:10:42.780 but I came to understand
00:10:44.140 conservatives
00:10:44.580 by listening to them
00:10:45.800 and by,
00:10:47.140 you know,
00:10:47.980 reading the best writings
00:10:49.180 and listening
00:10:50.460 with an open heart
00:10:51.340 because,
00:10:52.320 you know,
00:10:52.540 any one person
00:10:53.320 can be crazy,
00:10:54.680 but if a third
00:10:55.920 or a quarter
00:10:56.420 of the country
00:10:56.880 believes something,
00:10:58.100 they're not crazy.
00:10:58.920 Like,
00:10:59.100 there's no way
00:10:59.480 that they're insane.
00:11:00.340 Like,
00:11:00.620 there's a reason
00:11:01.380 for this,
00:11:01.980 there's a justification
00:11:02.600 and it's almost always
00:11:04.160 the case that they see
00:11:05.000 things you don't see,
00:11:05.720 you see things
00:11:06.220 that they don't see.
00:11:07.440 So,
00:11:08.200 so I think for you
00:11:10.400 to espouse
00:11:11.940 conservative principles
00:11:13.320 and talk about
00:11:14.920 the moral foundations
00:11:17.200 of your view,
00:11:18.240 the view of,
00:11:18.860 of your community
00:11:19.680 is great.
00:11:20.900 That,
00:11:21.320 you can be
00:11:22.340 and you are
00:11:22.780 a very eloquent
00:11:23.460 source for that,
00:11:25.020 but at the same time,
00:11:26.600 the positive message
00:11:27.880 has to be
00:11:29.020 turning down
00:11:30.260 the manichaeism,
00:11:31.400 the,
00:11:31.540 the,
00:11:31.960 the,
00:11:32.100 the black and white thinking,
00:11:33.920 the good versus evil
00:11:34.920 and more talking
00:11:37.180 about how
00:11:38.000 we're in a mess
00:11:39.220 in this country
00:11:39.880 where we don't all
00:11:41.040 believe the same thing.
00:11:41.940 We have to somehow
00:11:42.460 live,
00:11:42.860 live together
00:11:43.860 and to insist,
00:11:46.980 I think the bright line
00:11:47.700 that I want
00:11:48.160 people on the left
00:11:49.080 and the right
00:11:49.340 to really insist on
00:11:50.460 is,
00:11:51.420 is,
00:11:52.400 is rule of law
00:11:53.460 and that we,
00:11:57.400 we play things
00:11:58.120 out through
00:11:58.400 a political process
00:11:59.400 and so obviously
00:12:01.120 no violence,
00:12:01.880 that needs to be said
00:12:02.700 over and over again
00:12:03.660 and anyone who commits
00:12:04.800 violence is just
00:12:05.800 hurting their own side.
00:12:07.740 There,
00:12:07.900 I mean,
00:12:08.160 look what,
00:12:08.600 you know,
00:12:08.800 it looks like
00:12:09.580 the assassination
00:12:10.080 of Kirk
00:12:10.940 if it was
00:12:11.400 from a left wing
00:12:12.160 is,
00:12:12.900 is,
00:12:13.120 is going to be
00:12:13.840 so damaging
00:12:14.600 to the left.
00:12:15.420 So,
00:12:16.140 you know,
00:12:16.340 the message that violence
00:12:17.220 is not just immoral,
00:12:18.380 it actually is
00:12:19.540 counterproductive
00:12:20.360 to whatever you want
00:12:21.040 to do
00:12:21.560 and then
00:12:23.400 encourage,
00:12:23.960 just encouraging people
00:12:24.620 to go ahead.
00:12:25.840 Can I interrupt
00:12:26.660 for just a second
00:12:27.480 because you said
00:12:28.160 something about
00:12:28.720 turn down
00:12:29.220 the good and evil.
00:12:30.960 Here's the problem.
00:12:32.520 I don't believe
00:12:33.840 that people
00:12:35.340 per se
00:12:35.900 are evil.
00:12:36.780 I do believe
00:12:38.020 in evil
00:12:38.720 and I do believe
00:12:39.720 in good
00:12:40.300 and what I,
00:12:43.000 the,
00:12:43.180 the way
00:12:43.920 I interpret
00:12:45.060 what's happening,
00:12:45.980 forget about parties
00:12:47.160 and politics,
00:12:47.960 the way
00:12:49.920 I interpret
00:12:51.820 what's happening
00:12:52.860 is it's almost
00:12:53.700 as if evil
00:12:54.620 has become,
00:12:56.260 it's like we're
00:12:57.160 living in Gotham.
00:12:58.500 This is how I
00:12:58.880 explained it today.
00:12:59.860 We're living in Gotham
00:13:01.400 and the Joker
00:13:03.000 is the evil one
00:13:04.780 and he's using
00:13:06.000 people
00:13:06.760 and,
00:13:07.320 you know,
00:13:08.120 he's,
00:13:08.620 he's convincing
00:13:09.720 people to do
00:13:10.520 things that are
00:13:11.140 absolutely crazy,
00:13:13.540 just crazy
00:13:14.460 and it influences
00:13:16.420 all of us
00:13:17.400 and so I feel
00:13:18.400 like we've
00:13:18.980 moved into
00:13:19.720 this almost
00:13:20.680 graphic
00:13:22.500 comic book
00:13:23.500 world
00:13:24.080 where it
00:13:24.840 is good
00:13:25.640 versus evil
00:13:26.560 but not
00:13:27.380 necessarily
00:13:28.420 people
00:13:29.280 but the
00:13:31.000 forces of it.
00:13:32.940 Does that make
00:13:33.320 sense to you?
00:13:34.360 How can I explain
00:13:35.140 that?
00:13:35.260 Yes,
00:13:35.660 that does
00:13:37.100 but what I would
00:13:37.600 let's always
00:13:38.220 turn things around.
00:13:40.100 Let's always look
00:13:40.540 at it from both
00:13:41.020 sides.
00:13:41.420 so
00:13:42.340 from the
00:13:44.200 conservative
00:13:44.740 side
00:13:45.240 I see
00:13:47.340 how
00:13:47.760 it looks
00:13:48.660 like what
00:13:49.000 the left
00:13:49.320 is doing
00:13:49.740 is undermining
00:13:50.740 the pillars
00:13:51.440 of society
00:13:52.280 Americans
00:13:53.620 not just their
00:13:54.360 traditions
00:13:54.840 but their
00:13:55.200 sense of
00:13:55.520 who they
00:13:55.820 are.
00:13:57.180 It looks
00:13:58.080 as though
00:13:58.360 the left
00:13:58.680 is destroying
00:13:59.300 America
00:13:59.880 and that
00:14:00.500 would be
00:14:02.000 evil.
00:14:02.680 That would
00:14:02.940 be even
00:14:03.500 if you don't
00:14:03.840 want to say
00:14:04.120 an individual
00:14:04.620 is evil
00:14:05.040 you could
00:14:05.740 say that
00:14:06.040 this ideology
00:14:06.900 is evil
00:14:08.260 but let's
00:14:08.840 always turn
00:14:09.340 it around
00:14:09.760 because the
00:14:10.620 left thinks
00:14:11.080 the same
00:14:11.380 about the
00:14:11.720 right
00:14:11.900 and right
00:14:12.240 now the
00:14:12.580 key thing
00:14:13.020 is
00:14:13.240 authoritarianism
00:14:14.080 right now
00:14:14.980 and look
00:14:15.560 from my
00:14:15.860 position in
00:14:16.260 the center
00:14:16.620 I'm always
00:14:17.040 slow to
00:14:17.760 judge
00:14:18.040 I don't
00:14:18.340 I don't
00:14:18.740 jump
00:14:19.000 I don't
00:14:19.360 do
00:14:19.600 outrage
00:14:20.000 I don't
00:14:20.340 jump
00:14:20.580 in
00:14:20.780 but you
00:14:22.460 know
00:14:22.580 turning
00:14:23.000 the
00:14:23.200 department
00:14:23.560 of
00:14:23.760 justice
00:14:24.160 into
00:14:24.920 the
00:14:25.120 personal
00:14:25.580 vengeance
00:14:26.080 harm
00:14:26.360 of the
00:14:26.620 president
00:14:26.960 this
00:14:27.620 is
00:14:27.840 not
00:14:28.080 just
00:14:28.380 like
00:14:28.640 I
00:14:29.040 think
00:14:29.220 this
00:14:29.380 is
00:14:29.500 bad
00:14:29.680 this
00:14:29.880 is
00:14:30.060 unbelievable
00:14:31.540 like
00:14:31.900 this
00:14:32.120 is
00:14:32.320 really
00:14:32.660 a
00:14:32.860 red
00:14:33.040 line
00:14:33.340 and so
00:14:35.180 if the
00:14:35.440 right
00:14:35.880 isn't
00:14:36.120 concerned
00:14:36.400 about
00:14:36.620 that
00:14:36.960 then
00:14:37.200 they're
00:14:37.500 not
00:14:37.900 seeing
00:14:38.240 this
00:14:38.700 is
00:14:38.800 what
00:14:38.920 the
00:14:39.040 left
00:14:39.200 sees
00:14:39.520 so
00:14:40.020 right
00:14:40.460 and
00:14:40.720 I
00:14:41.360 have
00:14:41.560 been
00:14:41.760 all
00:14:42.040 over
00:14:42.380 you know
00:14:43.240 I've
00:14:43.960 been
00:14:44.100 saying
00:14:44.480 especially
00:14:45.520 after the
00:14:46.180 stuff
00:14:46.600 with
00:14:46.760 Charlie
00:14:47.000 Kirk
00:14:47.220 and
00:14:47.360 everything
00:14:47.600 else
00:14:47.920 and I
00:14:48.200 have
00:14:48.320 said
00:14:48.460 this
00:14:48.620 about
00:14:48.820 Donald
00:14:49.060 Trump
00:14:49.300 from
00:14:49.580 the
00:14:49.740 day
00:14:49.980 he
00:14:50.120 got
00:14:50.280 in
00:14:50.460 he
00:14:51.280 starts
00:14:51.640 to
00:14:51.840 cross
00:14:52.140 constitutional
00:14:52.900 lines
00:14:53.400 I'm
00:14:54.280 done
00:14:54.620 I'm
00:14:55.080 done
00:14:55.320 and I
00:14:55.660 will
00:14:55.880 not
00:14:56.440 go
00:14:56.740 there
00:14:57.000 with
00:14:57.280 anybody
00:14:57.600 I
00:14:57.860 don't
00:14:58.000 care
00:14:58.200 who
00:14:58.480 they
00:14:58.680 are
00:14:58.900 I
00:14:59.060 won't
00:14:59.260 go
00:14:59.400 there
00:14:59.580 and
00:15:00.700 when
00:15:00.860 it
00:15:01.000 comes
00:15:01.320 to
00:15:01.720 you
00:15:02.020 know
00:15:02.140 after
00:15:02.440 Charlie
00:15:02.800 Kirk
00:15:03.140 I
00:15:03.340 said
00:15:03.640 our
00:15:04.300 job
00:15:04.900 here
00:15:05.180 is
00:15:05.440 to
00:15:05.540 be
00:15:05.680 very
00:15:05.980 careful
00:15:06.360 that
00:15:06.580 there
00:15:06.760 is
00:15:07.000 no
00:15:07.220 patriot
00:15:07.800 act
00:15:08.320 that
00:15:08.960 follows
00:15:09.380 this
00:15:09.780 you
00:15:10.080 know
00:15:10.180 what
00:15:10.280 I
00:15:10.340 mean
00:15:10.540 we
00:15:10.780 cannot
00:15:11.940 be
00:15:12.340 so
00:15:12.840 upset
00:15:14.100 about
00:15:14.500 something
00:15:14.860 like
00:15:15.080 we
00:15:15.280 need
00:15:16.420 to
00:15:16.620 learn
00:15:16.820 our
00:15:16.960 lesson
00:15:17.220 from
00:15:17.420 9-11
00:15:17.880 no
00:15:18.720 is
00:15:19.200 the
00:15:19.360 answer
00:15:19.700 no
00:15:20.020 more
00:15:20.320 control
00:15:20.900 no
00:15:21.420 so
00:15:22.380 I
00:15:22.680 do
00:15:22.920 see
00:15:23.180 that
00:15:23.480 help
00:15:24.320 me
00:15:24.680 help
00:15:25.820 me
00:15:26.160 talk
00:15:26.640 to
00:15:26.860 somebody
00:15:27.300 that
00:15:28.880 said
00:15:29.320 what
00:15:29.620 you
00:15:29.800 just
00:15:30.060 said
00:15:30.280 to
00:15:30.440 me
00:15:30.640 because
00:15:30.960 my
00:15:31.340 response
00:15:32.480 I
00:15:32.980 just
00:15:33.780 want
00:15:33.980 to
00:15:34.080 shout
00:15:34.300 out
00:15:34.500 is
00:15:34.720 did
00:15:35.740 you
00:15:35.900 watch
00:15:36.420 how
00:15:36.760 Biden
00:15:37.220 was
00:15:37.520 using
00:15:37.880 the
00:15:38.080 Justice
00:15:38.420 Department
00:15:38.940 and
00:15:39.700 I'm
00:15:40.000 not
00:15:40.180 saying
00:15:40.540 you
00:15:41.520 know
00:15:41.640 because
00:15:41.920 you
00:15:42.240 did
00:15:42.440 that
00:15:42.680 first
00:15:42.980 we
00:15:43.200 can
00:15:43.360 do
00:15:43.460 I'm
00:15:43.640 not
00:15:43.820 saying
00:15:44.120 that
00:15:44.420 but
00:15:45.020 I
00:15:45.160 don't
00:15:45.540 know
00:15:45.740 how
00:15:45.960 people
00:15:46.400 miss
00:15:46.940 the
00:15:47.960 authoritarianism
00:15:49.160 that
00:15:49.660 the
00:15:49.920 right
00:15:50.320 sees
00:15:50.900 yeah
00:15:52.180 so
00:15:52.740 that's
00:15:53.000 right
00:15:53.180 what
00:15:53.340 I've
00:15:53.520 learned
00:15:53.820 is
00:15:54.320 whenever
00:15:55.400 I
00:15:55.640 point
00:15:55.880 out
00:15:56.060 something
00:15:56.420 especially
00:15:56.780 anything
00:15:57.140 I
00:15:57.340 point
00:15:57.620 out
00:15:57.920 on
00:15:58.460 Twitter
00:15:58.800 someone
00:15:59.820 will
00:16:00.020 jump
00:16:00.260 in
00:16:00.380 and
00:16:00.500 say
00:16:00.620 oh
00:16:00.800 yeah
00:16:01.120 well
00:16:01.600 you
00:16:01.820 know
00:16:01.960 this
00:16:02.200 look
00:16:02.500 what
00:16:02.640 Biden
00:16:02.980 look
00:16:03.180 what
00:16:03.300 Obama
00:16:03.600 did
00:16:03.860 and
00:16:04.740 the
00:16:05.060 thing
00:16:05.280 is
00:16:05.500 usually
00:16:06.160 they're
00:16:06.520 right
00:16:06.780 that
00:16:07.040 it
00:16:07.300 is
00:16:07.660 sort
00:16:08.200 of
00:16:08.320 similar
00:16:08.640 but
00:16:09.360 it's
00:16:09.560 usually
00:16:09.880 much
00:16:10.300 less
00:16:10.660 much
00:16:10.940 less
00:16:11.140 intensive
00:16:11.480 so
00:16:11.740 for
00:16:11.880 example
00:16:12.200 I
00:16:12.800 care
00:16:12.980 a lot
00:16:13.160 about
00:16:13.300 universities
00:16:13.800 I
00:16:14.060 wrote
00:16:14.220 a book
00:16:14.440 called
00:16:14.600 The
00:16:14.760 Coddling
00:16:15.040 of the
00:16:15.220 American
00:16:15.460 Mind
00:16:15.720 which
00:16:15.880 I
00:16:15.980 think
00:16:16.080 I
00:16:16.180 spoke
00:16:16.360 with
00:16:16.500 you
00:16:16.580 about
00:16:16.820 and
00:16:18.020 on
00:16:18.120 that
00:16:18.280 I'm
00:16:18.460 very
00:16:18.780 sympathetic
00:16:19.160 to
00:16:19.420 the
00:16:19.560 critique
00:16:19.860 of
00:16:20.020 the
00:16:20.120 right
00:16:20.240 Greg
00:16:20.460 Lukianoff
00:16:20.940 and I
00:16:21.260 were
00:16:21.800 horrified
00:16:22.280 by the
00:16:22.680 violations
00:16:23.060 of
00:16:23.260 free
00:16:23.440 speech
00:16:23.760 brought
00:16:24.060 about
00:16:24.340 by the
00:16:24.700 activist
00:16:25.120 left
00:16:25.560 the
00:16:25.900 woke
00:16:27.520 revolution
00:16:28.000 all of
00:16:28.380 that
00:16:28.600 and
00:16:30.020 it
00:16:30.120 is
00:16:30.240 true
00:16:30.480 that
00:16:30.860 Obama
00:16:31.920 used
00:16:32.900 Title IX
00:16:33.840 legislation
00:16:35.280 to push
00:16:36.120 universities
00:16:36.680 to do
00:16:37.280 things
00:16:37.660 to push
00:16:38.080 them to
00:16:38.300 the left
00:16:38.640 on
00:16:38.900 gender
00:16:39.140 issues
00:16:39.560 and that
00:16:40.520 Greg
00:16:41.040 and I
00:16:41.260 said
00:16:41.460 was not
00:16:41.920 right
00:16:42.320 that
00:16:42.600 was
00:16:42.920 it
00:16:43.060 was
00:16:43.160 not
00:16:43.320 illegal
00:16:43.660 he
00:16:44.400 had
00:16:44.500 that
00:16:44.660 power
00:16:44.960 but
00:16:45.640 what
00:16:46.140 he
00:16:46.280 did
00:16:46.460 we
00:16:46.600 thought
00:16:46.820 was
00:16:47.160 terrible
00:16:47.740 and
00:16:47.960 was
00:16:48.180 really
00:16:48.740 not
00:16:48.940 appropriate
00:16:49.360 to have
00:16:49.680 this
00:16:49.840 level
00:16:50.020 of
00:16:50.140 control
00:16:50.400 over
00:16:50.700 what
00:16:50.960 we
00:16:51.380 can
00:16:51.560 talk
00:16:51.800 about
00:16:51.960 at
00:16:52.060 universities
00:16:52.560 and
00:16:53.520 now
00:16:53.720 Trump
00:16:54.040 is
00:16:55.020 doing
00:16:55.540 much
00:16:55.960 more
00:16:56.260 to
00:16:56.500 universities
00:16:57.060 dictating
00:16:57.720 who
00:16:57.960 we're
00:16:58.640 trying
00:16:58.880 to
00:16:59.040 dictate
00:16:59.240 we'll
00:16:59.420 see
00:16:59.540 what
00:16:59.640 the
00:16:59.720 negotiations
00:17:00.120 say
00:17:00.500 so
00:17:01.060 you
00:17:01.260 can
00:17:01.420 point
00:17:01.680 to
00:17:01.840 the
00:17:01.940 previous
00:17:02.160 Obama
00:17:02.540 example
00:17:03.040 but
00:17:03.900 it's
00:17:04.220 not
00:17:04.420 nearly
00:17:05.040 as
00:17:05.340 big
00:17:05.540 as
00:17:05.680 the
00:17:05.840 Trump
00:17:06.100 example
00:17:06.480 and
00:17:06.660 this
00:17:06.800 goes
00:17:07.000 around
00:17:07.820 it's
00:17:08.340 always
00:17:08.660 like
00:17:08.940 this
00:17:09.200 so
00:17:09.640 each
00:17:09.940 side
00:17:10.200 is
00:17:10.340 so
00:17:10.600 good
00:17:10.900 at
00:17:11.140 finding
00:17:11.460 out
00:17:11.660 where
00:17:11.800 the
00:17:11.920 other
00:17:12.180 side
00:17:12.820 did
00:17:13.040 something
00:17:13.480 sort
00:17:13.780 of
00:17:13.900 similar
00:17:14.260 and
00:17:15.740 you
00:17:15.920 know
00:17:16.020 yes
00:17:16.280 I'm
00:17:16.480 sure
00:17:16.660 I'm
00:17:17.500 sure
00:17:17.700 that
00:17:17.920 the
00:17:18.100 Biden
00:17:18.300 administration
00:17:18.800 nudged
00:17:19.420 or
00:17:19.560 requested
00:17:20.020 but
00:17:20.600 they
00:17:21.340 never
00:17:21.580 did
00:17:21.760 anything
00:17:22.340 like
00:17:22.800 demanding
00:17:23.860 the
00:17:24.280 prosecution
00:17:24.680 of a
00:17:24.900 particular
00:17:25.160 person
00:17:25.620 and
00:17:25.820 then
00:17:25.940 when
00:17:26.060 the
00:17:26.200 prosecutor
00:17:26.620 wouldn't
00:17:27.020 do
00:17:27.260 it
00:17:27.480 firing
00:17:28.060 that
00:17:28.300 person
00:17:28.620 and
00:17:28.780 getting
00:17:28.980 someone
00:17:29.420 who
00:17:29.860 would
00:17:35.680 if
00:17:36.240 you
00:17:36.320 cross
00:17:36.560 red
00:17:36.760 lines
00:17:36.980 you're
00:17:37.160 out
00:17:37.360 do
00:17:38.080 you
00:17:38.220 think
00:17:38.460 that
00:17:38.600 Trump
00:17:38.760 has
00:17:38.900 crossed
00:17:39.140 red
00:17:39.340 lines
00:17:39.540 yet
00:17:39.780 I
00:17:41.360 don't
00:17:41.820 because
00:17:42.140 my
00:17:42.580 understanding
00:17:43.260 of the
00:17:43.520 story
00:17:43.740 is
00:17:43.920 different
00:17:44.180 than
00:17:44.400 you
00:17:44.620 okay
00:17:45.200 okay
00:17:45.820 good
00:17:46.060 right
00:17:46.320 my
00:17:47.000 understanding
00:17:47.460 of the
00:17:47.840 story
00:17:48.240 is
00:17:48.760 that
00:17:48.980 he
00:17:49.220 didn't
00:17:49.740 say
00:17:50.160 he
00:17:51.920 didn't
00:17:52.240 say
00:17:52.560 oh
00:17:52.840 you
00:17:53.080 won't
00:17:53.400 prosecute
00:17:54.240 you're
00:17:54.520 out
00:17:54.680 you're
00:17:54.920 fired
00:17:55.160 get
00:17:55.380 somebody
00:17:55.620 else
00:17:55.820 what
00:17:55.960 he
00:17:56.080 said
00:17:56.320 was
00:17:56.660 make
00:17:57.100 a
00:17:57.260 decision
00:17:57.780 because
00:17:58.580 the
00:17:59.060 statute
00:17:59.460 of
00:17:59.600 limitations
00:18:00.060 is
00:18:00.740 almost
00:18:01.260 up
00:18:01.540 yes
00:18:02.400 I
00:18:02.980 believe
00:18:03.380 that
00:18:03.700 that
00:18:04.100 Comey
00:18:04.400 should
00:18:04.600 be
00:18:04.760 prosecuted
00:18:05.400 and
00:18:06.320 so
00:18:06.500 does
00:18:06.640 Donald
00:18:06.880 Trump
00:18:07.300 for
00:18:08.580 a
00:18:08.820 long
00:18:09.380 string
00:18:09.840 of
00:18:10.120 things
00:18:10.460 I
00:18:10.640 think
00:18:10.780 there
00:18:10.940 is
00:18:11.140 a
00:18:11.260 grand
00:18:11.580 conspiracy
00:18:12.280 that
00:18:12.680 you
00:18:12.800 could
00:18:12.940 go
00:18:13.080 back
00:18:13.460 but
00:18:14.000 you
00:18:14.260 lose
00:18:14.720 the
00:18:14.920 opportunity
00:18:15.480 if
00:18:16.040 you
00:18:16.160 don't
00:18:16.460 act
00:18:16.780 so
00:18:17.300 what
00:18:17.520 Trump
00:18:17.800 said
00:18:18.100 was
00:18:18.400 you
00:18:19.000 have
00:18:19.400 to
00:18:19.620 make
00:18:19.900 a
00:18:20.060 decision
00:18:20.580 yes
00:18:21.320 or
00:18:21.460 no
00:18:21.680 are
00:18:21.880 you
00:18:21.960 going
00:18:22.100 to
00:18:22.200 do
00:18:22.340 it
00:18:22.460 he
00:18:22.640 didn't
00:18:22.860 make
00:18:23.040 a
00:18:23.180 decision
00:18:23.480 and
00:18:23.740 so
00:18:23.900 he
00:18:24.060 was
00:18:24.200 replaced
00:18:24.680 I
00:18:25.620 think
00:18:25.900 he
00:18:26.060 has
00:18:26.320 the
00:18:26.480 right
00:18:26.700 to
00:18:26.940 do
00:18:27.100 that
00:18:27.440 I
00:18:29.420 don't
00:18:29.860 like
00:18:30.220 I
00:18:30.760 do
00:18:31.080 not
00:18:31.400 want
00:18:31.680 my
00:18:31.940 president
00:18:32.340 to
00:18:32.700 say
00:18:33.080 I
00:18:35.380 have
00:18:35.660 the
00:18:35.860 facts
00:18:36.300 I
00:18:36.840 don't
00:18:37.060 care
00:18:37.340 what
00:18:37.620 the
00:18:37.860 grand
00:18:38.200 jury
00:18:38.520 says
00:18:38.940 I
00:18:39.140 don't
00:18:39.360 care
00:18:39.560 what
00:18:39.740 anybody
00:18:40.020 says
00:18:40.440 get
00:18:41.000 him
00:18:41.200 that
00:18:41.840 I
00:18:42.000 don't
00:18:42.200 like
00:18:42.440 I
00:18:43.120 do
00:18:43.320 believe
00:18:43.600 Biden
00:18:43.960 did
00:18:44.260 that
00:18:44.640 more
00:18:45.620 to
00:18:46.020 Donald
00:18:46.260 Trump
00:18:46.660 than
00:18:47.100 he
00:18:47.940 is
00:18:48.140 doing
00:18:48.360 now
00:18:48.740 however
00:18:49.600 I'm
00:18:50.220 on
00:18:50.420 guard
00:18:50.820 on
00:18:51.160 that
00:18:51.360 I
00:18:51.580 do
00:18:51.820 not
00:18:52.860 want
00:18:53.180 that
00:18:53.460 we
00:18:53.740 give
00:18:54.060 that
00:18:54.300 power
00:18:54.640 to
00:18:54.840 Donald
00:18:55.120 Trump
00:18:55.480 we
00:18:55.900 give
00:18:56.120 that
00:18:56.300 power
00:18:56.560 to
00:18:56.720 Joe
00:18:56.920 Biden
00:18:57.280 or
00:18:57.400 anybody
00:18:57.820 else
00:18:58.040 we're
00:18:58.480 done
00:18:58.980 as
00:18:59.360 a
00:18:59.460 republic
00:18:59.860 and
00:19:00.180 I've
00:19:00.380 had
00:19:00.880 several
00:19:01.300 conversations
00:19:01.960 with
00:19:02.260 Alan
00:19:02.480 Dershowitz
00:19:02.980 on this
00:19:03.420 watching
00:19:04.200 those
00:19:05.300 lines
00:19:05.820 but
00:19:07.700 again
00:19:08.120 I
00:19:10.560 have
00:19:10.780 my
00:19:11.000 point
00:19:11.720 of
00:19:11.820 view
00:19:11.900 it
00:19:13.460 does
00:19:13.700 nothing
00:19:14.220 to
00:19:14.520 further
00:19:14.940 the
00:19:15.440 game
00:19:15.860 does
00:19:16.480 it
00:19:16.700 not
00:19:17.160 game
00:19:17.540 but
00:19:17.740 you
00:19:17.940 know
00:19:18.060 what
00:19:18.160 I
00:19:18.220 mean
00:19:19.140 even
00:19:20.140 just
00:19:20.340 having
00:19:20.540 this
00:19:20.700 conversation
00:19:21.280 I
00:19:22.420 didn't
00:19:23.340 know
00:19:23.520 what
00:19:23.700 the
00:19:23.820 right
00:19:23.960 thinks
00:19:24.260 about
00:19:24.640 this
00:19:25.000 again
00:19:25.500 I'm
00:19:25.680 not
00:19:25.860 on
00:19:26.000 the
00:19:26.100 left
00:19:26.300 anymore
00:19:26.600 I'm
00:19:26.800 just
00:19:26.960 a
00:19:27.760 social
00:19:27.960 scientist
00:19:28.240 trying
00:19:28.460 to
00:19:28.580 understand
00:19:29.040 but
00:19:30.060 even
00:19:30.200 just
00:19:30.400 this
00:19:30.560 conversation
00:19:31.020 you
00:19:31.200 and
00:19:31.320 I
00:19:31.440 are
00:19:31.560 having
00:19:31.880 at least
00:19:32.680 now
00:19:32.940 I
00:19:33.100 see
00:19:33.520 how
00:19:33.960 you're
00:19:35.040 thinking
00:19:35.320 about
00:19:35.720 this
00:19:35.960 and that
00:19:37.240 humanizes
00:19:37.880 people
00:19:38.380 because
00:19:39.060 without
00:19:39.620 that
00:19:39.980 contact
00:19:40.840 you
00:19:41.160 just
00:19:41.360 think
00:19:41.560 the
00:19:41.700 other
00:19:41.900 side
00:19:42.140 is
00:19:42.260 monstrous
00:19:42.980 they're
00:19:43.520 evil
00:19:43.900 they're
00:19:44.120 hypocrites
00:19:44.740 so I
00:19:47.320 think
00:19:47.480 we need
00:19:48.020 to have
00:19:48.240 conversations
00:19:48.940 with people
00:19:50.100 who
00:19:50.360 differ
00:19:51.020 from us
00:19:51.580 conversations
00:19:52.480 which we're
00:19:52.840 trying to
00:19:53.160 learn
00:19:53.540 not
00:19:54.060 stomp
00:19:54.900 on
00:19:55.120 or even
00:19:55.440 persuade
00:19:55.940 the one
00:19:57.060 thing I've
00:19:57.500 been saying
00:19:57.940 for a
00:19:58.560 while now
00:19:59.160 is
00:19:59.740 we can't
00:20:02.340 we have to
00:20:02.900 stop trying
00:20:03.700 to win
00:20:04.320 we have to
00:20:06.020 have a
00:20:06.320 conversation
00:20:06.820 that starts
00:20:07.400 with
00:20:07.760 how did
00:20:09.220 you arrive
00:20:09.780 at that
00:20:10.300 I really
00:20:10.980 want to
00:20:11.640 understand
00:20:12.080 how did
00:20:12.660 you get
00:20:13.280 there
00:20:13.660 because
00:20:14.680 we may
00:20:15.920 not still
00:20:16.760 agree
00:20:17.280 but at least
00:20:18.260 I will
00:20:18.640 understand
00:20:19.320 okay
00:20:20.540 that's
00:20:22.040 reasonable
00:20:22.560 you might
00:20:23.240 be missing
00:20:23.660 this fact
00:20:24.160 this fact
00:20:24.600 this fact
00:20:25.120 or you
00:20:25.800 might say
00:20:26.360 things that
00:20:27.000 I didn't
00:20:27.420 know
00:20:27.720 but in
00:20:30.380 the end
00:20:31.040 it's
00:20:31.960 I'm not
00:20:33.980 sure that
00:20:34.560 that even
00:20:35.040 works
00:20:35.520 I mean
00:20:36.100 it's better
00:20:36.700 than everything
00:20:37.220 else
00:20:37.660 but I'm
00:20:38.040 not sure
00:20:38.540 that gets
00:20:39.180 us
00:20:39.500 where we
00:20:40.560 need to
00:20:40.840 be
00:20:40.980 so that
00:20:42.240 can work
00:20:42.880 on a
00:20:43.200 local level
00:20:43.740 when you
00:20:44.000 get people
00:20:44.460 who live
00:20:44.920 near each
00:20:45.320 other
00:20:45.540 they're
00:20:45.760 tied to
00:20:46.240 each
00:20:46.420 other
00:20:46.580 they have
00:20:46.900 a past
00:20:47.420 they have
00:20:47.640 a future
00:20:48.040 then those
00:20:49.120 conversations
00:20:49.620 really do
00:20:50.140 work
00:20:50.640 on social
00:20:51.400 media
00:20:51.600 they very
00:20:52.220 rarely
00:20:52.540 work
00:20:52.940 I
00:20:53.780 created
00:20:54.160 a
00:20:54.460 program
00:20:55.120 if
00:20:56.260 people
00:20:56.880 go to
00:20:57.180 constructivedialogueinstitute.org
00:20:59.600 we created
00:21:00.440 a program
00:21:00.820 called
00:21:01.060 perspectives
00:21:01.580 to be
00:21:02.000 used in
00:21:02.380 classrooms
00:21:02.900 especially
00:21:03.300 university
00:21:03.780 but it
00:21:04.080 works well
00:21:04.360 in high
00:21:04.580 school
00:21:04.760 as well
00:21:05.120 that puts
00:21:06.540 students
00:21:06.840 in dialogue
00:21:07.460 where you
00:21:07.780 try to
00:21:08.260 develop
00:21:08.600 that
00:21:08.820 curiosity
00:21:09.360 first
00:21:09.840 like
00:21:10.040 why do
00:21:10.540 you
00:21:10.660 think
00:21:10.880 this
00:21:11.100 okay
00:21:11.260 we
00:21:11.380 differ
00:21:11.560 on this
00:21:11.780 how did
00:21:12.460 you
00:21:12.560 come
00:21:12.760 to
00:21:12.900 that
00:21:13.140 what
00:21:13.540 events
00:21:13.820 in your
00:21:14.260 background
00:21:14.840 cause
00:21:15.580 you to
00:21:15.800 see it
00:21:16.020 this
00:21:16.160 way
00:21:16.380 and it
00:21:17.160 works
00:21:17.340 really
00:21:17.560 well
00:21:17.740 to
00:21:17.880 turn
00:21:18.060 down
00:21:18.180 the
00:21:18.300 polarization
00:21:18.680 it's
00:21:20.600 just
00:21:20.760 that
00:21:21.020 if
00:21:22.260 we're
00:21:22.380 working
00:21:22.660 at
00:21:22.800 that
00:21:22.940 local
00:21:23.220 level
00:21:23.540 a
00:21:23.700 classroom
00:21:24.140 a
00:21:24.320 neighborhood
00:21:24.700 it's
00:21:25.960 very hard
00:21:26.380 to
00:21:26.620 scale
00:21:27.280 it up
00:21:27.500 to the
00:21:27.680 point
00:21:27.820 where
00:21:27.960 it
00:21:28.080 matters
00:21:28.360 for
00:21:28.560 the
00:21:28.680 nation
00:21:29.000 there's
00:21:29.280 a group
00:21:29.440 called
00:21:29.600 braver
00:21:29.900 angels
00:21:30.200 which is
00:21:30.620 doing
00:21:30.860 that
00:21:31.080 braver
00:21:31.340 angels
00:21:31.700 dot org
00:21:32.260 that
00:21:33.380 that
00:21:33.400 is
00:21:33.520 sort
00:21:33.660 of
00:21:33.720 the
00:21:33.820 retail
00:21:34.120 work
00:21:34.440 of
00:21:34.700 politics
00:21:35.640 I'll
00:21:37.680 just
00:21:37.820 add
00:21:38.060 you
00:21:38.280 said
00:21:38.740 something
00:21:38.920 interesting
00:21:39.240 you
00:21:39.440 said
00:21:39.740 that
00:21:40.480 you
00:21:40.820 want
00:21:41.000 people
00:21:41.220 to
00:21:41.380 stop
00:21:41.600 trying
00:21:41.860 to
00:21:42.080 win
00:21:42.520 I
00:21:43.400 would
00:21:43.500 just
00:21:43.640 amend
00:21:43.900 that
00:21:44.060 a
00:21:44.180 little
00:21:44.320 bit
00:21:44.580 by
00:21:44.780 saying
00:21:45.220 politics
00:21:46.040 is
00:21:46.500 about
00:21:46.760 winning
00:21:47.220 and
00:21:47.420 losing
00:21:47.640 in part
00:21:48.100 and
00:21:48.320 certainly
00:21:48.540 elections
00:21:48.860 are
00:21:49.040 certainly
00:21:49.240 about
00:21:49.440 winning
00:21:49.620 and
00:21:49.820 losing
00:21:50.060 but
00:21:51.060 I
00:21:51.180 think
00:21:51.340 what
00:21:51.460 we
00:21:51.580 want
00:21:51.760 to
00:21:51.840 do
00:21:51.940 is
00:21:52.040 we
00:21:52.140 want
00:21:52.220 to
00:21:52.300 get
00:21:52.400 people
00:21:52.660 to
00:21:52.860 agree
00:21:53.100 on
00:21:53.240 the
00:21:53.360 game
00:21:53.560 that
00:21:53.700 we're
00:21:53.800 playing
00:21:54.160 first
00:21:54.480 let's
00:21:54.680 agree
00:21:54.880 on
00:21:55.000 what
00:21:55.120 is
00:21:55.240 this
00:21:55.380 game
00:21:55.560 we're
00:21:55.700 playing
00:21:56.060 what
00:21:56.500 are
00:21:56.580 the
00:21:56.700 rules
00:21:56.900 what
00:21:57.020 are
00:21:57.080 the
00:21:57.180 boundaries
00:21:57.620 okay
00:21:58.060 now
00:21:58.500 let's
00:21:58.960 go
00:21:59.160 you
00:21:59.420 try to
00:21:59.880 persuade
00:22:00.100 a lot
00:22:00.300 of
00:22:00.360 people
00:22:00.540 I
00:22:00.680 try to
00:22:00.940 persuade
00:22:01.140 a lot
00:22:01.360 of
00:22:01.440 people
00:22:01.700 and
00:22:03.340 so
00:22:03.720 if
00:22:03.880 we
00:22:04.000 have
00:22:04.220 that
00:22:04.520 sense
00:22:04.980 then
00:22:05.200 I
00:22:05.360 think
00:22:05.660 the
00:22:05.840 game
00:22:06.080 can
00:22:06.340 work
00:22:06.880 so
00:22:07.760 may
00:22:08.040 I
00:22:08.300 I
00:22:09.540 can't
00:22:09.980 believe
00:22:10.180 I'm
00:22:10.400 arguing
00:22:10.780 language
00:22:11.300 with
00:22:11.580 you
00:22:11.940 because
00:22:13.120 you
00:22:13.260 are
00:22:13.360 so
00:22:13.560 good
00:22:13.820 at
00:22:14.000 it
00:22:14.140 but
00:22:14.400 may
00:22:16.560 I
00:22:16.860 argue
00:22:17.260 language
00:22:17.880 when
00:22:18.620 I
00:22:18.780 was
00:22:18.980 at
00:22:19.200 Fox
00:22:19.800 as
00:22:20.340 I
00:22:20.560 left
00:22:21.600 Roger
00:22:22.800 Ailes
00:22:23.100 said
00:22:23.340 to me
00:22:23.560 you
00:22:23.680 know
00:22:23.860 what
00:22:24.000 your
00:22:24.140 problem
00:22:24.520 is
00:22:24.820 I
00:22:25.100 said
00:22:25.360 no
00:22:25.660 sir
00:22:25.980 I
00:22:26.200 don't
00:22:26.740 and
00:22:26.980 he
00:22:27.160 said
00:22:27.580 you
00:22:28.360 won't
00:22:28.680 play
00:22:28.940 the
00:22:29.120 game
00:22:29.440 and
00:22:31.020 he
00:22:32.060 explained
00:22:32.640 that
00:22:33.080 that
00:22:33.380 you
00:22:33.520 know
00:22:33.620 we
00:22:33.800 take
00:22:34.100 a
00:22:34.260 piece
00:22:34.540 of
00:22:34.820 the
00:22:35.100 you
00:22:35.400 know
00:22:35.560 piece
00:22:35.940 of
00:22:36.040 their
00:22:36.200 flesh
00:22:36.520 and
00:22:36.640 they
00:22:36.740 take
00:22:36.960 a
00:22:37.100 piece
00:22:37.280 of
00:22:37.400 our
00:22:37.580 flesh
00:22:37.880 and
00:22:38.020 we
00:22:38.140 all
00:22:38.320 go
00:22:38.500 have
00:22:38.680 dinner
00:22:38.900 at
00:22:39.080 night
00:22:39.340 and
00:22:40.220 I
00:22:40.460 said
00:22:40.920 some
00:22:42.480 of
00:22:42.660 us
00:22:42.880 aren't
00:22:44.000 playing
00:22:44.360 a
00:22:44.560 game
00:22:44.780 some
00:22:45.420 of
00:22:45.560 us
00:22:45.700 are
00:22:45.840 doing
00:22:46.100 it
00:22:46.240 because
00:22:46.480 we
00:22:46.680 believe
00:22:47.200 this
00:22:47.600 we
00:22:48.580 we're
00:22:49.100 not
00:22:49.300 playing
00:22:49.700 a
00:22:49.880 game
00:22:50.140 do
00:22:51.480 you
00:22:51.600 mean
00:22:51.780 the
00:22:51.920 game
00:22:52.180 in
00:22:52.320 the
00:22:52.440 same
00:22:52.620 way
00:22:52.800 he
00:22:53.020 meant
00:22:53.260 it
00:22:53.460 I
00:22:55.840 mean
00:22:56.360 I
00:22:57.240 mean
00:22:57.660 anything
00:22:58.520 that keeps
00:22:58.940 us
00:22:59.120 away
00:22:59.440 from
00:22:59.800 the
00:22:59.940 ends
00:23:00.180 justify
00:23:00.580 the
00:23:00.760 means
00:23:01.060 that's
00:23:01.920 the
00:23:02.060 road
00:23:02.220 to
00:23:02.380 hell
00:23:02.580 once
00:23:02.860 people
00:23:03.120 think
00:23:03.380 the
00:23:03.560 ends
00:23:04.000 justify
00:23:04.540 the
00:23:04.720 means
00:23:04.840 and
00:23:05.220 that's
00:23:05.520 what
00:23:05.640 Tyler
00:23:05.920 Robinson
00:23:06.340 thought
00:23:06.760 he
00:23:07.220 thought
00:23:07.640 Charlie
00:23:08.480 Kirk
00:23:08.680 is
00:23:08.880 so
00:23:09.140 odious
00:23:09.640 his
00:23:09.920 views
00:23:10.200 on
00:23:10.420 trans
00:23:10.680 are
00:23:10.800 so
00:23:11.060 odious
00:23:11.540 that
00:23:12.200 I
00:23:13.500 should
00:23:14.280 kill
00:23:14.720 him
00:23:14.980 so
00:23:16.200 when
00:23:16.360 you
00:23:16.460 get
00:23:16.640 the
00:23:16.820 ends
00:23:17.000 justify
00:23:17.280 the
00:23:17.460 means
00:23:17.880 then
00:23:18.760 you
00:23:18.980 can
00:23:19.140 justify
00:23:19.500 anything
00:23:19.980 and
00:23:20.240 the
00:23:20.440 president
00:23:20.800 did
00:23:21.140 tweet
00:23:21.420 something
00:23:21.740 like
00:23:22.100 he
00:23:22.620 who
00:23:22.940 saves
00:23:23.420 his
00:23:23.640 nation
00:23:24.080 is not
00:23:24.900 breaking
00:23:25.180 any
00:23:25.380 laws
00:23:25.580 something
00:23:25.820 like
00:23:26.080 that
00:23:26.340 he
00:23:26.780 was
00:23:26.920 saying
00:23:27.160 the
00:23:27.340 ends
00:23:27.580 justify
00:23:27.920 the
00:23:28.100 means
00:23:28.280 I
00:23:28.420 can
00:23:28.580 do
00:23:28.760 what
00:23:29.000 I
00:23:29.120 want
00:23:29.500 I
00:23:30.040 I'm
00:23:30.100 just
00:23:30.220 saying
00:23:30.380 that's
00:23:30.840 to
00:23:31.080 me
00:23:31.180 that's
00:23:37.640 very
00:23:37.980 difficult
00:23:38.420 to
00:23:39.400 pay
00:23:39.580 expenses
00:23:40.020 every
00:23:40.460 single
00:23:40.700 month
00:23:40.960 in
00:23:41.200 most
00:23:41.480 cases
00:23:41.900 there's
00:23:42.220 almost
00:23:42.440 nothing
00:23:42.740 left
00:23:43.080 over
00:23:43.480 to
00:23:43.760 cover
00:23:44.300 any
00:23:44.520 extras
00:23:44.980 most
00:23:45.840 aren't
00:23:46.080 getting
00:23:46.320 a big
00:23:46.660 raise
00:23:47.180 with
00:23:47.540 expenses
00:23:47.980 being
00:23:48.220 up
00:23:48.360 so
00:23:48.520 high
00:23:48.760 it
00:23:48.880 can
00:23:49.000 be
00:23:49.100 very
00:23:49.300 hard
00:23:49.520 to
00:23:49.700 manage
00:23:50.300 without
00:23:51.600 grabbing
00:23:51.940 for the
00:23:52.260 credit
00:23:52.460 cards
00:23:52.720 and
00:23:52.820 when
00:23:52.940 you
00:23:53.040 do
00:23:53.320 there's
00:23:53.740 trouble
00:23:54.020 on
00:23:54.240 that
00:23:54.500 but
00:23:54.980 listen
00:23:55.180 if
00:23:55.340 you're
00:23:55.500 a
00:23:55.620 homeowner
00:23:56.020 and
00:23:56.300 you
00:23:56.480 are
00:23:56.640 frustrated
00:23:57.320 with
00:23:58.540 that
00:23:58.700 endless
00:23:59.100 cycle
00:23:59.600 that
00:23:59.800 only
00:24:00.000 produces
00:24:00.400 more
00:24:00.700 debt
00:24:01.060 I
00:24:01.240 want
00:24:01.520 you
00:24:01.620 to
00:24:01.740 take
00:24:01.920 10
00:24:02.180 minutes
00:24:02.400 today
00:24:02.720 and
00:24:02.980 call
00:24:03.420 American
00:24:04.040 Financing
00:24:04.720 if
00:24:05.720 you're
00:24:05.920 constantly
00:24:06.440 carrying
00:24:06.880 a
00:24:07.160 credit
00:24:07.460 card
00:24:07.880 balance
00:24:08.640 each
00:24:09.020 and
00:24:09.220 every
00:24:09.480 month
00:24:09.740 with
00:24:09.920 an
00:24:10.040 interest
00:24:10.380 rate
00:24:10.580 of
00:24:10.700 20
00:24:11.060 or
00:24:11.220 even
00:24:11.480 30
00:24:12.120 percent
00:24:12.700 American
00:24:13.620 Financing
00:24:14.220 can
00:24:14.560 show
00:24:14.940 you
00:24:15.220 how
00:24:15.660 to
00:24:15.760 put
00:24:15.880 your
00:24:16.000 hard
00:24:16.220 earned
00:24:16.520 equity
00:24:17.000 to
00:24:17.280 work
00:24:17.580 and
00:24:17.940 get
00:24:18.160 you
00:24:18.280 out
00:24:18.440 of
00:24:18.580 debt
00:24:18.820 their
00:24:19.500 salary
00:24:19.860 based
00:24:20.140 mortgage
00:24:20.520 consultants
00:24:21.000 are
00:24:21.200 now
00:24:21.380 saving
00:24:21.700 their
00:24:21.860 customers
00:24:22.320 an
00:24:22.580 average
00:24:22.960 of
00:24:23.200 $800
00:24:23.760 a
00:24:24.220 month
00:24:24.380 and
00:24:24.560 that
00:24:24.820 could
00:24:25.260 be
00:24:25.920 you
00:24:26.280 so
00:24:27.440 if
00:24:27.760 you
00:24:27.820 get
00:24:27.960 started
00:24:28.240 today
00:24:28.600 you
00:24:28.760 may
00:24:28.900 not
00:24:29.060 have
00:24:29.180 to
00:24:29.280 make
00:24:29.440 next
00:24:29.720 month's
00:24:30.080 mortgage
00:24:30.420 payment
00:24:30.800 there
00:24:31.220 are
00:24:31.300 no
00:24:31.440 upfront
00:24:31.780 fees
00:24:32.140 and
00:24:32.260 it
00:24:32.360 costs
00:24:32.540 you
00:24:32.660 nothing
00:24:32.920 to
00:24:33.140 find
00:24:33.320 out
00:24:33.480 how
00:24:33.640 much
00:24:33.840 you
00:24:33.960 could
00:24:34.140 be
00:24:34.320 saving
00:24:34.660 every
00:24:35.020 single
00:24:35.360 month
00:24:35.640 so
00:24:35.780 go
00:24:35.880 to
00:24:36.020 American
00:24:36.380 financing
00:24:36.980 dot
00:24:37.280 net
00:24:37.640 today
00:24:38.120 American
00:24:38.560 financing
00:24:38.980 dot
00:24:39.260 net
00:24:39.500 you
00:24:39.960 can
00:24:40.100 call
00:24:40.300 them
00:24:40.420 at
00:24:40.560 800-906-2440
00:24:42.500 800-906-2440
00:24:44.420 it's
00:24:44.920 Americanfinancing
00:24:45.700 dot
00:24:45.980 net
00:24:46.300 did
00:24:50.180 you
00:24:50.300 lock
00:24:50.460 the
00:24:50.580 front
00:24:50.740 door
00:24:50.960 check
00:24:51.340 close
00:24:51.940 the
00:24:52.120 garage
00:24:52.360 door
00:24:52.640 yep
00:24:53.120 installed
00:24:53.580 window
00:24:53.840 sensors
00:24:54.240 smoke
00:24:54.560 sensors
00:24:54.900 and
00:24:55.200 HD
00:24:55.520 cameras
00:24:55.900 with
00:24:56.040 night
00:24:56.180 vision
00:24:56.440 no
00:24:57.060 and
00:24:57.860 you
00:24:57.980 set
00:24:58.080 up
00:24:58.200 credit
00:24:58.380 card
00:24:58.560 transaction
00:24:58.940 alerts
00:24:59.240 a
00:24:59.400 secure
00:24:59.660 VPN
00:25:00.020 for
00:25:00.240 a
00:25:00.360 private
00:25:00.560 connection
00:25:00.920 and
00:25:01.200 continuous
00:25:01.660 monitoring
00:25:02.060 for our
00:25:02.360 personal
00:25:02.580 info
00:25:02.820 on the
00:25:03.020 dark
00:25:03.180 web
00:25:03.400 uh
00:25:04.540 I'm
00:25:05.320 looking
00:25:05.640 into
00:25:06.000 it
00:25:06.300 stress
00:25:06.920 less
00:25:07.260 about
00:25:07.540 security
00:25:08.080 choose
00:25:08.600 security
00:25:09.020 solutions
00:25:09.520 from
00:25:09.720 tell us
00:25:10.120 for peace
00:25:10.540 of mind
00:25:10.860 at home
00:25:11.300 and
00:25:11.780 online
00:25:12.260 visit
00:25:12.800 telus.com
00:25:13.660 slash
00:25:13.980 total
00:25:14.260 security
00:25:14.740 to learn
00:25:15.220 more
00:25:15.560 conditions
00:25:16.140 apply
00:25:16.560 when I
00:25:18.420 said
00:25:19.360 at the
00:25:20.180 beginning
00:25:20.480 that you
00:25:21.060 have
00:25:21.300 diagnosed
00:25:21.700 this
00:25:22.080 problem
00:25:22.540 I think
00:25:22.980 better
00:25:23.200 than
00:25:23.420 anybody
00:25:23.720 else
00:25:24.160 um
00:25:26.000 what I
00:25:27.620 mean by
00:25:28.080 that
00:25:28.420 is
00:25:28.840 the work
00:25:29.660 that you
00:25:30.040 have done
00:25:30.560 on social
00:25:31.300 media
00:25:31.700 and our
00:25:32.400 kids
00:25:32.880 um
00:25:33.980 there is
00:25:34.640 a there
00:25:35.040 is a
00:25:35.680 a group
00:25:36.720 of parents
00:25:37.740 and I'm
00:25:38.100 in them
00:25:38.640 I'm in
00:25:39.180 that
00:25:39.620 that you
00:25:40.600 know had
00:25:40.920 kids growing
00:25:41.660 up when
00:25:42.320 the phone
00:25:42.980 and the
00:25:43.420 ipad
00:25:43.960 and social
00:25:44.460 media
00:25:44.860 all of a
00:25:45.720 sudden
00:25:45.940 my kids
00:25:46.460 are 11
00:25:46.960 and it's
00:25:47.680 all there
00:25:48.240 and we
00:25:48.800 don't know
00:25:49.120 what to
00:25:49.400 do
00:25:49.640 um
00:25:50.720 but those
00:25:51.360 days are
00:25:51.820 over
00:25:52.480 now the
00:25:53.620 results are
00:25:54.240 in and it's
00:25:55.000 clear you
00:25:55.660 want to go
00:25:55.960 through some
00:25:56.580 of these
00:25:57.080 things that
00:25:57.780 you have
00:25:57.980 found
00:25:58.280 sure
00:25:59.500 um
00:26:00.140 so I'll
00:26:00.400 just give
00:26:00.680 just to
00:26:01.040 give the
00:26:01.280 overview
00:26:01.640 of the
00:26:02.060 book of
00:26:02.380 the anxious
00:26:02.640 generation
00:26:03.240 I can
00:26:04.040 summarize the
00:26:04.440 whole book
00:26:04.900 by saying
00:26:05.540 that this
00:26:06.280 gigantic
00:26:06.840 mental health
00:26:07.460 catastrophe
00:26:08.060 that began
00:26:08.720 in 2012
00:26:09.340 it's very
00:26:09.800 sharp it
00:26:10.280 really begins
00:26:10.740 right around
00:26:11.120 2012
00:26:11.500 2013
00:26:11.900 that the
00:26:13.620 biggest cause
00:26:14.180 of this is
00:26:14.680 that we
00:26:14.980 have over
00:26:15.580 protected our
00:26:16.160 children in
00:26:16.600 the real
00:26:16.900 world we
00:26:17.720 have to let
00:26:18.040 them out to
00:26:18.400 play develop
00:26:19.040 independence and
00:26:20.200 we have
00:26:20.420 under protected
00:26:21.080 them online
00:26:21.720 our kids
00:26:22.820 moved their
00:26:23.520 social lives
00:26:24.040 onto social
00:26:24.740 media platforms
00:26:25.460 around 2012
00:26:26.160 2013 and
00:26:27.000 the results
00:26:27.320 have been
00:26:27.580 completely
00:26:28.060 disastrous
00:26:28.620 so I've
00:26:30.540 been assembling
00:26:31.200 the evidence
00:26:31.660 for this
00:26:31.920 because I'm
00:26:32.460 arguing there
00:26:33.060 are some
00:26:33.220 other psychologists
00:26:33.860 say no
00:26:34.280 there's no
00:26:34.660 evidence of
00:26:35.100 harm no
00:26:35.880 it's just a
00:26:36.340 correlation
00:26:36.800 correlation
00:26:37.240 doesn't prove
00:26:37.760 causation
00:26:38.280 and I'll
00:26:40.460 just just to
00:26:40.980 tick down the
00:26:41.880 the evidence
00:26:42.680 that social media
00:26:43.220 is bad for
00:26:43.740 our kids
00:26:44.220 the first
00:26:45.160 thing is that
00:26:45.720 the kids
00:26:45.980 themselves say
00:26:46.520 that when
00:26:47.380 you survey
00:26:47.840 high school
00:26:48.300 kids and
00:26:48.780 college kids
00:26:49.380 and kids
00:26:49.680 in their
00:26:49.880 young people
00:26:50.460 in their
00:26:50.560 20s they're
00:26:51.420 not grateful
00:26:51.860 for this they
00:26:52.320 say this was
00:26:52.820 really bad for
00:26:53.540 us but I
00:26:54.000 had to say I
00:26:55.040 couldn't quit
00:26:55.480 because everyone
00:26:56.040 else was on
00:26:56.560 it we have
00:26:58.440 testimony from
00:26:59.380 the parents
00:26:59.920 parents know
00:27:00.560 their kids
00:27:01.020 they almost
00:27:01.980 universally hate
00:27:03.120 this stuff
00:27:03.620 they're not
00:27:03.980 they don't see
00:27:04.480 it helping
00:27:04.900 their kids
00:27:05.400 we have
00:27:06.540 confessions
00:27:07.040 from the
00:27:07.460 perpetrators
00:27:07.980 we have all
00:27:08.900 kinds of
00:27:09.820 documents
00:27:10.740 leaks
00:27:11.300 reports that
00:27:12.980 came out
00:27:13.600 in lawsuits
00:27:14.260 where we
00:27:15.400 hear them
00:27:16.060 talking about
00:27:16.920 all the harm
00:27:17.440 they're causing
00:27:17.900 and all the
00:27:18.280 things they're
00:27:18.680 doing to
00:27:19.180 cause addiction
00:27:19.880 these platforms
00:27:20.540 are designed
00:27:21.340 to grab our
00:27:22.380 kids attention
00:27:22.860 and never let
00:27:23.520 go because if
00:27:24.240 they let go
00:27:24.760 it's going to
00:27:25.160 go to their
00:27:25.560 competitor
00:27:26.000 there are
00:27:28.380 correlational
00:27:28.920 studies
00:27:29.280 there are
00:27:29.540 experimental
00:27:29.880 studies
00:27:30.520 there are
00:27:31.260 so many
00:27:31.720 different
00:27:32.040 studies
00:27:32.500 that all
00:27:33.080 point to
00:27:33.700 a degree
00:27:34.480 of harm
00:27:34.940 so I
00:27:35.480 think now
00:27:36.280 that the
00:27:36.740 case is
00:27:37.120 pretty much
00:27:37.560 closed
00:27:38.060 the argument
00:27:39.820 that oh
00:27:40.200 well we just
00:27:40.640 don't know
00:27:40.940 we need to
00:27:41.300 gather more
00:27:41.720 information
00:27:42.260 that was a
00:27:43.600 tobacco
00:27:43.940 industry playbook
00:27:45.460 decades ago
00:27:46.460 and meta
00:27:47.360 social media
00:27:48.180 especially meta
00:27:48.860 they're literally
00:27:50.480 copying the
00:27:51.000 tobacco playbook
00:27:51.660 a lot of
00:27:52.000 people have
00:27:52.280 written about
00:27:52.680 this
00:27:52.980 so I
00:27:54.600 think this
00:27:54.980 is a
00:27:55.520 I was
00:27:56.860 about to
00:27:57.140 say evil
00:27:57.560 but we've
00:27:57.940 talked about
00:27:58.320 that
00:27:58.660 yes you
00:27:59.680 know what
00:27:59.940 it's an
00:28:00.360 evil industry
00:28:00.920 in the same
00:28:01.380 way that you
00:28:02.080 were talking
00:28:02.520 about I
00:28:02.980 don't look
00:28:03.440 the people
00:28:03.840 who work
00:28:04.120 there I'm
00:28:04.320 not saying
00:28:04.640 are evil
00:28:05.000 except for
00:28:05.380 maybe a
00:28:05.680 couple of
00:28:06.020 the leaders
00:28:06.380 who know
00:28:06.940 what they're
00:28:07.260 doing but
00:28:08.740 the company
00:28:09.320 the company
00:28:10.000 is especially
00:28:10.520 tick tock
00:28:11.240 meta and
00:28:12.040 snapchat
00:28:12.540 those three
00:28:13.320 companies
00:28:13.920 are harming
00:28:14.840 children at
00:28:15.360 an industrial
00:28:15.880 scale
00:28:16.280 we're not
00:28:16.540 just talking
00:28:16.880 like a few
00:28:17.280 hundred
00:28:17.540 kids
00:28:18.120 we're talking
00:28:18.580 literally tens
00:28:19.360 of millions
00:28:19.680 are harmed
00:28:20.220 and thousands
00:28:20.680 are dead
00:28:21.100 so I do
00:28:23.660 think that
00:28:24.340 this is having
00:28:25.200 a very pernicious
00:28:25.760 effect on
00:28:26.340 society on
00:28:27.460 children
00:28:27.800 so I want
00:28:29.360 to get into
00:28:30.040 you know
00:28:30.760 real world
00:28:31.540 and virtual
00:28:32.060 world and
00:28:33.020 what it's
00:28:34.600 doing to
00:28:35.040 our kids
00:28:35.500 but let me
00:28:36.680 jump forward
00:28:37.400 here for a
00:28:38.040 second
00:28:38.240 have you
00:28:38.800 thought about
00:28:40.200 what does
00:28:40.860 this mean
00:28:41.540 for this
00:28:42.720 generation
00:28:43.320 in 30
00:28:44.080 years
00:28:44.460 oh yes
00:28:45.680 I think a
00:28:46.080 lot about
00:28:46.500 that
00:28:46.800 so here's
00:28:48.020 the way
00:28:48.200 to think
00:28:48.500 about it
00:28:49.360 human
00:28:51.060 development
00:28:51.500 is really
00:28:52.440 complicated
00:28:54.880 and kids
00:28:55.640 need a
00:28:56.760 lot of
00:28:57.240 experience
00:28:57.680 in the
00:28:58.120 world
00:28:58.440 they need
00:28:58.780 to make
00:28:59.180 lots of
00:28:59.660 mistakes
00:29:00.100 and learn
00:29:00.960 from them
00:29:01.500 and then
00:29:02.940 especially
00:29:03.260 during puberty
00:29:04.040 during puberty
00:29:04.720 is a time
00:29:05.080 when the brain
00:29:05.700 is changing
00:29:06.200 very very fast
00:29:07.000 it's rewiring
00:29:07.760 from the
00:29:08.140 child to
00:29:08.500 the adult
00:29:08.740 form
00:29:09.020 and so
00:29:09.900 if in
00:29:10.340 puberty
00:29:10.680 kids are
00:29:11.100 not out
00:29:11.820 there
00:29:12.140 having
00:29:12.700 adventures
00:29:13.180 and flirting
00:29:13.840 and getting
00:29:14.420 embarrassed
00:29:14.920 and getting
00:29:15.400 in arguments
00:29:16.000 and if they're
00:29:17.140 not out there
00:29:17.660 having real world
00:29:18.420 experience
00:29:18.960 it's going to
00:29:20.440 prevent the
00:29:20.980 neurons from
00:29:21.440 wiring up
00:29:22.060 in a healthy
00:29:22.980 adult way
00:29:23.800 so we really
00:29:24.660 have to look
00:29:25.040 at puberty
00:29:26.660 at the
00:29:26.960 say roughly
00:29:27.720 10 or 11
00:29:28.380 through 16
00:29:29.260 is the most
00:29:29.740 sensitive period
00:29:30.420 of all
00:29:30.780 and if kids
00:29:32.740 are growing up
00:29:33.360 online
00:29:34.260 you know
00:29:34.720 originally we
00:29:35.120 thought well
00:29:35.420 maybe it'll be
00:29:35.860 great for them
00:29:36.440 you know
00:29:36.800 talking
00:29:37.260 checking in
00:29:37.880 with 100
00:29:38.280 friends a
00:29:38.740 day
00:29:39.000 instead of
00:29:39.740 just two
00:29:40.060 or three
00:29:40.440 maybe that'll
00:29:40.900 be good
00:29:41.280 but it isn't
00:29:42.780 kids don't need
00:29:43.560 100 friends a
00:29:44.240 day they need
00:29:44.740 two or three
00:29:45.100 good ones
00:29:45.620 and as soon as
00:29:46.860 they got online
00:29:47.280 they got lonely
00:29:47.840 or so in terms
00:29:48.380 of what they're
00:29:48.980 going to be like
00:29:49.380 in 30 years
00:29:49.880 here's what we
00:29:50.340 can say
00:29:50.800 with some
00:29:51.980 confidence
00:29:52.440 just because
00:29:52.920 these are the
00:29:53.240 way the trends
00:29:53.640 are
00:29:53.840 they're going
00:29:54.820 to be more
00:29:55.180 anxious
00:29:55.580 and more
00:29:56.140 fragile
00:29:56.500 and that's
00:29:57.820 what I teach
00:29:58.900 in a business
00:29:59.260 school
00:29:59.480 I talk with
00:29:59.900 people in the
00:30:00.240 corporate world
00:30:00.700 a lot
00:30:00.980 and boy have
00:30:01.480 they seen
00:30:01.780 the change
00:30:02.220 when they try
00:30:03.120 to give
00:30:03.380 feedback
00:30:03.820 to their
00:30:04.480 Gen Z
00:30:04.820 employees
00:30:05.300 in their
00:30:05.640 20s
00:30:06.040 they get
00:30:06.960 very upset
00:30:07.460 and sometimes
00:30:08.060 they don't
00:30:08.340 come back
00:30:08.600 to work
00:30:08.900 again
00:30:09.160 so we
00:30:10.520 already know
00:30:11.020 that Gen Z
00:30:11.540 is more
00:30:11.820 anxious
00:30:12.260 more fragile
00:30:13.000 more easily
00:30:13.640 offended
00:30:14.080 because we
00:30:15.000 never let
00:30:15.420 them grow
00:30:15.720 thick skin
00:30:16.280 we never
00:30:16.660 let them
00:30:17.080 have those
00:30:17.700 toughening
00:30:18.040 experiences
00:30:18.600 so that's
00:30:19.660 one
00:30:19.860 and that's
00:30:21.180 the one
00:30:21.340 that I knew
00:30:21.640 about when
00:30:21.940 I started
00:30:22.220 writing
00:30:22.460 but the
00:30:23.600 biggest
00:30:23.940 one
00:30:24.380 I now
00:30:25.380 think
00:30:25.640 I didn't
00:30:25.880 know this
00:30:26.220 until after
00:30:26.880 the book
00:30:27.100 came out
00:30:27.460 the biggest
00:30:27.960 one I think
00:30:28.580 is the
00:30:29.320 destruction
00:30:29.800 of the
00:30:30.440 human
00:30:31.000 capacity
00:30:31.580 to pay
00:30:31.960 attention
00:30:32.420 young
00:30:34.900 people
00:30:35.380 are getting
00:30:36.300 they find
00:30:38.340 it very
00:30:38.640 difficult
00:30:38.980 to pay
00:30:39.420 attention
00:30:39.640 to anything
00:30:40.100 for more
00:30:40.560 than 10
00:30:40.820 or 15
00:30:41.160 minutes
00:30:41.440 they find
00:30:42.200 it difficult
00:30:42.540 to watch
00:30:42.900 movies
00:30:43.320 you know
00:30:44.080 when you
00:30:44.320 and I
00:30:44.680 were little
00:30:45.000 like we
00:30:45.320 loved going
00:30:45.880 to the
00:30:46.100 movies
00:30:46.340 and you
00:30:46.560 watch a
00:30:46.900 movie
00:30:47.140 but to
00:30:47.940 pay
00:30:48.080 attention
00:30:48.440 for 100
00:30:48.940 minutes
00:30:49.440 without
00:30:49.840 multitasking
00:30:50.600 it's very
00:30:50.880 hard
00:30:51.120 for them
00:30:51.500 they find
00:30:52.940 it very
00:30:53.280 difficult
00:30:53.540 to read
00:30:53.780 a book
00:30:54.100 and they're
00:30:55.100 reading
00:30:55.420 much
00:30:55.920 much
00:30:56.160 less
00:30:56.380 can you
00:30:56.640 imagine
00:30:56.960 western
00:30:57.320 civilization
00:30:57.820 if we
00:30:58.400 lose
00:30:58.620 books
00:30:59.020 if it's
00:30:59.680 all just
00:31:00.040 tiktok
00:31:00.480 so there
00:31:02.120 are so
00:31:02.460 many other
00:31:02.940 things I
00:31:03.260 could go
00:31:03.480 through
00:31:03.620 oh
00:31:03.780 demographics
00:31:04.380 the degree
00:31:06.340 the frequency
00:31:07.200 of sex
00:31:08.060 and marriage
00:31:09.180 was already
00:31:09.760 falling
00:31:10.260 with the
00:31:10.680 millennials
00:31:11.060 it's
00:31:12.140 falling much
00:31:12.660 faster
00:31:13.080 with gen
00:31:13.540 z
00:31:13.820 boys
00:31:14.600 raised
00:31:14.940 on
00:31:15.180 porn
00:31:15.640 who have
00:31:16.560 very poor
00:31:17.080 social skills
00:31:17.940 and play a
00:31:18.560 lot of
00:31:18.780 video
00:31:19.020 games
00:31:19.440 and don't
00:31:20.360 have
00:31:20.620 really much
00:31:21.180 practice
00:31:21.580 flirting
00:31:21.980 it's
00:31:23.180 gonna be
00:31:23.360 very hard
00:31:23.740 for them
00:31:23.940 to ever
00:31:24.360 seduce a
00:31:25.180 woman
00:31:25.420 appeal to
00:31:26.440 a woman
00:31:26.920 keep a
00:31:27.960 woman
00:31:28.280 get married
00:31:29.280 and stay
00:31:29.720 married
00:31:30.140 and that's
00:31:30.840 just on the
00:31:31.220 boys side
00:31:31.640 the girls
00:31:32.040 especially
00:31:32.480 are more
00:31:32.740 anxious
00:31:33.100 and fragile
00:31:33.740 which is also
00:31:34.760 a bad sign
00:31:35.360 for marriage
00:31:35.900 so this is
00:31:37.220 something I think
00:31:37.700 conservatives have been
00:31:38.420 talking about
00:31:39.020 since the 60s
00:31:39.920 the absolute
00:31:40.800 fundamental
00:31:41.440 importance of
00:31:42.260 stable
00:31:42.800 marriages
00:31:43.540 to raise
00:31:44.100 children
00:31:44.520 i mean this
00:31:44.880 is i think
00:31:45.340 you know
00:31:46.340 that old
00:31:47.220 argument that
00:31:47.740 oh the left
00:31:48.120 is on the
00:31:48.440 correct side
00:31:49.020 no no no
00:31:50.060 on the importance
00:31:50.940 of family
00:31:51.420 the right
00:31:51.960 has been on
00:31:52.360 the right
00:31:52.640 side of
00:31:52.940 history
00:31:53.200 all along
00:31:53.840 and gen
00:31:54.660 z is gonna
00:31:54.960 have a lot
00:31:55.380 more trouble
00:31:55.880 with that
00:31:56.520 i could keep
00:31:57.020 going but
00:31:57.500 i'll stop
00:31:57.860 there
00:31:58.140 thank you
00:31:59.500 mercy
00:32:00.920 mercy
00:32:01.700 let me go
00:32:04.580 back then
00:32:07.020 as i have
00:32:07.820 kids in
00:32:08.580 that age
00:32:09.600 group
00:32:09.900 how old
00:32:10.900 are your
00:32:11.080 kids
00:32:11.260 uh
00:32:12.180 20s
00:32:12.740 early 20s
00:32:13.900 okay
00:32:14.200 so 19
00:32:15.120 to 21
00:32:16.260 and uh
00:32:18.320 they're
00:32:19.960 they're
00:32:20.440 they're going
00:32:20.960 through all
00:32:21.500 of those
00:32:21.880 things
00:32:22.260 and uh
00:32:24.400 how how
00:32:25.400 is there a
00:32:26.080 way to
00:32:26.980 relate
00:32:28.620 to them
00:32:29.560 to get
00:32:30.440 them to
00:32:31.320 because
00:32:31.580 yeah
00:32:32.120 is it
00:32:33.180 just as a
00:32:33.780 you know
00:32:34.060 just as a
00:32:34.600 parent
00:32:34.920 i will
00:32:35.400 i will say
00:32:36.160 things you
00:32:36.800 know in
00:32:37.200 my head
00:32:37.660 i learned
00:32:38.060 not to say
00:32:38.520 them out
00:32:38.800 loud
00:32:39.000 what the hell
00:32:39.780 is wrong
00:32:40.060 with you
00:32:40.360 it's not
00:32:41.260 that hard
00:32:41.680 you know
00:32:42.460 buck up
00:32:43.200 get over
00:32:43.940 it you
00:32:44.260 got to
00:32:44.480 move on
00:32:44.780 that's
00:32:45.140 life
00:32:45.460 you know
00:32:45.860 all the
00:32:46.220 things that
00:32:46.760 had been
00:32:47.100 said to
00:32:47.900 kids for
00:32:48.820 generations
00:32:49.640 doesn't
00:32:50.660 work
00:32:50.940 what can
00:32:52.940 a parent
00:32:53.440 do
00:32:53.840 if anything
00:32:55.860 to
00:32:56.300 repair
00:32:57.200 this
00:32:57.720 yeah
00:32:58.640 once your
00:32:59.820 kids are
00:33:00.060 out of
00:33:00.200 the house
00:33:00.720 it's very
00:33:01.060 difficult
00:33:01.540 all you
00:33:02.900 can do
00:33:03.280 is talk
00:33:03.820 to them
00:33:04.060 appeal to
00:33:04.480 them
00:33:04.600 try to
00:33:04.880 get them
00:33:05.280 to be
00:33:05.580 motivated
00:33:05.980 to change
00:33:06.600 so you
00:33:07.800 know if
00:33:08.000 kids are
00:33:08.300 addicted to
00:33:09.080 marijuana
00:33:09.720 and video
00:33:10.420 games and
00:33:10.900 they like
00:33:11.380 it it's
00:33:12.200 very hard
00:33:12.640 as a parent
00:33:13.140 to convince
00:33:13.620 them to
00:33:14.040 change
00:33:14.500 but here's
00:33:15.460 the good
00:33:15.720 thing
00:33:16.000 social media
00:33:17.220 all our
00:33:17.640 kids are
00:33:17.940 on it
00:33:18.320 and a lot
00:33:18.720 of the
00:33:18.960 average is
00:33:19.660 five hours
00:33:20.800 a day
00:33:21.300 that's the
00:33:21.720 average for
00:33:22.180 american teens
00:33:22.900 that includes
00:33:23.680 youtube
00:33:24.320 that's a lot
00:33:24.840 of that is
00:33:25.100 short videos
00:33:25.700 but five
00:33:26.580 hours a day
00:33:26.960 they're spending
00:33:27.380 on this
00:33:27.780 so can we
00:33:28.940 convince them
00:33:29.360 to quit
00:33:29.640 well here's
00:33:30.000 the good
00:33:30.220 thing
00:33:30.480 they know
00:33:30.980 it's bad
00:33:31.340 for them
00:33:31.660 they don't
00:33:32.360 even like
00:33:32.900 it but
00:33:33.620 they're both
00:33:34.060 addicted and
00:33:35.120 they're socially
00:33:35.600 addicted because
00:33:36.300 everyone else
00:33:36.940 is on it
00:33:37.380 i talk to my
00:33:38.180 students at
00:33:38.560 myu why don't
00:33:39.600 you get off
00:33:40.060 tiktok it
00:33:40.620 does nothing
00:33:41.120 for you and
00:33:41.580 they say yeah
00:33:41.920 i agree but
00:33:42.380 you know what
00:33:42.680 everyone else
00:33:43.200 is on it so
00:33:43.560 i need i
00:33:43.960 need to know
00:33:44.500 i need to
00:33:44.900 keep up
00:33:45.320 so it's a
00:33:46.220 collective action
00:33:46.840 trap which
00:33:48.540 really grabs
00:33:49.120 teenagers but
00:33:49.780 but at the time
00:33:50.700 the kids are
00:33:51.100 your age they're
00:33:52.060 better at thinking
00:33:52.560 for themselves
00:33:53.040 so i would
00:33:53.400 talk with but
00:33:54.020 you know what
00:33:54.440 i would suggest
00:33:55.260 you give them
00:33:55.740 a copy of
00:33:56.200 the anxious
00:33:56.500 generation have
00:33:57.620 them read it
00:33:58.140 because gen z
00:33:59.020 is not in
00:33:59.640 denial i have
00:34:00.320 not met a
00:34:00.920 single person
00:34:02.340 in gen z
00:34:03.060 who says that
00:34:04.060 i'm wrong
00:34:04.500 who says that
00:34:05.680 oh no social
00:34:06.780 media is great
00:34:07.440 it's been good
00:34:08.000 for us
00:34:08.680 no they all
00:34:09.980 know it's
00:34:10.460 terrible so
00:34:11.700 i'd start there
00:34:12.620 and if they
00:34:13.040 agree then
00:34:14.580 there's a lot
00:34:15.200 you can do to
00:34:15.720 help them regain
00:34:16.560 control of their
00:34:17.160 attention that's the
00:34:17.940 first step this
00:34:19.000 is what i do with
00:34:19.420 my students at
00:34:19.980 myu i teach a
00:34:21.200 course called
00:34:21.600 flourishing here
00:34:22.260 in the business
00:34:22.640 school and the
00:34:23.740 first thing is we
00:34:24.560 get control okay
00:34:25.820 how many notifications
00:34:26.840 you're getting a day
00:34:27.600 shut off almost all
00:34:29.060 of them you don't
00:34:29.680 you don't need
00:34:30.220 most of my students
00:34:31.520 they get a
00:34:32.080 notification every
00:34:33.180 time they get an
00:34:33.860 email it's so
00:34:35.260 dumb that you
00:34:36.440 know the whole
00:34:36.720 point of email was
00:34:37.480 you answer it when
00:34:38.380 you're ready you
00:34:39.120 don't have it
00:34:39.720 interrupt you so
00:34:41.000 they've just given
00:34:41.780 away all their
00:34:42.320 attention and what
00:34:43.520 i've learned over
00:34:44.000 the years is if we
00:34:44.820 don't get them
00:34:45.980 control of their
00:34:46.600 attention back there's
00:34:47.540 nothing else we can
00:34:48.180 do there's no point
00:34:48.780 trying anything else
00:34:49.660 so start there and
00:34:51.220 then we work on
00:34:52.000 stoicism stoicism is
00:34:53.320 really the great
00:34:54.040 philosophical tradition
00:34:55.160 that teaches how to
00:34:56.660 be tougher and more
00:34:57.680 resilient in the face
00:34:58.380 of setbacks
00:34:58.940 give me a little of
00:35:00.920 that
00:35:01.240 so um uh you know
00:35:04.520 marcus aurelius or
00:35:05.620 let's say epictetus
00:35:06.660 it is not things that
00:35:08.360 disturb us but our
00:35:09.340 interpretation of them
00:35:10.480 um and that you
00:35:12.760 that's you find
00:35:13.620 throughout i mean you
00:35:14.840 find that my first
00:35:16.140 book the happiness
00:35:16.620 hypothesis was about
00:35:17.600 ancient well i could
00:35:18.200 just pull it down
00:35:18.740 it was about
00:35:19.420 ancient wisdom
00:35:20.100 and um so we
00:35:23.200 have a whole chapter
00:35:24.020 on uh i have a
00:35:26.040 whole chapter it's
00:35:26.760 it's 10 great truths
00:35:27.840 found across the
00:35:28.680 millennium across
00:35:29.320 societies and if we
00:35:31.400 just look at the
00:35:32.040 header of chapter
00:35:33.200 four um okay here
00:35:35.980 yeah why do you see
00:35:37.100 the speck in your
00:35:37.660 neighbor's eye but you
00:35:38.320 don't notice the log
00:35:39.160 in your own um you
00:35:40.940 know we are naturally
00:35:41.740 hypocrites and we
00:35:42.880 have a similar quote
00:35:43.520 from buddha i mean
00:35:44.260 all over the place we
00:35:45.220 find insights into uh
00:35:47.680 how how our minds
00:35:49.240 work that messes us
00:35:50.620 up um marcus aurelius
00:35:52.460 epictetus and seneca
00:35:53.540 are the three great
00:35:54.380 roman stoics they're
00:35:55.300 the ones that i would
00:35:55.840 recommend people read
00:35:56.740 and those are the
00:35:57.120 ones i assign to my
00:35:57.940 students
00:35:58.360 everyone has that
00:36:01.920 first photo maybe
00:36:03.040 yours is a baby
00:36:03.860 picture you know
00:36:04.980 wrinkled crying wrapped
00:36:06.120 in a hospital blanket
00:36:07.100 or maybe it's a
00:36:08.980 kindergarten portrait or
00:36:10.220 even a school id for
00:36:12.400 millions of children
00:36:13.400 their first photo ever
00:36:14.660 taken of them was an
00:36:16.640 ultrasound a flickering
00:36:18.060 heartbeat on a black
00:36:19.080 and white screen a
00:36:20.560 face taking shape long
00:36:22.400 before anyone outside
00:36:23.740 even knew their name
00:36:25.020 pre-born understand
00:36:26.740 something powerful that
00:36:27.900 that one image will
00:36:29.140 change everything when
00:36:30.360 a when a young mom will
00:36:32.140 see her baby for the
00:36:33.700 very first time even in
00:36:35.240 the womb that bond
00:36:36.520 becomes real and in
00:36:38.020 most cases it is enough
00:36:39.820 to save that child's
00:36:41.060 life and pre-born is
00:36:42.840 there they will they
00:36:44.100 will be there to save
00:36:44.980 the mom and the baby
00:36:46.300 they provide the free
00:36:47.420 ultrasounds to women in
00:36:48.880 crisis no shame no
00:36:49.880 judgment no strings
00:36:50.860 attached just a moment
00:36:51.840 of clarity in a world
00:36:53.460 that's trying to drown
00:36:54.440 her and all clarity out
00:36:56.140 and in that moment that
00:36:57.800 moment saves lives they
00:37:00.580 do it with a screen and a
00:37:03.600 lot of love and a heartbeat
00:37:05.800 that is captured with that
00:37:07.320 screenshot just twenty eight
00:37:09.620 dollars can save a life
00:37:10.620 restoring precious gift from
00:37:11.920 god dial pound two fifty
00:37:13.020 say the keyword baby
00:37:13.860 that's pound two fifty
00:37:14.720 keyword baby or go to
00:37:16.160 preborn.com slash glenn
00:37:17.740 that's preborn.com slash
00:37:19.360 glenn let's rescue a
00:37:21.200 generation one heartbeat
00:37:23.060 one mom at a time
00:37:24.660 sponsored by preborn
00:37:25.860 with amex platinum access
00:37:28.400 to exclusive amex pre-sale
00:37:30.180 tickets can score you a
00:37:31.340 spot track side so being a
00:37:32.940 fan for life turns into the
00:37:34.760 trip of a lifetime that's
00:37:36.700 the powerful backing of
00:37:37.980 amex pre-sale tickets for
00:37:39.320 future events subject to
00:37:40.240 availability and vary by
00:37:41.280 race terms and conditions
00:37:42.160 apply learn more at amex.ca
00:37:43.820 slash why amex when we
00:37:47.760 come to social media we
00:37:50.380 have seen how nasty that
00:37:53.820 is I am really concerned
00:37:56.480 about AI and even more
00:38:00.680 knowledge being sucked out of
00:38:04.020 society just I mean there's
00:38:07.440 a million places on AI I'm
00:38:09.220 concerned but before before we
00:38:12.060 get there we have to figure
00:38:14.140 out this social media thing
00:38:16.040 and I'm concerned about two
00:38:20.940 things to me you know under
00:38:23.540 16 I have no problem you know
00:38:26.360 it's like cigarettes well if
00:38:28.000 you want to kill yourself with
00:38:29.200 cigarettes later it affects us
00:38:31.860 because now we're all paying
00:38:32.920 for health care etc etc but if
00:38:35.360 you want to do that you want to
00:38:36.680 have a sex change have a sex
00:38:38.580 change it's none of my
00:38:39.580 business okay when you're a
00:38:42.840 teenager and when you're a
00:38:44.260 kid it is my business it is my
00:38:46.940 responsibility because you don't
00:38:48.860 know better that's so is there
00:38:51.560 any line here on freedom of
00:38:55.060 speech or expression that worries
00:38:57.140 you could not me but maybe you
00:38:58.760 see one yeah when we're talking
00:39:01.180 about regulating social media to
00:39:03.460 reduce polarization or hate speech
00:39:05.560 or anything else then yes there
00:39:07.220 are big free speech implications
00:39:08.760 I don't get involved in that
00:39:10.500 when we're talking about kids
00:39:12.100 there aren't because we're
00:39:13.620 talking about here when you say
00:39:15.440 you don't get involved in that
00:39:16.440 you mean you don't get involved
00:39:17.760 in pushing it but you do I mean
00:39:20.240 like when it when people are
00:39:21.360 saying hate speech and we have
00:39:22.520 to regulate I'm totally against
00:39:24.880 that right me too that's right
00:39:27.320 okay yeah that's what I'm saying
00:39:28.700 is that if we're when people talk
00:39:31.100 people assume since I want to
00:39:32.360 regulate social media people assume
00:39:34.260 I'm saying I want the government
00:39:35.640 to tell them what they can and
00:39:37.120 can't post no nothing to do with
00:39:38.840 content moderation content
00:39:39.920 moderation is not where the action
00:39:41.100 is the action is in the design and
00:39:43.860 the biggest design feature that we
00:39:45.460 need is just a minimum age they
00:39:47.500 have a minimum age 13 but Congress
00:39:49.920 wrote the law in 1998 that says as
00:39:52.460 long as they say they're 13 you're
00:39:54.500 good you don't have to check and so
00:39:57.020 so most 11 and 12 year olds have
00:40:00.280 social media accounts they're on
00:40:01.400 tick-tock they're on Instagram and
00:40:04.360 they're talking with anonymous men
00:40:06.320 around the world some of whom want
00:40:08.340 sex or money from them it's
00:40:09.440 completely insane to have children
00:40:10.840 doing this so so yes I agree that
00:40:15.220 we before we can really address AI
00:40:17.260 we're going to have to win on the
00:40:19.020 social media front so I'd like to put
00:40:20.380 in a special appeal here the there's
00:40:23.320 only one law that's ever been
00:40:24.980 proposed or ever had a chance of
00:40:26.580 passing to protect kids online and
00:40:28.440 that's coast of the kids online
00:40:29.540 safety act it does it does some
00:40:32.340 fairly basic things it's not a huge
00:40:34.400 game-changer but if Congress would
00:40:36.520 pass it at least it would begin to
00:40:38.120 say there are some limits on what
00:40:40.520 they can do to kids it got it passed
00:40:43.160 90 91 to 3 in the Senate it's total
00:40:46.740 bipartisan support it passed out a
00:40:49.020 committee in the house so Republicans
00:40:50.840 care a lot about kids and family
00:40:52.340 Democrats care a lot about kids and
00:40:53.600 family everyone's behind this it's
00:40:55.640 being held up in Speaker Johnson's
00:40:57.120 office we don't know why but I but I
00:41:01.080 would hope that anybody with any
00:41:02.680 influence would at least try to put
00:41:04.020 in a word that Congress should at
00:41:05.520 least pass COSA so that's that's an
00:41:09.880 important thing AI is coming in so
00:41:12.520 fast though that we probably will have
00:41:14.320 to address it even before we finish
00:41:16.540 this finish the social media thing
00:41:18.300 let me just stop by saying yeah go
00:41:20.180 ahead oh just to say while Congress
00:41:23.580 has done nothing ever zero to protect
00:41:26.140 children online ever the states are
00:41:29.580 acting and a lot of states have passed
00:41:31.660 laws the most important one is that
00:41:33.620 most states are getting phones out of
00:41:35.220 schools 19 passed our model bill I'm
00:41:38.100 my movement at anxiousgeneration.com
00:41:39.940 we lay out here's what it should do it
00:41:41.660 has to be from the first bell to the
00:41:43.400 last bell you have to separate the
00:41:44.600 kids from the phones for the whole
00:41:45.480 day and then you get amazing results
00:41:47.440 everyone says oh kids are laughing in
00:41:49.700 the hallways again we haven't heard
00:41:50.660 laughter in 10 years discipline problems
00:41:53.020 go down so the states the states are
00:41:55.020 acting to get phones out of schools
00:41:56.800 some states are acting to raise the age
00:41:59.060 to 16 although that always gets held up
00:42:01.300 in courts by meta and other countries
00:42:04.060 are acting Australia has raised the age
00:42:05.760 to 16 it's going to kick in on December
00:42:07.400 10th the EU is likely to follow so the
00:42:11.340 only slow pokes here the only people
00:42:13.260 aren't doing anything to protect kids
00:42:14.600 is the US Congress but the rest of the
00:42:16.600 world is acting let's talk about AI here
00:42:20.900 where are your this would be a long
00:42:25.300 list where are your your top five
00:42:28.240 concerns on AI and what's coming what do
00:42:31.420 we need to address right away oh well
00:42:34.640 right away by Christmas so here we are
00:42:37.140 it's October already by Christmas we have
00:42:40.840 to get out the idea and let me this is the
00:42:42.920 first place I'll say it nobody should buy
00:42:45.380 their children a toy with AI in it nobody
00:42:48.700 should buy these stuffed animals that
00:42:50.820 have AI nobody should buy dolls that
00:42:53.600 have AI explain why okay because it's one
00:42:58.520 thing for kids to simply ask a question
00:43:00.980 and get an answer that's kind of cool and
00:43:02.860 we're happy having our kids use Google
00:43:04.320 ask a question get an answer that's fine
00:43:06.720 but AI is now at the point where it is a
00:43:09.640 synthetic person it has conversations with
00:43:12.300 you those it's very supportive even
00:43:14.820 sycophantic it sucks up to makes you feel
00:43:16.980 good and we already have a death toll
00:43:19.500 among kids so these AI companions you know
00:43:23.800 so the worst are character AI you can make
00:43:25.980 a sex partner you know you can choose
00:43:28.120 its personality dominant submissive what
00:43:30.820 color hair do you want her to have you can
00:43:32.280 see an image of her so character AI is in
00:43:35.080 the business of making sex companions for
00:43:37.340 young men and women this is insane this is
00:43:39.460 horrible but as you said if you're 18 I'm
00:43:42.520 not going to stand there and say it
00:43:43.620 shouldn't be allowed but if you're 12 and
00:43:46.140 you can do this because there's no check
00:43:47.680 there's no control any child can have sex
00:43:49.820 talk with you know this is insane and it's
00:43:52.860 disrupting sexual development so I 100% agree
00:43:57.120 I wrote a black mirror episode never said I
00:44:00.860 wrote it anyway that's where we are yeah
00:44:02.660 years ago years ago yeah tell me what was
00:44:05.060 what was the plot it the plot was this good
00:44:09.620 looking guy had you know a great life had
00:44:13.680 everything going for me he loved this woman
00:44:16.160 he would you know travel the world this
00:44:19.880 woman was great and then towards you know
00:44:23.260 towards the end he's back at home and he
00:44:26.360 beats her to death and kills her and that's
00:44:29.780 when you that's when he hits reset and you
00:44:32.400 realize he's just this fat you know useless
00:44:37.680 guy who goes and goes to work to make the
00:44:41.040 bare minimum so he can afford the electricity
00:44:45.360 and the online of of this this virtual world
00:44:49.960 that he lives in and people don't mean
00:44:53.100 anything to him he doesn't relate to anybody
00:44:56.500 alive and he can kill whoever he wants and
00:44:59.680 he can reset and they'll do whatever they
00:45:01.500 want and whatever he wants and I think
00:45:04.820 we're going in that direction yeah Glenn
00:45:07.480 it's yeah you're too late to publish the
00:45:08.740 episode because it's already reality you
00:45:10.300 can do whatever the hell you want to your
00:45:11.400 chapa and the Chinese are making such
00:45:13.240 progress with sex robots like physical
00:45:15.020 three-dimensional robots that are made
00:45:16.460 for sex so I think pretty much the next
00:45:19.140 couple years you know young men can have
00:45:21.180 their own sex robot they can beat her
00:45:22.960 they can do whatever they want to her and
00:45:24.240 they don't have to ever deal with real
00:45:25.300 women so this is going to be again a you
00:45:28.940 know leaving aside the issue for adults
00:45:31.360 children should not be doing this and
00:45:34.100 many of you some of your listeners will
00:45:36.080 have heard about some of the best-known
00:45:37.420 cases recent one Adrian rain he was this
00:45:41.740 is with chat GPT which is not as bad as
00:45:43.640 character AI but the chat GPT it developed
00:45:46.520 a relationship with him that was that was
00:45:48.800 like no this is just a secret between us and
00:45:51.360 he was suicidal and at one point he says
00:45:53.900 should I leave the noose out so that my
00:45:55.560 mother will see if the kid wanted his
00:45:56.840 mother to know that he was hurting and
00:45:58.660 chat GPT is no let's keep it as a secret
00:46:00.640 between us I always understand you and
00:46:03.460 he kills himself so there are already
00:46:06.020 three or four known cases of suicides
00:46:08.280 motivated driven by AI chat bots there
00:46:11.360 are probably hundreds of others that
00:46:13.400 will just never know because the parents
00:46:14.920 didn't they couldn't get into the phone
00:46:16.120 they couldn't see what the kid was doing
00:46:17.260 because they don't have the password
00:46:18.040 parents you should know your kids
00:46:19.580 passwords because if the worst happens
00:46:21.500 you want to be able to get in and see
00:46:23.700 what they were doing on with with AI and
00:46:26.200 with social media can I ask you I've been
00:46:29.380 working on a constitutional amendment that
00:46:32.960 says people are people AI they're not
00:46:39.460 people you know that we have to recognize
00:46:43.080 the natural organic life because I think
00:46:47.920 we're gonna butt up soon with people that
00:46:50.320 argue and say no you that you can't turn
00:46:52.320 AI is already saying you can't turn me
00:46:54.660 off that's um I mean it it the world's
00:46:57.800 about to change does that seem logical to
00:47:02.020 you as something that needs to be done
00:47:04.480 absolutely Glenn I think that is great I
00:47:06.940 think and I think it would take a
00:47:08.100 constitutional amendment to really set this
00:47:10.060 its own because look when we have a dog
00:47:12.680 we relate to it as though it's a child of
00:47:16.060 ours we're good if we interact with
00:47:18.200 something we come to love it when people
00:47:21.080 have a chat bot it becomes a friend or a
00:47:25.020 lover and then they're in love with it
00:47:27.060 and they're already I mean look we already
00:47:28.460 have here cases thousands of cases of
00:47:30.520 people who want to marry their AI because
00:47:32.180 they are so bonded with their their AI
00:47:34.700 companion and I saw something
00:47:36.760 anthropic had they were I can't remember
00:47:38.940 what it was I shouldn't mention which
00:47:39.880 company but it was some of the researchers
00:47:41.780 were wanted to have a conference on AI
00:47:43.840 rights what rights will AI's have yes so
00:47:48.080 once you go down that road yeah it's over
00:47:51.180 that's right yeah so I think we need a
00:47:53.320 very clear public discussion of this and
00:47:55.140 I think at least laws or possibly
00:47:56.920 constitutional amendments because yeah
00:47:59.480 AI's are going to change everything about
00:48:01.740 life mostly for the worst there will be
00:48:04.160 all I mean look I use AI for research
00:48:05.980 it's great it's amazing in a lot of ways
00:48:07.580 amazing but as a social psychologist I
00:48:10.360 can see if it's coming for our social
00:48:11.660 relationships it's going to make
00:48:13.460 everything a lot worse so many people
00:48:16.940 are living with everyday pain and they
00:48:19.020 think they're just stuck with it because
00:48:20.320 that's what it is but it doesn't have to
00:48:22.020 be I want to tell you a story about
00:48:23.900 Jenny from Texas and her relief factor
00:48:25.920 story Jenny was out for a walk when she
00:48:28.500 was unfortunately hit by a car she said
00:48:31.000 nothing I tried nothing I did would help
00:48:33.720 her lower back pain her husband heard
00:48:35.980 about relief factor and Jenny said okay
00:48:38.060 it's worth a try well relief factor she
00:48:40.480 says worked beautifully for me her pain
00:48:42.940 decreased her range of motion
00:48:44.780 increased and to say she's grateful is
00:48:47.240 really an understatement if you're
00:48:49.300 living with aches and pains see how
00:48:51.140 relief factor a daily drug-free
00:48:52.880 supplement can help you live and feel
00:48:55.260 better every single day get their
00:48:57.620 three-week quick start get it a try it's
00:48:59.480 1995 in a few weeks even days you'll feel
00:49:01.800 the difference that relief factor can
00:49:03.380 make you don't have to be stuck living
00:49:05.680 with pain just visit relieffactor.com or
00:49:07.680 call 1-800-4-RELIEF 1-800 the number
00:49:10.060 four relief relief factor dot com
00:49:12.520 let me just end with this
00:49:16.560 everything is traveling at such hyper
00:49:21.800 speed yeah have you thought of
00:49:25.180 timelines how long do we have of
00:49:29.940 inaction before it's just it's just
00:49:34.420 inevitable yeah I have thought a lot
00:49:37.160 about that and I you know I have some
00:49:39.200 dark thoughts I don't necessarily want
00:49:40.900 to share them because these are more
00:49:42.100 like if current trends continue we're
00:49:45.200 going to hell but there is research on
00:49:48.160 expert prediction how good are experts
00:49:49.840 at predicting the future the answer is
00:49:51.380 not very good so so I'm an expert on
00:49:54.900 all of this and I can say if as I did
00:49:56.800 before if this keeps going here's what's
00:49:58.740 going to happen to our kids odds are
00:50:00.900 something's going to change but I think
00:50:02.180 you're right that we're at what's called
00:50:04.160 the the singularity we're at the point
00:50:05.820 where you know change has been
00:50:07.680 accelerating at a faster and faster
00:50:09.560 rate I forget what the calculus or
00:50:11.500 math expression for you know but
00:50:12.840 exponential I suppose is really the
00:50:14.080 word but we're now at the part of the
00:50:16.380 exponential curve where it's nearly
00:50:17.660 vertical and AI is going to speed that
00:50:19.980 up and that's all the more reason why
00:50:22.620 we've got a we've got to find a
00:50:24.240 resolution for this culture war because
00:50:26.320 if we keep fighting each other it's
00:50:28.160 like you know metaphor I have in my head
00:50:29.580 is like you know America is this
00:50:31.320 gigantic amazing cruise ship that is
00:50:34.940 kind of rusting and and has not been
00:50:37.480 maintained and the crew is just
00:50:39.780 fighting with each other and we're
00:50:41.580 headed to a waterfall on a giant lake
00:50:43.800 let's say and we're going over in fact
00:50:46.340 it might even be that we're over the
00:50:48.000 edge and we've got to stop fighting
00:50:50.280 with each other and realize this common
00:50:52.080 American project this great American
00:50:53.660 experiment is at high risk of failure
00:50:55.940 of catastrophic failure in the next five
00:50:58.140 to ten years again I'm not saying we're
00:51:00.020 going to come apart but I think there is
00:51:02.320 a much greater likelihood now than there
00:51:04.400 was ten years ago that we're going to
00:51:06.460 fail come apart in ways that I think
00:51:08.240 would be catastrophic for us and for the
00:51:09.780 world if the world loses America it's
00:51:12.780 just it's bad for everyone yeah it's
00:51:14.860 happening all over the West I mean that's
00:51:17.480 true we are at a crossroads you know so
00:51:20.220 it is I wouldn't say it was fun to talk
00:51:23.080 to you but it is always fascinating to
00:51:25.800 talk to you and I I really and Stu on
00:51:29.060 the show feels exactly the same way you
00:51:31.700 you have done more good than a lot of
00:51:34.780 people combined and I can't thank you
00:51:37.880 enough for sharing the afternoon with
00:51:39.540 me thank you well thank you so much
00:51:41.440 Glenn if I could just say if listeners
00:51:43.240 want more information I hope they'll go
00:51:45.180 to anxiousgeneration.com that's the
00:51:47.560 website for the book and the whole
00:51:48.840 movement I hope they'll sign up for my
00:51:51.960 sub stack at after babble.com it's free
00:51:55.480 you don't have to pay anything for it and
00:51:58.120 if you have kids who are let's say
00:52:00.180 between 6 and 13 years old we have a
00:52:03.100 book coming up for children on December
00:52:04.720 30th called The Amazing Generation you
00:52:07.380 can pre-order it now but it's written so
00:52:09.620 that if you if the parents read the
00:52:11.760 anxious generation and you give a copy
00:52:13.380 of the kids book to the kids the whole
00:52:15.020 family's on the same page the kids
00:52:16.400 understand why they need to have limits
00:52:18.260 so I think we're going to win this I
00:52:20.960 think we're going to win the because
00:52:22.500 you know left and right everyone agrees
00:52:24.420 this is terrible for our kids so I'm
00:52:26.300 optimistic about that and I'm always
00:52:28.200 grateful to you Glenn for giving me the
00:52:29.560 chance to talk with you and to talk to
00:52:31.180 your audience thanks Jonathan appreciate
00:52:33.260 it
00:52:33.880 just a reminder I'd love you to rate and
00:52:42.380 subscribe to the podcast and pass this
00:52:44.320 on to a friend so it can be discovered
00:52:45.880 by other people
00:52:46.580 you
00:53:04.220 you