The Glenn Beck Program - March 02, 2024


Ep 211 | Dr. Phil's WARNING for Parents & His Advice for Trump's Legal Team | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 29 minutes

Words per Minute

141.56602

Word Count

12,625

Sentence Count

1,125

Misogynist Sentences

17

Hate Speech Sentences

14


Summary

Dr. Phil has been a staple on daytime television for over 25 years, saving countless troubled marriages with his ability to deliver an unapologetic, tough love with a dashed southern charm. He s guided thousands of people who have been struggling with everything from their weight to their relationships. Before blazing into his number one television career, Dr. Phil was a successful trial consultant. He had clients like Exxon and Oprah Winfrey.


Transcript

00:00:00.000 This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460 Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420 All Porter fares include beer, wine, and snacks and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.860 And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240 Visit flyporter.com and actually enjoy economy.
00:00:30.000 And now, a Blaze Media podcast.
00:00:34.020 Today's guest may have been your first therapist, depending on your age.
00:00:39.220 He has been a staple on daytime television for over 25 years, saving countless troubled marriages with his ability to deliver an unapologetic, tough love with a dashed southern charm.
00:00:52.620 He's a no-nonsense guy.
00:00:54.600 He's, you know, guided, I don't know, thousands of people who have been struggling with everything from their weight to their relationships.
00:01:02.820 Maybe you can help me on that a bit.
00:01:05.300 Before blazing into his number one television career, he was a successful trial consultant.
00:01:13.060 He had clients like Exxon.
00:01:16.300 In fact, one that really put him on the map was his client, Oprah Winfrey.
00:01:20.900 He's a New York Times best-selling author.
00:01:22.880 He has a star on the Hollywood Walk of Fame.
00:01:24.980 Founder of a brand-new media company here in Texas.
00:01:28.320 He ditched Hollywood for the free state of Texas.
00:01:31.440 So, just like he always says to the people on his show, how's that working for you?
00:01:37.060 Dr. Phil, he's got a new book out.
00:01:39.140 It is called We've Got Issues, How You Can Stand Strong for America's Soul and Sanity.
00:01:44.620 This, I think, is a Dr. Phil, at least I haven't seen before.
00:01:50.280 I think you're going to enjoy our conversation.
00:01:52.520 Welcome to the program, Dr. Phil.
00:01:54.440 Before we get to Dr. Phil, let me tell you, last year, because of you,
00:02:00.440 Preborn's network of clinics saw over 58,000 babies saved.
00:02:04.660 Thank you to everybody who made this possible.
00:02:07.260 We should celebrate the lives of the precious babies that were saved,
00:02:11.180 and quite honestly, the lives of the moms.
00:02:14.340 When Charlotte found out she was pregnant, she was seven weeks along.
00:02:17.640 In the back of her mind, she had no support from anybody.
00:02:20.260 She thought abortion's going to be the best solution.
00:02:22.480 But then she went into a preborn clinic, and they gave her a free ultrasound,
00:02:28.940 paid for by somebody like you.
00:02:30.640 And she saw her baby on the ultrasound.
00:02:32.960 She heard the heartbeat, and she chose life.
00:02:35.760 Now, she also needed help.
00:02:38.200 So she got all of the postnatal care that she needed,
00:02:41.480 and all the way to baby clothes and books and diapers and everything else she might have needed
00:02:45.640 for up to two years.
00:02:47.460 Each of these babies are miraculous, and so are their moms.
00:02:50.920 Preborn celebrates 200 miracles.
00:02:55.000 $28 a day can be the difference between life and death.
00:03:00.160 $28 a month.
00:03:02.220 $28 once.
00:03:04.860 When a mom meets her baby on the ultrasound and hears their heartbeat,
00:03:08.080 it is a divine connection.
00:03:09.460 It doubles the baby's chance to life.
00:03:11.220 Will you be the person that either makes a major gift or even a $10 gift
00:03:17.340 to help another mom and baby survive?
00:03:21.280 Just dial pound 250, say the keyword baby.
00:03:23.420 Pound 250, keyword baby.
00:03:24.880 Preborn.com slash Glenn.
00:03:26.960 That's preborn.com slash Glenn.
00:03:28.920 Dr. Phil, nice to have you here.
00:03:45.580 Thanks for having me.
00:03:46.460 And I mean that in two ways, too.
00:03:47.980 Nice to have you in the studio, but also nice to have you in Texas.
00:03:51.260 Well, I'm glad to be back.
00:03:52.840 You know, we're from here.
00:03:53.700 I used to live in Las Colinas.
00:03:54.840 Yeah, that's where the studio is, right here.
00:03:57.820 Yeah, right here.
00:03:58.680 So, I've actually recorded a couple of audio books in this building.
00:04:02.660 Really?
00:04:03.320 Yeah, a long time ago, right by the front door there.
00:04:05.660 Really?
00:04:06.820 Yeah, it's changed a lot.
00:04:09.020 It was the old Paramount lot.
00:04:10.880 I bought it about 10 years ago, and we've changed it a lot.
00:04:13.340 So, it's good to have you back.
00:04:15.020 Well, you've done more than changed it a lot.
00:04:16.940 You've really built this thing up.
00:04:18.720 It's amazing what you've done here.
00:04:20.580 I was in it when it was more warehouse than studio.
00:04:24.000 Yeah.
00:04:24.840 So, you've really turned this into a broadcast center.
00:04:27.580 Yeah, yeah.
00:04:28.520 I mean, this studio is the largest studio in the Americas that's in daily TV production.
00:04:36.200 Yeah.
00:04:37.340 It's pretty amazing.
00:04:39.600 Anyway, I have watched you for years.
00:04:45.100 But I think you're changing a great deal.
00:04:51.940 Reading the book, We've Got Issues.
00:04:55.440 Holy cow.
00:04:56.280 This is not the Dr. Phil.
00:04:58.160 I mean, it is and it isn't.
00:04:59.480 It's not the Dr. Phil that I know.
00:05:02.400 Yeah.
00:05:02.580 You said it right.
00:05:03.900 It is and it isn't.
00:05:05.820 I mean, it's the same kind of common sense, shoot from the hip.
00:05:11.580 But I highlighted a few things I just want to go through.
00:05:15.480 Because this is...
00:05:16.060 Well, let me just read this.
00:05:19.840 You said at the very, very beginning that you have been doing this for a very long time
00:05:26.800 and you've been listening to people who have problems, relationships, but you noticed something
00:05:35.320 change.
00:05:36.580 What was it?
00:05:37.260 Well, you know, it's been a process, really.
00:05:41.460 You have to understand, having been doing this for 25 years or a little more, actually,
00:05:48.900 spending the time I did on Oprah and I started my own show in 2002.
00:05:54.820 And I didn't really think about it until I sat down and started timelining this out, Glenn.
00:06:01.000 But in 2002, the first text message hadn't been sent.
00:06:07.980 There were no text messages.
00:06:09.760 We weren't at all digital.
00:06:12.600 So along about 06, 07, we started to get much more into the internet.
00:06:20.360 And then 08, 09...
00:06:22.640 iPhone.
00:06:23.580 It was like a bunch of C-130s flew over and dropped smartphones on everybody.
00:06:29.200 And that's when I saw as big a change in our society as has happened in my lifetime, for
00:06:37.480 sure.
00:06:38.260 I think as big a change to mankind as has happened since the Industrial Revolution.
00:06:45.260 We think about it.
00:06:46.120 We are walking around with as much computing power in our hand as we had when we did the
00:06:50.940 moonshot.
00:06:51.720 And that changed everything.
00:06:53.500 Yeah, and especially with what's coming, they say that the last 400 years, all of the changes
00:07:02.800 in the last 400 years will now be compressed between right now and 2030, 2035.
00:07:10.380 Yeah.
00:07:11.780 Man is not geared for that.
00:07:15.580 I mean, we are animals and our instincts, everything comes from millions of years of experience.
00:07:23.740 We're not ready for this.
00:07:24.740 And it's showing, because if you look, particularly at our young people who immerse themselves in
00:07:31.740 this technology, we're seeing the highest levels of anxiety, depression, loneliness, suicidality
00:07:39.080 among our young people, starting in 09, 010, right after we had all of this technology boom,
00:07:47.300 that have been recorded, the highest levels that have been recorded since they started keeping
00:07:52.040 records for that sort of thing.
00:07:54.180 Our young people stopped living their lives and started watching people live their lives
00:07:59.440 and comparing themselves to that.
00:08:01.660 But the problem was, they're comparing themselves to fictional lives.
00:08:05.400 These aren't real lives.
00:08:06.520 These influencers over there, I've had them on the show that have said, look, I shoot a
00:08:12.300 video with all these fancy clothes and saying, okay, I'm in a rush.
00:08:16.200 I'm going to the NBA All-Star Game.
00:08:18.280 And they say, as soon as the video's over, I carefully take those clothes off because I
00:08:23.720 don't own them.
00:08:24.480 I have to take them back to the store because I just brought them home.
00:08:28.880 Now, I don't have the money for those, so I take them back.
00:08:31.500 I put on my sweats and get on the couch.
00:08:33.280 I'm not going to the NBA All-Star Game.
00:08:35.860 So kids watch this and say, by comparison, what a loser am I?
00:08:39.760 I mean, I'm not, I don't have that life.
00:08:42.900 So their self-esteem goes down.
00:08:44.700 Their self-worth goes down.
00:08:45.980 By comparison, they get anxious and depressed.
00:08:49.400 And they're comparing themselves to this fantasy life that doesn't even exist.
00:08:54.740 We have a place out in Santa Monica where they have a fake fuselage to a private jet that
00:09:01.580 rents out for 15 minutes at a time where these influencers go in and pretend they're on a
00:09:06.740 private jet going to-
00:09:08.340 You've got to be kidding me.
00:09:09.760 Going to Cabo or going to Aspen, they put on their ski clothes and say, oh, off to Aspen.
00:09:15.760 They'll go in and shoot a whole year's worth of content, changing clothes from beach to ski
00:09:22.180 to whatever.
00:09:24.440 And publish all of that.
00:09:26.500 And kids compare themselves to that and say, I don't ever go anywhere.
00:09:29.600 Neither do they.
00:09:30.780 They went over to Santa Monica and shot all this phony content and put it out on the internet
00:09:35.980 like they're some kind of rock star.
00:09:38.540 Unbelievable.
00:09:38.980 You know, when I first got into radio and then later television, it took- it was hard
00:09:47.320 work to curate an audience, to know who you were, and then to create and curate an audience.
00:09:54.900 Now my audience has an audience.
00:09:57.880 Yes.
00:09:58.440 And everything that people used to say about, oh, he's only saying that because he wants
00:10:02.600 to get rich, or he's only saying that because he wants, you know, people to watch him.
00:10:06.520 No, you can't be, you can't be who you are or I am for very long if you're fake, I think.
00:10:15.180 Oh, they sniff it out in a hurry.
00:10:16.960 But the people in the audience now who have their own audience, that is what they're doing.
00:10:22.000 And they don't recognize it.
00:10:23.700 And people, they're just, I don't know.
00:10:26.520 It's like we're in some sort of weird nightmare.
00:10:29.800 We are, and it doesn't last very long, but the problem is there's one standing there to
00:10:35.560 take their place as soon as they're gone.
00:10:37.900 They'll get 100,000 followers, maybe they'll get 500,000, but they flame out in a short
00:10:44.120 period of time, but then there's the next one coming right behind them.
00:10:47.700 Um, I was looking at some stats, uh, just today and it's something like 60% of Gen Z, 25 and
00:11:00.160 under that say they would rather be an influencer than a doctor, a lawyer, an architect, whatever.
00:11:07.440 They would rather do that.
00:11:08.980 And they honestly think that's going to work.
00:11:12.680 They don't understand how difficult it is to monetize that content, how difficult it
00:11:18.380 is to make that content.
00:11:20.220 They just think, I'll just put stuff up and get money.
00:11:24.880 So when the iPhone first came out and I noticed everybody started doing this, looking down,
00:11:30.880 um, I said, we are running the biggest experiment on humankind that is ever.
00:11:38.160 We don't know what's, what's going to happen.
00:11:41.180 We're just seeing here.
00:11:43.180 Everybody completely change your life with this.
00:11:48.320 And people thought that was crazy at the time because you know, it, it does bring a lot of
00:11:53.740 connection.
00:11:54.540 You can, it was amazing when you could actually see somebody on the other side of the world
00:11:59.620 that was just an individual telling you what was going on.
00:12:02.920 So is it, is it this bad experiment that we're running or is it that and the combination of
00:12:12.620 really bad actors, uh, that are using knowingly using this and creating such dystopian.
00:12:22.860 It, it's at so many different levels.
00:12:25.460 And look, obviously there are great advantages to this technology, right?
00:12:30.900 Uh, you know, some kids don't even know what a library is.
00:12:34.140 And if you happen to be listening, it's a big building with books in it.
00:12:38.820 Wait, what's a book?
00:12:40.020 Yeah, exactly.
00:12:41.680 Uh, now just think how much information we have at fingertip.
00:12:46.360 Now you got to check and make sure that it's not wrong information.
00:12:51.040 Um, but, but even the stuff that we, because we're digital now, I just, just read this,
00:12:57.080 that information in our libraries, that's solid, but the digital information can easily
00:13:08.500 be manipulated.
00:13:09.400 And some of it is you look for things that, you know, you saw, you know, it was on the
00:13:14.440 line.
00:13:14.660 It's gone.
00:13:15.620 So even real reality is being edited.
00:13:21.060 Yeah.
00:13:21.400 And it's going to get worse, not better.
00:13:23.880 This AI and I, I, I've seen myself in ads for products and it's me.
00:13:32.400 I mean, I look at it.
00:13:33.800 It's a deep fake.
00:13:35.520 It looks like me.
00:13:36.860 It sounds like me.
00:13:37.960 And I'm peddling a product I've never seen or heard of.
00:13:41.640 And we send cease and desist letters.
00:13:43.880 Um, it just passes on to somebody else.
00:13:47.340 They just shut that down and open up a new entity and they're right back at it again.
00:13:53.380 It's like stomping ants at a picnic.
00:13:55.520 I mean, it's, you, you can't, you, you can't get rid of them as fast as they pop back up.
00:14:01.060 But obviously there are huge positives to this, but you ask, is it, is it bad actors?
00:14:08.620 Um, there are huge bad actors at every level.
00:14:14.760 I, I, I deal with women that get caught up in these romance scams.
00:14:21.040 I've had them that they've worked their whole life.
00:14:25.060 Husband passes away.
00:14:26.660 They get a million dollar insurance policy.
00:14:28.760 They've worked their whole life, saved up three, $400,000.
00:14:32.200 And it's all gone in six weeks.
00:14:36.580 Some Nigerian in some workroom that's got 30 or 40 of them up on their computer connects
00:14:45.160 with them, steal somebody's identity.
00:14:48.920 And they've got a playbook.
00:14:50.200 We actually got a copy of their playbook and they start scamming these women and take them
00:14:55.360 for every penny they're worth.
00:14:56.600 So there are bad actors in that regard.
00:14:59.800 And then we've got bad actors, I think, in terms of who's running the algorithms.
00:15:06.240 Yeah.
00:15:06.840 You talk, you talk about that.
00:15:08.800 Is that scary?
00:15:09.960 That is really frightening.
00:15:11.560 I wrote a book a long time ago about AI and talked about, don't fear the technology per se.
00:15:20.880 Fear the people who are writing the algorithms.
00:15:23.660 Because we don't know their motivation.
00:15:26.760 We may never know who they were.
00:15:28.840 But you wrote, while you're getting fed highly curated, highly filtered information, you aren't
00:15:36.640 getting other information.
00:15:38.900 And you talk about how you're not even in charge anymore.
00:15:44.540 Explain.
00:15:45.860 Well, the thing is, we'll have a feed.
00:15:49.000 You open up Instagram or TikTok or whatever and it starts, you start scrolling through
00:15:56.080 and it's showing you this and showing you that.
00:15:58.100 And you think, well, this is coming from somewhere and I wonder why I'm seeing what I'm seeing.
00:16:06.060 Maybe most people don't wonder why they're seeing what they're seeing.
00:16:08.580 But the fact is, I include a study in here where they opened one up, opened up an account with
00:16:18.240 a 13-year-old girl.
00:16:20.360 They created an account with a 13-year-old girl, just put up her name, and 13 years old.
00:16:26.800 And within minutes, they started feeding her toxic information, just really information
00:16:38.960 that was upsetting for her and not in her best interest.
00:16:43.560 So they came back and said, well, all right, let's see what happens if we give a clue about
00:16:49.260 a 13-year-old.
00:16:50.640 So they changed the label to Lauren Lose Weight, gave a clue to the algorithm about what she
00:16:57.840 was about.
00:16:59.040 The amount of toxic information she got within minutes went up like 10x.
00:17:07.480 In a matter of minutes, they started directing her to 700-calorie diets, 400-calorie diets,
00:17:15.740 anorexia sites, all sorts of things started bombarding her.
00:17:22.000 Now, you say, well, why would they do that?
00:17:26.160 If they show you a box of puppies, and you think, whoa, that's really cute, and so you
00:17:31.840 click it a few times, you think, yeah, cute puppies, okay.
00:17:33.940 But if they show you something upsetting, like sick puppies or abandoned puppies, something
00:17:44.320 that upsets you emotionally, it gets you jacked up, you're going to really start clicking because
00:17:51.300 now you're emotionally invested.
00:17:53.360 And so instead of just kind of clicking and laughing, clicking and laughing, you start really
00:17:58.060 clicking.
00:17:58.600 And the more you click, the more money they make.
00:18:00.720 So they feed these girls this information that gets them emotionally invested, gets them
00:18:07.120 emotionally upset.
00:18:08.520 They stay on longer, they click longer, and what happens, of course, then is more ads come
00:18:15.160 at them and they make more money.
00:18:16.900 Now, they do this knowing, and we've seen the information, that the girls get anxious,
00:18:22.100 they get depressed, their self-worth goes down.
00:18:24.980 It hurts them to see this.
00:18:28.600 They don't care.
00:18:29.480 It's a money grab.
00:18:31.840 So they continue to feed them upsetting content because they click more and get more ad exposure.
00:18:39.420 And they know that.
00:18:40.700 We've seen the documents that say they know that.
00:18:43.160 So they feed them upsetting information because it creates more ad ref.
00:18:47.920 So your kids are not just seeing what randomly comes at them.
00:18:53.080 They're actually being targeted by harmful information because it creates more money for the major social media platforms.
00:19:02.420 They're knowingly doing that.
00:19:04.160 And your child doesn't know it, and you don't know it, but they're victimizing your child consciously, and you don't know it.
00:19:11.900 And I'm putting it in here because people need to know it.
00:19:14.840 And what you talk about here, too, is the censoring of information, stuff that you may want to know, may not want to know, but it's what they want you to know.
00:19:29.520 And that censoring of information, especially in my world, is growing at a dramatic and terrifying pace.
00:19:40.260 It's a terrifying pace.
00:19:42.020 You know, I spent a lot of time in the litigation arena.
00:19:45.380 You know, not three minutes from here, I had a company, Courtroom Sciences, Inc.
00:19:51.840 We did trial science work.
00:19:54.120 And so we spent a lot of time studying how jurors problem solve cases.
00:20:00.940 There might be a thousand facts in a case.
00:20:04.540 And we discover that out of that, a jury might break this down to maybe 50 or 60 facts.
00:20:14.620 And out of that 50 or 60, eight or 10 may drive their decision.
00:20:19.720 It was our job to isolate what are those eight or 10 decision-driving facts and how can they be presented in the most effective way.
00:20:30.540 And if you understand what those facts are and how they can be presented most impactfully, then you've got a real leg up.
00:20:44.380 And one of the things we learned real quick is jurors decide cases on what they see and hear, not on what they don't see and hear.
00:20:54.260 So you'll have a lawyer that says, well, we tried to get something in and they objected and there was a big fight.
00:21:00.760 And the judge said, well, we can't let that in now, maybe later.
00:21:04.980 And they look at the jury and they say, well, they knew we had something powerful.
00:21:09.540 We had a big impact.
00:21:10.860 No, you did not.
00:21:12.660 They decide on what they see and hear, not what was implied, not what you inferred.
00:21:18.260 They need to see it and they need to hear it.
00:21:21.780 And if they don't, it doesn't have a lasting effect on them.
00:21:26.220 So people that are censoring and deleting information, making sure it doesn't get in your feed, then I promise you across time.
00:21:34.740 They lose.
00:21:35.780 They lose.
00:21:37.000 They're not.
00:21:38.040 Those are not the decision-driving factors in somebody making up their opinion, forming an opinion and solving a problem based on that information.
00:21:48.260 And when they're curating this information, when they're choosing what you see and what you don't see, they're forming your opinions.
00:21:55.640 And that's scary.
00:21:57.220 What's frightening to me is they have, you know, you don't think of, you worked for Exxon and, you know, did court cases with Exxon.
00:22:06.960 When we used to always look at those companies and go, well, they're not going to lose because they got all the money in the world and they know exactly.
00:22:14.920 They can just figure out the jurors and everything else.
00:22:18.260 And that's what you do.
00:22:19.340 But that's what's being done on every American now by our own government and by social media.
00:22:26.460 We have we have gone from this country.
00:22:29.240 And maybe I'm naive, Dr. Phil.
00:22:31.380 But I got to tell you, in 2008, when everybody was starting to be called a racist, I thought, I think we're doing really well.
00:22:39.000 We're not perfect.
00:22:39.860 But we're not 1965.
00:22:42.840 You know, I grew up at a time when you didn't really notice it.
00:22:47.480 The Martin Luther King idea, at least I grew up in Seattle, at least there, it wasn't an issue.
00:22:54.240 And I really thought we were making progress.
00:22:57.040 And then all of a sudden we're being told, you're a racist, you're a racist, you're a racist.
00:23:01.060 And it is some of the greatest psychological and behavioral scientists alive today that are doing it.
00:23:09.640 Well, that's what I call in the book, tyranny of the fringe.
00:23:14.320 We have these and this is psychopolitics.
00:23:19.240 And I'm not a politician.
00:23:21.000 Talk to talk about it before you start.
00:23:22.400 Tell me what psychopolitics is.
00:23:23.880 You talk about it in the book.
00:23:25.060 I do.
00:23:25.600 And a lot of this, we have to remember, when we talk about Russia, for example, Pavlov, which is one of the greatest behavioral psychologists in the history of the field, was Russian.
00:23:43.280 And so, trust me, they are good at what they do.
00:23:50.240 And we have a document from the 60s that I talk about in the book.
00:24:02.080 And they were talking about the subverting American society, American culture.
00:24:10.020 And they describe it as psychopolitics as well.
00:24:14.400 And they're talking about how you can control the minds and the morale and the emotions of the society.
00:24:25.360 And their conclusion was, they've already done it for us because they're attacking each other.
00:24:32.800 This is like George Orwell's 1984.
00:24:35.760 They may have freedom of speech under the First Amendment, but they're muzzling each other.
00:24:40.500 This cancel culture that we have now is an advanced version of what they were talking about with the psychopolitics of the 60s.
00:24:49.700 But when I'm talking about psychopolitics, I'm talking about brainwashing people, controlling what people say, what they feel comfortable talking about.
00:25:01.700 And if they dare to question what these activists are talking about, what they're pushing, what they're peddling, then they are attacked with a vengeance.
00:25:16.520 They're labeled phobic.
00:25:17.640 They're labeled haters.
00:25:19.560 And it's to the point where they call their job.
00:25:23.180 They contact their job.
00:25:25.160 They get them fired.
00:25:26.480 They get them where their own family won't talk to them.
00:25:28.720 And that's not theory.
00:25:31.540 That's happening.
00:25:33.080 I was struck by, I was over in London during Gay Pride week and month.
00:25:38.740 And I mean, on the castle, on every government building, in every store, they were flying the rainbow flag.
00:25:47.800 Everyone, like without exception, everyone.
00:25:52.100 And I just, I kept walking down the street.
00:25:54.400 I kept thinking to myself, everyone, everyone wants to fly that flag.
00:25:59.820 And I think it's a lot like the people in Germany that hung the political party flag.
00:26:08.680 They were just saying, leave me alone.
00:26:10.400 I'm fine.
00:26:12.460 You don't have to mess with me.
00:26:14.300 And it is happening.
00:26:17.420 And it seems as though there are those who are awake, who see it, and see it for what it is.
00:26:25.300 And they're not necessarily political.
00:26:29.340 They just remember what right and wrong is.
00:26:33.620 And then there's those who, I mean, I've done my job for 25 years trying to say, wake up, wake up, wake up.
00:26:41.680 And I don't know if I've made an impact.
00:26:44.620 I don't know how else to say, wake up.
00:26:51.240 It's a trance.
00:26:54.520 How do you break this?
00:26:56.060 Well, it is a trance in particular areas.
00:27:00.320 And I wonder, because I, my position is this.
00:27:07.960 I want us to deal with the facts.
00:27:12.080 Let's deal with the facts.
00:27:13.800 One of the things I talk about in the book, and I entitled the book, We've Got Issues,
00:27:22.580 because, look, I love this country.
00:27:28.140 And, you know, I get hate mail for saying I love this country.
00:27:32.160 But I do.
00:27:33.140 I love this country.
00:27:34.200 I stand up when the flag goes by.
00:27:36.400 I put my hand over my heart when they play the national anthem.
00:27:41.760 And I love this country enough to acknowledge that we've got problems.
00:27:47.320 I'm not so defensive about it.
00:27:49.220 But I love it enough to not be defensive about the fact that we've got problems.
00:27:53.380 Big problems.
00:27:54.520 And I think that's a good thing to say, I admit we've got problems.
00:28:00.420 I don't have to be defensive about that.
00:28:01.960 Sure we do.
00:28:04.300 And I see things like trigger warnings.
00:28:08.960 The majority of universities, my research has shown me that the majority of universities
00:28:17.740 have utilized or are utilizing trigger warnings.
00:28:23.420 I saw a couple of universities are using trigger warnings for Romeo and Juliet,
00:28:29.440 where they say trigger warning, suicide content.
00:28:34.160 Well, spoiler alert, come on, kind of gave the storyline away.
00:28:45.820 But here's my problem.
00:28:49.240 When you research trigger warnings, and please, if you're listening to this, fact check me.
00:28:56.200 Please fact check me.
00:28:58.020 Go to, don't go to Google.
00:29:00.420 Go to Scholastic Google.
00:29:02.240 I mean, go another level and research trigger warnings, and you will find that the vast majority,
00:29:11.440 overwhelming body of literature says trigger warnings not only don't work.
00:29:17.020 They make you weaker.
00:29:17.940 They actually make you anxious.
00:29:20.680 They actually create the problem that they were designed to avoid.
00:29:25.380 Now, here's why they don't work.
00:29:30.420 There is evidence-based therapy designed to teach people to cope with things that stress them out, right?
00:29:39.620 Systematic desensitization, dialectical behavior therapy.
00:29:43.720 There are a number of evidence-based therapies to teach people to overcome these stressors in their life.
00:29:53.820 Why?
00:29:54.420 Because you can't avoid them.
00:29:56.820 So what do trigger warnings do?
00:29:58.720 They say, okay, some things are going to come up that might stress you, so we're going to warn you, which is stressful.
00:30:05.860 And you can go over here and sit in a corner and avoid this and pretend it's not there.
00:30:12.300 Problem is, when you get out of college and get out in the real world, that doesn't happen.
00:30:16.960 Well, it's starting to.
00:30:18.600 Well, sadly.
00:30:19.700 Yeah.
00:30:20.020 But that pendulum is swinging back.
00:30:21.940 Yes.
00:30:22.220 So you're not preparing people for the real world.
00:30:25.980 These trigger warnings don't work.
00:30:29.880 Research says they actually hurt and that the better method is to learn to cope with them.
00:30:35.820 So later in life, you're not still paralyzed, whether it's PTSD or whatever it may be.
00:30:42.360 You can have them.
00:30:43.900 And the trigger warnings have been for some really ridiculous things.
00:30:47.280 But assuming that they're for something that was traumatic, you need to learn to cope with that.
00:30:53.860 Now, here's my problem.
00:30:55.200 These universities that are employing them have the same access to the same research that I do.
00:31:02.140 So if I can go out there and find out that trigger warnings are contraindicated, that you should not use them, that they actually create problems.
00:31:12.200 They don't help anything.
00:31:13.460 They actually create problems.
00:31:14.680 If I can look that up and find it, so can every university that's employing them.
00:31:21.140 So why are they doing it anyway?
00:31:22.780 Because they're virtue signaling.
00:31:24.640 They're wanting to seem like, I am super woke here.
00:31:27.960 I'm really sensitive.
00:31:31.520 I'm protecting all of the students and creating a safe place for them to get an education.
00:31:37.660 The problem is that's not an education.
00:31:40.140 The problem is that's teaching them to go on green and stop on red.
00:31:46.100 Here are the keys.
00:31:47.400 Have a great ride.
00:31:49.240 That's not the way the world works.
00:31:51.280 Now, if I can look that up and find it and see it, so can they.
00:31:55.140 So they are knowingly teaching these people something that doesn't help them and actually hurts them.
00:32:02.940 But you said, don't look it up on Google.
00:32:07.020 Scholastic.
00:32:08.400 You write.
00:32:11.160 Nathan Sharansky wrote a book called Case for Democracy.
00:32:16.360 In it, Sharansky created what he called the Town Square Test.
00:32:21.360 Explain the Town Square Test.
00:32:23.820 Well, the whole idea here is you have to be willing to speak up, speak out, and say what you think.
00:32:37.180 And the whole idea is that the Internet, for example, should be like a town square, right?
00:32:47.040 You should be able to talk about whatever you want to talk about and have an exchange of ideas.
00:32:52.440 Yeah.
00:32:52.640 Good luck with that.
00:32:55.080 Yeah.
00:32:55.700 Good luck with that.
00:32:56.820 It doesn't work that way.
00:32:58.620 When you put out something that is at odds with the agenda, the agenda is intolerant when you're dealing with these activists.
00:33:13.540 And I think it was Richard Feynman that said, I would rather have answers that I can't question than questions I can't answer.
00:33:21.780 And that's a real problem.
00:33:24.840 If you've got answers you can't question, that's worse than having questions you can't answer.
00:33:32.860 And that's where we are right now.
00:33:35.680 You've got answers you've got answers you can't question.
00:33:39.000 And that's worse than having questions you can't answer.
00:33:41.320 You said at one point that the town square test, the way to know if you're living in a fear society, is if a person cannot walk into the middle of town square and express his or her views without fear of arrest, imprisonment, or physical harm.
00:33:56.480 By that definition, we're not living in a fear society, at least not yet.
00:34:00.540 In a fear society, in a real town square, when a person's getting silenced, you actually see them getting attacked or muzzled or arrested or dragged away.
00:34:09.540 This is what I called a few years ago and got a lot of heat for it, a digital ghetto.
00:34:17.080 The Germans just moved people into ghettos.
00:34:19.840 They could talk all they want behind that wall.
00:34:22.480 So it was, in effect, erased.
00:34:27.240 We're burning books, but not physically.
00:34:31.040 We're ghettoizing people, just not physically.
00:34:34.420 Is there a difference?
00:34:35.500 Well, I don't think there's a difference.
00:34:39.320 And here's the thing.
00:34:40.820 They didn't have, if, think about George Orwell's 1984, which was written, think about how prophetic this was.
00:34:50.840 It was written in 1948.
00:34:53.120 And he talked about someone would say something they shouldn't say.
00:34:59.360 They would fail the town square test, and they would get unpersoned.
00:35:06.600 They would just disappear.
00:35:09.040 I mean, everything, they would disappear from the records.
00:35:13.040 They're just gone.
00:35:13.860 They would just, you couldn't find anything about them.
00:35:16.860 We call it canceling them now.
00:35:19.260 It's the same thing as he wrote about in 1948.
00:35:22.640 And they started eliminating words from, in the book, they started eliminating words.
00:35:32.000 And there was just what they called new speak.
00:35:34.540 And you could only use these words and not use any others.
00:35:37.480 Same thing.
00:35:37.940 And people started saying, well, I actually like that because I don't want to have to make any decisions.
00:35:42.480 Just tell me what words I can use, and I'll use those, and then I don't have to think.
00:35:48.100 Wow.
00:35:49.080 And how crazy is that?
00:35:52.720 We have the First Amendment protecting free speech, and we're muzzling each other.
00:36:00.280 It's not the government coming in and taking it away.
00:36:02.500 It's not.
00:36:03.060 We're muzzling each other.
00:36:04.740 You say something I don't like, and we will attack you.
00:36:09.120 We will cancel you.
00:36:10.580 We will get you fired.
00:36:11.780 We will get you labeled as a hater.
00:36:13.500 We will get you labeled as phobic, whatever.
00:36:18.140 And you'll wear the scarlet letter and be unacceptable anymore.
00:36:24.360 And it's got people, I have one statistic in the book that says the percentage of people that are afraid to express their opinion has tripled since 1950.
00:36:40.360 Jeez.
00:36:40.840 So in the last 75 years.
00:36:43.180 And think of 1950 as the beginning of the Red Scare.
00:36:45.620 Yeah.
00:36:46.560 It's tripled since then.
00:36:48.020 People are just saying, I'd just rather not say anything.
00:36:51.860 How bad is that?
00:36:53.320 You don't have a civilization when that happens.
00:36:57.800 No, you don't.
00:36:58.960 And what you're saying is you've been trying to wake people up, and have.
00:37:04.240 I mean, don't sell yourself short.
00:37:06.620 You've awakened a lot of people.
00:37:10.000 It's a process.
00:37:11.860 And I think that a lot of these activists have pushed too far, too long, too hard.
00:37:19.460 And people have started saying, wait a minute.
00:37:21.620 You're now messing with my kids.
00:37:23.580 You're messing with my education.
00:37:26.040 Too much is too much.
00:37:28.500 Enough's enough.
00:37:29.320 Too much is too much.
00:37:32.760 My grandmother said, you quit preaching and go on to meddling now.
00:37:36.940 That's not okay.
00:37:38.100 Right.
00:37:40.400 There's all kinds of research, historic research, that shows that the final stages of an empire
00:37:49.340 always comes at the end with questionable sexuality, questionable bad morals on sex,
00:38:03.160 LGBTQ kind of stuff.
00:38:07.260 And for some reason, that's the last straw that comes before it collapses.
00:38:15.020 Is that true?
00:38:16.060 Do you know?
00:38:16.940 I don't know.
00:38:18.160 But see, I'm the incurable optimist.
00:38:23.200 And maybe that's a flaw.
00:38:24.760 But I believe in mankind.
00:38:30.340 I believe that if we really want this culture, this society to flourish, that it's the number
00:38:48.140 one principle I write in the book, be who you are on purpose.
00:38:52.840 Don't wake up and ride the river wherever it's going.
00:38:58.300 Be who you are on purpose.
00:38:59.880 You got to decide what's important to me and what am I willing to do to stand up for that.
00:39:11.080 That's why the subtitle to the book is How to Stand Strong for America's Soul and Sanity.
00:39:17.320 And the soul of the nation is a big word, right?
00:39:21.240 I mean, to talk about the very soul of the nation.
00:39:24.540 But when people are trying to rewrite history, biology, science, all of that, how does that work?
00:39:36.040 I mean, you don't just decide, you know, I don't like the way this is, so I'm going to rewrite it.
00:39:44.120 And it's not enough to just live and let live.
00:39:49.080 I'm like, if that's what you want to think, okay.
00:39:52.140 That's not enough.
00:39:53.580 We're going to demand that you stand up and say you agree with us.
00:39:58.540 Yeah.
00:39:58.820 It's not enough that you let us think that way.
00:40:01.320 You have to stand up and agree with us.
00:40:03.380 That's where I have a problem.
00:40:06.100 It's in the Bible, Sodom and Gomorrah.
00:40:08.840 You know, the angel is in the house.
00:40:14.120 And the people come pounding on the door.
00:40:18.180 Offers his son or his daughter.
00:40:21.060 Here, take my daughter.
00:40:22.460 No.
00:40:23.520 The stranger.
00:40:24.720 Bring the stranger.
00:40:25.860 He must participate.
00:40:27.800 We're in that position right now where it isn't enough.
00:40:33.020 You must.
00:40:34.060 I mean, again, I go back to the store owners.
00:40:37.180 How do you get a country that only voted a third for the Nazis?
00:40:41.500 How do you get them to raise their hand and give the Hitler salute?
00:40:44.400 How do you get them?
00:40:45.660 Fear.
00:40:46.420 They were physically beaten in the streets by the SA.
00:40:50.080 We are not physically beaten, but we have a massive psychological game being played.
00:40:58.440 We do.
00:40:59.000 And the Internet's the game changer.
00:41:01.580 The social media platforms, all that are game changers.
00:41:06.400 Because think about it, before that, if you were up in Kansas and you were living out on
00:41:13.820 the farm and you had some wacky idea, you could tell a couple mouth breathers down at
00:41:18.820 the bar.
00:41:19.420 Right.
00:41:19.620 And you guys could get out in the woods and talk to each other and that's about as far
00:41:26.540 as it went.
00:41:28.060 But now, you know, on the Internet, it gets oxygen because you got enough other people
00:41:35.400 that say, oh, well, I want to find a cause.
00:41:38.480 I want to find something that distinguishes me.
00:41:41.940 I want to belong to something.
00:41:43.480 And that's why I say that I'm so worried that family in America is under attack.
00:41:51.380 Religion has dropped below 50% in America for the first time in our country's history.
00:41:56.360 People want to belong to something somewhere, somehow.
00:41:59.940 And absent equality choice, they'll grab onto anything.
00:42:03.800 And so, you know, here's somebody with a wacky idea and they'll love bomb you and accept
00:42:11.280 you and tell you, oh, that's great.
00:42:15.340 The way you're thinking, yeah, you get me.
00:42:17.460 We're in this together.
00:42:19.420 And so, before you know it, those four or five guys out in the woods now are connected
00:42:26.200 with four or five hundred.
00:42:27.880 And then those four or five hundred, that's why Richard Allen Ross, who's, I think, the
00:42:34.180 best cult guy around, says we've probably got 10,000 cults operating in America right
00:42:41.040 now because of the Internet.
00:42:43.420 And they'll connect that way.
00:42:45.540 And then maybe once a year, they all get together and physically.
00:42:48.620 So, is this a problem?
00:42:51.220 Because I've been fascinated with technology my whole life.
00:42:57.180 And I started reading Ray Kurzweil back in the 90s and what was coming and AI and, you
00:43:06.280 know, ASI and all of the things that we're now beginning to experience.
00:43:11.480 And it is very hopeful and very, it's miraculous, but it is also deadly.
00:43:18.980 So, is it, is it the fact that, I mean, how many movies do you have to watch where you
00:43:26.300 see the AI go wrong?
00:43:28.080 Know how, turn the air back on, you know what I mean?
00:43:31.200 How many times?
00:43:32.720 And we just seem to have this, like, normalcy bias in a way that this time it's different.
00:43:37.700 It's never different.
00:43:38.880 It's never different.
00:43:39.760 No, and I think the way to inoculate against that is, as I said, we have to ask ourselves,
00:43:52.120 what are we all about?
00:43:55.000 And are we doing anything about what we're about?
00:43:59.160 Are we being who we are on purpose?
00:44:01.980 Are we living with intention?
00:44:03.500 And I look at what's being taught in the colleges and universities right now.
00:44:09.660 I said not long ago that our elite universities were fostering intellectual rot.
00:44:18.380 They're not teaching critical thinking.
00:44:21.580 I mean, we have this invasion of Israel and, I mean, those were murdering assassins that came
00:44:29.560 in and attacked, raped, beheaded, set on fire, and children in their cribs set on fire.
00:44:39.220 And I said, look, I am going to speak about this.
00:44:43.480 And I was speaking to representatives of the Israeli government.
00:44:48.000 And I said, look, I can't talk about this based on descriptions and hearsay.
00:44:53.680 I don't want to see visual proof of this, but I can't talk about it if I don't.
00:45:03.680 So the Israeli consulate had the IDF bring to my home here in Dallas classified footage
00:45:16.060 that has not been released to this day.
00:45:18.780 And I watched, and a lot of it was GoPros from Hamas.
00:45:27.660 Some of it was cell phone video from Israelis that were murdered and fell on their cell phones
00:45:34.680 and they opened them up and saw what was there, saw what happened.
00:45:41.220 And these were not acts of war.
00:45:44.260 And as I said, I don't know enough about politics to speak about it.
00:45:49.960 I think a lot of people don't that speak about it.
00:45:54.220 I certainly don't know geopolitical dynamics.
00:45:59.620 But I do know right from wrong.
00:46:02.100 I know murder when I see it.
00:46:03.940 I know when somebody goes into a noncombatant's house and kills an infant
00:46:09.540 and says, well, they're settlers.
00:46:14.200 And it's just wrong.
00:46:16.100 And then I see hundreds of students on campus.
00:46:21.820 Some of them, I just, I see a banner that says,
00:46:28.200 Gays for Palestine.
00:46:30.740 Do a little homework.
00:46:35.660 Just a little homework.
00:46:36.800 Seriously?
00:46:37.560 Right.
00:46:37.960 Walk that banner into the Gaza Strip and see how far you get.
00:46:41.260 Right.
00:46:41.740 You're cheering on people that would sooner kill you than look at you.
00:46:46.060 Yes.
00:46:46.340 And how have they not been taught critical thinking?
00:46:54.720 And do I write off the fact that many Palestinians have been killed,
00:47:04.280 20,000 and counting, and many of them children or civilians?
00:47:09.480 No, I don't.
00:47:10.300 But being killed in collateral damage from a bomb dropped as an act of war
00:47:17.520 is not the moral equivalent of what was done by Hamas when they came into Israel.
00:47:22.340 You said a couple things that I think are interesting.
00:47:24.760 You talked about, you know, you're not an expert on it,
00:47:27.820 but you know the difference between right and wrong.
00:47:30.020 I think our society has been trained just to listen to the experts.
00:47:33.580 You don't know.
00:47:34.520 You're not smart enough.
00:47:35.580 You don't know all the information.
00:47:36.760 But you can, as an individual, and must as an individual,
00:47:41.360 look at the situations, listen to all sides, listen to what's going on,
00:47:46.380 and then make a judgment, not necessarily as the person in charge,
00:47:51.120 but absolutely a judgment if you can tell the difference between right and wrong.
00:47:55.780 If you're murdering and setting babies on fire,
00:47:59.000 I don't need to listen to anything else you say.
00:48:02.120 That's such a violation of right.
00:48:05.020 I've got a moral compass.
00:48:08.420 Does America?
00:48:09.720 I think they do, but I don't think that they're finding a voice the way some of the activists are.
00:48:17.740 Why?
00:48:18.500 Because they're not organized.
00:48:21.600 And here's the thing.
00:48:23.320 The activists, the tyranny of the friends that I talk about in the book,
00:48:28.000 they have an identified enemy.
00:48:30.560 It's often us.
00:48:34.780 They have an identified enemy.
00:48:37.000 And so once they identify an enemy, then they can rally towards that enemy.
00:48:44.000 When they have an identified enemy, they'll have a place to go, a time to arrive, a target to focus on.
00:48:54.200 Whereas this middle America doesn't have an identified enemy.
00:49:02.260 They don't want an enemy.
00:49:04.360 America does.
00:49:05.220 Most of Americans don't want an enemy.
00:49:07.360 They want to live peacefully and accept one another and love one another,
00:49:11.760 which I'm so glad about in one sense, but you have to pick your battles.
00:49:20.080 And they're not picking a battle.
00:49:22.480 And so here we've got this tyranny of the fringe out here that are snipers.
00:49:28.140 And they're targeting people, and they're pushing this narrative.
00:49:36.840 And middle America, millions and millions and millions and tens of millions of people don't have an identified enemy.
00:49:44.400 They don't want to hurt anyone.
00:49:46.960 They don't want to hurt.
00:49:47.580 It's weird also that they don't know.
00:49:49.540 I mean, you talk about core principles.
00:49:52.660 The one thing about America, we didn't agree on anything, on anything, except a few core principles.
00:49:59.740 We enshrined them as the Bill of Rights.
00:50:02.960 And all of our laws were based on Judeo-Christian principles.
00:50:08.060 Don't kill.
00:50:09.420 You know, don't steal.
00:50:10.400 Don't lie.
00:50:11.520 Those kinds of things were important.
00:50:14.340 We don't agree on the Bill of Rights anymore.
00:50:16.120 We don't even agree that it's a good document.
00:50:19.320 Yeah.
00:50:20.020 How do you bring that back, and can it be brought back together?
00:50:24.080 It has to be, because, you know, I talk about the fact that we have to make all, we have to choose all behaviors based on results.
00:50:36.520 And that means that we have to support a meritocracy.
00:50:40.260 Look, this stuff about equality of outcome, come on.
00:50:46.580 I mean, if you've got a guy sitting home in a beanbag eating Cheetos, and he's going to get the same outcome as the person that—
00:50:54.780 I'm going to sit in the beanbag and eat Cheetos all day, too.
00:50:57.200 Yeah, he gets the same outcome as the person that gets up at 6 o'clock and goes and totes that bale and works at the lumberyard all day or goes to medical school or whatever it is.
00:51:10.940 And I saw this happen when they said about mismanaging COVID and spent $5.5 trillion.
00:51:22.640 And, again, I'm not being political, I'm being psychological here.
00:51:27.280 When you pay people not to work, and, in fact, you pay them more to not work than to work, and when you figure in that gas was $5, $6, $7 a gallon in L.A., I drove past some stations where it was $7 and a quarter.
00:51:43.120 I took pictures because I went home to Robin and said, gee, look at this.
00:51:51.180 And they're having to commute, and so it's going to cost them $300, $400 a week to commute.
00:51:56.740 Why do it?
00:51:57.560 Or they can sit home in that beanbag, and they get unemployment plus a $600 a week bonus and then another bonus on top of that.
00:52:05.460 And then they get a stimulus check for, what was it, $1,250 per person.
00:52:10.940 I had some friends with a family of four that, honest to God, they were getting $10,000.
00:52:21.980 You destroy it.
00:52:25.960 And when you take all of it together, it was like $5.5 trillion.
00:52:30.100 $4.4 trillion of it went into checking or savings accounts.
00:52:33.220 They didn't spend it on rent and groceries.
00:52:35.460 They didn't need it, obviously, because it went into savings.
00:52:37.860 So you're paying people to not work, and then they say, I don't understand what happened to the supply chain.
00:52:45.880 Why we can't get anybody to unload all of these ships out in Long Beach Harbor?
00:52:50.040 They're backed up out here for miles, and we can't get anybody to unload them.
00:52:54.460 Well, why would they?
00:52:56.680 They're in the beanbag.
00:52:57.720 Well, it's going to get really bad.
00:53:25.800 It has to before it gets better.
00:53:28.520 You need to move quickly and find the safest ways to invest so you can protect yourself and your family from whatever dark day lie ahead.
00:53:35.700 Because the sun does come up, we all survive.
00:53:38.740 It's just in what condition do we get to the other side?
00:53:41.480 That's why I recommend you protect your hard-earned savings with an asset you can trust, gold.
00:53:46.760 I made my very first gold purchase in the days when I was listening to Rush Limbaugh, I think back in the very late 90s, early 2000s.
00:53:55.400 And I was listening to Rush, and he talked about Lear Capital.
00:53:59.060 It was his sponsor of gold for a very, very long time.
00:54:02.560 The person I called at Lear Capital still works there today.
00:54:05.900 And the investment I made has, I mean, it's eye-bleed crazy.
00:54:12.320 It's maybe five times as much.
00:54:15.240 Lear helped me prepare for the coming insanity.
00:54:17.500 They can help you do the same thing.
00:54:19.500 Don't wait around for things to get worse.
00:54:21.240 Do what's right for you and your money today.
00:54:24.600 I want you just to call Lear Capital and just get a booklet that they have on what you can do.
00:54:33.820 Find out if it's right for you.
00:54:35.260 There's no obligation.
00:54:36.860 There's no cost to this.
00:54:38.080 They're going to send you a free booklet.
00:54:39.220 You do your own homework.
00:54:40.280 You talk to your spouse.
00:54:41.540 You pray on it.
00:54:42.840 You'll also, if you decide to buy, Lear will credit your account $250 towards your purchase because you got this from me.
00:54:49.760 Call today, 800-889-3070, 800-889-3070, Lear Capital.
00:54:57.380 So, you said several times, I don't want to get political.
00:55:02.560 But in reading your book, you don't talk about politics.
00:55:04.900 There's no left or right in there.
00:55:06.720 No.
00:55:07.820 But it does, and maybe it's because everything is political now, but it does seem, common sense seems political right now.
00:55:16.500 Well, politicians talk about a lot of the cultural issues that I talk about.
00:55:20.440 Right.
00:55:20.660 That's their problem, not mine.
00:55:23.480 I'm talking about cultural issues in terms of the fact that I think family is the backbone of America, and I think families are under attack.
00:55:33.400 They're under attack by the big social media platforms.
00:55:37.320 I think they're under attack by these fringe activists.
00:55:40.620 I think they're under attack.
00:55:41.740 Some of it's unintended consequences.
00:55:44.520 Some of it's on purpose.
00:55:46.340 And that's why I say, be who you are on purpose.
00:55:53.100 Decide, my family's going to be pulled together.
00:55:56.020 My family is, we're going to consciously make this family strong again.
00:56:01.100 And I think you have to make some of those choices.
00:56:04.500 Yeah.
00:56:05.000 You've got to support a meritocracy.
00:56:07.460 Can I give some of the principles you said?
00:56:09.980 You said, I've been working on my 10 working principles for a healthy society.
00:56:14.360 Let me give these to you.
00:56:15.460 Number one, be who you are on purpose.
00:56:17.800 That's just a simple conversation that we won't have.
00:56:21.120 With no political party, we'll have this conversation.
00:56:23.700 What do we believe?
00:56:24.580 Do you believe in socialism?
00:56:27.380 Do you believe in some sort of fascism without the killing centers?
00:56:32.800 Or do you believe in the Bill of Rights and individual freedom?
00:56:36.600 We won't have that conversation.
00:56:37.980 In fact, they'll argue that we can't have that conversation.
00:56:41.180 But that's what we have to have to know who we are.
00:56:43.940 Two, focus on solving problems rather than winning arguments.
00:56:48.720 What a great principle.
00:56:50.860 Isn't that the truth?
00:56:51.700 And you can know whether you're dealing with somebody that is trying to win an argument
00:56:56.940 or solve a problem in the first three or four sentences you just sit down and talk to them.
00:57:01.400 Because if they sit down and say, okay, how can I help here?
00:57:07.600 What can we do together to work on this?
00:57:10.640 If I'm sitting down to negotiate with somebody, the first thing I always do is say, look,
00:57:16.100 we've got some differences.
00:57:17.460 We'll get to those.
00:57:18.360 But let's talk about what we have in common first.
00:57:22.600 Let's talk about what we agree on.
00:57:24.420 Because then we've got a foundation to build on.
00:57:26.400 Because I've never one time done that, that we didn't realize we have a lot more in common
00:57:35.480 than we thought we did.
00:57:36.740 Every time you have a conversation with somebody, if they're honest, if they're engaged in an honest
00:57:41.920 conversation, and as you say, not trying to win, it always works that way.
00:57:47.500 And you don't have to love everything about the other person to love that person.
00:57:51.640 You don't have to like everything they do in order to work with them.
00:57:56.500 Principle number three, don't reward bad behavior or support conduct you don't value.
00:58:01.420 These are so fundamental psychological principles.
00:58:07.160 If your kid's throwing a tantrum in the grocery store, do you go give him a piece of candy?
00:58:11.480 Of course you don't.
00:58:13.800 So why reward bad behavior?
00:58:16.280 Why reelect politicians that aren't doing the job?
00:58:20.700 If you've got pit bulls walking around your neighborhood, jumping on people and chewing them
00:58:26.120 up, get a new dog catcher.
00:58:28.660 He ain't doing his job.
00:58:30.240 Number four, measure all actions based on results and all thoughts based on rationality.
00:58:37.780 That, we are told, you don't understand science.
00:58:42.440 Science says there are 93 genders.
00:58:46.340 That's not based on anything.
00:58:48.540 No, it's real easy to check some of these things out.
00:58:54.240 And you know, rationality sounds like a big word.
00:58:58.080 Is it based on verifiable fact?
00:59:01.980 Does it protect and prolong your life?
00:59:04.920 Does it get you closer to what you want and need?
00:59:09.180 There are some simple building blocks to answer whether something's rational or not.
00:59:14.980 This isn't a subjective opinion.
00:59:18.080 You can ask yourself, first off, is it based on verifiable fact?
00:59:20.800 And if it's not, get a new thought.
00:59:25.280 Well, they'll tell you.
00:59:26.740 Let's just take on an issue that I know you've gotten a lot of heat for.
00:59:31.300 Transgenderism.
00:59:31.860 The American in me says, look, dude, okay, that is not my deal.
00:59:40.160 Whatever.
00:59:41.420 But don't expect me to say, oh, look at that woman over there.
00:59:46.700 I'm not going to.
00:59:47.680 I'm not going to.
00:59:48.520 Don't expect me to say that I have to call you or treat you the same because, honestly, you have some sort of mental disorder.
01:00:01.380 If you actually think you're something trapped inside of your body, you're not.
01:00:08.360 You're not.
01:00:09.020 And there might be, but how do you say those things in polite society?
01:00:17.560 Our American ethos has always been live and let live.
01:00:23.400 Look, if you want to identify as Glinda, that's up to you.
01:00:29.840 So, if you want me to say that sex is assigned at birth rather than defined at conception or chromosomally, I can't find the science to support that.
01:00:47.660 Right.
01:00:48.440 But if you want to identify differently, what business is it of mine to tell you that you can't?
01:00:55.980 Correct.
01:00:56.800 But don't force me to say that it's normal or rational or teach my kids that this is something that you should pursue or even experiment with.
01:01:09.620 And that's number one of rationality.
01:01:11.680 Is it based on verifiable science?
01:01:13.520 And, you know, I had Dr. Carol Hooven on my show, professor at Harvard, one of the most respected and popular professors.
01:01:25.980 And she taught a course in, I think it's, I forget the title, I think it was biogenetics.
01:01:33.540 It's talked about in the book.
01:01:35.220 She was on my show and Fox and Friends and nicest woman.
01:01:40.100 I mean, this is a sweetheart of a woman's spirit.
01:01:46.300 So smart.
01:01:47.660 I mean, scary smart.
01:01:48.660 And we had a transgender woman on that had transgender, she had transitioned to a male and was a coach and was very happy in that position.
01:02:08.760 And when I came to Dr. Hooven, who was going to talk about transgender athletes, I came to her to talk about that.
01:02:18.320 And before she could even answer, she was very emotional and she was talking to this coach and said, I'm so happy for you that you're happy.
01:02:31.240 And she said, well, thank you for saying that.
01:02:43.200 And he said, you know, there's, you know, there are more than two sexes.
01:02:50.920 And she said, well, but there aren't.
01:02:52.360 And he said, well, yes, there are.
01:02:54.140 I said, well, well, but there aren't.
01:02:55.280 And she said, I'm just telling you, you'll never be able to get a biological male to compete fairly with females.
01:03:10.340 We're seeing that.
01:03:11.820 And she said, I've got the research here.
01:03:14.280 It's not my opinion.
01:03:15.240 I don't care.
01:03:16.900 And he said, well, how many were in your study?
01:03:18.480 She said, well, it wasn't my study.
01:03:19.460 I did a meta-analysis of, I think, 54 studies that looked at all of this.
01:03:26.540 And even with testosterone blockers, like, for example, you can't change the wingspan.
01:03:32.360 You can't change lung capacity.
01:03:34.280 You can't change all of these different things.
01:03:37.220 And you can modify it some, but you'll never get on a level playing field.
01:03:42.220 And she even had the percentages broken down.
01:03:44.260 And she said, like, with swimming, you can get within 10%.
01:03:47.200 But most swimming events are timed in hundreds of a second.
01:03:51.620 And if it's a two-minute race, 10% would be 12 seconds.
01:03:57.780 You know, they'd be down there turning around while he's standing up there waiting.
01:04:01.940 And said, you'll never get it.
01:04:05.400 She got back to Harvard, and they labeled her transphobic.
01:04:09.420 Oh, my gosh.
01:04:09.880 And drummed her out of that university after 20 years.
01:04:14.260 For being transphobic.
01:04:16.940 And she is so far from transphobic.
01:04:19.440 That is the most accepting woman you could ever meet.
01:04:22.220 I don't know anybody.
01:04:23.880 I'm sure there are.
01:04:25.120 But I personally don't know anybody when you saw Bruce Jenner.
01:04:29.840 And he told his story of, I've been like this my whole life.
01:04:34.520 I feel like I've, I mean, what he spoke about when he was 20, when we were watching him win gold,
01:04:40.880 what was going through his head, I felt horrible for him.
01:04:45.220 I'm like, that's torture.
01:04:47.160 That is just torture.
01:04:49.540 And if he's happy now, that's great.
01:04:53.000 I am happy.
01:04:54.060 But you have to draw the line and say, look, if I'm bringing you to the hospital, I'm not going to tell them and argue with them that you're the most beautiful woman ever.
01:05:06.600 You're a man.
01:05:08.060 And it's important for them to know, because that's science.
01:05:11.040 And look, if if they want to identify as a transgender female, I get it.
01:05:22.320 And I and I talk in the book, I say, look, I'm not sure that I'm describing this right.
01:05:28.940 And if not, please help me, because for a long time, they did not say that sex and gender were the same thing as I understood it.
01:05:41.580 And I'm I'm live and let live.
01:05:46.100 And there's gender dysphoria.
01:05:48.880 And I'm I'm worried about what's happening with children.
01:05:54.220 If if they're pushing that agenda, I just I but look, I'm the damage.
01:06:02.860 You know, my son, it probably seven or eight was exposed to pornography.
01:06:11.380 A babysitter.
01:06:12.540 He was online and it tore him apart.
01:06:16.520 He he was in a rabbit hole that he was seeing hardcore stuff.
01:06:21.340 And it really tore him apart.
01:06:24.220 How can people say that that is good and natural for children to see?
01:06:31.880 I mean, little children we've known forever protect their innocence.
01:06:38.000 Yeah, I think there's a real problem with what some people are wanting to make available to kids and illustrated books and that sort of thing.
01:06:48.640 Um, let's go to principle number five.
01:06:51.760 Consciously choose which voices are in your life and deserve the most attention.
01:06:57.100 That's stop scrolling.
01:06:58.800 Yeah, you bet.
01:07:00.460 Uh, principle six.
01:07:01.640 Don't stay silent so others can remain comfortable.
01:07:05.020 That's really hard.
01:07:06.300 Yeah, that's what we've got to get people to do is, is, is be willing to speak up even if it makes other people upset.
01:07:15.960 If it's who you are and what you believe, you got to be willing to speak it.
01:07:19.100 Um, principle seven, actively live in support of meritocracy.
01:07:24.180 That's what made us.
01:07:25.580 Exactly.
01:07:27.200 Principle eight, identify and build your consequential knowledge.
01:07:30.640 What does that mean?
01:07:31.080 Here's the thing, and that's probably an awkward word choice, but I meant it to be because I want it to stick out in people's mind.
01:07:43.060 Consequential knowledge is knowledge you have, skill set that you have, talent you have, ability you have, where they can't replace you by noon tomorrow.
01:07:52.320 If you're working somewhere and your job is opening the gate or filling an order or whatever, they can replace you by tomorrow.
01:08:03.100 If you are a computer repair person or you're, you're, um, a brain surgeon or you're doing something where.
01:08:11.740 If you have the institutional knowledge, just the institutional knowledge is, you don't let that person go.
01:08:19.620 I've had the same assistant, uh, that runs my office for 45 years.
01:08:25.520 Wow.
01:08:26.160 As she knows where everything is.
01:08:28.720 She knows how to get that file cabinet open.
01:08:31.080 She knows how to do this.
01:08:32.180 She knows everybody at every vendor, every account.
01:08:36.220 If, if something happens to her, we're just going to have to shut down.
01:08:40.540 I mean, that's, that's institutional knowledge, but find out what you're good at and vertically develop at that.
01:08:48.820 There's nothing wrong with being a jack of all trades, but you better be a master of one, have something you're good at that.
01:08:55.740 You can't be replaced by noon tomorrow.
01:08:58.420 Find that, develop it.
01:09:00.280 And then you're protected.
01:09:02.600 Last one principle.
01:09:04.040 Oh no, there's two principle.
01:09:05.300 Number nine, work hard to understand the way others see things.
01:09:09.400 I can't think of, I mean, all of these are so perfect.
01:09:15.780 We're, no one's asking, how did you get there?
01:09:19.800 You know, I work with law enforcement some, and I, I, I, I do training with them on different things from different angles.
01:09:30.300 And you talk to the FBI, you talk to hostage negotiators, they'll tell you, you're never going to get a hostage out alive if you don't convince the hostage taker that you understand why they took that hostage to begin with.
01:09:47.440 That's the number one predictor of whether or not you're going to get those hostages out alive.
01:09:53.200 If they understand that, okay, he gets why I took these hostages to begin, whether it's for political reasons or you've been hurt or once they understand, Glenn understands why I did this.
01:10:06.680 Then you've got a chance of getting them out that, hey, he gets me, he understands why this happened.
01:10:13.740 That's when you'll get them out.
01:10:14.940 They need to know they've been heard, that you have heard what they have to say, why they did what they did.
01:10:21.920 We need to work hard at making people understand we get them, we see things through their eyes.
01:10:27.600 We don't do that enough.
01:10:29.320 Oh, in fact, we do the opposite.
01:10:30.720 Yeah.
01:10:30.900 And then the last one, number 10, is treat yourself and others with dignity and respect, which seems so simple.
01:10:39.160 I mean, you could look at that and say, what, did you run out of 10?
01:10:41.300 You needed one more?
01:10:42.660 No, that is not as easy as it sounds.
01:10:45.540 You can't give away what you don't have.
01:10:47.900 If you don't treat yourself with dignity and respect, you can't give it to other people.
01:10:53.800 Let me change subjects here.
01:10:56.160 You're just talking about hostages.
01:10:57.880 And I feel as though some people are hostage to their own normalcy bias or confirmation bias, and you just can't get them out.
01:11:12.520 Let me take a situation that is political and don't make it political.
01:11:17.020 Donald Trump is seeing jury after jury after jury.
01:11:22.420 In Washington, D.C., he's going to be at a jury pool.
01:11:26.780 Only 6% of the population voted for him.
01:11:31.160 Okay.
01:11:31.280 If you were advising, how do you get them to understand the case from his perspective?
01:11:46.440 What would you be advising his attorneys?
01:11:48.360 I would absolutely do what I call plaintiffing the defense.
01:12:01.860 You have got to put the prosecution on trial.
01:12:09.920 You don't want to go in there and defend Donald Trump.
01:12:15.480 You want to put the other side on trial.
01:12:19.800 You need to—you're going to come off a whole lot better if you can flip the script and say,
01:12:29.040 we need to decide what the motives are of the other side of this case.
01:12:37.960 And I would be asking them to say, to ask themselves, you're a first draft historian here.
01:12:46.720 What are you going to write?
01:12:48.240 Are you going to put together a case here where you are bringing this case in a way that is actually going to alter the course of American politics,
01:13:07.280 or are you going to let the electorate make a decision?
01:13:12.780 And I think you have to put the other side on trial.
01:13:17.860 I think you'll do a whole lot better by plaintiffing the defense instead of defending the defense.
01:13:26.240 I think you're a lot better off if you put the other side on trial instead of defending your guy.
01:13:31.980 And is that because—
01:13:33.860 Six percent voted for your guy.
01:13:40.080 Let's talk about—you're coming to Texas.
01:13:44.620 You're starting a new network.
01:13:49.860 And—well, tell me about the network.
01:13:53.540 Well, it's Merritt Street Media, and it's a 24-hour-a-day, seven-day-a-week network.
01:13:59.940 We're launching on April 7th.
01:14:01.540 That is hard.
01:14:02.440 It is more than hard.
01:14:04.780 We're going to have four hours a day of news,
01:14:07.240 and the news is going to be factual.
01:14:10.380 It's going to be based on, you know, empirical fact
01:14:15.680 and let people decide whether they think it's good news or bad news.
01:14:20.540 That's up to them.
01:14:23.100 And my show is nightly, in primetime.
01:14:29.060 And we have a lot of others that are involved.
01:14:36.040 Nancy Grace is going to be on.
01:14:37.880 She'll be kind of at the top of our true crime vertical.
01:14:44.860 Mike Rowe is on.
01:14:47.540 Bear Grylls is on.
01:14:50.220 We've got—there will be some of my shows from daytime.
01:14:54.000 We'll have some legacy programming.
01:14:55.740 We've got a really interesting show from the behavior panel.
01:15:00.560 I don't know if you've run across these guys.
01:15:02.320 They're guys I've worked with from law enforcement
01:15:04.280 that they're from the military, homeland security, law enforcement,
01:15:11.420 guys that are really experts in deception, detection, and interrogation and all.
01:15:19.900 And they're going to have a really good time analyzing people that—from politicians to—
01:15:28.440 Oh, my gosh.
01:15:29.340 Is that great.
01:15:31.060 To every walk of life.
01:15:34.560 And these guys are the best in the world.
01:15:39.560 I mean, they get down to pupil dilation, blink rate.
01:15:43.900 Wow.
01:15:44.380 You know, there are some things—if you're interrogating somebody and they're lying,
01:15:47.980 90% of the time, their feet are pointed towards the exit.
01:15:51.880 There are just things that people just don't know.
01:15:54.040 If you're lying, your blink rate goes from an average of 15 to into the 70s or 80s.
01:15:59.120 I mean, it's like playing whack-a-mole.
01:16:00.500 There are so many indications and signs that even if you know them all, you can't control them all.
01:16:07.240 And it's really interesting what we get into with these guys.
01:16:11.760 There's lots of things like that.
01:16:13.420 So it's going to be great.
01:16:17.300 And we think we're going to launch into somewhere between 75 and 90 million homes day one.
01:16:25.460 It's going to be great.
01:16:27.520 What's the network called Merit Street?
01:16:29.400 Yeah, Merit Street Media.
01:16:30.660 The call signs when you pull it up on DirecTV or Dish or whatever will be Merit.
01:16:35.540 And that wasn't chosen at random issue, I guess.
01:16:39.920 So let me ask you.
01:16:41.780 I mean, I built this, and I'm 60 now.
01:16:46.540 And I built it at 50, and it damn near broke me.
01:16:52.980 Why?
01:16:54.080 Why?
01:16:54.720 You don't need the money.
01:16:56.360 You don't need the fame.
01:16:58.240 Why are you doing this?
01:17:01.320 I ask myself that once in a while.
01:17:04.000 I mean, how's that working out for you?
01:17:06.380 You know, Glenn, I really, it's, I tell people if you don't have a passion in your life, you need to find one.
01:17:18.420 And you're right.
01:17:20.260 I don't need to do this.
01:17:22.120 I want to do this.
01:17:23.400 I'm more excited about launching this network than I was when I launched Dr. Phil back in 2002.
01:17:30.060 And that was very exciting.
01:17:31.360 Why?
01:17:32.200 Because I feel like this country is really in danger right now.
01:17:39.760 I feel like we are under attack from within.
01:17:44.100 And I think we've got a lot of things going on right now that I have relevant information about.
01:17:51.680 I think I have some things to say.
01:17:54.260 I remember, you know, Roger King.
01:17:56.360 You remember Roger King.
01:17:57.340 He was king of syndication.
01:18:00.980 And I remember right over here at the Four Seasons out in the.
01:18:05.320 Los Galinas.
01:18:06.340 Yeah.
01:18:08.560 That was the first interview I gave when we was getting ready to take the show out to sell.
01:18:14.000 And for people that don't know, when you're in syndication, they sell your show market by market.
01:18:18.780 Now everything's so.
01:18:20.400 Yeah.
01:18:21.400 Now it's six or seven groups.
01:18:23.560 It used to be 200.
01:18:24.840 Yeah.
01:18:24.980 It would be maybe three or four big groups.
01:18:27.000 But then the rest of it, you went market by market.
01:18:30.640 But I remember he said, all right, we're going to make a pitch reel for this show.
01:18:33.680 So sit down there.
01:18:34.320 We're going to ask you some questions.
01:18:35.280 And the first question he asked is, all right, what's the show going to be about?
01:18:39.280 And I remember the first things I said on camera about it.
01:18:43.140 I said, I want to talk about things that matter to people who care.
01:18:47.260 And I didn't plan that or think about it.
01:18:50.120 I said, I want to talk about things that matter to people who care.
01:18:54.780 And I want to deliver common sense, usable information to people's homes every day for free.
01:19:04.080 And how can that not work?
01:19:07.700 And I was right.
01:19:10.080 I mean, if you talk about things that matter to moms and dads and husbands and wives and, you know, whoever, and those things that matter have changed.
01:19:22.360 As I said, when smartphones came out, it changed.
01:19:24.680 And over the last few years, it's changed to include more social issues than it used to because so much is going on.
01:19:36.160 People are concerned about immigration.
01:19:38.700 They're concerned about so many things that five years ago, it wasn't on their radar as much.
01:19:43.520 I was surprised to see you down on the border.
01:19:46.400 And you were clear on that.
01:19:53.180 You've been clear on a few things.
01:19:55.080 Yeah.
01:19:55.200 Well, I'm not one to, you know, like I said, be who you are on purpose and do it with intention.
01:20:03.600 You know, when I was asking those guys about that, I said, why haven't you talked about this?
01:20:08.460 He said, nobody's asked us this question in this point of fashion.
01:20:13.420 You know, I said, are we sending children off into prostitution with tax dollars?
01:20:18.660 They said, yes.
01:20:20.740 I said, and it's not in the clip.
01:20:22.720 I said, is the camera, you know, you're on camera, right?
01:20:26.340 Right.
01:20:26.620 He said, yeah, I know.
01:20:27.560 I'm glad you're asking.
01:20:28.480 In fact, in the clip, it says, I'm grateful you're asking.
01:20:33.400 Nobody's asking.
01:20:34.980 What the hell?
01:20:35.580 Nobody's asking.
01:20:36.180 Are you kidding me?
01:20:39.220 So I'm asking questions that I think people need to hear the answer to.
01:20:42.700 You said you wanted to reach a more broad audience.
01:20:45.680 The things that are political in this day and age, just narrow.
01:20:54.460 I mean, they will.
01:20:56.120 I know you've been through the ringer.
01:20:58.180 You haven't been through the ringer yet.
01:21:01.440 It's it's.
01:21:04.220 What what audience was missing, first of all, that you didn't have?
01:21:08.320 Well, when you're on in daytime, you know, 90 percent of your audience is female.
01:21:12.620 You know, and a lot of people are are working during that time and, you know, they can record
01:21:22.080 and watch it later.
01:21:23.160 And we were one of the few shows that people recorded and actually did watch.
01:21:27.880 Our number changed to live plus seven change, you know, significantly.
01:21:32.280 But I think being on in primetime, I can speak a lot to the male audience.
01:21:38.260 I can speak a lot to those Americans that are out there working hard and now they're
01:21:41.740 home and can watch the show organically.
01:21:45.220 And so I want to do that.
01:21:48.660 And I like having a news department where if something's breaking, I can walk over there
01:21:55.040 and join them and talk about it.
01:21:56.580 And I'm not a news guy.
01:21:58.800 I don't know anything about producing news, but I've got people in there that do.
01:22:02.600 I'm real good at surrounding myself with people that are a lot smarter than me on things that
01:22:06.980 I don't know about.
01:22:07.720 And I don't have any trouble acknowledging that.
01:22:11.020 And and you're right.
01:22:12.800 People are going to take pot shots at me when I talk about Israel and the border and stuff
01:22:16.840 like that.
01:22:17.360 But I've never had a need to be loved by strangers, which works out great, doesn't it?
01:22:25.840 That's come in handy for you.
01:22:26.900 It really does.
01:22:27.980 It really is.
01:22:29.320 You know, if you believe in what you believe in, that's all that matters to me.
01:22:33.960 I believe in it.
01:22:35.040 And if if somebody checks my facts and I'm wrong, I'll say, hey, somebody checked my fact
01:22:41.200 and it was wrong and here's the right one.
01:22:42.700 That's all right.
01:22:43.260 And I remember my first interview with Roger Ailes at Fox, I'd had dinner with him a couple
01:22:51.840 of times and he was a delightful man and a great storyteller.
01:22:55.480 And then I went to an interview dinner.
01:22:58.860 He was totally different.
01:23:00.900 And he didn't talk to me at the table for maybe three minutes.
01:23:05.180 Just silence.
01:23:06.800 I sat there and then he said, so.
01:23:10.520 What do you think of the 1972 China Accords?
01:23:17.320 I know nothing about it.
01:23:18.920 And I thought, well, I'm just going to tell the truth.
01:23:22.040 Don't know anything about it.
01:23:24.200 He didn't talk to me for another three, four minutes.
01:23:26.960 Next time, he said.
01:23:27.980 What do you think of the international relationship that was fostered by the Eisenhower administration?
01:23:41.140 And I sat there and I looked at him and I said.
01:23:47.620 I got a choice right now.
01:23:49.900 I could bluff.
01:23:51.220 And hope that you won't notice that I'm bluffing, but you're too smart for that.
01:23:58.700 Or I could shut this interview down right now and just tell you, I got no idea.
01:24:04.240 None.
01:24:05.580 He said, which one are you going with?
01:24:08.900 And I said, I think the second one.
01:24:11.620 He didn't talk to me for another five minutes.
01:24:16.040 And then he just started pummeling me with questions that, I mean, I think I lost 20 pounds of sweat.
01:24:22.040 You know, just.
01:24:23.920 And I got up from the table and I thought, well, this has been a total train wreck of an evening.
01:24:29.200 And instead, he stood up and he shook my hand and he said, it's very rare that you get to meet a man who knows what he knows, knows what he doesn't, and is willing to tell you the truth.
01:24:43.320 And I don't think, I think people worry too much about other people.
01:24:49.960 Yeah.
01:24:50.420 And they want to be right or have the answers or look smart.
01:24:54.780 Yeah.
01:24:55.140 It's so much better when you, smart people know when you're bluffing.
01:24:59.700 Yeah, for sure.
01:25:01.080 And I used to train witnesses that were CEOs of, you know, Fortune 100 companies.
01:25:08.520 And we'd put them up on the stand and cross-examine them before trial and ask them a question they didn't know.
01:25:16.640 And they'd try to say, well, blah, blah, blah, blah.
01:25:19.320 And I said, wait a minute.
01:25:21.660 You don't know.
01:25:22.960 Say you don't.
01:25:23.460 Well, I'm a CEO.
01:25:24.080 I should know.
01:25:24.740 No, you shouldn't.
01:25:26.280 You don't have to know that.
01:25:27.580 Just say, I don't know.
01:25:30.460 I've got people that do know, and I can make them available if you want, but I'm not involved in that, but I support the people who are.
01:25:42.140 And the jury will love you for that.
01:25:44.580 Just saying, I don't know, but I support the people who do.
01:25:49.120 They're the people who do.
01:25:50.460 And they'll love the fact that a CEO says, I don't know, but I support those who do.
01:25:57.440 And they'll love you for that.
01:25:58.740 And once we got that beat through their heads, they were great witnesses once you get them to realize it's CEO-itis.
01:26:05.720 Yeah.
01:26:05.920 And we used to train that all the time.
01:26:08.780 And I think everybody has a little bit of that now.
01:26:10.960 Oh, yeah.
01:26:12.540 Everybody feels like they have to know, or they do know, and they don't know.
01:26:16.520 And they don't know.
01:26:17.300 They don't know.
01:26:17.800 And they'll get in trouble.
01:26:19.260 Dr. Phil, it's been great.
01:26:21.220 I really enjoyed this.
01:26:22.460 Yeah, me too.
01:26:22.900 It's going to be nice to have you as a neighbor.
01:26:24.280 I can't believe it's been this long before we've ever sat down and talked.
01:26:28.040 Because we've been in.
01:26:29.140 Oh, I know.
01:26:29.660 I was at the, it was the Paramount lot that you were on, right?
01:26:34.700 I was at the Paramount lot.
01:26:36.340 See this gigantic building, your face all across it.
01:26:40.660 It was quite a run.
01:26:43.220 I loved it there.
01:26:45.180 And I've had a great relationship with CBS.
01:26:48.260 I still have primetime shows on with them.
01:26:50.340 And they've been great partners.
01:26:54.660 We're still in business together, have a great relationship.
01:26:57.800 And we were the longest running show on the history of the Paramount lot.
01:27:02.600 And they are 105 years old.
01:27:06.720 That's a lot of history on that lot, too.
01:27:08.700 We were on stage 29.
01:27:10.380 And when I walked on the set for the first time,
01:27:13.900 they were getting ready to take down a picture of Lou Alcindor, not Kareem Abdul-Jabbar.
01:27:19.240 It was when he was at UCLA.
01:27:21.720 He had this picture up there, and it was kind of turned sideways.
01:27:24.320 And I said, what is that picture up there?
01:27:26.920 And they told me, and they said, how did he get there?
01:27:29.020 And they said, well, Arsenio Hall hung it when he was here.
01:27:33.580 Wow.
01:27:33.880 And I said, really?
01:27:35.680 And they said, yeah.
01:27:36.540 He was the longest show to ever last on this stage, stage 29.
01:27:41.940 He was here for five years, longest run ever on this.
01:27:45.640 I said, do not touch that picture.
01:27:49.120 Do not touch that picture.
01:27:50.960 And we were there 21 years.
01:27:52.680 And when we left, I brought that picture.
01:27:54.760 And it's now hanging on my new set at Merritt Street Media.
01:27:58.760 That is great.
01:27:59.180 And when my son launched the doctors on stage 30 next to us,
01:28:03.620 I had one of those pictures made and hung it at the same angle on stage 30 right next to him.
01:28:08.560 That is so great.
01:28:08.720 And he was there for, I think, 14 years.
01:28:10.900 Wow.
01:28:11.580 So that's a, I tell you, you need to get one of those.
01:28:14.580 Yeah, I know.
01:28:15.480 I get one.
01:28:16.480 Yeah.
01:28:16.980 Thank you, Don.
01:28:17.320 Glenn, thanks so much for having me.
01:28:18.700 God bless you.
01:28:24.380 Just a reminder, I'd love you to rate and subscribe to the podcast
01:28:28.460 and pass this on to a friend so it can be discovered by other people.
01:28:31.540 Thank you.
01:28:41.200 Thank you.
01:28:48.180 Bye.
01:28:48.360 Bye.
01:28:48.620 Bye.
01:28:48.960 Bye.
01:28:51.100 Bye.
01:28:51.520 Bye.
01:28:52.540 Bye.
01:28:53.600 Bye.
01:28:55.600 Bye.
01:28:55.700 Bye.
01:28:55.920 Bye.
01:28:56.540 Bye.
01:28:57.780 Bye.
01:28:58.180 Bye.
01:28:58.900 Bye.
01:28:59.660 Bye.
01:28:59.800 Bye.
01:28:59.960 Bye.
01:29:00.240 Bye.
01:29:01.080 Bye.
01:29:01.960 Bye.
01:29:02.460 Bye.
01:29:03.180 Bye.
01:29:04.540 Bye.
01:29:05.180 Bye.
01:29:07.180 Bye.
01:29:09.460 Bye.
01:29:09.840 Bye.
01:29:10.400 Bye.
01:29:10.700 Bye.