Ep 211 | Dr. Phil's WARNING for Parents & His Advice for Trump's Legal Team | The Glenn Beck Podcast
Episode Stats
Length
1 hour and 29 minutes
Words per Minute
141.56602
Summary
Dr. Phil has been a staple on daytime television for over 25 years, saving countless troubled marriages with his ability to deliver an unapologetic, tough love with a dashed southern charm. He s guided thousands of people who have been struggling with everything from their weight to their relationships. Before blazing into his number one television career, Dr. Phil was a successful trial consultant. He had clients like Exxon and Oprah Winfrey.
Transcript
00:00:00.000
This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460
Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420
All Porter fares include beer, wine, and snacks and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.860
And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240
Visit flyporter.com and actually enjoy economy.
00:00:34.020
Today's guest may have been your first therapist, depending on your age.
00:00:39.220
He has been a staple on daytime television for over 25 years, saving countless troubled marriages with his ability to deliver an unapologetic, tough love with a dashed southern charm.
00:00:54.600
He's, you know, guided, I don't know, thousands of people who have been struggling with everything from their weight to their relationships.
00:01:05.300
Before blazing into his number one television career, he was a successful trial consultant.
00:01:16.300
In fact, one that really put him on the map was his client, Oprah Winfrey.
00:01:24.980
Founder of a brand-new media company here in Texas.
00:01:28.320
He ditched Hollywood for the free state of Texas.
00:01:31.440
So, just like he always says to the people on his show, how's that working for you?
00:01:39.140
It is called We've Got Issues, How You Can Stand Strong for America's Soul and Sanity.
00:01:44.620
This, I think, is a Dr. Phil, at least I haven't seen before.
00:01:50.280
I think you're going to enjoy our conversation.
00:01:54.440
Before we get to Dr. Phil, let me tell you, last year, because of you,
00:02:00.440
Preborn's network of clinics saw over 58,000 babies saved.
00:02:07.260
We should celebrate the lives of the precious babies that were saved,
00:02:14.340
When Charlotte found out she was pregnant, she was seven weeks along.
00:02:17.640
In the back of her mind, she had no support from anybody.
00:02:20.260
She thought abortion's going to be the best solution.
00:02:22.480
But then she went into a preborn clinic, and they gave her a free ultrasound,
00:02:38.200
So she got all of the postnatal care that she needed,
00:02:41.480
and all the way to baby clothes and books and diapers and everything else she might have needed
00:02:47.460
Each of these babies are miraculous, and so are their moms.
00:02:55.000
$28 a day can be the difference between life and death.
00:03:04.860
When a mom meets her baby on the ultrasound and hears their heartbeat,
00:03:11.220
Will you be the person that either makes a major gift or even a $10 gift
00:03:47.980
Nice to have you in the studio, but also nice to have you in Texas.
00:03:58.680
So, I've actually recorded a couple of audio books in this building.
00:04:03.320
Yeah, a long time ago, right by the front door there.
00:04:10.880
I bought it about 10 years ago, and we've changed it a lot.
00:04:20.580
I was in it when it was more warehouse than studio.
00:04:24.840
So, you've really turned this into a broadcast center.
00:04:28.520
I mean, this studio is the largest studio in the Americas that's in daily TV production.
00:05:05.820
I mean, it's the same kind of common sense, shoot from the hip.
00:05:11.580
But I highlighted a few things I just want to go through.
00:05:19.840
You said at the very, very beginning that you have been doing this for a very long time
00:05:26.800
and you've been listening to people who have problems, relationships, but you noticed something
00:05:41.460
You have to understand, having been doing this for 25 years or a little more, actually,
00:05:48.900
spending the time I did on Oprah and I started my own show in 2002.
00:05:54.820
And I didn't really think about it until I sat down and started timelining this out, Glenn.
00:06:01.000
But in 2002, the first text message hadn't been sent.
00:06:12.600
So along about 06, 07, we started to get much more into the internet.
00:06:23.580
It was like a bunch of C-130s flew over and dropped smartphones on everybody.
00:06:29.200
And that's when I saw as big a change in our society as has happened in my lifetime, for
00:06:38.260
I think as big a change to mankind as has happened since the Industrial Revolution.
00:06:46.120
We are walking around with as much computing power in our hand as we had when we did the
00:06:53.500
Yeah, and especially with what's coming, they say that the last 400 years, all of the changes
00:07:02.800
in the last 400 years will now be compressed between right now and 2030, 2035.
00:07:15.580
I mean, we are animals and our instincts, everything comes from millions of years of experience.
00:07:24.740
And it's showing, because if you look, particularly at our young people who immerse themselves in
00:07:31.740
this technology, we're seeing the highest levels of anxiety, depression, loneliness, suicidality
00:07:39.080
among our young people, starting in 09, 010, right after we had all of this technology boom,
00:07:47.300
that have been recorded, the highest levels that have been recorded since they started keeping
00:07:54.180
Our young people stopped living their lives and started watching people live their lives
00:08:01.660
But the problem was, they're comparing themselves to fictional lives.
00:08:06.520
These influencers over there, I've had them on the show that have said, look, I shoot a
00:08:12.300
video with all these fancy clothes and saying, okay, I'm in a rush.
00:08:18.280
And they say, as soon as the video's over, I carefully take those clothes off because I
00:08:24.480
I have to take them back to the store because I just brought them home.
00:08:28.880
Now, I don't have the money for those, so I take them back.
00:08:35.860
So kids watch this and say, by comparison, what a loser am I?
00:08:49.400
And they're comparing themselves to this fantasy life that doesn't even exist.
00:08:54.740
We have a place out in Santa Monica where they have a fake fuselage to a private jet that
00:09:01.580
rents out for 15 minutes at a time where these influencers go in and pretend they're on a
00:09:09.760
Going to Cabo or going to Aspen, they put on their ski clothes and say, oh, off to Aspen.
00:09:15.760
They'll go in and shoot a whole year's worth of content, changing clothes from beach to ski
00:09:26.500
And kids compare themselves to that and say, I don't ever go anywhere.
00:09:30.780
They went over to Santa Monica and shot all this phony content and put it out on the internet
00:09:38.980
You know, when I first got into radio and then later television, it took- it was hard
00:09:47.320
work to curate an audience, to know who you were, and then to create and curate an audience.
00:09:58.440
And everything that people used to say about, oh, he's only saying that because he wants
00:10:02.600
to get rich, or he's only saying that because he wants, you know, people to watch him.
00:10:06.520
No, you can't be, you can't be who you are or I am for very long if you're fake, I think.
00:10:16.960
But the people in the audience now who have their own audience, that is what they're doing.
00:10:26.520
It's like we're in some sort of weird nightmare.
00:10:29.800
We are, and it doesn't last very long, but the problem is there's one standing there to
00:10:37.900
They'll get 100,000 followers, maybe they'll get 500,000, but they flame out in a short
00:10:44.120
period of time, but then there's the next one coming right behind them.
00:10:47.700
Um, I was looking at some stats, uh, just today and it's something like 60% of Gen Z, 25 and
00:11:00.160
under that say they would rather be an influencer than a doctor, a lawyer, an architect, whatever.
00:11:12.680
They don't understand how difficult it is to monetize that content, how difficult it
00:11:20.220
They just think, I'll just put stuff up and get money.
00:11:24.880
So when the iPhone first came out and I noticed everybody started doing this, looking down,
00:11:30.880
um, I said, we are running the biggest experiment on humankind that is ever.
00:11:43.180
Everybody completely change your life with this.
00:11:48.320
And people thought that was crazy at the time because you know, it, it does bring a lot of
00:11:54.540
You can, it was amazing when you could actually see somebody on the other side of the world
00:11:59.620
that was just an individual telling you what was going on.
00:12:02.920
So is it, is it this bad experiment that we're running or is it that and the combination of
00:12:12.620
really bad actors, uh, that are using knowingly using this and creating such dystopian.
00:12:25.460
And look, obviously there are great advantages to this technology, right?
00:12:30.900
Uh, you know, some kids don't even know what a library is.
00:12:34.140
And if you happen to be listening, it's a big building with books in it.
00:12:41.680
Uh, now just think how much information we have at fingertip.
00:12:46.360
Now you got to check and make sure that it's not wrong information.
00:12:51.040
Um, but, but even the stuff that we, because we're digital now, I just, just read this,
00:12:57.080
that information in our libraries, that's solid, but the digital information can easily
00:13:09.400
And some of it is you look for things that, you know, you saw, you know, it was on the
00:13:23.880
This AI and I, I, I've seen myself in ads for products and it's me.
00:13:37.960
And I'm peddling a product I've never seen or heard of.
00:13:47.340
They just shut that down and open up a new entity and they're right back at it again.
00:13:55.520
I mean, it's, you, you can't, you, you can't get rid of them as fast as they pop back up.
00:14:01.060
But obviously there are huge positives to this, but you ask, is it, is it bad actors?
00:14:14.760
I, I, I deal with women that get caught up in these romance scams.
00:14:21.040
I've had them that they've worked their whole life.
00:14:28.760
They've worked their whole life, saved up three, $400,000.
00:14:36.580
Some Nigerian in some workroom that's got 30 or 40 of them up on their computer connects
00:14:50.200
We actually got a copy of their playbook and they start scamming these women and take them
00:14:59.800
And then we've got bad actors, I think, in terms of who's running the algorithms.
00:15:11.560
I wrote a book a long time ago about AI and talked about, don't fear the technology per se.
00:15:20.880
Fear the people who are writing the algorithms.
00:15:28.840
But you wrote, while you're getting fed highly curated, highly filtered information, you aren't
00:15:38.900
And you talk about how you're not even in charge anymore.
00:15:49.000
You open up Instagram or TikTok or whatever and it starts, you start scrolling through
00:15:56.080
and it's showing you this and showing you that.
00:15:58.100
And you think, well, this is coming from somewhere and I wonder why I'm seeing what I'm seeing.
00:16:06.060
Maybe most people don't wonder why they're seeing what they're seeing.
00:16:08.580
But the fact is, I include a study in here where they opened one up, opened up an account with
00:16:20.360
They created an account with a 13-year-old girl, just put up her name, and 13 years old.
00:16:26.800
And within minutes, they started feeding her toxic information, just really information
00:16:38.960
that was upsetting for her and not in her best interest.
00:16:43.560
So they came back and said, well, all right, let's see what happens if we give a clue about
00:16:50.640
So they changed the label to Lauren Lose Weight, gave a clue to the algorithm about what she
00:16:59.040
The amount of toxic information she got within minutes went up like 10x.
00:17:07.480
In a matter of minutes, they started directing her to 700-calorie diets, 400-calorie diets,
00:17:15.740
anorexia sites, all sorts of things started bombarding her.
00:17:26.160
If they show you a box of puppies, and you think, whoa, that's really cute, and so you
00:17:31.840
click it a few times, you think, yeah, cute puppies, okay.
00:17:33.940
But if they show you something upsetting, like sick puppies or abandoned puppies, something
00:17:44.320
that upsets you emotionally, it gets you jacked up, you're going to really start clicking because
00:17:53.360
And so instead of just kind of clicking and laughing, clicking and laughing, you start really
00:17:58.600
And the more you click, the more money they make.
00:18:00.720
So they feed these girls this information that gets them emotionally invested, gets them
00:18:08.520
They stay on longer, they click longer, and what happens, of course, then is more ads come
00:18:16.900
Now, they do this knowing, and we've seen the information, that the girls get anxious,
00:18:22.100
they get depressed, their self-worth goes down.
00:18:31.840
So they continue to feed them upsetting content because they click more and get more ad exposure.
00:18:40.700
We've seen the documents that say they know that.
00:18:43.160
So they feed them upsetting information because it creates more ad ref.
00:18:47.920
So your kids are not just seeing what randomly comes at them.
00:18:53.080
They're actually being targeted by harmful information because it creates more money for the major social media platforms.
00:19:04.160
And your child doesn't know it, and you don't know it, but they're victimizing your child consciously, and you don't know it.
00:19:11.900
And I'm putting it in here because people need to know it.
00:19:14.840
And what you talk about here, too, is the censoring of information, stuff that you may want to know, may not want to know, but it's what they want you to know.
00:19:29.520
And that censoring of information, especially in my world, is growing at a dramatic and terrifying pace.
00:19:42.020
You know, I spent a lot of time in the litigation arena.
00:19:45.380
You know, not three minutes from here, I had a company, Courtroom Sciences, Inc.
00:19:54.120
And so we spent a lot of time studying how jurors problem solve cases.
00:20:04.540
And we discover that out of that, a jury might break this down to maybe 50 or 60 facts.
00:20:14.620
And out of that 50 or 60, eight or 10 may drive their decision.
00:20:19.720
It was our job to isolate what are those eight or 10 decision-driving facts and how can they be presented in the most effective way.
00:20:30.540
And if you understand what those facts are and how they can be presented most impactfully, then you've got a real leg up.
00:20:44.380
And one of the things we learned real quick is jurors decide cases on what they see and hear, not on what they don't see and hear.
00:20:54.260
So you'll have a lawyer that says, well, we tried to get something in and they objected and there was a big fight.
00:21:00.760
And the judge said, well, we can't let that in now, maybe later.
00:21:04.980
And they look at the jury and they say, well, they knew we had something powerful.
00:21:12.660
They decide on what they see and hear, not what was implied, not what you inferred.
00:21:21.780
And if they don't, it doesn't have a lasting effect on them.
00:21:26.220
So people that are censoring and deleting information, making sure it doesn't get in your feed, then I promise you across time.
00:21:38.040
Those are not the decision-driving factors in somebody making up their opinion, forming an opinion and solving a problem based on that information.
00:21:48.260
And when they're curating this information, when they're choosing what you see and what you don't see, they're forming your opinions.
00:21:57.220
What's frightening to me is they have, you know, you don't think of, you worked for Exxon and, you know, did court cases with Exxon.
00:22:06.960
When we used to always look at those companies and go, well, they're not going to lose because they got all the money in the world and they know exactly.
00:22:14.920
They can just figure out the jurors and everything else.
00:22:19.340
But that's what's being done on every American now by our own government and by social media.
00:22:31.380
But I got to tell you, in 2008, when everybody was starting to be called a racist, I thought, I think we're doing really well.
00:22:42.840
You know, I grew up at a time when you didn't really notice it.
00:22:47.480
The Martin Luther King idea, at least I grew up in Seattle, at least there, it wasn't an issue.
00:22:57.040
And then all of a sudden we're being told, you're a racist, you're a racist, you're a racist.
00:23:01.060
And it is some of the greatest psychological and behavioral scientists alive today that are doing it.
00:23:09.640
Well, that's what I call in the book, tyranny of the fringe.
00:23:25.600
And a lot of this, we have to remember, when we talk about Russia, for example, Pavlov, which is one of the greatest behavioral psychologists in the history of the field, was Russian.
00:23:43.280
And so, trust me, they are good at what they do.
00:23:50.240
And we have a document from the 60s that I talk about in the book.
00:24:02.080
And they were talking about the subverting American society, American culture.
00:24:10.020
And they describe it as psychopolitics as well.
00:24:14.400
And they're talking about how you can control the minds and the morale and the emotions of the society.
00:24:25.360
And their conclusion was, they've already done it for us because they're attacking each other.
00:24:35.760
They may have freedom of speech under the First Amendment, but they're muzzling each other.
00:24:40.500
This cancel culture that we have now is an advanced version of what they were talking about with the psychopolitics of the 60s.
00:24:49.700
But when I'm talking about psychopolitics, I'm talking about brainwashing people, controlling what people say, what they feel comfortable talking about.
00:25:01.700
And if they dare to question what these activists are talking about, what they're pushing, what they're peddling, then they are attacked with a vengeance.
00:25:19.560
And it's to the point where they call their job.
00:25:26.480
They get them where their own family won't talk to them.
00:25:33.080
I was struck by, I was over in London during Gay Pride week and month.
00:25:38.740
And I mean, on the castle, on every government building, in every store, they were flying the rainbow flag.
00:25:54.400
I kept thinking to myself, everyone, everyone wants to fly that flag.
00:25:59.820
And I think it's a lot like the people in Germany that hung the political party flag.
00:26:17.420
And it seems as though there are those who are awake, who see it, and see it for what it is.
00:26:33.620
And then there's those who, I mean, I've done my job for 25 years trying to say, wake up, wake up, wake up.
00:27:13.800
One of the things I talk about in the book, and I entitled the book, We've Got Issues,
00:27:28.140
And, you know, I get hate mail for saying I love this country.
00:27:36.400
I put my hand over my heart when they play the national anthem.
00:27:41.760
And I love this country enough to acknowledge that we've got problems.
00:27:49.220
But I love it enough to not be defensive about the fact that we've got problems.
00:27:54.520
And I think that's a good thing to say, I admit we've got problems.
00:28:08.960
The majority of universities, my research has shown me that the majority of universities
00:28:17.740
have utilized or are utilizing trigger warnings.
00:28:23.420
I saw a couple of universities are using trigger warnings for Romeo and Juliet,
00:28:29.440
where they say trigger warning, suicide content.
00:28:34.160
Well, spoiler alert, come on, kind of gave the storyline away.
00:28:49.240
When you research trigger warnings, and please, if you're listening to this, fact check me.
00:29:02.240
I mean, go another level and research trigger warnings, and you will find that the vast majority,
00:29:11.440
overwhelming body of literature says trigger warnings not only don't work.
00:29:20.680
They actually create the problem that they were designed to avoid.
00:29:30.420
There is evidence-based therapy designed to teach people to cope with things that stress them out, right?
00:29:39.620
Systematic desensitization, dialectical behavior therapy.
00:29:43.720
There are a number of evidence-based therapies to teach people to overcome these stressors in their life.
00:29:58.720
They say, okay, some things are going to come up that might stress you, so we're going to warn you, which is stressful.
00:30:05.860
And you can go over here and sit in a corner and avoid this and pretend it's not there.
00:30:12.300
Problem is, when you get out of college and get out in the real world, that doesn't happen.
00:30:22.220
So you're not preparing people for the real world.
00:30:29.880
Research says they actually hurt and that the better method is to learn to cope with them.
00:30:35.820
So later in life, you're not still paralyzed, whether it's PTSD or whatever it may be.
00:30:43.900
And the trigger warnings have been for some really ridiculous things.
00:30:47.280
But assuming that they're for something that was traumatic, you need to learn to cope with that.
00:30:55.200
These universities that are employing them have the same access to the same research that I do.
00:31:02.140
So if I can go out there and find out that trigger warnings are contraindicated, that you should not use them, that they actually create problems.
00:31:14.680
If I can look that up and find it, so can every university that's employing them.
00:31:24.640
They're wanting to seem like, I am super woke here.
00:31:31.520
I'm protecting all of the students and creating a safe place for them to get an education.
00:31:40.140
The problem is that's teaching them to go on green and stop on red.
00:31:51.280
Now, if I can look that up and find it and see it, so can they.
00:31:55.140
So they are knowingly teaching these people something that doesn't help them and actually hurts them.
00:32:11.160
Nathan Sharansky wrote a book called Case for Democracy.
00:32:16.360
In it, Sharansky created what he called the Town Square Test.
00:32:23.820
Well, the whole idea here is you have to be willing to speak up, speak out, and say what you think.
00:32:37.180
And the whole idea is that the Internet, for example, should be like a town square, right?
00:32:47.040
You should be able to talk about whatever you want to talk about and have an exchange of ideas.
00:32:58.620
When you put out something that is at odds with the agenda, the agenda is intolerant when you're dealing with these activists.
00:33:13.540
And I think it was Richard Feynman that said, I would rather have answers that I can't question than questions I can't answer.
00:33:24.840
If you've got answers you can't question, that's worse than having questions you can't answer.
00:33:35.680
You've got answers you've got answers you can't question.
00:33:39.000
And that's worse than having questions you can't answer.
00:33:41.320
You said at one point that the town square test, the way to know if you're living in a fear society, is if a person cannot walk into the middle of town square and express his or her views without fear of arrest, imprisonment, or physical harm.
00:33:56.480
By that definition, we're not living in a fear society, at least not yet.
00:34:00.540
In a fear society, in a real town square, when a person's getting silenced, you actually see them getting attacked or muzzled or arrested or dragged away.
00:34:09.540
This is what I called a few years ago and got a lot of heat for it, a digital ghetto.
00:34:19.840
They could talk all they want behind that wall.
00:34:40.820
They didn't have, if, think about George Orwell's 1984, which was written, think about how prophetic this was.
00:34:53.120
And he talked about someone would say something they shouldn't say.
00:34:59.360
They would fail the town square test, and they would get unpersoned.
00:35:09.040
I mean, everything, they would disappear from the records.
00:35:13.860
They would just, you couldn't find anything about them.
00:35:22.640
And they started eliminating words from, in the book, they started eliminating words.
00:35:34.540
And you could only use these words and not use any others.
00:35:37.940
And people started saying, well, I actually like that because I don't want to have to make any decisions.
00:35:42.480
Just tell me what words I can use, and I'll use those, and then I don't have to think.
00:35:52.720
We have the First Amendment protecting free speech, and we're muzzling each other.
00:36:00.280
It's not the government coming in and taking it away.
00:36:04.740
You say something I don't like, and we will attack you.
00:36:18.140
And you'll wear the scarlet letter and be unacceptable anymore.
00:36:24.360
And it's got people, I have one statistic in the book that says the percentage of people that are afraid to express their opinion has tripled since 1950.
00:36:43.180
And think of 1950 as the beginning of the Red Scare.
00:36:48.020
People are just saying, I'd just rather not say anything.
00:36:53.320
You don't have a civilization when that happens.
00:36:58.960
And what you're saying is you've been trying to wake people up, and have.
00:37:11.860
And I think that a lot of these activists have pushed too far, too long, too hard.
00:37:32.760
My grandmother said, you quit preaching and go on to meddling now.
00:37:40.400
There's all kinds of research, historic research, that shows that the final stages of an empire
00:37:49.340
always comes at the end with questionable sexuality, questionable bad morals on sex,
00:38:07.260
And for some reason, that's the last straw that comes before it collapses.
00:38:30.340
I believe that if we really want this culture, this society to flourish, that it's the number
00:38:48.140
one principle I write in the book, be who you are on purpose.
00:38:52.840
Don't wake up and ride the river wherever it's going.
00:38:59.880
You got to decide what's important to me and what am I willing to do to stand up for that.
00:39:11.080
That's why the subtitle to the book is How to Stand Strong for America's Soul and Sanity.
00:39:17.320
And the soul of the nation is a big word, right?
00:39:21.240
I mean, to talk about the very soul of the nation.
00:39:24.540
But when people are trying to rewrite history, biology, science, all of that, how does that work?
00:39:36.040
I mean, you don't just decide, you know, I don't like the way this is, so I'm going to rewrite it.
00:39:49.080
I'm like, if that's what you want to think, okay.
00:39:53.580
We're going to demand that you stand up and say you agree with us.
00:39:58.820
It's not enough that you let us think that way.
00:40:27.800
We're in that position right now where it isn't enough.
00:40:37.180
How do you get a country that only voted a third for the Nazis?
00:40:41.500
How do you get them to raise their hand and give the Hitler salute?
00:40:46.420
They were physically beaten in the streets by the SA.
00:40:50.080
We are not physically beaten, but we have a massive psychological game being played.
00:41:01.580
The social media platforms, all that are game changers.
00:41:06.400
Because think about it, before that, if you were up in Kansas and you were living out on
00:41:13.820
the farm and you had some wacky idea, you could tell a couple mouth breathers down at
00:41:19.620
And you guys could get out in the woods and talk to each other and that's about as far
00:41:28.060
But now, you know, on the Internet, it gets oxygen because you got enough other people
00:41:38.480
I want to find something that distinguishes me.
00:41:43.480
And that's why I say that I'm so worried that family in America is under attack.
00:41:51.380
Religion has dropped below 50% in America for the first time in our country's history.
00:41:56.360
People want to belong to something somewhere, somehow.
00:41:59.940
And absent equality choice, they'll grab onto anything.
00:42:03.800
And so, you know, here's somebody with a wacky idea and they'll love bomb you and accept
00:42:19.420
And so, before you know it, those four or five guys out in the woods now are connected
00:42:27.880
And then those four or five hundred, that's why Richard Allen Ross, who's, I think, the
00:42:34.180
best cult guy around, says we've probably got 10,000 cults operating in America right
00:42:45.540
And then maybe once a year, they all get together and physically.
00:42:51.220
Because I've been fascinated with technology my whole life.
00:42:57.180
And I started reading Ray Kurzweil back in the 90s and what was coming and AI and, you
00:43:06.280
know, ASI and all of the things that we're now beginning to experience.
00:43:11.480
And it is very hopeful and very, it's miraculous, but it is also deadly.
00:43:18.980
So, is it, is it the fact that, I mean, how many movies do you have to watch where you
00:43:28.080
Know how, turn the air back on, you know what I mean?
00:43:32.720
And we just seem to have this, like, normalcy bias in a way that this time it's different.
00:43:39.760
No, and I think the way to inoculate against that is, as I said, we have to ask ourselves,
00:43:55.000
And are we doing anything about what we're about?
00:44:03.500
And I look at what's being taught in the colleges and universities right now.
00:44:09.660
I said not long ago that our elite universities were fostering intellectual rot.
00:44:21.580
I mean, we have this invasion of Israel and, I mean, those were murdering assassins that came
00:44:29.560
in and attacked, raped, beheaded, set on fire, and children in their cribs set on fire.
00:44:39.220
And I said, look, I am going to speak about this.
00:44:43.480
And I was speaking to representatives of the Israeli government.
00:44:48.000
And I said, look, I can't talk about this based on descriptions and hearsay.
00:44:53.680
I don't want to see visual proof of this, but I can't talk about it if I don't.
00:45:03.680
So the Israeli consulate had the IDF bring to my home here in Dallas classified footage
00:45:18.780
And I watched, and a lot of it was GoPros from Hamas.
00:45:27.660
Some of it was cell phone video from Israelis that were murdered and fell on their cell phones
00:45:34.680
and they opened them up and saw what was there, saw what happened.
00:45:44.260
And as I said, I don't know enough about politics to speak about it.
00:45:49.960
I think a lot of people don't that speak about it.
00:46:03.940
I know when somebody goes into a noncombatant's house and kills an infant
00:46:21.820
Some of them, I just, I see a banner that says,
00:46:37.960
Walk that banner into the Gaza Strip and see how far you get.
00:46:41.740
You're cheering on people that would sooner kill you than look at you.
00:46:46.340
And how have they not been taught critical thinking?
00:46:54.720
And do I write off the fact that many Palestinians have been killed,
00:47:04.280
20,000 and counting, and many of them children or civilians?
00:47:10.300
But being killed in collateral damage from a bomb dropped as an act of war
00:47:17.520
is not the moral equivalent of what was done by Hamas when they came into Israel.
00:47:22.340
You said a couple things that I think are interesting.
00:47:24.760
You talked about, you know, you're not an expert on it,
00:47:27.820
but you know the difference between right and wrong.
00:47:30.020
I think our society has been trained just to listen to the experts.
00:47:36.760
But you can, as an individual, and must as an individual,
00:47:41.360
look at the situations, listen to all sides, listen to what's going on,
00:47:46.380
and then make a judgment, not necessarily as the person in charge,
00:47:51.120
but absolutely a judgment if you can tell the difference between right and wrong.
00:47:55.780
If you're murdering and setting babies on fire,
00:47:59.000
I don't need to listen to anything else you say.
00:48:09.720
I think they do, but I don't think that they're finding a voice the way some of the activists are.
00:48:23.320
The activists, the tyranny of the friends that I talk about in the book,
00:48:37.000
And so once they identify an enemy, then they can rally towards that enemy.
00:48:44.000
When they have an identified enemy, they'll have a place to go, a time to arrive, a target to focus on.
00:48:54.200
Whereas this middle America doesn't have an identified enemy.
00:49:07.360
They want to live peacefully and accept one another and love one another,
00:49:11.760
which I'm so glad about in one sense, but you have to pick your battles.
00:49:22.480
And so here we've got this tyranny of the fringe out here that are snipers.
00:49:28.140
And they're targeting people, and they're pushing this narrative.
00:49:36.840
And middle America, millions and millions and millions and tens of millions of people don't have an identified enemy.
00:49:52.660
The one thing about America, we didn't agree on anything, on anything, except a few core principles.
00:50:02.960
And all of our laws were based on Judeo-Christian principles.
00:50:20.020
How do you bring that back, and can it be brought back together?
00:50:24.080
It has to be, because, you know, I talk about the fact that we have to make all, we have to choose all behaviors based on results.
00:50:36.520
And that means that we have to support a meritocracy.
00:50:40.260
Look, this stuff about equality of outcome, come on.
00:50:46.580
I mean, if you've got a guy sitting home in a beanbag eating Cheetos, and he's going to get the same outcome as the person that—
00:50:54.780
I'm going to sit in the beanbag and eat Cheetos all day, too.
00:50:57.200
Yeah, he gets the same outcome as the person that gets up at 6 o'clock and goes and totes that bale and works at the lumberyard all day or goes to medical school or whatever it is.
00:51:10.940
And I saw this happen when they said about mismanaging COVID and spent $5.5 trillion.
00:51:22.640
And, again, I'm not being political, I'm being psychological here.
00:51:27.280
When you pay people not to work, and, in fact, you pay them more to not work than to work, and when you figure in that gas was $5, $6, $7 a gallon in L.A., I drove past some stations where it was $7 and a quarter.
00:51:43.120
I took pictures because I went home to Robin and said, gee, look at this.
00:51:51.180
And they're having to commute, and so it's going to cost them $300, $400 a week to commute.
00:51:57.560
Or they can sit home in that beanbag, and they get unemployment plus a $600 a week bonus and then another bonus on top of that.
00:52:05.460
And then they get a stimulus check for, what was it, $1,250 per person.
00:52:10.940
I had some friends with a family of four that, honest to God, they were getting $10,000.
00:52:25.960
And when you take all of it together, it was like $5.5 trillion.
00:52:30.100
$4.4 trillion of it went into checking or savings accounts.
00:52:35.460
They didn't need it, obviously, because it went into savings.
00:52:37.860
So you're paying people to not work, and then they say, I don't understand what happened to the supply chain.
00:52:45.880
Why we can't get anybody to unload all of these ships out in Long Beach Harbor?
00:52:50.040
They're backed up out here for miles, and we can't get anybody to unload them.
00:53:28.520
You need to move quickly and find the safest ways to invest so you can protect yourself and your family from whatever dark day lie ahead.
00:53:38.740
It's just in what condition do we get to the other side?
00:53:41.480
That's why I recommend you protect your hard-earned savings with an asset you can trust, gold.
00:53:46.760
I made my very first gold purchase in the days when I was listening to Rush Limbaugh, I think back in the very late 90s, early 2000s.
00:53:55.400
And I was listening to Rush, and he talked about Lear Capital.
00:53:59.060
It was his sponsor of gold for a very, very long time.
00:54:02.560
The person I called at Lear Capital still works there today.
00:54:05.900
And the investment I made has, I mean, it's eye-bleed crazy.
00:54:15.240
Lear helped me prepare for the coming insanity.
00:54:24.600
I want you just to call Lear Capital and just get a booklet that they have on what you can do.
00:54:42.840
You'll also, if you decide to buy, Lear will credit your account $250 towards your purchase because you got this from me.
00:54:49.760
Call today, 800-889-3070, 800-889-3070, Lear Capital.
00:54:57.380
So, you said several times, I don't want to get political.
00:55:02.560
But in reading your book, you don't talk about politics.
00:55:07.820
But it does, and maybe it's because everything is political now, but it does seem, common sense seems political right now.
00:55:16.500
Well, politicians talk about a lot of the cultural issues that I talk about.
00:55:23.480
I'm talking about cultural issues in terms of the fact that I think family is the backbone of America, and I think families are under attack.
00:55:33.400
They're under attack by the big social media platforms.
00:55:37.320
I think they're under attack by these fringe activists.
00:55:46.340
And that's why I say, be who you are on purpose.
00:55:53.100
Decide, my family's going to be pulled together.
00:55:56.020
My family is, we're going to consciously make this family strong again.
00:56:01.100
And I think you have to make some of those choices.
00:56:09.980
You said, I've been working on my 10 working principles for a healthy society.
00:56:17.800
That's just a simple conversation that we won't have.
00:56:21.120
With no political party, we'll have this conversation.
00:56:27.380
Do you believe in some sort of fascism without the killing centers?
00:56:32.800
Or do you believe in the Bill of Rights and individual freedom?
00:56:37.980
In fact, they'll argue that we can't have that conversation.
00:56:41.180
But that's what we have to have to know who we are.
00:56:43.940
Two, focus on solving problems rather than winning arguments.
00:56:51.700
And you can know whether you're dealing with somebody that is trying to win an argument
00:56:56.940
or solve a problem in the first three or four sentences you just sit down and talk to them.
00:57:01.400
Because if they sit down and say, okay, how can I help here?
00:57:10.640
If I'm sitting down to negotiate with somebody, the first thing I always do is say, look,
00:57:18.360
But let's talk about what we have in common first.
00:57:24.420
Because then we've got a foundation to build on.
00:57:26.400
Because I've never one time done that, that we didn't realize we have a lot more in common
00:57:36.740
Every time you have a conversation with somebody, if they're honest, if they're engaged in an honest
00:57:41.920
conversation, and as you say, not trying to win, it always works that way.
00:57:47.500
And you don't have to love everything about the other person to love that person.
00:57:51.640
You don't have to like everything they do in order to work with them.
00:57:56.500
Principle number three, don't reward bad behavior or support conduct you don't value.
00:58:01.420
These are so fundamental psychological principles.
00:58:07.160
If your kid's throwing a tantrum in the grocery store, do you go give him a piece of candy?
00:58:16.280
Why reelect politicians that aren't doing the job?
00:58:20.700
If you've got pit bulls walking around your neighborhood, jumping on people and chewing them
00:58:30.240
Number four, measure all actions based on results and all thoughts based on rationality.
00:58:37.780
That, we are told, you don't understand science.
00:58:48.540
No, it's real easy to check some of these things out.
00:58:54.240
And you know, rationality sounds like a big word.
00:59:04.920
Does it get you closer to what you want and need?
00:59:09.180
There are some simple building blocks to answer whether something's rational or not.
00:59:18.080
You can ask yourself, first off, is it based on verifiable fact?
00:59:26.740
Let's just take on an issue that I know you've gotten a lot of heat for.
00:59:31.860
The American in me says, look, dude, okay, that is not my deal.
00:59:41.420
But don't expect me to say, oh, look at that woman over there.
00:59:48.520
Don't expect me to say that I have to call you or treat you the same because, honestly, you have some sort of mental disorder.
01:00:01.380
If you actually think you're something trapped inside of your body, you're not.
01:00:09.020
And there might be, but how do you say those things in polite society?
01:00:17.560
Our American ethos has always been live and let live.
01:00:23.400
Look, if you want to identify as Glinda, that's up to you.
01:00:29.840
So, if you want me to say that sex is assigned at birth rather than defined at conception or chromosomally, I can't find the science to support that.
01:00:48.440
But if you want to identify differently, what business is it of mine to tell you that you can't?
01:00:56.800
But don't force me to say that it's normal or rational or teach my kids that this is something that you should pursue or even experiment with.
01:01:13.520
And, you know, I had Dr. Carol Hooven on my show, professor at Harvard, one of the most respected and popular professors.
01:01:25.980
And she taught a course in, I think it's, I forget the title, I think it was biogenetics.
01:01:35.220
She was on my show and Fox and Friends and nicest woman.
01:01:40.100
I mean, this is a sweetheart of a woman's spirit.
01:01:48.660
And we had a transgender woman on that had transgender, she had transitioned to a male and was a coach and was very happy in that position.
01:02:08.760
And when I came to Dr. Hooven, who was going to talk about transgender athletes, I came to her to talk about that.
01:02:18.320
And before she could even answer, she was very emotional and she was talking to this coach and said, I'm so happy for you that you're happy.
01:02:43.200
And he said, you know, there's, you know, there are more than two sexes.
01:02:55.280
And she said, I'm just telling you, you'll never be able to get a biological male to compete fairly with females.
01:03:16.900
And he said, well, how many were in your study?
01:03:19.460
I did a meta-analysis of, I think, 54 studies that looked at all of this.
01:03:26.540
And even with testosterone blockers, like, for example, you can't change the wingspan.
01:03:34.280
You can't change all of these different things.
01:03:37.220
And you can modify it some, but you'll never get on a level playing field.
01:03:44.260
And she said, like, with swimming, you can get within 10%.
01:03:47.200
But most swimming events are timed in hundreds of a second.
01:03:51.620
And if it's a two-minute race, 10% would be 12 seconds.
01:03:57.780
You know, they'd be down there turning around while he's standing up there waiting.
01:04:05.400
She got back to Harvard, and they labeled her transphobic.
01:04:09.880
And drummed her out of that university after 20 years.
01:04:19.440
That is the most accepting woman you could ever meet.
01:04:25.120
But I personally don't know anybody when you saw Bruce Jenner.
01:04:29.840
And he told his story of, I've been like this my whole life.
01:04:34.520
I feel like I've, I mean, what he spoke about when he was 20, when we were watching him win gold,
01:04:40.880
what was going through his head, I felt horrible for him.
01:04:54.060
But you have to draw the line and say, look, if I'm bringing you to the hospital, I'm not going to tell them and argue with them that you're the most beautiful woman ever.
01:05:08.060
And it's important for them to know, because that's science.
01:05:11.040
And look, if if they want to identify as a transgender female, I get it.
01:05:22.320
And I and I talk in the book, I say, look, I'm not sure that I'm describing this right.
01:05:28.940
And if not, please help me, because for a long time, they did not say that sex and gender were the same thing as I understood it.
01:05:48.880
And I'm I'm worried about what's happening with children.
01:05:54.220
If if they're pushing that agenda, I just I but look, I'm the damage.
01:06:02.860
You know, my son, it probably seven or eight was exposed to pornography.
01:06:16.520
He he was in a rabbit hole that he was seeing hardcore stuff.
01:06:24.220
How can people say that that is good and natural for children to see?
01:06:31.880
I mean, little children we've known forever protect their innocence.
01:06:38.000
Yeah, I think there's a real problem with what some people are wanting to make available to kids and illustrated books and that sort of thing.
01:06:51.760
Consciously choose which voices are in your life and deserve the most attention.
01:07:01.640
Don't stay silent so others can remain comfortable.
01:07:06.300
Yeah, that's what we've got to get people to do is, is, is be willing to speak up even if it makes other people upset.
01:07:15.960
If it's who you are and what you believe, you got to be willing to speak it.
01:07:19.100
Um, principle seven, actively live in support of meritocracy.
01:07:27.200
Principle eight, identify and build your consequential knowledge.
01:07:31.080
Here's the thing, and that's probably an awkward word choice, but I meant it to be because I want it to stick out in people's mind.
01:07:43.060
Consequential knowledge is knowledge you have, skill set that you have, talent you have, ability you have, where they can't replace you by noon tomorrow.
01:07:52.320
If you're working somewhere and your job is opening the gate or filling an order or whatever, they can replace you by tomorrow.
01:08:03.100
If you are a computer repair person or you're, you're, um, a brain surgeon or you're doing something where.
01:08:11.740
If you have the institutional knowledge, just the institutional knowledge is, you don't let that person go.
01:08:19.620
I've had the same assistant, uh, that runs my office for 45 years.
01:08:32.180
She knows everybody at every vendor, every account.
01:08:36.220
If, if something happens to her, we're just going to have to shut down.
01:08:40.540
I mean, that's, that's institutional knowledge, but find out what you're good at and vertically develop at that.
01:08:48.820
There's nothing wrong with being a jack of all trades, but you better be a master of one, have something you're good at that.
01:09:05.300
Number nine, work hard to understand the way others see things.
01:09:09.400
I can't think of, I mean, all of these are so perfect.
01:09:19.800
You know, I work with law enforcement some, and I, I, I, I do training with them on different things from different angles.
01:09:30.300
And you talk to the FBI, you talk to hostage negotiators, they'll tell you, you're never going to get a hostage out alive if you don't convince the hostage taker that you understand why they took that hostage to begin with.
01:09:47.440
That's the number one predictor of whether or not you're going to get those hostages out alive.
01:09:53.200
If they understand that, okay, he gets why I took these hostages to begin, whether it's for political reasons or you've been hurt or once they understand, Glenn understands why I did this.
01:10:06.680
Then you've got a chance of getting them out that, hey, he gets me, he understands why this happened.
01:10:14.940
They need to know they've been heard, that you have heard what they have to say, why they did what they did.
01:10:21.920
We need to work hard at making people understand we get them, we see things through their eyes.
01:10:30.900
And then the last one, number 10, is treat yourself and others with dignity and respect, which seems so simple.
01:10:39.160
I mean, you could look at that and say, what, did you run out of 10?
01:10:47.900
If you don't treat yourself with dignity and respect, you can't give it to other people.
01:10:57.880
And I feel as though some people are hostage to their own normalcy bias or confirmation bias, and you just can't get them out.
01:11:12.520
Let me take a situation that is political and don't make it political.
01:11:17.020
Donald Trump is seeing jury after jury after jury.
01:11:22.420
In Washington, D.C., he's going to be at a jury pool.
01:11:31.280
If you were advising, how do you get them to understand the case from his perspective?
01:11:48.360
I would absolutely do what I call plaintiffing the defense.
01:12:09.920
You don't want to go in there and defend Donald Trump.
01:12:19.800
You need to—you're going to come off a whole lot better if you can flip the script and say,
01:12:29.040
we need to decide what the motives are of the other side of this case.
01:12:37.960
And I would be asking them to say, to ask themselves, you're a first draft historian here.
01:12:48.240
Are you going to put together a case here where you are bringing this case in a way that is actually going to alter the course of American politics,
01:13:07.280
or are you going to let the electorate make a decision?
01:13:12.780
And I think you have to put the other side on trial.
01:13:17.860
I think you'll do a whole lot better by plaintiffing the defense instead of defending the defense.
01:13:26.240
I think you're a lot better off if you put the other side on trial instead of defending your guy.
01:13:53.540
Well, it's Merritt Street Media, and it's a 24-hour-a-day, seven-day-a-week network.
01:14:10.380
It's going to be based on, you know, empirical fact
01:14:15.680
and let people decide whether they think it's good news or bad news.
01:14:37.880
She'll be kind of at the top of our true crime vertical.
01:14:50.220
We've got—there will be some of my shows from daytime.
01:14:55.740
We've got a really interesting show from the behavior panel.
01:15:02.320
They're guys I've worked with from law enforcement
01:15:04.280
that they're from the military, homeland security, law enforcement,
01:15:11.420
guys that are really experts in deception, detection, and interrogation and all.
01:15:19.900
And they're going to have a really good time analyzing people that—from politicians to—
01:15:39.560
I mean, they get down to pupil dilation, blink rate.
01:15:44.380
You know, there are some things—if you're interrogating somebody and they're lying,
01:15:47.980
90% of the time, their feet are pointed towards the exit.
01:15:51.880
There are just things that people just don't know.
01:15:54.040
If you're lying, your blink rate goes from an average of 15 to into the 70s or 80s.
01:16:00.500
There are so many indications and signs that even if you know them all, you can't control them all.
01:16:07.240
And it's really interesting what we get into with these guys.
01:16:17.300
And we think we're going to launch into somewhere between 75 and 90 million homes day one.
01:16:30.660
The call signs when you pull it up on DirecTV or Dish or whatever will be Merit.
01:16:35.540
And that wasn't chosen at random issue, I guess.
01:16:46.540
And I built it at 50, and it damn near broke me.
01:17:06.380
You know, Glenn, I really, it's, I tell people if you don't have a passion in your life, you need to find one.
01:17:23.400
I'm more excited about launching this network than I was when I launched Dr. Phil back in 2002.
01:17:32.200
Because I feel like this country is really in danger right now.
01:17:44.100
And I think we've got a lot of things going on right now that I have relevant information about.
01:18:00.980
And I remember right over here at the Four Seasons out in the.
01:18:08.560
That was the first interview I gave when we was getting ready to take the show out to sell.
01:18:14.000
And for people that don't know, when you're in syndication, they sell your show market by market.
01:18:27.000
But then the rest of it, you went market by market.
01:18:30.640
But I remember he said, all right, we're going to make a pitch reel for this show.
01:18:35.280
And the first question he asked is, all right, what's the show going to be about?
01:18:39.280
And I remember the first things I said on camera about it.
01:18:43.140
I said, I want to talk about things that matter to people who care.
01:18:50.120
I said, I want to talk about things that matter to people who care.
01:18:54.780
And I want to deliver common sense, usable information to people's homes every day for free.
01:19:10.080
I mean, if you talk about things that matter to moms and dads and husbands and wives and, you know, whoever, and those things that matter have changed.
01:19:22.360
As I said, when smartphones came out, it changed.
01:19:24.680
And over the last few years, it's changed to include more social issues than it used to because so much is going on.
01:19:38.700
They're concerned about so many things that five years ago, it wasn't on their radar as much.
01:19:55.200
Well, I'm not one to, you know, like I said, be who you are on purpose and do it with intention.
01:20:03.600
You know, when I was asking those guys about that, I said, why haven't you talked about this?
01:20:08.460
He said, nobody's asked us this question in this point of fashion.
01:20:13.420
You know, I said, are we sending children off into prostitution with tax dollars?
01:20:22.720
I said, is the camera, you know, you're on camera, right?
01:20:28.480
In fact, in the clip, it says, I'm grateful you're asking.
01:20:39.220
So I'm asking questions that I think people need to hear the answer to.
01:20:42.700
You said you wanted to reach a more broad audience.
01:20:45.680
The things that are political in this day and age, just narrow.
01:21:04.220
What what audience was missing, first of all, that you didn't have?
01:21:08.320
Well, when you're on in daytime, you know, 90 percent of your audience is female.
01:21:12.620
You know, and a lot of people are are working during that time and, you know, they can record
01:21:23.160
And we were one of the few shows that people recorded and actually did watch.
01:21:27.880
Our number changed to live plus seven change, you know, significantly.
01:21:32.280
But I think being on in primetime, I can speak a lot to the male audience.
01:21:38.260
I can speak a lot to those Americans that are out there working hard and now they're
01:21:48.660
And I like having a news department where if something's breaking, I can walk over there
01:21:58.800
I don't know anything about producing news, but I've got people in there that do.
01:22:02.600
I'm real good at surrounding myself with people that are a lot smarter than me on things that
01:22:07.720
And I don't have any trouble acknowledging that.
01:22:12.800
People are going to take pot shots at me when I talk about Israel and the border and stuff
01:22:17.360
But I've never had a need to be loved by strangers, which works out great, doesn't it?
01:22:29.320
You know, if you believe in what you believe in, that's all that matters to me.
01:22:35.040
And if if somebody checks my facts and I'm wrong, I'll say, hey, somebody checked my fact
01:22:43.260
And I remember my first interview with Roger Ailes at Fox, I'd had dinner with him a couple
01:22:51.840
of times and he was a delightful man and a great storyteller.
01:23:00.900
And he didn't talk to me at the table for maybe three minutes.
01:23:18.920
And I thought, well, I'm just going to tell the truth.
01:23:24.200
He didn't talk to me for another three, four minutes.
01:23:27.980
What do you think of the international relationship that was fostered by the Eisenhower administration?
01:23:41.140
And I sat there and I looked at him and I said.
01:23:51.220
And hope that you won't notice that I'm bluffing, but you're too smart for that.
01:23:58.700
Or I could shut this interview down right now and just tell you, I got no idea.
01:24:16.040
And then he just started pummeling me with questions that, I mean, I think I lost 20 pounds of sweat.
01:24:23.920
And I got up from the table and I thought, well, this has been a total train wreck of an evening.
01:24:29.200
And instead, he stood up and he shook my hand and he said, it's very rare that you get to meet a man who knows what he knows, knows what he doesn't, and is willing to tell you the truth.
01:24:43.320
And I don't think, I think people worry too much about other people.
01:24:50.420
And they want to be right or have the answers or look smart.
01:24:55.140
It's so much better when you, smart people know when you're bluffing.
01:25:01.080
And I used to train witnesses that were CEOs of, you know, Fortune 100 companies.
01:25:08.520
And we'd put them up on the stand and cross-examine them before trial and ask them a question they didn't know.
01:25:16.640
And they'd try to say, well, blah, blah, blah, blah.
01:25:30.460
I've got people that do know, and I can make them available if you want, but I'm not involved in that, but I support the people who are.
01:25:44.580
Just saying, I don't know, but I support the people who do.
01:25:50.460
And they'll love the fact that a CEO says, I don't know, but I support those who do.
01:25:58.740
And once we got that beat through their heads, they were great witnesses once you get them to realize it's CEO-itis.
01:26:08.780
And I think everybody has a little bit of that now.
01:26:12.540
Everybody feels like they have to know, or they do know, and they don't know.
01:26:22.900
It's going to be nice to have you as a neighbor.
01:26:24.280
I can't believe it's been this long before we've ever sat down and talked.
01:26:29.660
I was at the, it was the Paramount lot that you were on, right?
01:26:36.340
See this gigantic building, your face all across it.
01:26:54.660
We're still in business together, have a great relationship.
01:26:57.800
And we were the longest running show on the history of the Paramount lot.
01:27:10.380
And when I walked on the set for the first time,
01:27:13.900
they were getting ready to take down a picture of Lou Alcindor, not Kareem Abdul-Jabbar.
01:27:21.720
He had this picture up there, and it was kind of turned sideways.
01:27:26.920
And they told me, and they said, how did he get there?
01:27:29.020
And they said, well, Arsenio Hall hung it when he was here.
01:27:36.540
He was the longest show to ever last on this stage, stage 29.
01:27:41.940
He was here for five years, longest run ever on this.
01:27:54.760
And it's now hanging on my new set at Merritt Street Media.
01:27:59.180
And when my son launched the doctors on stage 30 next to us,
01:28:03.620
I had one of those pictures made and hung it at the same angle on stage 30 right next to him.
01:28:11.580
So that's a, I tell you, you need to get one of those.
01:28:24.380
Just a reminder, I'd love you to rate and subscribe to the podcast
01:28:28.460
and pass this on to a friend so it can be discovered by other people.