U.S. Government Wiretapped Former Trump Campaign Chairman Paul Manchurian, Was He Undercover? And Will He Be Charged? Glenn Beck breaks it all down and explains why this is a big deal. Plus, a listener's prediction about the end of the world.
00:00:15.640Like your crazy ex that just won't go away, Russia is back in the news.
00:00:20.780Last night, two bombshells in the Russian investigation.
00:00:24.100Number one, U.S. investigators wiretapped former campaign chairman Paul Manafort under secret FISA court orders before and after the election.
00:00:36.220This is huge, unprecedented, never been done before.
00:00:40.060Two, the Mueller prosecutors told Manafort that they had planned to indict him.
00:00:49.020According to the sources that spoke with CNN, Manafort became the subject of an FBI investigation in 2014 that looked into a number of consulting firms that did work with Ukraine's former ruling party.
00:01:02.200Manafort is a guy that long before he had anything to do with Donald Trump, we told you about.
00:01:08.360He was a bad guy and shouldn't be near any presidential politics, no matter which side it was on.
00:01:15.680The second surveillance warrant came after the government intercepted communications between Manafort and suspected Russian operatives.
00:01:24.920And number two, on top of investigating Trump associates communications with Russian operatives, Mueller has Manafort under investigation for possible violation of tax laws, money laundering and requirements to disclose foreign lobbying.
00:33:28.920I'm one of those idiots that used to hop around at every single site.
00:33:32.760And then I would start questioning myself and my own judgment of whether I booked the right thing.
00:33:36.940And then I'd go through that buyer's remorse.
00:33:38.420After you'd book it, you'd find another flight that was at a better time for less.
00:33:41.900See, this is the thing that Upside is trying to do.
00:33:46.260They're trying to fill you with the confidence that you can trust that the way they've put this thing together to where you are putting your travel together and your hotels together and you're bundling everything together, you're never going to get a better deal.
00:34:02.260And the biggest thing in bringing that time down is making sure people understand, compare, go ahead, go ahead.
00:34:10.160Look, you're not going to find a better deal.
00:34:13.400And soon as people start to trust that Upside is giving you that best deal, that time is going to go through the floor.
00:34:20.000But they've already shaved tens of minutes off of this.
00:34:23.660This is going to be the easiest way for you to book a travel.
00:34:36.000Customer had a flight that they were it was delayed for two hours.
00:34:40.380Navigator arranged to have complimentary access to the executive lounge to pass the time.
00:34:46.580These are the kinds of things that Upside is going to do for you.
00:34:49.720And and on top of it, as Stu said, they give you free stuff.
00:34:53.840If you go to Upside.com and you use the promo code back, when you book your next business travel, your company is going to save a buttload of money and you're going to get a minimum hundred dollar gift card to Amazon.com.
00:35:39.820The hurricane in Miami is pretty interesting in that, you know, they had all this power is out everywhere and you think, well, there's so much new solar power.
00:35:55.300Power company rules and lobbying have let it made it impossible across Florida to buy a solar panel and power your individual home with it.
00:36:03.240You are instead legally mandated to connect your panels to your local electric grid.
00:36:08.660More egregious FPL, the power company, mandates that if the power goes out, your solar power system must power down along with the rest of the grid, robbing potentially needy people of power during major outages.
00:36:20.600They talk about how this is supposedly to protect people who are working on the grid because if there's energy going through it, they could have problems.
00:36:29.320However, the state rules also mandate that solar customers include a switch that cleanly disconnects their panels from the grid while keeping the rest of a power home lines connected.
00:36:39.680But during a disaster like Irma, FPL customers aren't allowed to switch the flip, flip the switch and turn on the solar power in their homes because of all these ridiculous government rules.
00:36:50.900This is so this is the way to just keep solar power.
00:36:55.820I mean, why, why, why, why do we, why are you mandated to pump power back into Florida light and power?
00:37:08.800If I have extra juice that I'm not using, I have no problem taking that power and throwing it into the grid.
00:37:15.180I have no problem sharing it, but you're going to I'm going to put a I'm going to put a panel on my roof and then I have to go power a private institution and they take away my right.
00:37:52.000If you live by lawlessness, you are going to die by it.
00:37:55.040And that is the double edged sword that first the Democrats are starting to be cut by.
00:37:59.540But the Republicans will be cut by it just the same.
00:38:03.540Nancy Pelosi experienced it firsthand yesterday.
00:38:05.980Dreamer press conference in San Francisco.
00:38:08.780She looked helpless and even a little bit scared as pro-immigration activists shouted her down using the same tactics that Occupy Wall Street used on conservatives just a few years ago.
00:39:33.280We are going to wake up to some real problems if we don't start focusing on things that matter and start telling the truth.
00:39:45.560Trump, Schumer, Pelosi, they're all making deals.
00:39:48.800The problem is they're bad for both sides.
00:39:51.720Vast majority of those on both the left and the right want real lasting change.
00:39:57.860What happens when everybody in the country that wants change has four to eight more years of the status quo?
00:40:06.840We just keep switching seats, switching sides of the argument, but nothing is changing.
00:40:13.480Maybe the better question is, who is going to eventually rise up that isn't rising up on violence or hatred?
00:40:24.580Who is it that is going to rise up and promise something different, promise a true choice, true freedom and real peace and prosperity?
00:40:35.300I have been wrestling with something for the last six months or so because I'm quite honestly, I'm disgusted by the things that I have built.
00:41:01.660And I don't think it has a lasting value.
00:41:06.680And I would hope that others are thinking the same thing.
00:41:10.400But as I go and I speak to media executives, they're not.
00:41:15.400They they recognize the same thing, but they they're not willing to do the things that might hurt them.
00:41:23.660And in some ways, I understand they have shareholders, et cetera, et cetera.
00:41:27.760But at some point, we all have to try to do the right thing.
00:41:35.280Right now, almost everything in our life was supposed to make our life easier.
00:41:40.940It was supposed to make it less chaotic.
00:41:43.080We were supposed to have access to more information, communicate easier.
00:43:31.580Well, I have to say, Glenn, I was also really moved by your interview with with Dave Rubin talking about how the race for attention when you when you were on on television and the race for good ratings affected, you know, your own life.
00:43:45.560And I think this is the thing people miss about the tech industry is that no matter what good intentions Facebook, Google, you know, Snapchat has to improve people's lives, they're caught in this race for attention.
00:43:58.500And as I said in the TED talk, and you know so well, it's this race to the bottom of the brainstem for whatever works at getting attention.
00:44:04.940And there's no escaping that because there's there's only so much attention, there's only so much time in people's lives, only so many hours in a day, and it's not growing, you know, so so that the race is only going to get more competitive.
00:44:17.920And as it gets more competitive, it becomes this race for figuring out what pushes the buttons in people's brains.
00:44:24.000And so we have to get out of this race for attention.
00:44:27.340And like you said, you can't ask anyone who's in the attention economy to not do what they're doing.
00:44:33.520You can't tell YouTube, hey, stop getting so much of people's attention.
00:44:37.280You can't tell Facebook, hey, stop making your product so addictive.
00:44:40.700You can't tell Snapchat, hey, stop manipulating the minds of teenagers to get them sending messages back and forth and hooking them because they're all caught in this race for attention, which is why we need to reform the system one level up.
00:45:06.680Well, it's sort of like, you know, the tragedy of the commons.
00:45:09.340So you can't ask any one of the actors to to do something different than what they're doing.
00:45:14.020They need to be able to coordinate, you know, their race for attention.
00:45:18.520So, you know, one way to go one level up is to go to the government, which is not very pleasant as an idea.
00:45:25.740Another way to go one level up is actually to go to Apple.
00:45:29.240So Apple is kind of like the government of the attention economy because they create the device upon which everyone else is competing for attention.
00:45:39.820And Google is also sort of a mini government of the attention economy because they create kind of the government of who gets the best results when when you search for something.
00:45:50.660And Facebook's kind of the government of the attention economy, too, because they choose who's at the top of your feed.
00:45:56.720And currently they're they're locked into their own race for attention.
00:46:01.400So one of the things is we have to decouple profit from attention because so long as those two things are, you know, one to one connected, it becomes this race to the bottom.
00:46:13.500And we actually did this with energy where there's only so much energy available to sell people.
00:46:20.200And energy companies used to have this incentive of I make more money, the more energy you use.
00:46:25.000So I actually want you to leave the lights on, leave the faucets on.
00:46:29.280And, you know, that created a problem where where, you know, we waste more energy we waste or we more we pollute the environment, the more money the companies make.
00:46:38.480And in the U.S., we went through a change called decoupling, which decoupled through a little bit of self-regulation among the among the energy companies where they basically capped how much money energy companies pocketed directly from the more energy people used.
00:46:56.420And then the remaining energy, when you use a lot of energy, all that extra energy, they priced it higher to disincentivize it.
00:47:03.540And then they actually use that extra profit not to capture it for themselves, but to collectively reinvest it into renewable energy infrastructure.
00:47:13.560And so I'm wondering whether or not something like that could happen for attention, where companies could profit from, you know, some amount of attention.
00:47:22.780But then beyond a certain point, what if everyone was reinvesting in the greater good of the attention economy?
00:47:29.940Because, you know, right now, two billion people's minds from the moment they wake up in the morning, you know, they're jacked into this environment, this digital environment that's controlled by three technology companies like Apple, Google and Facebook.
00:47:48.040Does it, does it, does it, does it, um, I mean, a, I'm really glad somebody is thinking about this because I think about this stuff all the time and I don't hear anybody really talking about it.
00:48:01.460Um, and, and it's, it's a little hair raising because what you're even saying is starts to roll into the, you know, big brother, brave new world.
00:48:17.380I mean, it could so easily go into, we're, we're just in this, this weird place that I don't know if mankind has ever been in before, that if we don't do this right, we're really going to screw ourselves.
00:48:31.840No, you know, you're so dialed into this, Glenn, you're, you're totally right.
00:48:35.160And, and, you know, I studied this for, for three or four years.
00:48:37.940I was a design ethicist at Google where literally the way I, I spent every single day studying, what does it mean to ethically steer people's attention?
00:48:45.680Uh, and, and it really, like you said, it's the brave new world scenario combined with the big brother scenario, because whether we want to admit it or not, you know, again, 2 billion people from the moment they wake up to every bathroom breaks, every coffee line, to going to bed, to every back of the Uber or public transportation, you know, people are glued to their phones.
00:49:07.420And, and again, because of this race for attention, these technology products are not neutral.
00:49:12.560Uh, each one, um, is trying to do whatever it can to get attention.
00:49:15.820So they, they deploy these different persuasive techniques and it becomes this, uh, you know, amusing ourselves to death, you know, brave new world scenario.
00:49:24.500If you've seen the movie WALL-E, it's like, uh, you know, it's a race to, to put people with a screen in front of their eyes for as many hours as possible, consuming for as much as possible because that's what's most profitable.
00:49:36.240So it does start to resemble something like the matrix.
00:49:39.100And I don't know if you know the, the book, I'm using ourselves to death.
00:49:42.380Um, but in the, in the beginning, he, he, Neil Postman, the author contrasts Orwell's vision of the future, which we're all, you know, really ready to oppose because it's a form of tyranny.
00:49:53.020We don't want big brother, but then there is this subtler vision of brave new world that people forget to oppose because there's no face of it.
00:50:43.820So, um, one thing that I think everybody who uses a smartphone in a family is aware of is how this is affecting, uh, their kids, especially if they're teenagers.
00:50:52.300Um, so Snapchat, uh, is the number one way that teenagers in the United States communicate.
00:51:45.220And they give their, they give their password to five other friends when they go on vacation just because they don't want to lose it.
00:51:52.300And so it's like, it's like tying two kids on a, you know, legs together on a treadmill on two separate treadmills and then hitting start and watching them run like chickens with their heads cut off, passing the football back and forth just so they don't drop the streak.
00:52:07.160And this is, by the way, you know, from a playbook of persuasive techniques that people in the industry know are good at getting people to do things.
00:52:15.520And you can use it for good, like you can set up a streak for the number of days to the gym that you've, you know, the number of days you read five pages in a book that you wanted to make sure you do that habit.
00:52:26.460But what they did is they take, they took this powerful technique and then they applied it to a vulnerable population and they applied it to children's sense of belonging with each other.
00:52:36.380Because now kids define the terms of their friendship based on whether or not they have a streak or not.
00:52:42.400It becomes the currency of their friendship.
00:53:00.100And what's different about this is that your phone in the 1970s didn't have thousands of engineers on the other side of the screen who knew how to kind of strategically tap two people on the shoulder and make them feel like they're missing out on each other's lives.
00:53:16.340And to show you, you know, to have the phone like light up and appear in your life exactly when you're most vulnerable.
00:53:22.140I mean, for example, it's never been easier to find out that you're missing out on what your friends are doing if you're a teenager.
00:53:29.260You know, Snapchat or Instagram benefit if they put that at the top of the feed, not at the bottom.
00:53:34.660In the same way that Facebook benefits by putting out outrageous news at the top of the news feed because it's better at getting attention.
00:53:44.840Well, so as you said, I don't want to be here dwelling on the problem.
00:53:49.960I first want to do this because it's important people understand the problem.
00:53:53.780And it's honestly one of the biggest problems of our time because it's infrastructure for solving every other problem.
00:54:00.220You know, every other problem, healthcare, climate change, all these things require, you know, us to be able to sustain attention and talk about a complex topic.
00:54:08.720And if we're just running around distracted all the time and if the entire next generation is hooked and addicted to their devices and it's – we can't – you know, we don't develop the capacity for patience or complexity or sitting with each other.
00:56:08.180They have a gold protection price protection program for as little as twenty five hundred dollars.
00:56:11.880Once you get three months of price protection, it goes down, they make it up in gold all the way up to a year of price protection for an investment of twenty five thousand dollars.
00:56:20.100Make sure you read their important risk information and find out if it's right for you.
00:58:20.940The world changed during the Industrial Revolution in about a hundred year period.
00:58:25.480I have been saying for almost 15 years now that we're going to pass a threshold and you're going to see the Industrial Revolution happen in about a 10 year period where you will not recognize the way the world is structured, you know, 10 years from now.
00:58:43.220And I think we're either at that threshold or about to cross into that threshold to where everything is being redesigned and changed.
00:58:50.980And we all have to participate in this.
00:58:53.820There's no spectator sport on this one.
00:58:56.640And I'm very concerned that especially in the media, they're playing the old games and it is truly the race for the bottom.
00:59:06.720Tristan Harris is the founder of Time Well Spent, former Google design ethicist.
00:59:11.940And this is the kind of stuff that he is working on, on how to get people to A, recognize the problem and B, companies to say, wait a minute, I have responsibility here and we should be thinking these things through.
00:59:27.880Tristan, I have talked to leaders from the left and the right, you know, all the way from all the way from the Huffington Post and, you know, and AOL to the Wall Street Journal and Fox.
00:59:49.880And most of them don't understand what I'm even talking about, but those who do feel like they're trapped and they can't change it because it affects their traffic.
01:00:05.120And we are trying to redesign, I'm trying to break the mold and every single thing I suggest, my product team says, you're going to hurt traffic, you're going to hurt traffic.
01:00:18.680And I'm trying to balance of how do I do the right thing and not play this game and still survive?
01:00:30.420Yeah, well, that's why it's such a good question.
01:00:33.460I'm so glad the conversation, because, you know, like I said, you can't ask anyone who's got, you know, a direct access to people's minds to say, hey, don't put the shiny stuff, the outrage stuff,
01:00:46.400the new subscribe to my newsletter at the top of the website, right?
01:00:49.980Everyone's in this race to capture attention.
01:00:51.760So you can't you with your website, you can't just change what you're doing because it'll mean you get less profit.
01:00:57.920OK, so I want to make sure I understand you because you're not asking me.
01:01:44.960Instead of that, that's going to work, you know, not as well as dripping them out one by one just to make it more like a slot machine, just dripping out the dopamine.
01:01:53.600And, you know, so one, there's some small things that companies can do or that you can do with with your website.
01:02:00.440One example, again, is to batch and deliver this one digest at the end of the day versus the dripping things out.
01:02:06.820That's something that big social media companies can do.
01:02:09.440But again, that will mean that they're going to be a little bit less successful in terms of their traffic.
01:02:13.980But if they all decided to do that, then we'd actually be, you know, in a in a better place.
01:02:20.040And so that's why it's like, can we can we talk together?
01:02:23.540Can we get the media industry together to talk about what are the standards we want to have about the way we write headlines of it, the rules that we use about the photos that we try to put?
01:02:32.780You know, you everyone wants to put the photo next to the article that's of, you know, the most surprising or outrageous photo of Mitch McConnell or of Trump with a big, you know, surprise on his face, you know, whatever is going to make it seem most engaging.
01:02:47.840And we need to, you know, get people together to say, what are the rules that we would all be happier as a society and as a media industry to live by?
01:02:56.000That's one thing is that right now we have to be able to acknowledge that the human mind, you know, we came from this long process of evolved instincts.
01:03:06.920And so there's certain things that always appeal to us.
01:03:09.300The fear of missing out is very powerful.
01:03:14.700So when Snapchat manipulates that, that that has an influence on us.
01:03:17.880So instead, we need to say, these are the worst demons of our nature.
01:03:21.180What does it mean to appeal to the angels of our nature?
01:03:23.660And how can we create some standards around that?
01:03:26.280So one way is to have the tech companies get together and to self-regulate themselves by asking, what are the standards for all of our practices?
01:03:35.140It's sort of like building codes or building standards for buildings.
01:03:38.000You know, I could build a building with some bad materials, but it'll hurt society.
01:03:42.200It'll be more economically efficient to do that.
01:03:44.460But we'd all be better off if we paid a little bit higher price.
01:03:47.420And these companies are richer than God.
01:03:49.780I mean, these are really profitable companies that can afford to reinvest in better infrastructure.
01:03:55.220That's why the conversation is how we can reform the tech industry.
01:03:58.160See, I don't know what your political leaning is, and I don't care, and I don't want to get into it.
01:04:01.900But as a libertarian, I agree with you.
01:04:05.560And at the same time, I'm spooked to hell by that.
01:04:08.220I mean, I just, you know, it's just we're in this place now to where I don't know what else to do.
01:04:13.680And I don't, and I think we're, I know you are, obviously, but I think this conversation is way ahead of the curve for the average person.
01:04:23.740By the time the average person really gets here, the wave is crashing on the beach.
01:04:31.000Can I switch the conversation a little bit to, as a parent, what do I need to know as a parent?
01:04:38.840Or, you know, because people just think, oh, my kids can X, Y, Z.
01:05:06.140And he was telling me, we were talking about it, and he said, Glenn, my mother made me watch that when I was so small that when I actually saw it later in life, I thought it was a dream.
01:05:37.980Yeah, well, for parents, I mean, I think it's challenging.
01:05:41.280I went to a conference on children and screens.
01:05:44.100And, you know, one of the things that came up is we often we worry about the kids, we worry about the kids.
01:05:49.100But sometimes the thing that came up was how kids want their parents to stop using their devices so much, you know, because there's something called the still face syndrome, where when a child is very young and they make eye contact with their parent, the parent, they need that parent to, you know, emote back to them, to open their eyes and smile.
01:06:09.600And use their face to really expressively see that they're being seen.
01:06:15.360And when the kid, you know, as an infant is looking at the adult's face and the adult's face is sucked hypnotically into the screen with a totally still flat face, you know, it's doing something to the development of children.
01:06:31.260And so one thing is what can we as adults do?
01:06:33.560One thing that's really easy that's super trivial is just to turn off all notifications on your phone, because what most people don't realize is that many notifications are invented by machines at technology companies to try and lure you back.
01:06:49.740So if it says that, hey, these five friends liked your photo, it feels like that was your five friends who sent you that notification because it was their action that led to it.
01:07:10.500Did you, did you, did you, did you go out on your own because you were both intellectually thrilled at what was happening or could happen and horrified at being a part of it as well?
01:07:29.640Yeah, well, I was a part of the tech industry for a long time.
01:07:32.900And, you know, my friends in college started Instagram.
01:07:36.020We both studied at this lab at Stanford that taught people this intersection between technology and psychology.
01:07:43.240And I became really alarmed by, I think, you know, technology was less and less about actually adding up to this benefit to our lives and more and more just kind of filling up our lives with things that work, you know, on, on, on getting our, our minds to do stuff.
01:07:58.600So I got into this work by, you know, a year into being at Google, making this memo before the Google diversity memo.
01:08:06.340And it was a memo about four years ago, five years ago, about essentially how, you know, 50, 20 to 35 year old engineer designers in San Francisco control what 2 billion people are thinking and doing every day when they make choices about how these screens work.
01:08:21.200And that we have a moral responsibility to do something about that.
01:08:25.360And to my surprise, I thought I was going to get, but actually it went viral throughout Google and was seen by about 10,000 people and talked to the CEO about it.
01:08:34.860And that led to my work researching this question of how, how do you hold that responsibility if you're someone in a technology company and your choices will impact what 2 billion people are thinking every day.
01:08:45.940And so I started working on that framework, but I couldn't get some of the changes through.
01:08:50.580And frankly, as you said, we're, we're, you know, coming up fast on this brave new world, you know, situation that, that we don't want.
01:09:01.140It's not like we're going to sort of self-correct and we're just going to figure out how to live with these devices as it's commonly talked about.
01:09:07.920This is going to get more and more persuasive because it's personalized and it's pulling on our social psychology.
01:09:13.560And so I knew that I needed to leave and create a public conversation to raise awareness first, but I also have a lot of ideas about what we can do.
01:09:21.800And so that's why I'm out here talking, talking to you.
01:09:27.720That's literally just a public, a good conversation that I think you and I both know is so important.
01:09:32.940I, Tristan, I, I, I can't thank you enough for coming on because you don't, you don't have a reason to come on and other than to do, you know, good.
01:09:41.700And I hope that we can invite you back because I have, I have so many questions and I think you have many of the answers and I really appreciate your time.
01:10:35.800When you read it, you're like, wait, holy cow.
01:10:38.880I didn't even, he is really pulling the curtain back and showing you, no, no, no.
01:10:44.980Everything is being done to get you sucked in and hold you there.
01:10:52.500And it's, it's, it's, it's going to, when, with the development of AI and on top of it, virtual reality, I'm telling you, the matrix is coming.
01:11:03.480If we don't wake up and that's, if we make it, if the news media continues to go down the road that we're all going down.
01:11:12.760And that is, I'm going to throw a fireball at you and you're going to throw one back and we're all going to get clicks.
01:11:19.340I'm telling you revolution will come and we may not have to worry about the tech problem.
01:11:25.380The other thing is it would be really expensive to get him to consult for us, but if he's on the air, I mean, we just ask him questions and answers them for free.
01:11:37.360That's why I didn't say it until right then.
01:11:39.760Uh, our crisis, uh, you know, whatever it is, the crisis that you have in your life, the crisis that we're all facing, it's not going to bring out a better version of you.
01:11:53.520You know, this, how many times have you had to apologize because you're like, I am sorry.
01:11:57.580I'm under so much stress, blah, blah, blah.
01:12:39.640Anyway, signs are everywhere, uh, that things, you know, are going to get rough and we'll make it through and we'll make it through together because we'll rise and be our best.
01:12:47.880If we are prepared, my Patriot supply can help you prepare, get their 70 serving survival food kit now for only 67 bucks, healthy food that lasts up to 25 years for less than a dollar per serving.
01:13:00.940And you're going to get breakfast, lunch and dinner call 800-977-0542 or you can order online at preparewithglenn.com.
01:13:08.960A prepared America is a strong America.
01:13:35.740I would, uh, Sarah, if we have time, could you grab the, uh, the audio?
01:13:41.900How artificial, uh, intelligence will affect your life?
01:13:44.580The people, uh, we are, um, we're in a really exciting period of time, but the world, uh, the world will, um, look back and study us and study our, um, our choices.
01:14:05.820And I am more convinced than ever that, um, that what I have, uh, I shouldn't say feared, but what I have worried about, uh, is going to come to pass.
01:14:23.740And, and we are going to be passengers, uh, on the road to hell if we don't wake up and say, wait, wait, wait, we want to steer.
01:14:47.280And, and we have to pay attention to that, but there's a lot going on.
01:14:50.820We're going to give you the recap of Donald Trump and what he said at the United Nations, some pretty stark and amazing statements that he made.
01:23:49.780I mean, this is crazy stuff so that I'm appealing to people not only to buy the book because you'll like it and have fun reading it, but because you need it to defend your communities against the PC far-left brigades that are on the march.
01:24:06.540You know, at the end, we talked about this on your show.
01:24:09.860You know what the end result here is, Beck, don't you?
01:24:12.060Yeah, the fundamental transformation and destruction of the country and the Western way of life.
01:25:16.520Yesterday, the blaze ran a story of somebody on Facebook that tweeted something from Hobby Lobby.
01:25:22.160It was a vase with a cotton plant coming out like flowers.
01:25:28.640And the woman who was really offended by it and said, you know, this was this is an ode to slavery and all of this nonsense.
01:25:35.480And the take on the blaze yesterday was, you know, this is an important story.
01:25:41.360And it's an important story because if you look at Maslow's hierarchy of needs, nobody, nobody gets to I'm really offended by that vase of cotton in Hobby Lobby unless your life is pretty sweet.
01:25:56.520All of the things that are we're complaining about are embarrassing, embarrassing when you look at what's what's really happening around the world and the struggle that is just around the corner for all of us.
01:26:08.020Well, again, it's the individual American that's got to decide, do you want to hold on to your country?
01:26:17.380Do you want to hold on to historical figures?
01:26:19.780Do you want to hold on to America's traditions?
01:26:21.720Or do you want to let these HUNS, H-U-N-S, HUNS, who are coming in to destroy everything win?
01:26:30.060And it is look, it extends into the media because the media is sympathetic with this with this change.