The Glenn Beck Program - December 01, 2022


Best of the Program | Guest: Brendan Carr | 12⧸1⧸22


Episode Stats

Length

45 minutes

Words per Minute

163.90337

Word Count

7,411

Sentence Count

586

Misogynist Sentences

3

Hate Speech Sentences

1


Summary

Glenn Beck is back with a new episode of the Glenn Beck Show! On today's show, Glenn is joined by FCC Commissioner Brian Wheeler, who is responsible for a lot of great things, and we discuss a philosophical discussion about TikTok and freedom of speech, and Elon Musk's "nanotechnology" and the future that is coming right around the corner. Don't miss it!


Transcript

00:00:00.000 All right. Podcast is fantastic today, especially I think like the last hour and a half of the show
00:00:07.260 today. Last half of the podcast is something very, very different. We talk about, well,
00:00:14.240 we start in hour two with the FCC commissioner who is responsible for a lot of great things.
00:00:20.580 And we kind of got into a philosophical discussion about TikTok, Twitter, what's going on there,
00:00:26.540 and also freedom of speech. From there, we went to Elon Musk's Neuralink to a simulated wormhole for
00:00:36.760 the very first time with quantum computing and the future that is coming right around the corner.
00:00:43.120 All that and so much more on today's podcast. Don't forget Goldline is out there and they want
00:00:48.440 to talk to you about protecting your retirement account. If you have a retirement account or you
00:00:54.360 just have even money in the bank, make sure that you spread the risk out. The Fed came out yesterday
00:01:01.740 and they're still saying, nobody saw this inflation coming. You should all be fired if you didn't see
00:01:07.680 this coming. Anyway, gold or silver is a great way for you to be able to protect your money, what you
00:01:16.440 have. Somebody's got to have some money. Somebody's got to have things to rebuild our country as we go
00:01:24.900 through this mass transition. Please call Goldline now. Tell them that I sent you from the podcast
00:01:30.200 and give Goldline the code MYB, which represents Mind Your Business, the silver bar that they'll give
00:01:36.880 you for free just as a thank you for calling in. Request the information. You can go to their website,
00:01:41.680 goldline.com or call them. Tell them I sent you. 866-GOLD-LINE. 866-GOLD-LINE or goldline.com.
00:01:56.400 You're listening to the best of the Glenn Beck program.
00:02:03.620 So, Stu, yesterday I kind of had a bad day. I had a bad day. I would say that.
00:02:09.960 You're doing okay? Yeah, I'm doing fine. But I would say that was a bad day.
00:02:14.760 Sure. Yes, I would agree. But that bad day didn't include me losing
00:02:18.620 $10 billion for other people. No, you didn't lose any billions of dollars for other people that day.
00:02:26.160 No, because I don't know if I would describe that as having a bad month. You know, I had a bad month.
00:02:31.000 Lost $30 billion, $10 billion in other people's money. Right. That's a really bad month.
00:02:37.260 That's a bad month. I think it's above a bad month. But that is the way that Bankman Freed
00:02:43.640 started the interview yesterday with the New York Times. He apologized and said,
00:02:50.260 I've just, I mean, here, let me, let me actually play it for you.
00:02:55.420 Um, let's play, uh, cut 11, please. I mean, look, I, I've had a bad month. Um, this is not
00:03:03.820 been any fun. But that's a real, that's not what matters here. Like what matters here is
00:03:09.900 the millions of customers. What matters here is all the stuff. I want to come back to that part of it
00:03:16.960 at the end, put a pin in that. But let's start with what he said. I, I just didn't know. Cut nine,
00:03:25.200 please. Was there commingling of funds? That's what it appears like. It appears like there's a
00:03:30.780 genuine commingling of the funds that are of FTX customers that were not supposed to be commingled
00:03:37.080 with your separate firm. I ain't knowingly commingled funds. And again, one piece of this,
00:03:44.900 you have the margin trading, you have, you know, customers borrowing from each other.
00:03:48.620 Alameda is one of those. I was frankly surprised by how big Alameda's position was, which points to
00:03:55.140 another failure of oversight on my part. Um, and, uh, failure to appoint someone to be chiefly in
00:04:02.800 charge of that. Uh, but, uh, I wasn't trying to commingle funds. Oh, okay. Well, there's all kinds
00:04:12.120 of evidence that, uh, Alameda, which was the hedge fund and FTX shared an account with their banking
00:04:19.380 partner. So, I mean, you're sharing an account at Silvergate. Uh, so not sure how you, you square
00:04:27.960 that circle or, you know, you weren't aware, but what he's saying here basically is I wasn't,
00:04:33.000 I wasn't aware of it. That's his girlfriend that's running it. So in other words, Hey, I'm,
00:04:38.680 I didn't look at her. His girlfriend is running it. He appointed her and he's still the owner of
00:04:45.700 Alameda. Right. He's, he still owns the, you know, that's kind of a big part of it. And, uh,
00:04:50.880 his incompetence, uh, slash fraudulent activities are what we're talking about. Yeah. Right. He's,
00:04:57.660 I, I wasn't running that and I didn't get involved cause I was nervous about the conflict of interest.
00:05:03.400 If I were too involved in that, you got the same bank account, dude, you have the same bank account.
00:05:10.920 All right. So let's go to cut, uh, 10. I personally don't think I have criminal liability.
00:05:16.360 How concerned are you about criminal liability at this point?
00:05:20.020 So I don't think that, I mean, obviously I don't, I don't personally think that I have,
00:05:24.180 uh, you know, uh, liability. I think the real answer is that's not, it sounds weird to say,
00:05:32.480 but, but I think the real answer is that's not what I'm focusing on. Um, it's, uh, there's going
00:05:40.340 to be a time and a place for me to sort of think about myself and my own future, but I don't think
00:05:48.120 this is it. Oh, so do, did you commit any crimes? Look, I, I don't think so, but it's not the time
00:05:59.260 or place to think about me. Uh, you can think about me later. What I'm concerned about are all
00:06:04.440 of the people who have lost their money. What an amazing answer. And I guess that's the best answer
00:06:13.300 you can give in this moment other than the correct answer, which is don't do the interview. That's
00:06:18.800 the, the first thing is so stay away. Correct. So you don't do an interview like this. Uh, Stu,
00:06:28.080 how many interviews, uh, in the course of my career have I been asked to do and everyone clearly said,
00:06:38.900 don't do it. You do not want to do it. This sounds familiar. Okay. I've had that conversation
00:06:47.560 a thousand times. A thousand times now to be clear, you have not lost $10 billion for investors
00:06:54.080 at any point. Correct. And never done anything wrong knowingly. Um, so, uh, however, people used
00:07:04.500 to say that watched me when I was on bill O'Reilly, why do you continually go on his program? He kills
00:07:14.820 you. And that never was ever discussed except for the first time I went on bill O'Reilly, right?
00:07:25.460 Bill is going to, you know, he's a wild card. You don't know what's going to happen. Okay. Yeah.
00:07:29.860 Why did I always not listen to that advice on bill O'Reilly?
00:07:38.120 Uh, well, I mean, I think you guys actually had a good relationship and he would push you on things,
00:07:45.320 but you knew it was coming from a good place. Exactly. He said to me at one point, look,
00:07:51.840 if I think you're wrong, I'm going to tell you you're wrong. If I think you're out of line,
00:07:56.380 I'm going to tell you you're out of line, but most of the stuff that you do, I don't think you're
00:08:01.340 out of line. I may not agree with your conclusions, but I don't think it's out of line. It's a good
00:08:06.380 question. So Glenn, come on my show. I will ask you the hard questions, but it won't be a setup.
00:08:15.640 I'm not trying to destroy you. Right. Uh-huh.
00:08:18.540 Uh-huh. So why would Sam go on, yeah, go on with the New York Times against all advice
00:08:29.840 because he knew he was walking into a friendly room. They're going to ask you tough questions.
00:08:36.460 We're going to ask you tough questions. Uh-huh. But he knew it would not be a lynching. He knew that
00:08:46.180 there were friends at the New York Times and he could ask and then answer, this is not the time to
00:08:56.680 think about me. It's not the time. I mean, my question is, why aren't you in jail, dude?
00:09:04.420 What makes you different from Ken Lay? What makes you different from Enron? What makes you different
00:09:11.560 than Bernie Madoff? He then appeals to the people of the audience. I care about the people who
00:09:19.680 lost their money and I'm sure there's people in your audience that have lost money and they laugh
00:09:26.260 and they laugh. Wow. Somebody who lost $10 billion of investors' money and he shows up and people are
00:09:36.800 like, oh, that crazy kid. This is a whitewashing. This is money laundering, except it's reputation
00:09:49.220 laundering. They are just laundering him here. Hmm. That's interesting. I mean, look, I think
00:09:55.260 everybody wants this interview, right? Like this is, it's not like people are resisting talking to
00:10:01.620 this guy right now. I think, you know, so, but he's selecting who he's going. He's also going on
00:10:05.200 with George Stephanopoulos, apparently. Oh, George Stephanopoulos. Yes. Which is interesting,
00:10:09.080 kind of, I think, supports your thesis there. Uh-huh. Right? Um, you know, I, I'm not surprised that
00:10:16.480 the New York Times would take the interview or offer the interview. I'm sure every mainstream,
00:10:20.460 like financial journalist has offered this interview. He's selecting where he's going
00:10:25.280 though, right? And this is, you know, we were in that room, weren't we Glenn? This, I can't remember
00:10:30.920 what event we did there. We did something there. I remember doing it and it's an impressive,
00:10:35.720 impressive room, right? Like at the New York Times. It's like an incredible place for one of these
00:10:41.140 things. Yeah. We did just to piss him off. We did an event for the blaze when we first launched
00:10:46.800 in that same room. That's what it was. That's right. It was the launch. It was a launch announcement
00:10:52.020 of the blaze. That's right. And, uh, it was, it was pretty sweet. It was pretty sweet. And we walked
00:10:57.580 into the New York Times and everybody's like, good Lord, what are these people doing here?
00:11:01.280 They're just hoping that it was some arrest announcement. Don't worry. We're just doing an
00:11:05.560 exorcism. Don't worry about it. But it's, it's, you're right. I mean, it's like, I mean, I watched
00:11:12.440 a good chunk of this interview. It was over an hour and apparently he did almost two hours with
00:11:16.540 George Stephanopoulos. That's coming out partially today. Another guy who's not going to really press
00:11:23.480 on. So how much money were you giving to the democratic party? Yeah. Cause that's, you know,
00:11:28.980 that I don't, I didn't hear one question about that from the times. See that that's where they're,
00:11:33.680 that's where they're washing all of this. Just make sure that he's not tied to any of that.
00:11:41.800 Let's not get into any of how much money was going to the people who are now going to keep
00:11:46.860 him out of jail. Let's make sure we don't ask those questions. See, there's, there's two ways
00:11:52.620 this can go. I think one way is the way you're talking about where they will protect him. And I
00:11:58.160 think there's a real argument to be made that that's, that's the way it goes. I think it's
00:12:01.800 maybe the most likely way it goes that he will be protected because of all the money that he was
00:12:06.160 giving to Democrats. The only, there is that part though, where this does cause problems for
00:12:11.800 Democrats, right? Like it does expose them, you know, like you don't want pictures of you when
00:12:18.180 you're running for reelection with Bernie Madoff with your arm around him. And, and, and, and like
00:12:22.720 these sorts of problems are going to be real for Democrats going forward.
00:12:26.080 You don't have Bernie Madoff here. You don't have a mainstream press making him in to Ken
00:12:34.100 Lay or Bernie Madoff. I don't know that he, this is really bad. I mean, you might be right,
00:12:38.720 but it's going to be hard to whitewash this guy. You might be able to, what it was like
00:12:42.740 with Bernie Madoff. They, how they just, Oh yeah. Hounded him all the time. Have you seen,
00:12:48.700 have you seen him being hounded in his Bahamas home? Have you seen that? Have you seen the
00:12:55.160 gaggle of, no, I haven't either. No, though I, you know, Bernie Madoff was walking down the streets
00:13:01.000 of New York where the footage was taken. This guy is not, but they were also staked out in front of
00:13:05.120 his house. They never let these guys rest because they were on a mission to make sure they showed how
00:13:12.300 evil these people were. It's true. They're not on that mission. So the P the people who are
00:13:18.040 average people are not hearing the Sam Bankman, uh, freed jokes. Yeah, that's true. You know,
00:13:27.180 it's interesting. They, they don't seem to be on a mission. They seem to like they're the tone of
00:13:31.680 the coverage and I've watched a lot of it. They are on a mission. Yeah. That's a different story.
00:13:35.760 It's like the way the tone of it is like, we, we need to understand. Yeah. Almost like we need to
00:13:41.460 understand because this guy who we all said was so great may have done a couple of things wrong.
00:13:46.880 So let's come up with a reason to, or let's let him explain, give him an ample opportunity to explain
00:13:53.260 why this reason wasn't that he wanted a private jet and a, and a $30 million apartment in the Palmas.
00:13:58.880 Imagine that I lost $10 million of people's, uh, investment and I was co-mingling funds and it was
00:14:08.700 an honest error. 10 million, not billion. Right. They would slaughter me. They'd be all over the
00:14:15.540 place. They'd have people parked outside these windows right now. Absolutely. They would. They
00:14:19.980 are on a mission. And one of the mission that there's to me, there are two reasons he's doing
00:14:27.340 interviews where he knows he won't be pushed on the tough questions. Keep this away from everyone
00:14:35.400 else. Contain this. Okay. So he's looking like a really good guy. Look, I'm just trying to help out.
00:14:43.500 I, I, I just believe in giving all this money away and it just got out of hand, but I wasn't part of
00:14:49.460 it. And don't ask any questions about Democrats. Don't ask any questions. Why all of a sudden
00:14:54.740 everything is different with this guy than Ken Lay or Enron. Um, and they want to contain it.
00:15:02.020 But the second thing they need to do is make sure America learns the lesson about how bad
00:15:12.200 these unregulated markets really are. I mean, it is so dangerous. We can't just have this
00:15:21.080 cryptocurrency out there. We need a central cryptocurrency. That's what he was for. He was
00:15:30.100 leading the band on saying, we got it. These people are out of control. We got to regulate all
00:15:36.080 of this. He was a major force in that. So they need to tell the story that cryptocurrency is bad,
00:15:45.680 bad, bad, bad, bad, bad, bad. That's why we need a Fed coin. And by the way, we weren't in league with
00:15:53.080 him on that or anything else. No, no, no. The media, the politicians, the Democrats. No,
00:15:59.780 no, no, no, no, no. What money? He gave us money. What? That's what's happening.
00:16:09.480 This is the best of the Glenn Beck program. And we really want to thank you for listening.
00:16:13.080 So this guy is the guy, they call him the FCC's 5G crusader. He's a guy who cut all of the red tape and
00:16:26.760 really pushed for the high speed networks to be built by private businesses. He is also the guy who is
00:16:38.700 one of the big forces behind telehealth, mainly for veterans and low income Americans to be able to get
00:16:50.400 to doctors on their smartphones or tablets or any other connected device, driving down the price and
00:16:56.200 driving up the access to medicine all around the country. And he also, like Mike Rowe and I believe
00:17:04.680 in apprenticeships and everything else. This is a, I think this guy is a real warrior for what we
00:17:12.320 believe are American truths. His name is Brendan Carr. He is a commissioner with the FCC. Brendan,
00:17:18.260 how are you, sir? Glenn, so good to join you. I really appreciate the chance to be with you. Big fan
00:17:23.880 of everything you're doing. And listen, if you ever get in trouble at the FCC, if anyone files a profanity or
00:17:28.840 indecency complaint against you, just don't mention you know me. It'll go a lot better. I know.
00:17:34.680 We were there for you. You and I never talked. That's your story going forward.
00:17:38.920 I know. I know. I know how this works. Anyway, I wanted to talk to you about two things. Let's start
00:17:45.780 with TikTok. Everybody in the tech industry seems to be against Twitter. I mean, it's crazy by letting
00:17:54.900 people talk how they are being accused of destroying free speech. It's an upside down world.
00:18:01.920 But TikTok, nobody seems to want to do anything about this. I've read your letter. I've read your
00:18:08.220 reports on this. TikTok is extraordinarily dangerous to Americans. Can you fill in why
00:18:16.840 it's a danger and why everybody in America seems to be focused on Twitter, including the White House,
00:18:23.840 and not TikTok? Well, it's quite amazing. And, you know, TikTok is an example of this. And as we
00:18:30.380 may get into Apple as well, when your product is, you know, for better or worse, immensely popular
00:18:34.840 with consumers, it's amazing what you can get away with. And I think TikTok is the prime example.
00:18:39.880 It's popular with millions and millions of Americans, including young Americans. And they
00:18:43.140 look at it and they think, well, that's just a fun platform for sharing videos and dance memes.
00:18:47.240 And the reality is that's just the sheep's clothing. Underneath, it operates as a very sophisticated
00:18:51.980 surveillance technology, right? In terms of service, they reserve the right to get your
00:18:56.120 biometrics, including face prints and voice prints, search and browsing history, keystroke patterns,
00:19:02.920 the list goes on from there. And for years, they said, don't worry, this is stored outside of Beijing,
00:19:07.460 not a big deal, even though our parent company is ByteDance, is based in Beijing. And well,
00:19:12.220 that's been revealed as nothing more than gaslighting. It turns out that according to internal
00:19:16.240 communications, quote, everything is seen inside China. And that's a massive, massive problem.
00:19:23.540 In fact, their COO was testifying in Congress a couple weeks ago, and was asked point blank,
00:19:28.820 do you transfer US user data to employees in Beijing who are themselves members of the CCP?
00:19:36.600 And the COO said that she declined to answer that particular question. So that's troubling.
00:19:42.200 There's also a new report that just came out that they had this Beijing-based operation,
00:19:45.880 that was attempting to surveil the location of specific Americans based on their usage of
00:19:52.540 the TikTok application. And that's not to mention, obviously, the concerns that come from the content
00:19:57.180 side, where Americans, including children as young as 10 years old, are being fed things like the blackout
00:20:03.500 challenge that's literally convincing themselves. And some have done that and died as a result. So it's a
00:20:09.980 national security threat, and it's something that parents should be worried about as well.
00:20:13.120 So explain to, because I've tried to explain this to my family, you know, my kids are like,
00:20:19.460 yeah, right, dad, I got it. What is China going to do with my, you know, my face print and my
00:20:25.360 information? Can you explain why that's dangerous?
00:20:29.660 Yeah, it really is. And if you, one way to think about it is there's a version of TikTok,
00:20:33.420 TikTok itself isn't available in Beijing, but a version of it called Doyan, a sister app run by the
00:20:38.040 parent company. And that application shows kids science experiments, museum exhibits, educational
00:20:44.540 material. And again, here in the US, it's showing kids the blackout challenge. So that's where the
00:20:50.760 real danger comes. But also, if you step back, what's really happening when you're using TikTok,
00:20:56.020 every time you swipe or click or search, what you're doing is you're feeding, training and improving
00:21:01.460 China's artificial intelligence, their AI. And China has said, we want to dominate the world in AI
00:21:07.220 by 2030. And they're going to use it for authoritarian purposes, for surveillance, for
00:21:12.000 exporting their control. So even if you step back from your own self and your own kids, and even
00:21:17.160 TikTok itself, the idea that we're sending this data and these clicks back to Beijing is improving
00:21:22.480 their AI. And that's going to come around and bite us in ways that are, again, unrelated to TikTok
00:21:27.140 itself. So we have Google doing the same thing. I mean, that's why Google was was free, is they
00:21:34.480 wanted all that information to work on AI. So you're saying this is just another version of Google,
00:21:44.640 if you will, that's here in America, to be able to mine for all of that information.
00:21:51.220 Yeah, you're right. You know, China has a fundamental flaw, both in their system of government,
00:21:55.020 obviously, but it carries through to AI, which is they don't have feedback loops, they don't
00:22:00.040 understand sort of Western free thinking. And so they need Americans to be on TikTok, to be observing
00:22:07.120 their usage of data in order to create their AI and make it a healthy system. So the sooner we cut off
00:22:14.700 data flows back to Beijing, the sooner their version of AI starts to atrophy and go down a separate path
00:22:21.580 in which it's going to be a lot less successful. So I think we do need to look broadly, how do we
00:22:26.060 stop training China's artificial intelligence? But again, that's a piece of it. It's used for
00:22:30.480 surveillance, it can be used for blackmail, it can be used for foreign influence campaigns.
00:22:34.400 And where things are right now is this is in the court of the Biden administration, the Treasury
00:22:38.820 Department has a group called CFIUS, Committee on Foreign Investment. And they've been reviewing
00:22:43.540 TikTok for over a year at this point. And the New York Times reports that they've got a
00:22:48.400 preliminary deal in place to allow TikTok to continue to operate. And frankly, I think this
00:22:53.020 is a big IQ test for the administration. And it's sort of a pass fail at this point. And in fact,
00:22:58.700 you just had FBI Director Chris Wray testify last week in Congress and said that the FBI has serious
00:23:04.480 national security concerns. So I don't see how the Biden administration can go forward and bless
00:23:09.260 TikTok to continue to operate when you have the FBI, when you have Democrats, Senator Mark Warner,
00:23:14.060 chair of the Senate Intel Committee, saying that it is TikTok that scares the Dickens out of him.
00:23:18.800 But we may very well be heading towards that direction here.
00:23:21.460 Google Play Store, Apple App Store, I know you wrote a letter to both of them and said,
00:23:27.220 drop, drop this. This is really bad for the country.
00:23:31.920 Yeah, I mean, putting aside the content of what's in this application, Google and Apple have very clear
00:23:38.320 terms of service to stay in the App Store. And if data is being used for purposes that aren't being
00:23:44.260 disclosed, or if data is traveling to countries and being accessed from countries without that being
00:23:49.160 properly disclosed, there's precedent for Google and Apple to boot apps off the App Store for that
00:23:54.080 reason. And so I wrote them a letter and said, look, in light of the national security concerns,
00:23:57.640 in light of these clearly surreptitious data flows that we're now learning about,
00:24:01.220 just apply the terms of your App Store policies and boot them from the App Store.
00:24:05.500 Of course, they didn't do that. And that's why it's, you know, obviously highly ironic that there
00:24:10.200 was at least the concern this week that Apple might take action against TikTok. Because look,
00:24:15.540 if you're pulling advertising dollars or pulling support in Apple's case, potentially from Twitter,
00:24:22.060 while keeping your support or expanding your advertising on TikTok, you are sending quite the
00:24:28.080 signal about your brand value. I think it's very different than the one you think.
00:24:31.980 One last thing, because I've got something else I want to talk to you about. But one last thing on
00:24:37.140 this, you just kind of brushed over this, but I think it is critical. There was a new survey out
00:24:42.760 that showed, I can't remember, six or eight out of 10 children in China want to be astronauts and want
00:24:51.540 to be scientists. Here, eight in 10 want to be social media movers.
00:25:01.620 Influencers.
00:25:02.280 Influencers, yeah.
00:25:04.120 That's crazy. And part of that is because of TikTok. As you said,
00:25:10.240 they're saying the same thing under a different name over in China
00:25:13.980 is encouraging people to do crazy great things and science and and knowledge and education.
00:25:23.600 And this same platform is programmed here to really make you as dumb as a box of rocks.
00:25:30.880 I don't think that's I don't think that's just, oh, really? I didn't even notice that.
00:25:35.240 That's intentional.
00:25:36.140 Yeah, you're right. I mean, this is why I've talked about TikTok as China's digital
00:25:41.580 fentanyl, because it's effectively, you know, a pipe directly from Beijing, from the CCP
00:25:46.660 into the ears and eyes and minds of millions and millions of America's youth. And what they're
00:25:53.560 being served is divisive content. It's it's content that is, you know, increasing ADHD problems,
00:26:00.000 suicide ideations, body image issues. This is what is being fed to us. And that's that's deeply,
00:26:09.120 deeply concerning. And that's why I think, you know, it's incumbent on the Biden administration
00:26:13.020 to step in and take some tough action here.
00:26:15.720 So, Brendan, I have a philosophical question, and I'd like you, if you would noodle this out. I
00:26:23.760 I tried to contact you a few weeks ago because I was presented with a story of a book that was in
00:26:32.660 a school library and being read to kids in school. And it was one of the most vile things I have
00:26:41.400 ever read. And I've done this for 40 plus years. I know exactly what I can and can't say with the FCC.
00:26:49.340 Okay. And I've always understood those to be community standards, et cetera, et cetera.
00:26:56.580 Here's my here's my problem. There are times when things need to be heard by the general public,
00:27:02.920 and I know we can go online and do it, et cetera, et cetera. But why, when we are a community standards
00:27:09.060 based system, if if you can teach it to my children and have it in the classroom, why can't I
00:27:17.640 a program that is aimed at adults and during the day when kids should be in school? Why can't I
00:27:26.300 read that book on the air? Well, you're right. Look, we still have in place at the FCC rules that
00:27:35.000 apply to broadcast radio and broadcast television that regulate profanity and decency, similar content
00:27:42.440 like that. It obviously hasn't been enforced very much in the last few years, but they're still on
00:27:47.320 the books. And so you're right. There is a point at which potentially you reading things from across
00:27:53.900 the broadcast airwaves that may be found in a library somewhere could have issues under the FCC's
00:28:01.680 profanity and indecency regulations. Now, of course, there's tends to be a newsworthy exception to a lot
00:28:07.380 of that stuff so you can cover issues and things like that. But it's a challenge. And some people say,
00:28:12.400 you know, look, how do you generally square this pro speech, free speech view with that type of stuff?
00:28:18.720 And I would say, look, what what we can speak of as adults and talk about really is very different
00:28:24.340 than the content that should be, you know, stock in the shelves of, you know, school libraries for
00:28:29.460 for kindergartners. Yeah, my problem is, is this is a show that is based on information and opinion.
00:28:37.240 You may not like it, but we we take it. We take our job seriously. We try to be responsible. I've
00:28:43.780 always been responsible with the FCC. And it's not a, you know, a 1990s Howard Stern kind of thing,
00:28:52.800 which we're way past that. This is this is being read to our students in many schools all across the
00:29:02.120 country. And it is absolutely indecent. And I know it's indecent. But why do I get in trouble
00:29:11.920 for exposing this indecency? And the way to expose it is to make people understand by hearing it,
00:29:20.000 how unbelievably indecent it is.
00:29:25.880 Yeah, look, I think we've gone a long way recently in trying to address this issue by doing what
00:29:31.360 you're doing. You know, we've had instances where parents have tried to read books from their, you
00:29:35.980 know, again, kindergarten library at school board meetings, at city council meetings, and they've
00:29:42.620 been shut down and said, we can't allow that content to be spoken at the city council meetings. Yet,
00:29:48.160 you know, there it is in the in the kids classroom. And so I do think there's been some progress in
00:29:53.380 that. Now, from my perspective, I remember, you know, I growing up in high school, the famous Eminem
00:29:58.780 song, the FCC won't let me be. And it's quite ironic, after humming that song in high school that
00:30:04.700 I've ended up at the FCC. And look, we try to be very, you know, pro free speech about this stuff. But
00:30:09.420 this is an issue that we're dealing with as a cultural matter right now.
00:30:12.700 I would not have a problem if it were me possibly losing my license. But I, I lose the license,
00:30:20.920 anything I do could possibly jeopardize the license of every station in my chain. So there's no way that
00:30:28.360 there's no way I'm going to put people out of work to prove this. What do you recommend?
00:30:35.040 Right. Well, look, again, there's, you know, a newsworthy exception to discussing some of this
00:30:41.680 stuff. But, you know, look, if you think it's, it's, I mean, it could be good or bad. I don't
00:30:45.880 know. But, you know, if it's close to the line, you know, there, there still are background indecency,
00:30:50.400 profanity rules of the FCC, we do get complaints from time to time, we usually dismiss them or don't
00:30:55.720 address in the main, but yeah, you do potentially subject, subject yourself to FCC scrutiny in those cases.
00:31:00.860 My problem is I had some of the best attorneys in Washington on free speech and FCC. I've always,
00:31:07.060 I've always had, I have for 25 years, about three years ago, they called, they also represent Google
00:31:14.460 and Apple and Facebook. And they dropped me in the middle of a case as a client, because it made their
00:31:23.260 other clients uncomfortable, and they had to make a choice. So I'm not sure if you will see me and my
00:31:29.900 attorney at some point, because I, you know, it's hard to get one, if you have to have my opinion
00:31:35.900 today. Brendan, thank you so much. I appreciate all that you do at the FCC. God bless.
00:31:42.000 Appreciate it. Thank you.
00:31:42.860 You bet. Brendan Carr, FCC Commissioner.
00:31:49.980 The best of the Glenn Beck Program.
00:31:51.900 So, Stu, there are two stories that I barely understand. Let me start with the one that I
00:32:03.440 really am a little foggy on. For any mammal, the loss of the Y chromosome should mean the loss of males
00:32:13.160 and the demise of the species. However, the mommy spiny rat manages without a Y chromosome and is public
00:32:24.580 as puzzled biologists for decades. Now, a Japanese scientist and her colleagues have shown that one
00:32:31.220 of the rats normal chromosomes effectively evolved into a new male sex chromosome. I hate to get all
00:32:39.660 science-y because I don't know how these rats identify. I don't know any of their pronouns or anything
00:32:46.440 else. So the reason why this is important is because the Y chromosome seems to be getting weaker and weaker
00:32:59.040 in a lot of mammals, including man. And once you lose the Y, then what happens? You've only got females.
00:33:12.880 End of the species. So that's why they're looking into this because they believe that we are headed
00:33:20.420 for the same kind of thing.
00:33:22.720 End of the species.
00:33:23.760 Yeah.
00:33:23.940 I mean, I guess think of just all the car accidents.
00:33:27.400 Oh, my gosh. Only women drivers? It would be crazy. Oh, my gosh. And women presidents and CEOs.
00:33:33.160 Oh, gosh. Just shut the thing down.
00:33:35.260 Lord, please come now. Anyway. So.
00:33:40.060 Stupid.
00:33:42.080 That was largely just to piss off Sarah and the other room.
00:33:44.580 Oh, it is. Largely. 100%.
00:33:47.720 And, of course, the fact that it's true.
00:33:49.800 Right. So the next story is a quantum computer has simulated a wormhole for the first time.
00:33:59.480 Now, do you know what a wormhole is?
00:34:02.020 It's a space thing. It's like a science-y space thing.
00:34:04.980 Okay. So it's like you take a piece of paper and you fold it in half. Then you, I think,
00:34:10.400 fold it again.
00:34:11.100 Okay.
00:34:11.740 And you put a little hole in it. Okay. You would see that there would be two holes in the piece
00:34:18.880 of paper.
00:34:19.400 Yeah.
00:34:19.520 If you open it up.
00:34:20.560 It looks like a mask.
00:34:21.020 Okay.
00:34:21.260 With your eye holes.
00:34:22.300 In fact, it's almost the perfect mask. Okay. So, and probably Fauci would have me wear this.
00:34:28.500 Anyway. So, a wormhole is a way to collapse the distance in between those two holes. Okay.
00:34:36.680 In space. And then they are right. You go through one hole and you're right there because they're
00:34:42.800 next to each other.
00:34:43.580 Right. Instantly.
00:34:44.580 If space is folded. Okay. So, that's the idea of a wormhole. You could travel great distances
00:34:51.960 through that quickly. So, this has just been a theory. Scientists with a quantum computer
00:35:01.420 have just simulated a wormhole for the very first time. Now, it gets very complex because
00:35:09.880 they say it was a holographic, but it's not exactly a holograph. They just simplified things
00:35:19.360 by taking gravity out of the equation, which gets into Einstein and theory of relativity.
00:35:24.080 So, they had to have something that would take gravity out and see if they could simulate
00:35:29.080 this. Well, they did. And what this means is you could have without any wires, cables,
00:35:38.380 Wi-Fi, nothing. You can take something digitally and send it from, let's say, my desk to a desk
00:35:48.600 in London and it would exist in both places. And you could close one of the doors and it would
00:35:57.720 either come back to me and only be here or I could close my door and it would be in London.
00:36:04.800 They just did this. This changes everything. This changes everything. This is, you remember
00:36:13.420 Einstein when he was, they talked to him about quantum physics. He said, God doesn't play dice.
00:36:20.500 Meaning, there is no super, there's no super position of, of a molecule or a, I don't even
00:36:29.820 know, of a, of a qubit, they're now called. It can't be both positive and negative. It can't
00:36:35.680 be both one and a zero. But quantum says, yes, it can. That led him to say, God doesn't play
00:36:42.700 dice. It doesn't work that way. Remember, the theory of relativity is only a theory. It's
00:36:52.400 the best theory we have on how things work. Quantum comes up and says, I don't think the
00:36:58.560 basic soup, I don't think it really goes with any of those physics. I think it breaks down
00:37:06.840 at some point and starts behaving completely illogically. This shows that, uh, Einstein
00:37:15.280 may have been wrong. Maybe God is playing dice. This, this, this, the, the things that we have on
00:37:22.880 the horizon are so groundbreaking and just quantum computing. All of this stuff will change life
00:37:34.700 in ways we, it's like we're standing in the 1200s and trying to imagine today, but it's going to
00:37:42.100 happen in the next 50 years. Do, do we have any idea where this would end up? Like what, what would,
00:37:47.140 what would be the end game of this type of technology? If it, if it were to come to fruition?
00:37:50.500 The biggest thing with quantum computing is you will, you will probably solve cancer in a week.
00:37:56.700 You will solve these problems that cannot be solved because it can model a million different
00:38:02.620 things all at the same time. So remember, um, uh, Edison said, you know, I didn't find a, uh, uh,
00:38:10.520 I didn't fail a thousand times. I found a thousand ways the light bulb doesn't work.
00:38:16.760 That will, you'll only fail. You'll fail and succeed one time because you'll try all of the combinations
00:38:25.320 all at once. And you'll have the answer. It feels like there are so many things right now
00:38:33.580 on the fringes of science, like where we are really, where scientists are, are, are playing,
00:38:41.220 right? They're, they're at the, the very edges of understanding where they can go, but see the
00:38:46.200 path forward. You know, some of these problems like this one are just beginning to be solved. And
00:38:50.780 there's so many different directions, whether it's, you know, we talked about the singularity
00:38:53.720 or, or whether it's a quantum computing or all sorts of different technologies that it feels like
00:39:01.300 one of these is going to hit in a way that totally changes the world almost immediately.
00:39:06.540 But in a way, let's, let's look at the telephone for a minute. Put yourself back at Alexander Graham
00:39:11.760 Bell's time. Alexander Graham Bell comes up with it and people think, oh, this is great. Look at this,
00:39:16.700 but nobody's going to have a telephone for a long time. Yeah. They say about everything.
00:39:19.840 Yeah. And they think, oh, well, I'll just, right. I'll just go to,
00:39:23.440 you know, the town square that will have a telephone and I'll be able to call, you know,
00:39:28.500 Washington if I needed to talk to the president because it was an emergency. They were thinking
00:39:33.800 like that. They would have never thought, think of the phone today. It's no longer cordless. I mean,
00:39:42.080 it's no longer a cord. Right. It, it, it doesn't work with right, with, with, uh, with wires. It doesn't,
00:39:48.480 it, it's a television. It's a camera. It, I mean, it's no longer really even for phone
00:39:53.080 conversations. Right. I mean, it's, and that's, I think a really interesting example of how this
00:39:57.840 goes. You think about the singularity for a second, right? Singularity being eventually we
00:40:01.880 merge with machines. Tell me if this is a terrible description, but my very terrible understanding of
00:40:05.920 it. Eventually we merge with computers where we are able to access information instantly because we
00:40:12.220 have maybe a chip in our head or whatever that allows us. Right. And we, we also have a nanobot
00:40:19.040 technology in us, in our bloodstream that is keeping you alive. You don't have to take medicine
00:40:23.720 anymore. The nanobots are programmed to take care of your body and it repairs itself through technology,
00:40:31.040 which is connected to AI, a giant machine outside of your body. Right. So you're one with AI,
00:40:39.400 your one with machine jerk, your hybrid person. Who is that? That's the singularity. That's the
00:40:43.880 singularity. So if you think about, let's just say for information purposes, you want to get an
00:40:50.080 answer about something in, in this world of the singularity, you want to know who was, uh, you know,
00:40:56.400 the president of France in 2004, right? You, it would instantly, you'd be able to access that
00:41:02.220 information instantly inside your brain, basically. Yeah. Right now you have to go to, to Google,
00:41:07.380 open up Google and type in your question. Right. The singularity, the way it would be imagined to
00:41:13.760 be used at its highest level would, Oh, who was the president of France? Oh, it was a so-and-so.
00:41:19.780 Right. Okay. No, the minute you think it, the answer is there. Right. Because you're connected
00:41:25.440 to everything. But in a way, what you're describing is essentially the same process,
00:41:32.720 just faster, right? Yes. You are right now we have crossed a line to the point where when we used
00:41:40.180 to do, when we talked about this on, on radio terms before the radio used to be really fun
00:41:45.240 because what you'd be able to do is come on the air and you'd say, Oh, what was that movie with
00:41:49.740 Corey Haim? Remember this? He was a guy and he, and he would go, do you remember? There's two
00:41:57.360 Corey's in it. What was that movie? And everyone, and then people would call in, they'd say, Oh,
00:42:01.300 it was Goonies. No, no, it wasn't Goonies. It was, I mean, you go through this whole thing
00:42:05.700 and you could do shower hours on this and people would reminisce about these memories and think
00:42:10.420 about these things and try to figure them out. And now that's all dead because everyone just goes,
00:42:16.760 Corey Haim types it in and looks at his IMDB page and knows the answer in five seconds. Right.
00:42:21.180 Correct. And so we have, wait, wait, wait, before you move on from there, what has that
00:42:26.260 also done to our memory? Right. Terrible. It's worse. You don't even think about, I have
00:42:32.240 to store that or I remember what's his name. Oh, I remember we were sitting in a room and
00:42:36.880 it was a so-and-so that said, okay, you don't do that. So your memory is weakened. I see it
00:42:43.320 with my kids when they want answers from things. They're like, uh, who's, you know, uh, what
00:42:48.860 was the score of the, they just ask elect the person, the dumb device that I won't screw
00:42:53.480 all the people, uh, by saying the name, but they, they'll ask the device, you know, without
00:42:58.600 trying to, to think about it for an hour. They just, they know the answers there. And
00:43:03.320 that's the same concept of what the singularity could theoretically become. Right. So we,
00:43:07.500 so imagine, imagine if you're going to Italy and you want translation, you'll be able to
00:43:14.600 understand them instantly because it'll be there. The translator will be inside of you.
00:43:19.660 You'll probably butcher it because it requires your physical use of your mouth, but it will,
00:43:25.800 you will know how that is supposed to be said and you'll say it. But once that information is gone,
00:43:33.820 you can't communicate in that language anymore. If you're cut off, right. You know, you, there's
00:43:40.820 no memory of it. There's memory that you did it, but there's no memory. There's no muscle memory.
00:43:46.460 There's nothing. And this gets to the point that, you know, think about being de-platformed now.
00:43:51.500 What does that mean? Oh, I lose my Twitter account. What does it mean? If, if the singularity exists
00:43:55.940 and you're de-platformed from all of this knowledge that everyone else can access immediately?
00:43:59.900 Oh no, Ray Kurzweil said that would never, that would never happen.
00:44:02.780 That would never happen.
00:44:03.560 So, okay.
00:44:04.480 So that's, that's a totally different road. When we think about the innovations that happen
00:44:08.960 when these things kick in, we could talk about Alexander Graham Bell, but go back just a decade,
00:44:15.520 right? Before, go back to 2008, right?
00:44:18.640 Before the iPhone.
00:44:19.300 Before the iPhone. We've gone from literally no one having these things or maybe just for
00:44:24.620 occasional phone calls to the era where everyone expects to have, it's on these things five,
00:44:30.800 six hours a day. We, we, that is the merging of man and machine. It already, it's already happening.
00:44:37.020 On the air, I said to you in the nineties that networks and watching shows is not, it's not going
00:44:44.780 to be Thursday night at eight o'clock. It'll just, you just will log on and download it and you'll
00:44:49.780 have all the episodes that you want. And that seemed completely insane. Insane in the nineties.
00:44:54.420 Yeah. And here we are. We all now expect it as normal TV and really put very little thought into what
00:45:01.240 it means or how, wait a minute. Right. Sound like you're starting to make a point on this.
00:45:05.920 Yeah. You know, hang on. I just want to live in my fantasy world here for a second and not think
00:45:10.760 about that.
00:45:11.440 Na, na, na, na, na.