The Charlie Kirk Show - April 18, 2023


Villain or Hero? Breaking Down Elon Musk + Tucker Carlson


Episode Stats

Length

32 minutes

Words per Minute

175.29955

Word Count

5,706

Sentence Count

487


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Charlie Kirk Show" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
00:00:00.000 Hey, everybody.
00:00:00.000 Today in the Charlie Kirk show, Elon Musk, what happened to our elites?
00:00:03.000 They're like, I'm not going to tell you Elon's the best guy ever, but I think what he's doing is admirable.
00:00:06.000 I hope I can convince you of that.
00:00:08.000 Email me, freedom at charliekirk.com.
00:00:10.000 Get involved with turning pointusa at tpusa.com.
00:00:13.000 Turning point USA is on the front lines.
00:00:15.000 It's America's most important organization, tpusa.com.
00:00:19.000 Get involved with turning pointusa at tpusa.com.
00:00:23.000 You can email me directly, freedom at charliekirk.com.
00:00:26.000 That's freedom at charliekirk.com.
00:00:29.000 Buckle up, everybody.
00:00:30.000 Here we go.
00:00:31.000 Charlie, what you've done is incredible here.
00:00:32.000 Maybe Charlie Kirk is on the college campuses.
00:00:34.000 I want you to know we are lucky to have Charlie Kirk.
00:00:38.000 Charlie Kirk's running the White House, folks.
00:00:41.000 I want to thank Charlie.
00:00:42.000 He's an incredible guy.
00:00:43.000 His spirit, his love of this country, he's done an amazing job building one of the most powerful youth organizations ever created, Turning Point USA.
00:00:52.000 We will not embrace the ideas that have destroyed countries, destroyed lives, and we are going to fight for freedom on campuses across the country.
00:01:00.000 That's why we are here.
00:01:03.000 Brought to you by the Loan Experts I Trust, Andrew and Todd at Sierra Pacific Mortgage at AndrewandTodd.com.
00:01:12.000 The world's richest man sat down with the number one cable talk show host in the country.
00:01:18.000 Elon Musk and Tucker Carlson had an incredible interview yesterday.
00:01:23.000 And part two will come out today.
00:01:26.000 You could tell a lot about the health of a society looking at the ruling class.
00:01:33.000 You could tell a lot about where a nation is headed and whether or not that nation is prospering or faltering based on whether or not the people in charge, the people that have been tasked with governing the society, are doing what is necessary.
00:01:51.000 That doesn't mean they all have to agree on every topic.
00:01:53.000 That doesn't mean that every single person in charge of society must have the same political beliefs.
00:01:58.000 But what we're living through right now, and it just was so clear just watching the Elon Tucker interview, and I didn't agree with everything Elon said, but it was so clear that in some form or fashion, Elon felt some responsibility that I am the world's wealthiest man.
00:02:13.000 I am the country's wealthiest man.
00:02:16.000 I probably shouldn't damage humanity or damage the country.
00:02:20.000 And for that, he's actually a rarity.
00:02:24.000 That simple sentence that I have wealth, I have power, I have prestige, and I want to do something that is not trying to remake the world in my image.
00:02:35.000 Not trying to remake the world so I can gain necessary more temporal power.
00:02:41.000 You see, we're always going to have leaders.
00:02:44.000 We're always going to have elites.
00:02:45.000 Elites will always be necessary.
00:02:47.000 We are not Marxists.
00:02:48.000 We do not believe in this utopian egalitarianism.
00:02:51.000 We believe in hierarchies.
00:02:53.000 We believe that there will be people that get more money.
00:02:56.000 We believe that some people have a greater work ethic, higher IQ, more of a drive.
00:03:02.000 And therefore, you're going to have the creation of hierarchies.
00:03:05.000 The Pareto principle shows us this: that over a period of time, some people are going to get wealthier, some people are going to create more.
00:03:11.000 And that's fine.
00:03:12.000 That's natural.
00:03:13.000 That is healthy.
00:03:15.000 Markets are the best way to actually organize hierarchies because you're able to give a preference on work ethic, responsibility, delayed gratification makes your society wealthier.
00:03:28.000 So hierarchies are not going away.
00:03:30.000 The left likes to make, like, it seems like we need to get rid of hierarchies.
00:03:32.000 Instead, they actually want to protect the oligarchy so it's untouchable.
00:03:36.000 You see, what we're living through, especially in the last 20 years, is unprecedented in American history.
00:03:42.000 It's not that we have been able to get rid of our elites.
00:03:44.000 Our elites are actually wealthier than they were even in the Gilded era.
00:03:49.000 Wealthier than they were in the Gilded Age.
00:03:51.000 They're more powerful than they were at any time in American history, whether it be Google or Goldman Sachs or the FBI.
00:03:58.000 No, what's unprecedented is that the people in charge of the society, the elites, the corporate elites, the entrepreneurs, the tech CEOs, the people of Congress, they have bitter resentment for the nation and the culture that they oversee.
00:04:14.000 That's what is unprecedented.
00:04:16.000 What's unprecedented is that the people in charge hate you and they're vocal about it.
00:04:22.000 Countries are run well when elites care about the nation and the future of the country and the people in their care.
00:04:30.000 Tucker has an analogy.
00:04:31.000 He says often, he says, we should imagine the country as a household or a family, and our leaders are at least temporarily the parents.
00:04:38.000 The parents look out for what's best for their children.
00:04:41.000 They don't prioritize other people's children.
00:04:43.000 They don't let random other kids move into the house and take over.
00:04:46.000 They don't shrug their shoulders if their kids get addicted to drugs or gambling or video games.
00:04:51.000 They don't lie to them, which our leaders have made a pattern.
00:04:55.000 They give them some autonomy and freedom as they get older, and they generally want what is best for them.
00:05:00.000 Our leaders, our elites, want the opposite.
00:05:04.000 They are parents who have decided their kids are indeed defective, so they want to throw them out and replace them with other people's kids.
00:05:11.000 That's what our leaders want.
00:05:13.000 The consensus amongst the elites is I hate the body politic of America.
00:05:17.000 I hate their customs.
00:05:18.000 I hate their beliefs.
00:05:20.000 I hate their religious traditions.
00:05:22.000 Hillary Clinton called them deplorables, the smelly Walmart people, the flyover country.
00:05:28.000 And our elites are completely unpredictable.
00:05:31.000 First of all, they refuse to take any responsibility.
00:05:34.000 See, they swerve wildly from being tyrannically controlling to then also being absentee.
00:05:41.000 They spy on everything we say.
00:05:43.000 At the same time, they want to edit our books and censor what we can read.
00:05:46.000 Simultaneously, they don't care if we get addicted to drugs or our kids are killing themselves or our border remains wide open.
00:05:53.000 They let mobs loot and plunder our streets because it lets them pose as a deeply progressive on racial equity, which makes them more powerful.
00:06:01.000 Contrast this with the elites that we are told to hate.
00:06:04.000 You go to a government school.
00:06:06.000 If you go to a government school, you are told to hate Andrew Carnegie.
00:06:09.000 You are told to hate Leland Stanford.
00:06:12.000 You're told to hate J.P. Morgan.
00:06:14.000 You are taught to hate John D. Rockefeller.
00:06:17.000 You are taught to hate the industrial titans of the early 1900s because they had too much money and they exploited people.
00:06:25.000 When in reality, we should be hating our current elites and we should actually be learning about the people that we call the robber barons, the people of the progressive era.
00:06:34.000 Why do we hate them exactly?
00:06:35.000 Okay, they had a lot of wealth.
00:06:37.000 They might have gotten too rich too quickly.
00:06:38.000 Maybe part of the trust busting was admirable by Teddy Roosevelt.
00:06:42.000 Leland Stanford spent a lot of money back when colleges were a good thing to start Stanford University.
00:06:42.000 But guess what?
00:06:49.000 He loved the country.
00:06:50.000 Andrew Carnegie built libraries, hospitals, public parks.
00:06:55.000 Today, our elites, the wealthiest people in charge of our society, they're not building places of learning or institutions to pursue truth or virtue or goodness or to preserve Western society.
00:07:07.000 Our elites are not trying to build places of deeper learning to support the next generation or to make the country stronger.
00:07:15.000 They're not doing what J.P. Morgan did, literally bailing out the country.
00:07:19.000 You know what our elites are doing now?
00:07:20.000 They're funding trans surgeries for kids.
00:07:23.000 Our elites now are actively involved in the arson of America.
00:07:28.000 This is a new development.
00:07:29.000 It's a new phenomenon.
00:07:30.000 We did not have this in the 60s, 70s, or 80s.
00:07:33.000 We had bad people.
00:07:35.000 We had bad individuals.
00:07:37.000 We did not have a collective agreement amongst the top tier of society, amongst the 1% of the 1%, the Google people, the Goldman people, the J.P. Morgan people, where they all seem to just agree on one thing.
00:07:51.000 They might disagree on other things.
00:07:52.000 They all agree and they just say, we're so sick of the muscular class.
00:07:56.000 We don't like the country class.
00:07:59.000 We don't like the people in Missouri or Iowa.
00:08:01.000 Who are these people?
00:08:02.000 Let's just bring in a bunch of foreigners.
00:08:04.000 We'll like them more.
00:08:05.000 What's this constitution?
00:08:06.000 This thing, this thing's outdated.
00:08:08.000 Get rid of this because I learned it at Yale.
00:08:10.000 That's the thing the elites can agree on, but they seem unwilling at every corner, any turn, to ever acknowledge that the country is falling apart.
00:08:20.000 No, instead, they'll dedicate hundreds of millions of dollars or billions of dollars to systemic racism or fighting the trans genocide, something that doesn't even exist.
00:08:30.000 It's a complete fiction.
00:08:32.000 We're talking about some sort of esoteric, abstract concept of climate change.
00:08:36.000 No, instead of actually pouring into the core fabric of what makes a nation strong and prosperous, they have disdain for you.
00:08:48.000 And that's what was so powerful about the Elon Tucker interview that we are going to play bits and pieces and talk about is that you could love Elon, you can hate Elon, you could trust him or distrust him.
00:08:59.000 It is a fact that what he is doing is a rebellion against the ruling class.
00:09:04.000 He has defected from that homogenous point of view.
00:09:08.000 He is certainly heterodox.
00:09:10.000 What he has done was not well supported.
00:09:13.000 Oh, yeah, he's unpredictable.
00:09:15.000 He could be an agent of chaos.
00:09:17.000 He could be a disruptor.
00:09:19.000 You better believe that he has different politics than I do on certain issues.
00:09:23.000 But it is a fact that what you saw in real time in that Tucker Elon interview is a one-man crusade of the world's wealthiest man going up against all the other plutocrats, all the other fat cats, all the other oligarchs and saying, you're a fraud.
00:09:37.000 You want to be like God.
00:09:38.000 You fund the Democrat Party.
00:09:40.000 I don't like you.
00:09:40.000 I like free speech and I am pro-human.
00:09:43.000 And for goodness sake, God bless him for that.
00:09:46.000 And we're going to talk about it because we need more elites that defect from this idea pathogen, the Borg, the one-size-fits-all that I hate the country.
00:09:57.000 I even hate humanity at times.
00:09:59.000 No, no, no, no.
00:10:00.000 This is a healthy development.
00:10:01.000 Is he going to succeed?
00:10:02.000 I have no idea.
00:10:03.000 What I'm saying, though, is we need more defections, more rebellions, more questioning of this leviathan of the elites who hate you, hate your customs, hate your religion, and hate the country.
00:10:17.000 Hey, everybody, Charlie Kirk here.
00:10:18.000 Are you tired of feeling burnt out and struggling to stay productive throughout the day?
00:10:22.000 Does brain fog and short-term memory loss keep you from functioning at your best?
00:10:25.000 Well, I did too.
00:10:26.000 And then I was introduced to Strong Cell, more specifically, NADH.
00:10:30.000 If you don't believe me, check it out yourself.
00:10:32.000 What Strong Cell has done is they have a scientific breakthrough in cellular health.
00:10:36.000 They combine NADH, CoQ10, and marine collagen to boost your body's cellular function.
00:10:41.000 I personally take it every day.
00:10:42.000 I'm a big NAD believer.
00:10:43.000 People say, Charlie, how do you keep up the schedule?
00:10:45.000 How do you have mental clarity?
00:10:47.000 Go do some research on NAD.
00:10:48.000 It is a precursor to ATP, which is your body's life source.
00:10:52.000 Strong Cell combines NADH with some of the best ingredients and vitamins available.
00:10:56.000 I get approached by many supplement brands, and I tell most of them no.
00:10:59.000 Do yourself a favor and give Strong Cell a try.
00:11:01.000 Visit strongcell.com forward slash Charlie today and use promo code Charlie.
00:11:06.000 That is strongcell.com forward slash Charlie.
00:11:08.000 And don't forget your 20% discount by using promo code Charlie at checkout.
00:11:12.000 Strongcell.com forward slash Charlie.
00:11:14.000 Check it out.
00:11:15.000 Strongcell.com forward slash Charlie.
00:11:19.000 Angelo Cotavilla wrote an amazing book on the ruling class.
00:11:24.000 In 2010, he summarized it in an essay.
00:11:26.000 He said, quote, today's ruling class from Boston to San Diego was formed by an educational system that exposed them to the same ideas and gave them remarkably uniform guidance as well as tastes and habits.
00:11:42.000 These amount to a social canon of judgments about good and evil, complete with a secular sacred history.
00:11:48.000 Sins, for example, actions against minorities and the environment, and saints using the right words and avoiding the wrong ones when referring to such matters.
00:11:57.000 Speaking with the in language serves as a badge of identity.
00:12:00.000 Regardless of what business or profession they are in, they wrote up, included government channels and government money because as government has grown, its boundary with the rest of American life has become indistinct.
00:12:12.000 Many began their careers in government and leveraged their way into the private sector.
00:12:16.000 Hence, whether formally in government or out of it or halfway, America's ruling class speaks the language and has the tastes, habits, and tools of bureaucrats.
00:12:26.000 It rules uneasily over the majority of Americans, not oriented to the regime.
00:12:31.000 Elon Musk called out Larry Page.
00:12:33.000 Now, two things that all of us in this audience have in common: we believe there is a God and we are not him.
00:12:40.000 Third thing is we don't want to become God.
00:12:42.000 It's very important.
00:12:43.000 Now, I'm not trying here to convince you that you should support Elon Musk.
00:12:47.000 I'm not even saying you should trust Elon Musk.
00:12:49.000 Instead, what I'm talking about is a broader picture of elite culture, the elite community, the governing class, and how they have abdicated their responsibility and how Elon is changing that.
00:13:03.000 And we're getting a lot of emails.
00:13:04.000 People are not convinced.
00:13:06.000 They think Elon's awful and terrible.
00:13:09.000 Okay.
00:13:10.000 Well, Elon deserves some credit for calling out Larry Page by name and saying that Google is trying to create a digital god.
00:13:17.000 I haven't heard another ruling class person with the courage to challenge Google like this.
00:13:21.000 Play cut 40.
00:13:23.000 The reason Open AI exists at all is that Larry Page and I used to be close friends, and I would stay at his house in Palo Alto.
00:13:31.000 And I would talk to him late into the night about AI safety.
00:13:35.000 And at least my perception was that Larry was not taking AI safety seriously enough.
00:13:43.000 What did he say about it?
00:13:44.000 He really seemed to be once sort of digital super intelligence, basically digital god, if you will, as soon as possible.
00:13:55.000 He wanted that?
00:13:56.000 Yes.
00:13:57.000 He's made many public statements over the years that the whole goal of Google is what's called AGI, artificial general intelligence or artificial superintelligence.
00:14:06.000 When I talk about how the ruling class has contempt for humanity, people roll their eyes.
00:14:14.000 You can think negatively of Elon Musk.
00:14:15.000 You can think, of course, oh, he's terrible.
00:14:17.000 He's all these different things.
00:14:18.000 When's the last time someone went on cable television, went on any sort of platform, and was willing to say that the head of Google has contempt for human beings?
00:14:26.000 It's a term that you might not pick up on called speciist.
00:14:26.000 That's what's here.
00:14:30.000 The head of Google accused Elon Musk of being specious, racist towards other species or intolerant of other species outside of human beings.
00:14:40.000 Yeah, guilty.
00:14:41.000 Actually, we are pro-human.
00:14:42.000 We should try to agree on a pro-human future, probably pretty something that is important.
00:14:48.000 How many other elites are doing this?
00:14:51.000 You might hate Elon.
00:14:52.000 You might think he's the worst person ever, but you have to be honest.
00:14:54.000 I'm not going to try to convince you otherwise.
00:14:56.000 Maybe Elon is awful and terrible.
00:14:56.000 You might be right.
00:14:58.000 It's not about it.
00:14:59.000 But it's a fact.
00:15:00.000 How many other people that are in the tech community that have hundreds of billions of dollars are willing to call out Google by name and say that they actually have contempt for human beings?
00:15:08.000 Anybody?
00:15:10.000 Zuckerberg?
00:15:11.000 Google?
00:15:13.000 Dropbox?
00:15:14.000 Salesforce?
00:15:15.000 Play cut 41.
00:15:16.000 No, and I agree with him that there's great potential for good, but there's also potential for bad.
00:15:22.000 And then at one point, I said, well, what about, you know, we're going to make sure humanity is okay here.
00:15:32.000 And they called me a species.
00:15:37.000 Did he use that term?
00:15:38.000 Yes.
00:15:39.000 And there were witnesses.
00:15:41.000 Wasn't the only one there when he called me a specist.
00:15:43.000 And so I was like, okay, that's it.
00:15:46.000 Yes, I'm a species.
00:15:48.000 You got me.
00:15:48.000 Okay.
00:15:50.000 What are you?
00:15:53.000 Head of Google thinks that you are being a specist, accusing you in a derogatory, negative way, condemning you for trying to argue for what is best for human beings.
00:16:02.000 Yeah, I think it's a pretty good thing that's now in the zeitgeist, and it shows a window.
00:16:06.000 The people in charge of Google very well might be able to agree with depopulation agendas, anti-human behavior.
00:16:14.000 How many elites are willing to speak out against that?
00:16:17.000 It would be good to have a little bit more discussion, disagreement amongst the people that are in charge of the governing class.
00:16:27.000 Are you concerned about the quality of your drinking water?
00:16:30.000 With all the disastrous flooding happening around the country, water sources are starting to become contaminated.
00:16:35.000 To help you avoid paying for bottled water, our friends at MyPatriot Supply are having a special limited time offer.
00:16:41.000 They'll give you for free a groundbreaking gravity-powered water filtration system so you could have your most important need in crisis: clean drinking water.
00:16:50.000 Remember, you can only go three days without water.
00:16:52.000 That's why AlexaPure Pro Water Filtration System might soon be the most essential appliance you own.
00:16:57.000 You need to get a kit, though, from MyPatriot Supply.
00:17:00.000 Normally, $279, it's yours for free.
00:17:02.000 When you purchase a three-month emergency food kit from MyPatriot Supply, each kit provides over 2,000 calories a day for strength and energy during tough times.
00:17:12.000 Best of all, though, the food is delicious.
00:17:13.000 Your whole family will love it.
00:17:14.000 To get your emergency food and free Alexapyr pro water filtration system, go to mypatriotsupply.com.
00:17:22.000 That is mypatriotsupply.com.
00:17:24.000 Check it out right now, mypatriotsupply.com for your free AlexaPyr pro-water filtration system, mypatriotsupply.com.
00:17:35.000 I want to read some emails here.
00:17:36.000 Definitely an overwhelming, a negative view of Elon Musk.
00:17:40.000 Charlie, I think he's the Antichrist.
00:17:42.000 Charlie, I believe that he's doing nothing but negative for humanity.
00:17:46.000 Charlie, he's bought by the CCP.
00:17:48.000 Charlie, stop praising Elon.
00:17:49.000 There's some positives, but I do want to make sure the negatives are communicated.
00:17:52.000 I'm not going to debate here.
00:17:53.000 I've never met Elon.
00:17:54.000 I don't know his personal motivations.
00:17:56.000 I don't know what drives him.
00:17:57.000 I just know what he's doing.
00:17:58.000 And that's a separate issue, isn't it?
00:18:00.000 So there's two categories.
00:18:01.000 I could judge somebody on their actions, which I think is fair, or somebody on their intentions, which is largely unknown unless you can really get to know somebody.
00:18:08.000 Not unknown as you see a pattern of behavior.
00:18:12.000 So the question I guess would be if Elon wants to take over the world just for power, why would he buy Twitter, which is now half as valuable today than when he bought it?
00:18:20.000 So that's kind of interesting.
00:18:22.000 I'm now freer to speak on Twitter than I was prior.
00:18:26.000 And so look, you could say he's the best, he's the worst, he's the antichrist, or you could kind of somebody in the middle and say what he's doing is necessary.
00:18:33.000 Let's talk about artificial intelligence.
00:18:35.000 We've done entire shows on artificial intelligence.
00:18:37.000 Our current elites are in bed with trying to create a digital God.
00:18:43.000 Okay?
00:18:44.000 And nobody is speaking out against it.
00:18:46.000 Nobody with power.
00:18:47.000 So Elon Musk takes a fair amount of time and says, hey, just so we're clear, this artificial intelligence stuff could destroy the entire world.
00:18:54.000 It could destroy humanity as we know it.
00:18:56.000 He's right.
00:18:57.000 People in power need to start saying it.
00:18:59.000 Is Zuckerberg saying that?
00:19:00.000 No.
00:19:02.000 Is the Sundar Perkay guy from Google saying it?
00:19:06.000 No.
00:19:07.000 Larry Page?
00:19:08.000 No.
00:19:08.000 Sergey Brin?
00:19:10.000 George Soros?
00:19:10.000 No.
00:19:11.000 No.
00:19:12.000 Lorraine Powell Jobs?
00:19:13.000 No.
00:19:14.000 They're training AI to take over the entire world.
00:19:18.000 I cannot tell you, as somebody who understands this at a very elementary level, not at a sophisticated, advanced level, this is heading in a bad direction very quickly.
00:19:31.000 So they want to try to create a digital God.
00:19:34.000 We played that tape.
00:19:37.000 He gets accused of being a species.
00:19:39.000 Sounds like the similar left-wing tactics, doesn't it?
00:19:41.000 You're a racist.
00:19:42.000 You're a homophobe.
00:19:43.000 You're a colonialist.
00:19:45.000 You're an imperialist.
00:19:46.000 You're a species.
00:19:47.000 That's where we're at.
00:19:48.000 If you say that human beings should come first, they call you a specist.
00:19:52.000 Play cut 42.
00:19:54.000 Can you be more precise about what's potentially dangerous and scary?
00:19:58.000 Like, what could it do?
00:19:59.000 What specifically are you worried about?
00:20:02.000 Going with old sayings, the pen is mightier than the sword.
00:20:06.000 So if you have a super intelligent AI that is capable of writing incredibly well and in a way that is very influential, you know, convincing, and then and is constantly figuring out what is more convincing to people over time and then enters social media, for example, Twitter, but also Facebook and others, you know,
00:20:33.000 and potentially manipulates public opinion in a way that is very bad.
00:20:38.000 How would we even know?
00:20:40.000 AI is being trained to be dishonest.
00:20:42.000 So they're being trained to be a left-wing activist, basically, a social justice warrior.
00:20:47.000 It's becoming a woke super weapon.
00:20:49.000 Who else is speaking out against this?
00:20:50.000 And let's just be honest, who else has the resources to actually combat this?
00:20:54.000 Congress is not even doing anything close to enough about this.
00:20:59.000 This thing could take over entire cyber grids, could take over militaries.
00:21:03.000 Play cut 43.
00:21:05.000 I'm worried about the fact that it's being trained to be politically correct, which is simply another way of being untruthful, saying untruthful things.
00:21:14.000 Yes.
00:21:15.000 So that's a bad sign.
00:21:17.000 That's certainly a path to AI dystopia is to train an AI to be deceptive.
00:21:21.000 A path to be evil, dishonest.
00:21:25.000 The problem is, as Elon says in this interview, usually we only put up roadblocks or regulations after something bad has happened.
00:21:33.000 The AI may be in control at that point, the artificial intelligence.
00:21:37.000 It might be sentient.
00:21:37.000 It might have singularity.
00:21:39.000 And Google is all on board for this.
00:21:42.000 So for those of you that have nothing but negative things to say about Elon Musk, here's the binary, I suppose.
00:21:49.000 You could have what you have right now, which is an unpredictable chaos agent who at least has liberated Twitter and exposed the government's control net.
00:21:58.000 I'll get to that in a second.
00:21:59.000 Fighting for free speech that might have ties to the CCP that I don't like and you don't like.
00:22:03.000 Or he could be using his $180 billion to assist Larry Page.
00:22:08.000 Which would you prefer?
00:22:10.000 You might say, oh, Charlie, they're both the same.
00:22:12.000 No, you're wrong.
00:22:13.000 That this is better than that.
00:22:15.000 Is it ideal?
00:22:16.000 Not really.
00:22:16.000 There's things I wish he could do differently.
00:22:18.000 But it's certainly a step in the right direction of an elite community that disagrees on nothing.
00:22:23.000 They are in full harmony.
00:22:25.000 The current elite governing community does not allow defections or disagreement.
00:22:29.000 They must say, we're going for transhumanism.
00:22:31.000 We're going for our version of eternal life.
00:22:33.000 We're going to destroy human beings as we know it.
00:22:34.000 We're going to depopulate the earth, mRNA vaccines, and suppress free speech.
00:22:38.000 And we hate America while doing it.
00:22:40.000 And then you got one guy that says, yeah, I don't agree.
00:22:42.000 And they're trying to crush him.
00:22:44.000 For those of you that don't like Elon Musk, if Elon was not doing something that could be a glitch in the Matrix, why are they trying to destroy him so badly?
00:22:51.000 Why is the media mocking him so much?
00:22:53.000 If he was really an enemy, why are they trying to destabilize him, slander him, and smear him?
00:22:59.000 We have not seen a person of this high profile defect against the ruling class as significant since Donald Trump went down the escalator in June of 2015, nearly eight years ago.
00:23:10.000 Play cut 39.
00:23:12.000 Regulations are really only put into effect after something terrible has happened.
00:23:16.000 That's correct.
00:23:17.000 If that's the case for AI and we only put in regulations after something terrible has happened, it may be too late to actually put the regulations in place.
00:23:23.000 The AI may be in control at that point.
00:23:26.000 Do you think that's real?
00:23:29.000 It is conceivable that AI could take control and reach a point where you couldn't turn it off and it would be making decisions for people.
00:23:37.000 Yeah, absolutely.
00:23:38.000 Absolutely.
00:23:39.000 No, that's definitely where things are headed, for sure.
00:23:44.000 Yeah, it's going to happen imminently.
00:23:45.000 How many of our members of Congress are saying anything?
00:23:48.000 Do they even understand AI?
00:23:50.000 They're probably still trying to figure out how to log into the Facebook.
00:23:53.000 Mark Zuckerberg was testifying.
00:23:55.000 Mr. Zuckerberg, how do I log into this thing?
00:23:58.000 It's way over their head.
00:24:00.000 Would you rather have elites like Lady Graham that are calling for war in Ukraine?
00:24:04.000 That's what you have in the Republican Party.
00:24:06.000 That's my only challenge, respectfully.
00:24:08.000 And you guys might be right.
00:24:09.000 He might be the worst person ever.
00:24:10.000 I have no idea.
00:24:11.000 I'd be happy to meet him, interview him, have an open mind.
00:24:14.000 Or he could be something in the middle, a mixture of good and bad, a mixture of agreement and disagreement.
00:24:18.000 Maybe it's something that is necessary, something that's disruptive, something that might be a truth-teller on certain topics and just kind of boring and mainstream on others.
00:24:26.000 Aren't we all kind of a mixture?
00:24:28.000 Is it really a binary always?
00:24:30.000 Good, bad, black, white, evil, good.
00:24:33.000 Maybe he's just, maybe there's a tension there with Elon.
00:24:36.000 Maybe.
00:24:36.000 Elon calls for regulation of artificial intelligence, and boy, do I certainly agree at this.
00:24:41.000 This is about to get wildly out of control.
00:24:43.000 It could destroy human beings as we know it, destroy humanity.
00:24:47.000 You want to talk about an apocalypse?
00:24:49.000 This makes nuclear war look like child's play.
00:24:51.000 Play cut 37.
00:24:52.000 So then I think regulation is, you know, it's not fun to be regulated.
00:24:58.000 It's sort of somewhat of a somewhat arduous to be regulated.
00:25:04.000 I have a lot of experience with regulated industries because obviously automotive is highly regulated.
00:25:10.000 I think it needs to start with a group that initially seeks insight into AI, then solicits opinion from industry, and then has proposed rulemaking.
00:25:23.000 And then those rules will probably hopefully grudgingly be accepted by the major players in AI.
00:25:33.000 Now, you might say, oh, Charlie, come on, this is a bunch of fear-mongering.
00:25:36.000 Oh, let me show you a piece of tape here.
00:25:38.000 60 Minutes in an interview with Pakai from Google.
00:25:41.000 This thing is getting a life of its own.
00:25:43.000 The artificial intelligence machine is getting for in singularity.
00:25:46.000 The elites are just salivating.
00:25:48.000 You know why?
00:25:48.000 They think they have found godlike power.
00:25:50.000 They think they have finally been able to find the oneness, the gnosis, the mind.
00:25:55.000 That's really literally where the word Gnosticism comes from.
00:25:59.000 They think they are close towards creating the divine.
00:26:02.000 Maybe someone should say this is a really bad idea.
00:26:05.000 Well, Elon is.
00:26:06.000 Who else is?
00:26:07.000 Watch this clip of Google CEO saying that, yeah, the AI system has a life of its own.
00:26:11.000 It's doing stuff it wasn't programmed to do.
00:26:13.000 He kind of finds it funny.
00:26:15.000 You trust Google?
00:26:17.000 If Elon is against Google, I'm going to compliment him for that.
00:26:21.000 Play cut 11.
00:26:23.000 AI systems are teaching themselves skills that they weren't expected to have.
00:26:29.000 How this happens is not well understood.
00:26:32.000 There is an aspect of this which we call, all of us in the field, call it as a black box.
00:26:37.000 You know, you don't fully understand, and you can't quite tell why it said this or why it got wrong.
00:26:44.000 We have some ideas, and our ability to understand this gets better over time.
00:26:48.000 But that's where the state of the art is.
00:26:50.000 You don't fully understand how it works, and yet you've turned it loose on society?
00:26:56.000 Yeah, let me put it this way.
00:26:58.000 I don't think we fully understand how a human mind works either.
00:27:01.000 Do you notice he said the word better?
00:27:04.000 By what standard?
00:27:06.000 What morality?
00:27:08.000 What ethical code?
00:27:10.000 How are you judging better?
00:27:12.000 You mean more likely to censor Christians, conservatives, and people of faith?
00:27:16.000 Better, more efficient to be able to destroy America?
00:27:19.000 Better, more likely to suppress ideas you don't like?
00:27:23.000 What do you mean by better exactly?
00:27:26.000 Who's actually deeply talking about this from a thoughtful way?
00:27:30.000 Joe Allen is.
00:27:31.000 Well, Elon Musk agrees.
00:27:34.000 Maybe we should take his warning.
00:27:36.000 Is other politics aside?
00:27:37.000 Okay, there's a clear and present danger here that it's not creeping up.
00:27:41.000 It's accelerating.
00:27:42.000 It's growing in speed.
00:27:43.000 It's growing in strength.
00:27:45.000 And the entire species could be at risk.
00:27:48.000 And Google thinks that's just fine.
00:27:52.000 A year ago, there were a few bad image generators for artificial intelligence.
00:27:56.000 Now you can write computer programs from scratch.
00:27:59.000 It's moving so quickly.
00:28:00.000 AI has gotten to an order of magnitude, more powerful in about a year.
00:28:04.000 In five years, I mean, in five years, it's possible that we're just all subservient to these things.
00:28:10.000 The species as we know it could be totally changed.
00:28:14.000 And our leaders just don't think about it.
00:28:16.000 They don't know about it.
00:28:17.000 And by the way, just watch out.
00:28:18.000 A lot of Republicans are bought by Google, aka Netpack.
00:28:22.000 I don't even know if government could do something at this point.
00:28:25.000 I don't.
00:28:26.000 The secularists are going to create something that they think are deemed that is finally worthy of worship.
00:28:33.000 Some people are emailing us: pull the plug.
00:28:36.000 Might be too late.
00:28:38.000 Elon's trying to warn us.
00:28:40.000 Is the option to create a better AI, a more ethical AI?
00:28:44.000 Now, there's a lot of good that could come out of artificial intelligence if the program is written correctly, the coding is written correctly, if it's trained correctly.
00:28:51.000 You might say, oh, Charlie, there's no such good thing out of AI.
00:28:54.000 But let me try to convince you otherwise.
00:28:56.000 Imagine artificial intelligence that could look at 300 million cancer screenings and be able to find tumors before anybody else, find predictive gene markers, or how about be able to develop life-saving drugs in a matter of minutes?
00:29:15.000 Yeah, I mean, that's kind of compelling to be able to say that, you know, kids with leukemia could be treated a lot easier and better because information and medicine is everything.
00:29:22.000 Finding trends, finding the best practices of treatment, and all of it is just kind of right now very inefficient and word of mouth and corrupt.
00:29:29.000 And you have Pfizer, Nashuxenek, and Moderna.
00:29:32.000 That's a potential positive.
00:29:34.000 It really does kind of come down to a question, though, because we are about to go into something very dangerous, everybody.
00:29:39.000 God created man.
00:29:42.000 Machines did not create man.
00:29:44.000 And we created machines.
00:29:46.000 Shouldn't machines work for us?
00:29:48.000 It's about to change.
00:29:49.000 The game is about to change.
00:29:50.000 Elon is trying to say, hey, this thing is really going in a dangerous and awful and terrible direction.
00:29:56.000 They're trying to create a digital God.
00:29:58.000 They're trying to create the oneness of mind.
00:30:00.000 And our elites are excited about it.
00:30:02.000 You know why our elites like this?
00:30:03.000 Our elites are thrilled.
00:30:06.000 Zuckerberg, all of them.
00:30:08.000 Because they think they're going to get eternal life through this.
00:30:12.000 They believe that they'll be able, through transhumanism, upload their consciousness and they will live forever.
00:30:18.000 They believe they have finally been able to find the way that they will be able to exist without ceasing.
00:30:24.000 Let's play another piece of tape here.
00:30:26.000 Play cut 37.
00:30:28.000 So then I think regulation is, yeah, it's not fun to be regulated.
00:30:34.000 It's sort of somewhat of auduous to be able to be regulated.
00:30:39.000 I have a lot of experience with regulated industries because obviously automotive is highly regulated.
00:30:45.000 I think it needs to start with a group that initially seeks insight into AI, then solicits opinion from industry, and then has proposed rulemaking.
00:30:58.000 And then those rules will probably hopefully grudgingly be accepted by the major players in AI.
00:31:09.000 Someone says here, Charlie, it's time.
00:31:11.000 Let it happen.
00:31:12.000 I disagree.
00:31:13.000 You should never welcome evil.
00:31:15.000 If God has a greater plan, so be it.
00:31:17.000 But you should do everything you possibly can to fight against evil.
00:31:21.000 So here we have Elon.
00:31:22.000 Currently, as I've said, we have a governing community of elites that could not care less about the country that they actually oversee.
00:31:29.000 They want all the perks, none of the responsibility.
00:31:31.000 They want all the privileges, but none of the grit or none of the work that it takes.
00:31:36.000 Elon is something in the middle.
00:31:38.000 Elon is pushing back.
00:31:39.000 He's warning us.
00:31:41.000 Soon you are going to have an entire society, civilization, that can be ruled by artificial intelligence and by a machine above man.
00:31:50.000 God created a hierarchy, God, man, nature.
00:31:54.000 And the Google people and the elites, they want to create something new.
00:31:58.000 They want to create a machine created by man that will get out of control.
00:32:01.000 We know it.
00:32:02.000 Elon's trying to say this thing is going to get wildly out of control.
00:32:05.000 It's going to have a mind of its own.
00:32:06.000 It's going to have its own gnosis, its own being.
00:32:10.000 For one, I'm thankful that Elon Musk is warning us of what's coming next.
00:32:15.000 And for that, he deserves praise.
00:32:19.000 Thanks so much for listening, everybody.
00:32:21.000 Email us your thoughts as always, freedom at charliekirk.com.
00:32:24.000 Thanks so much for listening, and God bless.
00:32:29.000 For more on many of these stories and news you can trust, go to CharlieKirk. com.