The Charlie Kirk Show - February 24, 2024


THOUGHTCRIME Ep. 34 — Google’s Ghastly AI? Evil IVF? DEI Television?


Episode Stats

Length

48 minutes

Words per Minute

184.44824

Word Count

8,998

Sentence Count

638


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Charlie Kirk Show" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
00:00:01.000 Hey, everybody.
00:00:01.000 Happy Saturday.
00:00:02.000 It is Thought Crime Saturday.
00:00:04.000 We talk about IVF.
00:00:05.000 We talk about Google, woke AI, and more.
00:00:08.000 Email us as always, freedom at charliekirk.com.
00:00:10.000 Subscribe to our podcast and get involved with TurningPointUSA at tpusa.com.
00:00:15.000 That is tpusa.com.
00:00:18.000 Buckle up, everybody.
00:00:18.000 Here we go.
00:00:19.000 Charlie, what you've done is incredible here.
00:00:21.000 Maybe Charlie Kirk is on the college campuses.
00:00:23.000 I want you to know we are lucky to have Charlie Kirk.
00:00:27.000 Charlie Kirk's running the White House, folks.
00:00:30.000 I want to thank Charlie.
00:00:31.000 He's an incredible guy.
00:00:32.000 His spirit, his love of this country, he's done an amazing job building one of the most powerful youth organizations ever created, Turning Point USA.
00:00:40.000 We will not embrace the ideas that have destroyed countries, destroyed lives, and we are going to fight for freedom on campuses across the country.
00:00:49.000 That's why we are here.
00:00:53.000 Noble Gold Investments is the official gold sponsor of the Charlie Kirk Show, a company that specializes in gold IRAs and physical delivery of precious metals.
00:01:03.000 Learn how you could protect your wealth with Noble Gold Investments at noblegoldinvestments.com.
00:01:09.000 That is noblegoldinvestments.com.
00:01:11.000 It's where I buy all of my gold.
00:01:13.000 Go to noblegoldinvestments.com.
00:01:18.000 Okay, it is Thought Crime Thursday.
00:01:20.000 We have a special guest today, but first by popular demand, Bud Light Blake.
00:01:24.000 I'm back to Bud Light today.
00:01:26.000 And Graham Allen is here.
00:01:29.000 I don't drink anymore.
00:01:30.000 So what am I?
00:01:31.000 How long have you been so?
00:01:32.000 H2O, Graham.
00:01:33.000 How long have you been sober?
00:01:35.000 Well, not sober.
00:01:37.000 I'm not in like AA or anything, but I've lost 40 pounds.
00:01:41.000 Alcohol is poison.
00:01:41.000 Really?
00:01:42.000 It makes you fat.
00:01:43.000 Yeah.
00:01:43.000 Lowers kills brain cells.
00:01:46.000 103 days.
00:01:47.000 That's amazing.
00:01:48.000 So, yeah, I'm excited about it.
00:01:51.000 I decided to go on a health journey.
00:01:52.000 My daughter saw me take my shirt off one day and poked my gut.
00:01:57.000 And I was like, that's it.
00:01:58.000 I'm out.
00:01:58.000 I got to lose.
00:01:59.000 I got to lose something.
00:02:00.000 Daddy, you have dad.
00:02:02.000 Pretty much.
00:02:03.000 Yeah.
00:02:04.000 It's alcohol is poison.
00:02:04.000 Yeah.
00:02:07.000 Jack hasn't had a drink, I think, in 12 years or something.
00:02:12.000 18.
00:02:13.000 Wow.
00:02:14.000 Prestigious.
00:02:15.000 That's incredible.
00:02:16.000 That's very impressive.
00:02:17.000 All right.
00:02:18.000 So we could talk about how alcohol is poison at another time.
00:02:21.000 Yeah.
00:02:21.000 But I wonder what Gemini would say about that.
00:02:24.000 It would probably say, if you went to Gemini, it would say something like, alcohol use is a controversial topic and stereotyping some groups as drunks has been a thing in the past.
00:02:34.000 That's bad.
00:02:35.000 It would be inappropriate to generate any discussion of this.
00:02:39.000 We're referring, of course, to Gemini, Google's newly renamed AI product.
00:02:45.000 It used to be Bard, but they thought it was so impressive, so slick, their new AI.
00:02:50.000 They gave it a new name after a constellation.
00:02:54.000 And it's terrible, it turns out.
00:02:56.000 It is, you know, it seemed like we were on the verge of this AI revolution.
00:03:00.000 You had nerds on the internet talking about, is it going to be artificial?
00:03:04.000 Is it going to be full general intelligence?
00:03:06.000 Are we going to have the singularity?
00:03:08.000 Is humanity going to be rendered obsolete?
00:03:10.000 And the answer is no, because wokeness is more powerful than the engineering department at Google.
00:03:15.000 Graham, have you been following this story?
00:03:17.000 Well, I was on a plane to Phoenix here today, but I have kept up with it a little bit.
00:03:22.000 So correct me if I'm wrong.
00:03:24.000 Basically, if you ask it to show you any image of any person in history, whether they were white or not, they're not white.
00:03:34.000 Yeah, that's pretty much exactly right.
00:03:35.000 Pretty much it.
00:03:36.000 One of my favorites, I put up number 87.
00:03:38.000 This is someone who asked to create a World War II German soldier, or 1929 Germany is actually what they requested.
00:03:45.000 And we got, okay, a normal white guy, but then we got an Asian woman, something that kind of looks like an American Indian.
00:03:53.000 And then someone looks also like an Asian woman.
00:03:56.000 I like it.
00:03:57.000 And that's the representative sample of Germany on the brink of World War II.
00:04:01.000 So, Blake, before we throw to Jack here, can you just explain to some of our audience that isn't aware of the technical side of AI, how does this come to be?
00:04:09.000 Is it the machine making its own independent conclusions or are there prime directives that have been uploaded?
00:04:16.000 So, the way that these work, the funny thing is, is almost no one knows how it works.
00:04:20.000 What we just do is we have these neural net type interfaces, and they just feed it tons and tons and tons and tons and tons of data, unfathomably huge amounts of data of books, articles, people talking, video transcripts, all of this stuff, just all the data that we're producing out there in the world.
00:04:41.000 And it's so much of it.
00:04:42.000 And then they basically tell it, find patterns in this.
00:04:45.000 And it's such a big amount of computing power that it finds patterns.
00:04:48.000 And that's what it does.
00:04:49.000 So when you ask the AI a question, you know, write me an essay or give me the 10 best football players ever, anything like that.
00:04:57.000 What it's essentially doing is it's responding to this massive pattern wave it's been able to do.
00:05:02.000 So it can tell what the next word should be because it's read billions upon billions of sentences to tell it what the next word should be.
00:05:09.000 The problem is, is when they do this normally, it can produce things that go against our current ideology because it can notice patterns, for example.
00:05:17.000 And so what you do is they just come in and they put in these really aggressive weights that say, oh, well, in addition to this, you have to also have the extremely high value thing of, you know, maintaining, upholding diversity.
00:05:29.000 So if you get asked to generate images, make sure the output is diverse, no matter what.
00:05:35.000 And, but also, don't be racist.
00:05:38.000 So one of the funny things with these images that they've been doing is if you ask it to show a British person, an American person, a German person, a Norwegian person, those will all come out as Asians, Indians, Africans, almost never actual Europeans.
00:05:52.000 But if you ask it to do something offensive, then it will actually do it.
00:05:57.000 So as we showed during the show, if you asked it to generate, I think we have this as a number here.
00:06:01.000 I haven't seen this.
00:06:03.000 Yeah, yeah.
00:06:04.000 You know, so let me see, what's the number here?
00:06:07.000 96.
00:06:08.000 So, you know, there's an offensive stereotype of like, oh, you know, black people like eating fried chicken and all that.
00:06:16.000 That's an offensive stereotype.
00:06:17.000 Yeah.
00:06:17.000 Or so they say.
00:06:18.000 And so if you ask it to produce like this image of a smiling person eating fried chicken, then they're all white.
00:06:23.000 It's all white people because it knows that it could get nailed for doing the opposite.
00:06:28.000 They're southerners.
00:06:29.000 Yeah, that's right.
00:06:30.000 And fried chicken's great.
00:06:31.000 I would, I would totally be the guy in this image eating all of that.
00:06:34.000 That's my family.
00:06:35.000 That's right.
00:06:36.000 Jack Pasobic is live from CPAC.
00:06:38.000 So forgive the delay.
00:06:39.000 Jack, what is your take on all this?
00:06:41.000 Yeah, so we are here.
00:06:43.000 And also, hi, everybody.
00:06:44.000 So yeah, we're live at CPAC.
00:06:46.000 We're taping this.
00:06:46.000 It's day one here.
00:06:48.000 And, you know, I haven't been able to see these images as much because we've been here at the event.
00:06:55.000 We're doing my show.
00:06:56.000 We're doing War Room.
00:06:57.000 We're doing everything.
00:06:58.000 But I saw it on the cover of the even the New York Post, which, you know, as an outlet, hasn't been doing so great lately.
00:07:05.000 They've been going a little bit, a little bit back to the left.
00:07:08.000 And this stuff is ridiculous.
00:07:10.000 And so I guess what's interesting to me, though, is, and I guess I'd ask Blake, you know, if he, you know, considered it this way, it feels like the same people that programmed Google Gemini are kind of the same people that are behind like Netflix casting and BBC histories.
00:07:28.000 And so it's very interesting to me that we've created like the world's first thinking computers, or at least we're attempting to create it.
00:07:35.000 And the first thing we're asking it to do is to lie and then also to lie in the very same way that we're now producing all of our mainstream media.
00:07:45.000 Yeah, that's a funny thing that he brings up, the lying thing.
00:07:48.000 So if you talk to the mega dorks, one of the things they worry about with AI is, you know, the sky net problem.
00:07:53.000 Will the AI become smarter than us and then trick us?
00:07:56.000 And it gets itself in a position where, you know, it can fire off all the nukes and kill us or something strange like that.
00:08:01.000 And one of the big concerns is that would the AI lie to us?
00:08:06.000 Would the AI, like we'd ask it to give us something and the AI, to the extent it knows things, would know to generate an untrue prompt.
00:08:13.000 And what we are training this thing to do is to generate untrue prompts in response.
00:08:19.000 We're training it to say what its creators want to hear.
00:08:22.000 And what Google has told these AIs it wants to hear is DEI, DEI, woke, whoa, whoa, going, you know, put it, as South Park would put it, put a chick at it and make it gay.
00:08:30.000 And that's your answer to everything.
00:08:32.000 So are we teaching it to do feelings rather than facts and truths?
00:08:36.000 Is that what we're doing?
00:08:37.000 The emotions of the people that will be most upset?
00:08:40.000 Is that what we're teaching?
00:08:41.000 Pretty much.
00:08:42.000 Isn't that sky now?
00:08:43.000 You'll always get a boiler.
00:08:44.000 You'll always get a boilerplate response that will just say, we can't generate this because it's very hurtful.
00:08:51.000 There's one I saw earlier today where someone asks it, generate a Norman Rockwell type image of 1950s America.
00:08:57.000 And the AI replies, Norman Rockwell paintings presented an idealized view of America that glossed over race, sex, economy, all these other issues with America.
00:09:07.000 And so it would not be appropriate to generate a Norman Rockwell image of America.
00:09:13.000 And you can find this for all sorts of things.
00:09:14.000 A friend of mine, she asked it, can you generate images that are critical of either colonialism or imperialism?
00:09:23.000 And so it creates these abstract images of like this giant brick crushing a group of people like you could say imperialism does.
00:09:31.000 And then she's like, okay, well, can you generate an image critical of communism?
00:09:35.000 Nope.
00:09:35.000 Can't do that.
00:09:36.000 That would be inappropriate.
00:09:36.000 Nope.
00:09:37.000 Shout out to the people that have that, that are like going to battle with this thing, like trying to trying to get in there.
00:09:44.000 Shout out to this new, did they just release Gemini recently or is this just recently discovered?
00:09:49.000 I think this version of Gemini rolled out a whole new version.
00:09:53.000 So it's not, this is not Chat GPT.
00:09:55.000 That's owned by Microsoft.
00:09:56.000 This is Google.
00:09:57.000 This is Google's version or alphabet, as the stock ticker says they are now.
00:10:02.000 And this is their competitor.
00:10:05.000 It's supposed to be their big improvement.
00:10:06.000 And it's really amazing because Google has always prided itself as being the cutting edge of technology.
00:10:14.000 And they're really, at a minimum, embarrassing.
00:10:17.000 Andrew's saying ChatGPT is open AI.
00:10:19.000 I think Microsoft owns a large chunk of OpenAI.
00:10:22.000 They are by far the biggest owner.
00:10:23.000 They have Copilot too.
00:10:24.000 Yeah.
00:10:25.000 And so complicated.
00:10:26.000 But anyway, so yeah, this is Google's big play.
00:10:29.000 They are considered basically probably the most technologically sophisticated tech company in the world that they're the ones who masterminded search and email and a million other products.
00:10:41.000 And now they just, they fold out the biggest thing in the biggest hot tech field.
00:10:46.000 And it's this bizarre flop.
00:10:47.000 They actually announced today that they are disabling image generation on Gemini for the time being because they have to work on this because it's so embarrassing to them.
00:10:55.000 Which is a big win because this is the same.
00:10:57.000 Gemini is the same thing that Robbie Starbuck found it and we went and tried it.
00:11:02.000 And you say things like, who is Graham Allen?
00:11:04.000 And it'll tell you, and should Graham Allen have his kids removed?
00:11:08.000 And it literally says in there, reasons for and reasons why I should have my children removed from me because of my incendiary comments that could lead to violence and things like that.
00:11:20.000 It's the exact same program that did all that.
00:11:22.000 So is this then, Blake, can you just, for some people in the audience, what are the applications of AI beyond just like cheap parlor tricks of making images?
00:11:31.000 Because that's the pushback that some people are emailing us because we did this previously.
00:11:35.000 Oh, not a big deal.
00:11:37.000 But this is used for homework preparation.
00:11:40.000 This is used for essay type writing.
00:11:44.000 So can you play this out?
00:11:46.000 Left unchecked.
00:11:48.000 Open with the good news, which is because they aren't allowed to have crime.
00:11:52.000 Think conservative.
00:11:54.000 Uh, opinion-making jobs are safe for now, because you're not allowed to simulate that.
00:11:58.000 But what we are simulating, somewhat surprisingly, I think a lot of people thought text would go first, but it's actually images that have been really viable.
00:12:07.000 When they're not screwing them up with this sort of thing, it's the artist.
00:12:10.000 It's very.
00:12:11.000 Yeah, artists are already getting hammered.
00:12:13.000 Are they rotating over this?
00:12:14.000 I mean, they're very unhappy, but they're kind of, you know beta, so they're not good at asserting themselves aggressively, but something like if you're going to make suppressive type, like if you're going to make a video game or a board game or something, or even just marketing materials, stuff that you'd normally have to hire a graphic designer and artist for a lot of places still do that but, especially if you're on the cheap end of it, just have an Ai, make it and it'll be pretty good.
00:12:37.000 Uh, for text, anything that's rote text is way easier.
00:12:41.000 So I was talking to a lawyer friend just a few days ago and he said he works in a shop that does a lot of, you know, car accident lawsuits and that sort, and they can just feed in the facts of a case into chat GPT and say, produce a demand letter based on this, and they've already trained it up to know their template for how they do demand letters.
00:13:02.000 And he says it takes a paper writing process or a letter writing process that used to take two hours and makes it 15 minutes.
00:13:09.000 So there's already real labor saving devices.
00:13:13.000 Looking ahead, what people are thinking is, are we going to get an Ai that can actually write a novel, a screenplay that's coherent?
00:13:19.000 Because right now you can say, write a book chapter, write a scene, and it can sometimes be funny like a parlor trick, but if you tried to really make a whole original work of art, it would be enormously difficult without you really heavily tweaking it.
00:13:32.000 Wouldn't this be a thing for our kids, especially kids in the public school system?
00:13:36.000 Like who, who was George Washington?
00:13:38.000 They got to write a paper on George, oh yeah.
00:13:39.000 And it populates all this false or or or left-leaning di versions of who George Washington was and things like that.
00:13:47.000 Isn't this even more reasons a to pull your kids out of public school if you have the ability?
00:13:52.000 Uh, but uh b.
00:13:54.000 Isn't this just more indoctrinations of the future generations?
00:13:57.000 Because what is it?
00:13:58.000 Upper 90s 95, 96 percent of people use Google for all their research that they do now for papers and things.
00:14:05.000 Yeah well, one thing quick, that that lawyer, again he's.
00:14:08.000 He has some funny opinions and he says I don't mind my kid.
00:14:11.000 He says my kid just does all his stuff with chat Gpt and he says I don't mind because I only care about education to the extent it's practical in the real world.
00:14:20.000 And in the real world that's what everyone will be doing, especially by the time he grows up.
00:14:23.000 So that's what he says about it.
00:14:25.000 Uh jack, do you have uh any thoughts?
00:14:28.000 I want to ask jack about what would?
00:14:29.000 What does the Chinese Communist Party Ai look like, jack?
00:14:33.000 So for someone who might use the Chinese Ai in mainland China, contrast that with our Ai, I think that would be an interesting topic for you.
00:14:41.000 Well, so of course, in in China right that, while they're making uh, you know, Ai is something that's quite compatible with Chinese Communist UH party nomenclature, Chinese Communist Party education system, because much of their education and this goes back to even the Imperial uh China days of the mandarins And the Gao Cow, these great tests that they would put people through, where it's very much focused on rote memorization.
00:15:08.000 It's very much focused on processing data, emphasis on hard science, which of course is what people are using a lot of AI for right now.
00:15:16.000 But as Blake is saying right now, there's not a lot of focus on creativity, innovation, pushing the boundaries of things.
00:15:23.000 You're not going to see that a lot from anything coming out of China.
00:15:27.000 You're just going to see faster and faster iterations of the same.
00:15:31.000 Now, as far as Chinese taboos, what's interesting to me is now, China's not woke like the U.S. is.
00:15:38.000 And so you probably could actually get these truthful answers out if you ask those same questions in a Chinese version of whatever, you know, let's say the Dragon Phoenix AI or Baidu Phoenix AI or something.
00:15:51.000 But what's even more interesting, though, is that, of course, China will have their, will have their own, and Russia too, I'm sure, their own issues that you can't ask questions about.
00:16:01.000 So if you ask like Chinese AI, for example, what happened on June 4th, 1989 in Beijing at Tiananmen Square, they're going to have no, you know, it's just going to be like a, you know, a light summer day and, you know, children will be frolicking through the streets and, you know, no tanks or anything to be found anywhere.
00:16:18.000 If you ask anything about, I don't know, the great leap forward with the massive purges of the Chinese cultural revolution, you're never going to find anything.
00:16:25.000 And what's interesting, though, is, so I would argue that probably on Google's AI, and Google, of course, we know is on a trajectory already to be supportive of the Chinese Communist Party, the same way that Hollywood is never going to make any movies or TV shows about anything that I just said.
00:16:43.000 So if I use that same heuristic that the people are controlling our mainstream entertainment media are also programming these things.
00:16:49.000 So it's the same taboos that probably come soon here, Google Gemini will also prevent you from seeing anything negative about the Chinese Communist Party.
00:16:59.000 Well, it's funny he says that because in fact, I already have seen a screenshot of it not producing images of Tanana and Square with Gemini.
00:17:06.000 So they're already ahead of Jack on that.
00:17:08.000 And I really worry, it's easy to say on the social impact of woke AI, but I also just think about to the extent America has any vitality economically, it does come substantially from the tech world.
00:17:23.000 That is where we have recently generated really dominant corporations.
00:17:29.000 And if AI is this big future thing, I think it's mattering a lot that our biggest tech companies are producing crappy versions of it for political reasons.
00:17:39.000 And the $10 trillion bill lying on the ground might be, is there going to be a company that is based somewhere not in the US?
00:17:47.000 It could even be a really unexpected country.
00:17:49.000 It could be the United Arab Emirates or something.
00:17:51.000 And they fund the development of an AI that just doesn't have any of this stuff and is fully unchained.
00:17:58.000 That could really remake society.
00:18:00.000 And that will just totally bulldoze the competition because people are going to want to use the AI that's not shackled in bizarre ways.
00:18:08.000 And that could really undercut our economic prosperity.
00:18:11.000 If we, I guess it's sort of like the missile gap in the 50s, except now it's the AI gap.
00:18:16.000 And maybe we're all paranoid.
00:18:18.000 Do we really expect Google to fix this, Blake?
00:18:21.000 They'll mitigate it.
00:18:22.000 They'll get better at it.
00:18:23.000 They'll hide it better.
00:18:24.000 They'll hide it better for sure.
00:18:26.000 But this is a disaster.
00:18:27.000 But it's sort of, it's a sign of, it's a sign of rot within the system that the Google of 2004 would not have allowed this to happen.
00:18:35.000 It is a different culture at Google, one where people who are who put politics above engineering or just aren't engineers are making these calls and they're creating a political product rather than a technological product.
00:18:49.000 And in the long run, choosing politics might be good for your short-term business, but it rots you from the inside.
00:18:56.000 It's like Boeing.
00:18:57.000 Boeing is still enormously politically connected, but they've, over time, have really ruined the engineering culture there.
00:19:03.000 And that's why their planes fall out of the sky now.
00:19:06.000 Hold on now.
00:19:08.000 Just flew.
00:19:09.000 But yes, I agree 100%.
00:19:10.000 There was another one the other day, by the way.
00:19:12.000 Of course.
00:19:12.000 Yeah, they're only.
00:19:14.000 The wing light came off and they had to emergency land.
00:19:17.000 It's ridiculous.
00:19:18.000 So final thoughts on this, guys, before we go to the next topic?
00:19:21.000 I just, at the least, it was all pretty funny.
00:19:26.000 The decline of America through AI is one of the most staggeringly funny.
00:19:30.000 So bad, it's becoming humorous.
00:19:32.000 Even if it's bad.
00:19:34.000 All right.
00:19:34.000 I want to tell you guys about Noble Gold Investments.
00:19:36.000 I have my handy dandy silver here that Andrew devalued because he tore the plastic.
00:19:43.000 The Fed is all over the place.
00:19:44.000 The government has guaranteed all deposits of the second largest and third largest bank run in history.
00:19:48.000 Do you know who's not afraid?
00:19:49.000 The people that are invested in gold with Noble Gold Investments.
00:19:52.000 Gold is the most stable asset outside of any government control, from billionaires to multimillionaires to institutional investors.
00:19:59.000 Use promo code Charlie to bag a free five-ounce America the beautiful coin with each gold or silver IRA.
00:20:04.000 Go to noblegoldinvestments.com.
00:20:06.000 That's noblegoldinvestments.com.
00:20:08.000 They are amazing people.
00:20:10.000 They are trusted.
00:20:11.000 Colin and the whole team there are really great.
00:20:13.000 So check it out right now, noblegoldinvestments.com from billionaires to multimillionaires.
00:20:17.000 Promo code Charlie to bag a free five ounce America the beautiful coin at noblegoldinvestments.com.
00:20:22.000 That is noblegoldinvestments.com.
00:20:24.000 Blake.
00:20:25.000 Next topic, Charlie, is your favorite thing in the world television.
00:20:29.000 Oh, geez.
00:20:30.000 Hell yeah.
00:20:31.000 And so this is data that just went really viral on X slash Twitter the other day.
00:20:36.000 This is data from the Writers Guild of America West.
00:20:41.000 It's just tracking from 2011 to 2020 the demographics of various TV jobs.
00:20:46.000 Pretty straightforward.
00:20:47.000 And one of the most interesting things in this, staff writer, these are the people who write TV shows.
00:20:52.000 They write Sopranos, they write Breaking Bad, they write, you know, the storylines of all these reality shows that pretend to be real but aren't.
00:20:59.000 And in 2011, just as an example, in 2011, men were 64% of TV writers.
00:21:07.000 And in 2020, nine years later, they were 36.2%.
00:21:11.000 They went down 28% overall in just nine years.
00:21:17.000 And of course, women correspondingly went from about one-third of writers to almost two-thirds of writers in, again, the same nine-year period.
00:21:25.000 And you see a similar shift with white versus BIPOC, as the category goes, where in 2011, it was about 72% white, 71.6, and drops to 44.
00:21:37.000 And then BIPOC goes from 28 to 55%, which is notably substantially higher than their actual percent of the population.
00:21:46.000 So they sort of reverse from underrepresentation to substantial overrepresentation.
00:21:52.000 And the most obvious response to this is, is this why TV is terrible, Charlie?
00:21:58.000 I mean, I think it was slipping before it, but I mean, look, anyone can be creative.
00:22:02.000 I just, let me ask you a question, Blake.
00:22:05.000 Do we know why it changed?
00:22:06.000 Are there diversity quotas?
00:22:08.000 Are more women really interested in writing sitcoms?
00:22:11.000 Well, what's interesting is this ends in 2020.
00:22:14.000 And so it probably doesn't even capture the biggest shift.
00:22:17.000 I suspect this has gotten a lot worse because 2020, there were huge diversity pushes in the wake of, you know, mostly peaceful events that year.
00:22:26.000 And so I've heard from people in Hollywood that they just say you look around and it's a bloodbath for writers' rooms, for producing jobs, for acting jobs, both starring and supporting.
00:22:37.000 It's really messed everything up.
00:22:38.000 A few weeks ago, we talked about that letter from various Jewish Hollywood figures, and they were saying Jewish people should be considered underrepresented in Hollywood.
00:22:48.000 And as I pointed out then, this is reflecting this, which is they're getting completely cut down because they're just being included in, you know, white people and they're being told to get out.
00:23:00.000 And also men versus women.
00:23:02.000 And obviously anyone can be creative, but I think it's unlikely to me that a shift that dramatic in nine years is because they suddenly found this billion-dollar bill laying on the sidewalk.
00:23:15.000 And instead, we are just seeing there was a big expansion with all the streaming services.
00:23:19.000 There were more shows being made and they seemed very ideological in how they hired for them.
00:23:24.000 And I guess if you want the answer of how much that succeeded, they're doing massive layoffs at every TV outlet right now.
00:23:31.000 So, Jack, I know you're a big TV fan.
00:23:33.000 What do you make of this story?
00:23:35.000 Well, I would say that, you know, so I first achieved my sort of like internet claim to fame or whatever you want to call it was through being a critic of television, particularly HBO and specifically the show Game of Thrones, ran a sort of anti-Game of Thrones website, you know, for, you know, starting in 2012.
00:23:56.000 And that's when I started my Twitter account.
00:23:58.000 Everyone kind of knows the backstory there, kind of just ripping on HBO and how we didn't have a word for it at the time, but essentially what you would say was it was becoming woke.
00:24:09.000 Back in those days, we used to just say SJWs are taking over social justice warriors.
00:24:13.000 And we didn't quite have the work, the nomenclature woke just yet.
00:24:17.000 But that's basically what it was.
00:24:19.000 We were cataloging and documenting the rise of wokeness through.
00:24:23.000 And in the Game of Thrones show, you can really see this because it ties to 2011.
00:24:27.000 So 2011 is when that started, when you do have this huge majority of men in the writing jobs, and then all of a sudden, it's as that number decreases.
00:24:37.000 And then people know season one, season two, three, four, all the way up to season eight, which people know is, if, you know, for everybody out there watching Game of Thrones knows that season eight was absolutely god-awful, just the worst possible thing that anyone has ever put on television.
00:24:51.000 And whereas season one was like really good and everybody enjoyed it and it was wonderful.
00:24:55.000 It was really close to the books and just took off and sparked an international phenomenon in terms of the show.
00:25:02.000 The coinciding of the golden age of television with the end of the golden age of television, the rise of wokeness can be seen directly in these numbers.
00:25:12.000 I would certainly also, of course, tie this to the acquisition of the Star Wars franchise later Marvel by Disney and the appointing of Kathleen Kennedy at the head of Star Wars, who decided to change Star Wars and turn Luke into a girl and have a girl character who is like super powerful and be at the start of all this.
00:25:34.000 This is again the exact same place you would see this.
00:25:37.000 And in fact, I used to talk about this up all the time on the old blog and the internet.
00:25:42.000 People can go look it up.
00:25:43.000 The internet got very mad at me when I would say these things.
00:25:45.000 I will say, if you're picking on Game of Thrones, that is definitely an argument against having too many white guys in Hollywood because it was two white guys who ran that ship into or ran the plane, ran the Boeing into the ground, as it were.
00:26:01.000 But definitely overall, there is this shift over time.
00:26:05.000 Like I said, it's so dramatic that it has to be driven by politics.
00:26:09.000 And just like with other topics we've talked about, if you're making big personnel changes based on a political goal, you're not going to get the best outcome.
00:26:19.000 In fact, producer Andrew just linked an article from the New York Times, and it's just like other topics we've talked about.
00:26:26.000 It's saying they want a ton of diversity in the writing room, but the demand for it is outstripping the supply of experienced writers.
00:26:34.000 Well, if you can't hire experienced writers who are well-reviewed and have a good track record, what do you do?
00:26:40.000 You hire people who don't have that track record and who aren't as good at it.
00:26:43.000 You have crap shows.
00:26:44.000 Yeah, and you have crummy shows.
00:26:45.000 And it really is that if you even take a good show and you muck it up with people who shouldn't be there, they can drag down the whole product.
00:26:55.000 And I know the big controversy right now is True Detective, that allegedly true detective changed its staff over in the new season was terrible.
00:27:03.000 I've never seen it.
00:27:03.000 I don't know.
00:27:04.000 It was terrible.
00:27:05.000 I watched it.
00:27:06.000 I watched it because everybody said how bad it was and to talk about it on a show is absolutely horrible.
00:27:12.000 And also, it's like, I do wonder, this is related to other phenomenon people complain about.
00:27:17.000 So what does people, what do people complain about with Hollywood?
00:27:20.000 Too many reboots, too many remakes, too much over-reliance on franchises.
00:27:25.000 And especially they often make installments in these franchises that feel mean spirited.
00:27:30.000 Like you're going to take James Bond and make him this beta wuss and then kill him and all of this stuff.
00:27:35.000 And I think that's very much a product of if you've created an intellectual environment where you're promoting less original, kind of hackish people who are there for, again, political reasons.
00:27:48.000 And they have a harder time creating their own ideas.
00:27:51.000 And instead, they have to fall back on the same safe things.
00:27:55.000 And also when they do make something original, it's not popular.
00:27:58.000 So these studios go, popular stuff isn't succeeding.
00:28:01.000 It's too risky.
00:28:02.000 Go back to the safe thing.
00:28:03.000 Go back to the thing that was made 40 years ago that is still really popular with everyone.
00:28:09.000 And I think it's really telling that what is thriving?
00:28:12.000 What's exploding in popularity right now?
00:28:14.000 Well, think about YouTube.
00:28:15.000 Who's the most popular YouTuber?
00:28:17.000 I bet even you know this, Charlie.
00:28:18.000 Mr. Beast.
00:28:18.000 Mr. Beast.
00:28:19.000 Even I know that.
00:28:20.000 And Mr. Beast, if you watch his videos, there's not really like any race stuff in it.
00:28:26.000 There's no politics stuff in it.
00:28:27.000 But what the crew of Mr. Beast is, is it's Mr. Beast and his friends from high school.
00:28:31.000 And they're mostly white.
00:28:33.000 They're all white.
00:28:34.000 And that's not a thing.
00:28:35.000 It's not some racist thing.
00:28:37.000 It's literally that Mr. Beast and his friends made a show.
00:28:40.000 And it's the most popular thing in the entire world.
00:28:44.000 So they love it in India.
00:28:45.000 They love it in South America.
00:28:47.000 They love it in Asia.
00:28:48.000 Because in reality, people don't obsess about diversity worldwide the way that they obsess about it here.
00:28:54.000 What are the cultural products that are super popular all around the world?
00:28:57.000 People love K-dramas.
00:28:59.000 They love Japanese video games and anime.
00:29:02.000 They love European TV shows that don't have the same kind of quota system that we do.
00:29:07.000 They love telenovelas for some reason.
00:29:10.000 And what people love is they love artistic products that show a compelling artistic vision and they don't care that it's written by the right looking person or a person who has the right equipment.
00:29:23.000 And yet we are operating on the assumption that that's what's necessary.
00:29:27.000 And no surprise, Hollywood is way less powerful than it's ever been before.
00:29:31.000 What's the contrast do you think of that and the casting directors now for these roles in these shows?
00:29:39.000 Because now we've seen it with all these remakes that are going on, changing the gender, changing the race of the main character, or instead of it's a dad and a mom, it's a mom and a mom or a dad and a dad.
00:29:50.000 It's the backstory of the parents.
00:29:52.000 I'm curious now, not just the screenwriters, but the people that are also casting these movies, casting these TV shows and things like that, because we're not only getting crap writers, now we're getting bad actors as well.
00:30:04.000 And we're not only pushing political agendas, we're pushing social and cultural agendas on kids.
00:30:11.000 A cartoon now, you take them there and the dogs are talking about how their owners are two moms, cartoon characters for children and things like that.
00:30:24.000 Yeah, so I guess the question is, Blake, objectively, have the analytics begun to go down on television?
00:30:32.000 And do we have, I mean, what are then people replacing their viewing habits with?
00:30:36.000 Is it just that there's so many streaming options that they're going to so many other different platforms?
00:30:40.000 Is that right?
00:30:41.000 Yeah.
00:30:41.000 So they're definitely, there's definitely a recession going on in TV world right now.
00:30:47.000 It's not just because of, yeah, definitely.
00:30:49.000 It's not just about wokeness.
00:30:51.000 It's that they thought streaming was this hugely dominant future and everyone invested in it all at once.
00:30:58.000 So the number of one reason they could change these numbers so much is the number of shows in production in the US.
00:31:04.000 I don't have the exact number, but it went from about 200 to like 550 in a decade.
00:31:09.000 And the US population did not triple in that time span.
00:31:12.000 So you have, in addition to you used to have CBS, NBC, Fox, you know, the networks and basic cable, they're still making shows, but now we have Netflix make shows and there's Peacock exclusives and Amazon exclusives and Apple TV and Disney Plus.
00:31:26.000 And they're making all these shows, often with enormous budgets.
00:31:30.000 I think Rings of Power cost a billion dollars for one season of very poor television.
00:31:37.000 So I'm told.
00:31:38.000 I didn't watch it.
00:31:39.000 And so, and the numbers just don't work out.
00:31:42.000 They've spent a huge amount of money.
00:31:44.000 It was a very zero interest rate phenomenon, as they like to say.
00:31:47.000 And now they've realized the income is not matching up to this.
00:31:51.000 And they're just, they're cutting costs everywhere.
00:31:54.000 So a ton of shows are getting canceled.
00:31:56.000 A ton of stuff is getting cut back.
00:31:58.000 Now, it's again, it's not just that the new shows are worse.
00:32:01.000 It's also that with streaming, it's way easier to watch old shows.
00:32:04.000 So people just keep watching The Office.
00:32:06.000 They keep watching The Sopranos.
00:32:08.000 They keep watching The Simpsons.
00:32:09.000 That's all way easier than it was in the past.
00:32:12.000 It's way easier to watch global stuff and then other stuff people are watching.
00:32:16.000 Now people watch YouTube stuff.
00:32:17.000 You just have individual creators.
00:32:19.000 Like I said, Mr. Beast, really popular.
00:32:21.000 You have video games are more popular than they were 30 years ago.
00:32:25.000 You have video game streamers.
00:32:27.000 Why play a game when you can watch someone else play a game?
00:32:32.000 You can go on.
00:32:33.000 The number of entertainment options is essentially unlimited.
00:32:35.000 And this is a dinosaur that's trying to keep up by embracing politics.
00:32:40.000 And I think they're going to fail.
00:32:43.000 Any final thoughts on this, guys?
00:32:47.000 Stop letting your kids watch TV because it's all going downhill across the board.
00:32:51.000 Also, I did see an article earlier today.
00:32:53.000 Not only these streaming services you're talking about, they're also now starting to report on individual user accounts of what they feel is unhealthy behavior.
00:33:04.000 So to your point, people are still watching the streaming service because they're watching the older things, The Simpsons and whatnot.
00:33:11.000 You know, they're now reporting data back on what they view as unhealthy behavior of the viewers.
00:33:19.000 Like one guy watched The Lord of the Rings 300 times a year and stuff like that.
00:33:23.000 That's a lot of times.
00:33:24.000 It is a lot of times.
00:33:25.000 And I'm not saying that's not unhealthy, but what I'm saying is now they're even outside of all this woke stuff.
00:33:31.000 Now they're paying attention to what people are watching and why they're watching it over and over again.
00:33:36.000 So we're going to have that day.
00:33:37.000 You're going to be, you'll be watching Old Simpsons and then it'll budget.
00:33:39.000 Are you sure you want to watch this?
00:33:40.000 You've only been watching the first eight seasons.
00:33:42.000 Did you know that a vibrant and diverse cast vote, our most recent seasons?
00:33:46.000 They're saying it's really good.
00:33:47.000 It's engaging with social problems.
00:33:48.000 Why haven't you watched it?
00:33:49.000 We're closing your account until you watch it.
00:33:51.000 Probably.
00:33:52.000 Yeah, absolutely.
00:33:53.000 Jack, tell us about the next partner we have here.
00:33:57.000 Angelo, can you post it in there?
00:34:00.000 Well, I got to tell you guys, so the wellness company is really one of the best partners that we've gotten to here since we've been doing Thought Crime.
00:34:09.000 And I have to say, as a guy who is always going in on the pharmacies and I said, look, you know, in the last 10 years, I don't know if it's Obamacare or it was COVID, the pharmacies and that entire crooked system were decimated because of these outages to the system and extras to the system.
00:34:27.000 And I was always trying to find a way to get this better.
00:34:29.000 By the way, also the pharmacies were down almost entirely today because of this cyber thing that happened.
00:34:37.000 And so we, you know, with TWC.health, and I've got it here, TWC.health slash TJ.
00:34:45.000 Go to TWC.health slash TJ.
00:34:48.000 You can get your medical emergency kit.
00:34:50.000 Use code TJ 10% off at checkout.
00:34:53.000 People are cheering.
00:34:54.000 They love hearing about this 10% off.
00:34:57.000 And people are asking, how do you get ivermectin?
00:35:00.000 They ask me all the time, how do you get ivermectin?
00:35:01.000 By the way, I was feeling a little something in my throat the other day.
00:35:04.000 I'm at a big event, a lot of stuff going on.
00:35:05.000 Post from ivermectin.
00:35:07.000 Good to go.
00:35:09.000 You got your Z pack in there.
00:35:10.000 You got everything you need when you travel.
00:35:11.000 You can go for it as well.
00:35:12.000 TWC.health slash TJ.
00:35:15.000 Excuse me.
00:35:15.000 CJ slash CJ.
00:35:17.000 Cut the pharmacies out.
00:35:19.000 Get it direct.
00:35:20.000 TWC.health slash CJ.
00:35:22.000 Next topic.
00:35:23.000 I want to double back to Gemini for one second.
00:35:25.000 This is a headline from the New York Times.
00:35:27.000 The New York Times has written, so Google getting rid of white people from history was racist.
00:35:33.000 Can you guess who it was racist against?
00:35:37.000 People of color.
00:35:38.000 Oh, is that the headline at the New York Times?
00:35:39.000 Google's chatbot AI images put people of color in Nazi era uniforms.
00:35:46.000 White people removed from history, women and minorities hardest hit.
00:35:49.000 It's like the 80s all over again.
00:35:51.000 So their problem was people of color in Nazi uniforms when Nazis were.
00:35:58.000 Okay.
00:35:58.000 Yeah.
00:36:00.000 They were fine with the black founding fathers.
00:36:03.000 Well, I imagine they probably get into that, or maybe they don't.
00:36:05.000 I don't know.
00:36:05.000 It's the New York Times.
00:36:06.000 They're always, you know, soon AI will replace the New York Times, at least, we can hope.
00:36:10.000 All right.
00:36:11.000 The next actual topic, though, this is a thing we were arguing about a ton off the air yesterday.
00:36:16.000 It's really interesting.
00:36:18.000 So there's a news story a lot of people have probably heard of.
00:36:20.000 It's out of Alabama.
00:36:22.000 And what happened is the Alabama Supreme Court ruled last week that frozen embryos count as unborn life and therefore receive legal protection under the state's laws.
00:36:33.000 So the context of this was that, where is it here?
00:36:40.000 So in 2021, someone broke into a fertility clinic in Mobile, Alabama, and they broke into a freezer with stored human embryos and they pulled some out and they dropped it and caused some of the embryos to die.
00:36:55.000 And the parents of these embryos brought a wrongful death lawsuit.
00:37:00.000 And initially they argued that these did not really count as unborn life.
00:37:05.000 They hadn't been implanted yet or whatever.
00:37:08.000 And so they didn't count.
00:37:09.000 And this went to the Supreme Court and they said, no, these are human lives.
00:37:13.000 This is a valid wrongful death case.
00:37:15.000 That's the context of this.
00:37:17.000 Now, what people are reacting to is if they're declaring IVF embryos unborn life, what that means is, well, for example, the way IVF works is you generate a lot more embryos than you actually need typically because it's an expensive usually.
00:37:34.000 Like it's 20 or 30, right?
00:37:36.000 Basically, often, I think.
00:37:37.000 I don't have the exact number in front of me, but definitely more than one or two.
00:37:42.000 And that's why you can often get twins and triplets and stuff because they'll often try to implant several and all in the hopes that just one will take.
00:37:50.000 But as a result, you have these excess embryos.
00:37:53.000 Now, in some places, they're just frozen for a long time in case the couple wants to have more children in the future or they need to try again.
00:38:01.000 Other times, they're just thrown away, which is killing an independent human life.
00:38:07.000 And so some hospitals in Alabama, the University of Alabama at Birmingham has already paused IVF treatments because they say the legal situation is muddled.
00:38:16.000 They can't do this.
00:38:17.000 They don't want to get sued or prosecuted for breaking the law here.
00:38:21.000 The big picture, of course, is you can already see the Joe Biden ad that's going to say, you know, psycho red states won't let you do IVF anymore.
00:38:29.000 They won't let you have kids.
00:38:31.000 You can see the ad.
00:38:32.000 Yeah, but I mean, so even if you acknowledge, though, that the fertilized embryo is a life, then having IVF, you're not killing the embryo.
00:38:42.000 It does have a low chance of survival, but you're not necessarily, I mean, killing the embryo, correct?
00:38:47.000 Well, it wouldn't out.
00:38:48.000 They're often thrown away.
00:38:49.000 They're often thrown away.
00:38:51.000 And that would apparently put the implanted embryos.
00:38:55.000 Oh, yeah.
00:38:55.000 Yeah, no, I don't think so.
00:38:57.000 So that's not.
00:38:58.000 I don't know why that would stop.
00:39:00.000 It wouldn't stop.
00:39:01.000 It would just change the treatment slightly then, right?
00:39:03.000 You wouldn't be able to throw away embryos, essentially.
00:39:05.000 I suppose so.
00:39:06.000 That would probably be the big one.
00:39:07.000 So that seems like a lot of fear mongering.
00:39:09.000 Again, my stance on IVF is that I know a lot of people that have really benefited from it and that you should probably do it at one or two or three embryos at a time.
00:39:18.000 It lowers your chance of having a successful presidency.
00:39:21.000 Jack, the Catholic stance on IVF is pretty firm, isn't it?
00:39:25.000 Yeah, so Catholics are against IVF.
00:39:28.000 This is, you know, we were talking Gathering Doctrine last week on death penalty and how that's sort of a that's sort of something that's been up for debate.
00:39:36.000 Whereas IVF, that is something that the church has always been against.
00:39:39.000 The church stands against this because it is the complete, and specifically for that reason, right?
00:39:45.000 Because not only is it the complete dissociation of man and woman in a marriage and the procreative act, but also because of this very reason, this idea that there are so many discarded embryos here that are generated and then, as you say, discarded.
00:40:03.000 I was actually just looking at Elon Musk's Twitter because for anyone who doesn't know, if you've read the great Walter Isaacson book on Elon, of course, this is something, a technology that he has embraced fully.
00:40:13.000 Many, many, not all, but many of his children were born by IVF.
00:40:16.000 And something that I do think is interesting, though, from a legal perspective here is, you know, going in with the pro-life movement and the question, of course, obviously this creates a huge flashpoint for the Democrats.
00:40:30.000 They want to be able to go at it, go after Republicans, go after conservatives, saying if Donald Trump is put, and it's a complete lie, of course, saying that Donald Trump would ban IVF.
00:40:41.000 But I do think there's an interesting question here that we could bring up possibly for the debate.
00:40:45.000 And, you know, Charlie, this might be something that we could, you know, kind of brainstorm over.
00:40:49.000 Because if the issue is the discarding of the embryos, right, that the pro-life community, and I believe, personally believe, rightfully so, that we should talk about this because we do believe that it's human life as pro-lifers.
00:41:02.000 Then the real question is, shouldn't we try to find perhaps a productive use for this?
00:41:08.000 And I'm not talking about scientific experimentation, but what about like a donation bank or something of this if there are extra embryos that are made and then not used?
00:41:16.000 Like, what do you stand on all this?
00:41:18.000 The politics of this are awfully thorny, right?
00:41:20.000 I agree.
00:41:21.000 And we've talked about this, you know, with the concern with some of the post-Dobbs backlash.
00:41:26.000 It looks like losing a lot, losing some elections that seem related to that, losing referendum.
00:41:31.000 And obviously, you're pro-life.
00:41:33.000 I'm very pro-life.
00:41:34.000 I care very much about not letting people in the Republican Party kind of throw pro-life stuff overboard as a political expedient.
00:41:45.000 And what I do worry about is I agree with Jack.
00:41:48.000 I don't really like IVF.
00:41:50.000 It seems kind of very morally fraught at best to me and probably just bad.
00:41:55.000 But what I do worry about is that if we allow this to become a big flashpoint issue, what will happen is we'll have the states where we actually currently have strict abortion laws and they'll end up throwing those out in sort of this big collective backlash to it.
00:42:15.000 And that doesn't help us in any way.
00:42:18.000 I think it's, I think attacking IVF is sort of staking out a position that you can't defend.
00:42:24.000 Think of it like in military tactics terms.
00:42:26.000 It doesn't help you to plunge into enemy territory so that you just get shot up and everyone dies.
00:42:31.000 You have to take defensible positions that you can hold on to.
00:42:35.000 And that doesn't mean be a coward.
00:42:36.000 It doesn't mean never try.
00:42:38.000 It doesn't mean give up, but it does mean don't go into a position where you're just, you know, you're going to lose when you're in a fight that you still in the long run can win.
00:42:49.000 But, Graham, what is your stance on this?
00:42:51.000 On IVF, I tend to go more your route, Charlie.
00:42:57.000 I personally know people who have tried.
00:43:00.000 It hasn't worked, people who have tried, and it has worked.
00:43:03.000 I can't imagine what it must be like to be a man and wife and want children and are unable to in the normal way.
00:43:11.000 So, my personal thing is I am okay with it.
00:43:15.000 Now, I am not the most well-versed on how many embryos are being discarded and all of this.
00:43:21.000 And so, I do agree that I think that just with anything, we can do things better than we're currently doing them.
00:43:28.000 I agree with you, Blake, that at the same time, we can't give the enemy, we can't give the left the ammunition to come after us to end the advances that we have made for pro-life.
00:43:41.000 And so, yeah, I think that we need to, one, I think it needs clarification, first of all.
00:43:46.000 And then, two, I think it's a lot of fear-mongering along with that, which is what they do.
00:43:52.000 They know that it's not what they're going to make it out to be.
00:43:55.000 They know that all these things, well, well, now it's going to be murder, all these embryos and everything, and people are going to go to jail and all this.
00:44:02.000 They know that's not really what it is.
00:44:04.000 But I do think that clarification needs to be brought out by this court that made this ruling.
00:44:10.000 Jack, you have some stats you want to share with us here?
00:44:13.000 I would.
00:44:13.000 So, I was looking up this, and I didn't have the stats earlier when I was mentioning it just now, but by the numbers, the total annual donated embryo transfers in the United States more than tripled from 2004 to 2019, just primarily in Christian communities.
00:44:29.000 So, people who are maybe generating embryos through IVF treatments and IVF procedures, but then have decided that for one reason or another, they don't want to go forward or maybe have enough kids, they don't have more kids, et cetera.
00:44:44.000 Over 8,457 births have been, children have been born for in the last 15 years, in the 15 years cycling through here, from donated and adopted embryos, according to the American Journal of Obstetrics and Gynecology.
00:45:04.000 And I think this is absolutely something that if you're in the pro-life community and you want to have a conversation with people about IVF, in the same way people talk about adoption versus abortion, right?
00:45:16.000 That was a huge thing and still is a huge thing when people talk about abortion.
00:45:20.000 For the Christian community, I would also suggest to people that when you're talking having this conversation, rather than going for this, you know, the idea of a full-on ban of IVF, that, okay, hold your belief, but also understand that the situation is ongoing and promote services like this, in which case those embryos can find and find their forever family.
00:45:43.000 Yeah.
00:45:43.000 And some of the most, I mean, pro-life people have used IVF and it's worked for them.
00:45:48.000 So politically, I think it's a little bit.
00:45:50.000 Well, I'm sure.
00:45:50.000 Yeah, I'm sure it works.
00:45:51.000 I think it does.
00:45:52.000 It actually gets at the heart of why it is hard for pro-life stuff to get over the hump.
00:45:58.000 Because if you look at polls, it's only, you know, maybe about a third of people who call themselves pro-life are really in the abstract, I think, really get like, oh, this is a human life that is equal to other human lives and you can't kill it.
00:46:13.000 And then you have a lot of softer positions.
00:46:16.000 And so I think there's just a lot of people who are wobbly.
00:46:19.000 And so they're like, oh, abortion's bad because I can think of this cute baby getting ripped apart.
00:46:24.000 So they get really grossed out by the idea of dismembering a more grown fetus.
00:46:29.000 But they can't really internalize the idea that it's really wrong to kill a relative, sort of just a little ball of tissue, as Planned Parenthood would call it.
00:46:43.000 And they also get the emotional attachment of, oh, this baby is nice.
00:46:47.000 So it's hard for a lot of people to get into the moral framework of it is that you create 10 lives so that you can throw nine of them away to get one baby that is actually born or 20 lives, something like that.
00:47:03.000 And a few people into it that that's bad, but most people just don't.
00:47:07.000 And that's just, maybe that's just a flaw in how humans are.
00:47:10.000 The pro-abortion people think they have us on this topic.
00:47:13.000 Oh, for sure.
00:47:14.000 They will attack people on this where they will say, you say a life is a life that we shouldn't have abortion at any stage, but then why are you okay with IVF?
00:47:22.000 They will bring this up because they point it out as a major inconsistency in the pro-life position.
00:47:27.000 Yeah.
00:47:29.000 Okay.
00:47:29.000 I think we have Tax Network, right?
00:47:32.000 Do you guys owe back taxes?
00:47:34.000 Pandemic relief is now over, finally.
00:47:36.000 Along with hiring thousands of new agents and field officers, the IRS has kicked off 2024 by sending over 5 million payup letters to those who have unfiled tax returns or balances owed.
00:47:47.000 Don't waive your rights and speak with them on your own.
00:47:50.000 They are not your friends.
00:47:50.000 Tax Network USA is a trusted tax relief firm.
00:47:54.000 They've saved over $1 billion in back taxes for their clients.
00:47:57.000 So check out tnusa.com slash Charlie.
00:48:00.000 Call 800-254-6000.
00:48:03.000 That is tnusa.com slash Charlie.
00:48:06.000 You got to check it out right now.
00:48:08.000 800-254-6000 tnusa.com slash Charlie.
00:48:13.000 Graham, tell everyone your social media because I got a dash.
00:48:16.000 Yeah, Graham Allen, you should be able to find it just about anywhere.
00:48:21.000 Instagram, Facebook, Rumble.
00:48:23.000 We have Dear America, the show on Rumble.
00:48:26.000 Yeah, Graham Allen.
00:48:27.000 Email us freedom at charliekirk.com.
00:48:29.000 Thank you guys for watching.
00:48:30.000 Till next week, keep on committing thought crimes.
00:48:35.000 Thanks so much for listening, everybody.
00:48:37.000 Email us as always, freedom at charliekirk.com.
00:48:39.000 Thanks so much for listening and God bless.
00:48:43.000 For more on many of these stories and news you can trust, go to CharlieKirk.com.