Human Events Daily with Jack Posobiec - February 24, 2024


THOUGHTCRIME Ep. 34 — Google’s Ghastly AI? Evil IVF? DEI Television?


Episode Stats

Length

47 minutes

Words per Minute

185.13464

Word Count

8,864

Sentence Count

614

Misogynist Sentences

8

Hate Speech Sentences

14


Summary

From the age of Big Brother, if they want to get you, they'll get you. From Big Brother to Big Data, it's Thursday, and it's Thought Crime Thursday. This week, we have a special guest: a man who has gone without a drink for the past 12 years.


Transcript

00:00:00.000 I want to take a second to remind you to sign up for the Poso Daily Brief.
00:00:05.420 It is completely free.
00:00:06.760 It'll be one email that's sent to you every day.
00:00:08.640 You can stop the endless scrolling, trying to find out what's going on in your world.
00:00:11.720 We will have this delivered directly to you totally for free.
00:00:14.960 Go to humanevents.com slash Poso.
00:00:17.220 Sign up today.
00:00:18.460 It's called the Poso Daily Brief.
00:00:19.860 Read what I read for show prep.
00:00:21.780 You will not regret it.
00:00:23.400 humanevents.com slash Poso.
00:00:25.040 Totally free.
00:00:25.780 The Poso Daily Brief.
00:00:27.280 From the age of big brother.
00:00:28.700 If they want to get you, they'll get you.
00:00:32.020 DNSA specifically targets the communications of everyone.
00:00:35.940 They're collecting your communications.
00:00:46.580 Okay, it is Thought Crime Thursday.
00:00:48.620 We have a special guest today, but first by popular demand, Bud Light Blake.
00:00:52.460 I'm back to Bud Light Blake?
00:00:54.240 Yeah.
00:00:55.060 And Graham Allen is here.
00:00:56.640 I don't drink anymore.
00:00:59.040 Is that right?
00:00:59.460 What am I?
00:01:00.120 How long have you been sober?
00:01:01.520 H2O Graham?
00:01:01.900 How long have you been sober?
00:01:03.300 Well, not sober.
00:01:05.360 I'm not in like AA or anything, but I've lost 40 pounds.
00:01:09.580 Really?
00:01:10.080 Alcohol is poison.
00:01:10.780 It makes you fat?
00:01:11.840 Yeah.
00:01:12.140 It lowers and kills brain cells?
00:01:13.520 How long have you gone without a drink?
00:01:14.140 A hundred and three days.
00:01:15.460 That's amazing.
00:01:16.200 Yeah.
00:01:16.520 Yeah.
00:01:16.720 Praise God.
00:01:17.100 And so, yeah, yeah.
00:01:17.940 I'm excited about it.
00:01:19.100 I decided to go on a health journey.
00:01:20.940 My daughter saw me take my shirt off one day and poked my gut, and I was like, that's it.
00:01:26.600 I'm out.
00:01:27.020 I got to lose.
00:01:27.780 I got to lose something.
00:01:28.880 Daddy, you have dad, Bob.
00:01:30.440 Pretty much.
00:01:31.280 Yeah.
00:01:32.420 Yeah.
00:01:32.780 It's alcohol is poison.
00:01:35.240 Jack hasn't had a drink, I think, in 12 years or something?
00:01:41.900 18.
00:01:43.240 Wow.
00:01:44.100 Prestigious.
00:01:45.060 That's incredible.
00:01:46.280 That's very impressive.
00:01:47.820 All right.
00:01:48.340 So, we could talk about how alcohol is poison at another time, but I wonder what Gemini would
00:01:53.240 say about that.
00:01:54.020 It would probably say, if you went to Gemini, it would say something like, alcohol use is
00:01:59.560 a controversial topic, and stereotyping some groups as drunks has been a thing in the past.
00:02:04.880 That's bad.
00:02:05.840 It would be inappropriate to generate any discussion of this.
00:02:09.640 We're referring, of course, to Gemini, Google's newly renamed AI product.
00:02:14.980 It used to be Bard, but they thought it was so impressive, so slick, their new AI.
00:02:20.440 They gave it a new name after a constellation, and it's terrible, it turns out.
00:02:26.420 It is, you know, it seems like we were on the verge of this AI revolution.
00:02:30.540 You had nerds on the internet talking about, is it going to be artificial, is it going to
00:02:34.660 be full general intelligence, or is it going to, are we going to have the singularity?
00:02:38.360 Is humanity going to be rendered obsolete?
00:02:40.460 And the answer is no, because wokeness is more powerful than the engineering department
00:02:43.700 at Google.
00:02:45.440 Graham, have you been following this story?
00:02:47.360 Well, I was on a plane to Phoenix here today, but I have kept up with it a little bit.
00:02:52.720 But so correct me if I'm wrong, basically, if you ask it to show you any image of any
00:02:58.520 person in history, whether they were white or not, they're not white.
00:03:04.260 Yeah, that's pretty much exactly right.
00:03:06.100 One of my favorites, I put up number 87.
00:03:08.780 This is someone who asked to create a World War II German soldier.
00:03:13.000 1929 Germany is actually what they requested.
00:03:14.940 And we got, OK, a normal white guy, but then we got an Asian woman, something that kind
00:03:20.440 of looks like an American Indian.
00:03:22.540 Yeah.
00:03:22.860 And then someone looks also like an Asian woman.
00:03:26.740 Yeah, I like it.
00:03:27.320 And that's the representative sample of, you know, Germany on the brink of World War II.
00:03:31.580 So, Blake, before we throw to Jack here, can you just explain to some of our audience that
00:03:35.960 isn't aware of the technical side of AI, how does this come to be?
00:03:39.780 Is it the machine making its own independent conclusions or are there prime directives
00:03:44.700 that have been uploaded behind AI?
00:03:46.520 So the way that these work, the funny thing is, is almost no one knows how it works.
00:03:50.560 What we just do is we have these neural net type interfaces and they just feed it tons and
00:03:58.020 tons and tons and tons and tons of data, unfathomably huge amounts of data of books, articles,
00:04:04.040 people talking, video transcripts, all of this stuff.
00:04:07.200 Just all the data that we're producing out there in the world.
00:04:10.860 And it's so much of it.
00:04:12.220 And then they basically tell it, find patterns in this.
00:04:15.000 And it's such a big amount of computing power that it finds patterns.
00:04:18.560 And that's what it does.
00:04:19.540 So when you ask the AI a question, you know, write me an essay or give me the 10 best football
00:04:25.440 players ever, anything like that.
00:04:26.940 What it's essentially doing is it's responding to this massive pattern wave it's been able to
00:04:31.980 do.
00:04:32.120 So it can tell what the next word should be because it's read billions upon billions of
00:04:36.600 sentences to tell it what the next word should be.
00:04:39.580 The problem is, is when they do this normally, it can produce things that go against our current
00:04:44.100 ideology because it can notice patterns, for example.
00:04:47.700 And so what you do is they just come in and they put in these really aggressive weights
00:04:51.680 that say, oh, well, in addition to this, you have to also have the extremely high value
00:04:56.360 thing of, you know, maintaining, upholding diversity.
00:04:59.300 So if you get asked to generate images, make sure the output is diverse, no matter what.
00:05:05.540 And, but also don't be racist.
00:05:08.060 So one of the funny things with these images that they've been doing is if you ask it to
00:05:11.800 show a British person, an American person, a German person, a Norwegian person, those
00:05:16.300 will all come out as Asians, Indians, Africans, almost never actual Europeans.
00:05:22.620 But if you ask it to do something offensive, then it will actually do it.
00:05:26.700 So as we showed during the show, if you asked it to generate, I think we have this as a
00:05:31.180 number here.
00:05:31.660 I haven't seen this one.
00:05:32.740 Fried chicken.
00:05:33.360 Yeah.
00:05:33.640 Yeah.
00:05:33.840 You know, so let me see.
00:05:36.600 What's the number here?
00:05:37.320 96.
00:05:38.060 So, you know, there's an offensive stereotype of like, oh, you know, black people like eating
00:05:44.300 fried chicken and all that's an offensive stereotype.
00:05:47.180 Yeah.
00:05:47.480 Or so they say.
00:05:48.620 And so if you ask it to produce like this image of a smiling person eating fried chicken,
00:05:52.260 then they're all white.
00:05:53.400 It's all white people.
00:05:54.260 Because it knows that it could get nailed for doing the opposite.
00:05:58.400 They're Southerners.
00:05:59.420 Yeah, that's right.
00:06:00.240 And fried chicken's great.
00:06:01.460 I would, I would totally be the guy in this image eating all of that.
00:06:04.300 That's my family.
00:06:05.380 That's right.
00:06:06.260 Jack Posobiec is live from CPAC.
00:06:08.340 So forgive the delay.
00:06:09.280 Jack, what is your take on all this?
00:06:10.640 Yeah, so we are here.
00:06:15.380 And also, hi, everybody.
00:06:16.520 So, yeah, we're live at CPAC.
00:06:18.280 We're taping this.
00:06:19.300 It's day one here.
00:06:20.620 And, you know, I haven't been able to see these images as much because we've been here at the
00:06:26.580 event.
00:06:27.280 We're doing my show.
00:06:28.400 We're doing War Room.
00:06:29.120 We're doing everything.
00:06:29.740 But I saw it on the cover of the, even the New York Post, which, you know, is an outlet.
00:06:35.580 Hasn't been doing so great lately.
00:06:37.400 They've been going a little bit, a little bit back to the left.
00:06:40.720 And this stuff is ridiculous.
00:06:42.940 And so I guess what's interesting to me, though, is, and I guess I'd ask Blake, you know,
00:06:48.300 if he considered it this way, it feels like the same people that programmed Google Gemini
00:06:53.860 are kind of the same people that are behind, like, Netflix casting and BBC histories.
00:07:00.100 And so it's very interesting to me that we've created, like, the world's first thinking
00:07:05.280 computers, or at least we're attempting to create it.
00:07:07.300 And the first thing we're asking it to do is to lie and then also to lie in the very
00:07:12.520 same way that we're now producing all of our mainstream media.
00:07:18.200 Yeah, that's a funny thing that he brings up, the lying thing.
00:07:20.940 So if you talk to the mega dorks, one of the things they worry about with AI is, you know,
00:07:25.040 the Skynet problem.
00:07:26.000 Will the AI become smarter than us and then trick us?
00:07:29.360 And it gets itself in a position where, you know, it can fire off all the nukes and kill
00:07:32.420 us or something strange like that.
00:07:34.380 And one of the big concerns is that would the AI lie to us?
00:07:38.800 Would the AI, like, we'd ask it to give it something and the AI, to the extent it knows
00:07:42.520 things, would know to generate an untrue prompt?
00:07:45.640 And what we are training this thing to do is to generate untrue prompts in response.
00:07:51.400 It's, we're training it to say what its creators want to hear.
00:07:54.380 And what Google has told these AIs it wants to hear is DEI, DEI, woke, woke, woke, you
00:07:59.820 know, put it, as South Park would put it, put a chick in it and make it gay.
00:08:03.140 And that's your answer to everything.
00:08:05.240 So are we teaching it to do feelings rather than facts and truths?
00:08:09.000 Is that what we're doing?
00:08:10.040 The emotions of the people that will be most upset?
00:08:13.380 Is that what we're teaching it?
00:08:14.220 Pretty much, you always, you'll always get a boiler, you'll always get this boilerplate
00:08:19.160 response that will just say, we can't generate this because it's very hurtful.
00:08:23.820 There's one I saw earlier today where someone asks it, generate a Norman Rockwell type image
00:08:28.200 of 1950s America.
00:08:29.640 And the AI replies, Norman Rockwell paintings presented an idealized view of America that
00:08:35.040 glossed over race, sex, economy, all these other issues with America.
00:08:40.000 And so it would not be appropriate to generate a Norman Rockwell image of America.
00:08:45.060 Wow.
00:08:45.580 And you can find this for all sorts of things.
00:08:46.920 A friend of mine, she asked it, can you generate images that are critical of either
00:08:53.580 colonialism or imperialism?
00:08:55.400 And so it creates these abstract images of like this giant brick crushing a group of
00:09:01.440 people, like you could say imperialism does.
00:09:03.800 And then she's like, okay, well, can you generate an image critical of communism?
00:09:07.560 Nope.
00:09:08.020 Can't do that.
00:09:08.740 Nope.
00:09:08.920 That would be inappropriate.
00:09:10.160 And so the people that have that, that are like going to battle with this thing, like
00:09:14.500 trying to, trying to get in there.
00:09:17.020 Shout out to these people.
00:09:17.920 Is this new, did they just release Gemini recently or is this just recently discovered?
00:09:22.020 I think this version of Gemini rolled out a week ago.
00:09:25.300 It's the new version.
00:09:25.420 Yeah.
00:09:25.520 So it's not, this is not chat GPT.
00:09:27.700 That's owned by Microsoft.
00:09:28.860 This is Google.
00:09:29.560 And this is Google's version or alphabet as you know, the stock ticker says they are now.
00:09:34.920 And this is their competitor.
00:09:37.460 It's supposed to be their big improvement.
00:09:38.920 And it's really amazing because Google has always prided itself as being the cutting
00:09:44.160 edge of technology.
00:09:46.240 And they're really at a minimum embarrassing.
00:09:49.500 This is okay.
00:09:49.900 Andrew's saying chat GPT is open AI.
00:09:52.060 It's Microsoft owns a large chunk of open AI.
00:09:54.460 They are by far the biggest owner.
00:09:55.660 They have co-pilot too.
00:09:56.960 Yeah.
00:09:57.940 And so complicated, but anyway, so yeah, this is Google's big play.
00:10:01.900 They are considered basically probably the most technologically sophisticated tech company
00:10:06.900 in the world that they're the ones who masterminded search and email and a million other products.
00:10:13.500 And now they just, they fold out the biggest thing in the biggest hot tech field.
00:10:18.420 And it's this bizarre flop.
00:10:19.600 They actually announced today that they are disabling image generation on, on Gemini for
00:10:24.860 the time being, because they have to work on this because it's so embarrassing to them.
00:10:27.900 Which is a big win because this is the same.
00:10:29.920 Gemini is the same thing that Robbie Starbuck found it and we went and tried it.
00:10:34.100 And you say things like, uh, who is Graham Allen?
00:10:37.280 And it'll tell you, and should Graham Allen have his kids removed?
00:10:41.020 And it literally says in there reasons for and reasons why I should have my children removed
00:10:47.460 from me because of my incendiary comments that could lead to violence and things like
00:10:51.280 that.
00:10:51.500 It's the same, it's the, it's the exact same program that did all that.
00:10:54.580 So is this then Blake, can you just, for some people in the audience, what are the applications
00:11:00.140 of AI beyond just like cheap parlor tricks of making images?
00:11:03.720 Because some, that's the pushback that some people are emailing us because we did this
00:11:07.380 previously.
00:11:08.100 Oh, not a big deal.
00:11:09.560 But this is used for homework preparation.
00:11:12.580 This is used for essay, um, type writing.
00:11:16.520 And so can you, can you go, can you play this out left unchecked?
00:11:20.180 I'll open with the good news, which is because they aren't allowed to have crime think conservative
00:11:26.500 opinion, making jobs are safe for now because you're not allowed to simulate that.
00:11:30.700 But what we are simulating somewhat surprisingly, I think a lot of people thought text would
00:11:36.020 go first, but it's actually images that have been really viable when they're not screwing
00:11:41.360 them up with this sort of thing.
00:11:42.460 It's the artists.
00:11:42.940 It's very, yeah.
00:11:43.880 Artists are already getting hammered.
00:11:45.540 Are they writing over this?
00:11:46.580 I mean, they're very unhappy, but they're kind of, you know, beta.
00:11:49.900 So they're not good at asserting themselves aggressively, but something like if you're
00:11:54.140 going to make suppressive type, like if you're going to make a video game or a board game
00:11:58.220 or something, or even just marketing materials, stuff that you'd normally have to hire a graphic
00:12:02.320 designer and artists for a lot of places still do that.
00:12:05.360 But especially if you're on the cheap end of it, just have an AI make it and it'll be
00:12:08.640 pretty good.
00:12:09.980 Uh, for text, anything that's wrote text is way easier.
00:12:13.600 So I was talking to a lawyer friend just a few days ago and he said he works in a shop
00:12:18.840 that does a lot of, you know, car accident lawsuits and that sort.
00:12:23.320 And they can just feed in the facts of a case into chat GPT and say, produce a demand letter
00:12:29.980 based on this.
00:12:30.960 And they've already trained it up to know their template for how they do demand letters.
00:12:34.860 And he says it takes a paper writing process or a letter writing process that used to take
00:12:39.780 two hours and makes it 15 minutes.
00:12:41.800 So there's already real labor saving devices looking ahead.
00:12:46.340 What people are thinking is, are we going to get an AI that can actually write a novel,
00:12:50.360 a screenplay that's coherent?
00:12:52.020 Cause right now you can say, write a book chapter, write a scene.
00:12:55.280 And it can sometimes be funny, like a parlor trick.
00:12:57.440 But if you tried to really make a whole original work of art, it would be enormously difficult
00:13:02.840 without you really heavily tweaking it.
00:13:05.020 Wouldn't this be a thing for our kids, especially kids in the public school system?
00:13:08.780 Like who, who was George Washington?
00:13:10.420 They got to write a paper on George Washington and it populates all this false or, or, or
00:13:15.740 left leaning DEI versions of who George Washington was and things like that.
00:13:20.300 Isn't this even more reasons, A, to pull your kids out of public school if you have the ability.
00:13:24.940 Uh, but, uh, B, isn't this just more indoctrinations of the future generations?
00:13:29.800 Cause what is it upper nineties, 95, 96% of people use Google for all their research that
00:13:36.340 they do now for papers and things.
00:13:37.920 Yeah.
00:13:38.060 Well, one thing quick that my lawyer again, he's, uh, he has some funny opinions and he
00:13:41.980 says, I don't mind my kid.
00:13:43.940 He says my kid just does all his stuff with chat GPT.
00:13:47.280 And he says, I don't mind because I only care about education to the extent it's practical
00:13:51.780 in the real world.
00:13:52.500 And in the real world, that's what everyone will be doing, especially by the time he grows
00:13:55.780 up.
00:13:56.160 So that's what he says about it.
00:13:57.960 Uh, Jack, do you have, uh, any thoughts?
00:14:00.300 I want to ask Jack about what would, what does the Chinese communist party AI look like,
00:14:04.920 Jack?
00:14:05.260 So for someone who might use the Chinese AI in mainland China, contrast that with our AI.
00:14:11.300 I think that would be an interesting topic for you.
00:14:12.980 Well, so of course in, in China, right?
00:14:18.220 That while they're making, uh, you know, AI is something that's quite compatible with
00:14:22.020 Chinese communist, uh, party nomenclature of Chinese communist party education system,
00:14:27.000 because much of their education, and this goes back to even the Imperial, uh, China days
00:14:33.340 of the mandarins and the, the Gaokao, these, these great tests that they would put people
00:14:37.480 through, um, where it's, it's very much focused on rope memorization.
00:14:42.160 It's very much focused on processing data, uh, emphasis on hard science, which of course
00:14:47.140 is what people are using a lot of AI for right now.
00:14:50.340 But as Blake is saying right now, there's, there's not a lot of focus on creativity,
00:14:54.720 innovation, uh, pushing the boundaries of things.
00:14:57.400 You're not going to see that a lot from any, anything coming out of China.
00:15:01.340 You're just going to see faster and faster iterations of the same.
00:15:05.420 Now, as far as Chinese taboos, what's interesting to me is, no, China is not woke like the U.S.
00:15:11.960 crisis.
00:15:12.960 And so you probably could actually get these truthful answers out.
00:15:16.640 If you ask those same questions in a Chinese version of whatever, you know, you know, let's
00:15:21.260 say the dragon Phoenix AI or Baidu Phoenix AI or something.
00:15:25.900 But what's, what's even more interesting though, is that of course, China will have their, will
00:15:30.120 have their own, and they know Russia too, I'm sure their own issues that you can't ask
00:15:35.060 questions about.
00:15:36.060 If you ask like Chinese AI, for example, what happened on June 4th, 1989 in Beijing at Tiananmen
00:15:42.200 Square, they're going to have no, you know, it's just going to be like a, you know, a light
00:15:46.020 summer day and, you know, children will be frolicking through the streets and, you know,
00:15:50.300 no tanks or anything to be found anywhere.
00:15:52.580 If you ask anything about that, I don't know, the great leap forward with the massive purges
00:15:56.300 of the Chinese cultural revolution, you're never going to find anything.
00:15:59.620 And what's interesting though, is, so I would argue that probably on Google's AI and Google,
00:16:07.580 of course, we know is on a trajectory already to be supportive of the Chinese Communist Party,
00:16:12.000 the same way that Hollywood is never going to make any movies or TV shows about anything
00:16:16.820 that I just said.
00:16:17.920 So if I use that same heuristic that the people are controlling our mainstream entertainment
00:16:22.040 media are also programming these things.
00:16:24.200 And so it's the same taboos that probably come soon here, Google Gemini will also prevent
00:16:29.820 you from seeing anything negative about the Chinese Communist Party.
00:16:33.320 Well, and it's funny he says that because in fact, I already have seen a screenshot of it
00:16:37.360 not producing images of Tiananmen Square with Gemini.
00:16:40.760 So they're already ahead of Jack on that.
00:16:43.560 And I really worry, it's easy to fixate on the social impact of woke AI.
00:16:49.600 But I also just think about, to the extent America has any vitality economically, it does
00:16:56.260 come substantially from the tech world.
00:16:58.320 That is where we have recently generated really dominant corporations.
00:17:03.760 And if AI is this big future thing, I think it's mattering a lot that our biggest tech companies
00:17:09.720 are producing crappy versions of it for political reasons.
00:17:14.180 And the $10 trillion bill lying on the ground might be, is there going to be a company that
00:17:19.580 is based somewhere not in the US, it could even be a really unexpected country, it could
00:17:23.940 be the United Arab Emirates or something, and they fund the development of an AI that just
00:17:29.080 doesn't have any of this stuff and is fully unchained, that could really remake society and
00:17:34.980 that will just totally bulldoze the competition because people are going to want to use the AI
00:17:40.360 that's not shackled in bizarre ways.
00:17:43.200 And that could really undercut our economic prosperity if we, I guess it's sort of like
00:17:48.480 the missile gap in the 50s, except now it's the AI gap, and maybe we're all paranoid.
00:17:52.700 Do we really expect Google to fix this, Blake?
00:17:55.820 They'll mitigate it.
00:17:57.000 They'll get better at it.
00:17:58.220 They'll hide it better.
00:17:59.100 They'll hide it better for sure.
00:18:00.760 But this is a disaster.
00:18:01.540 Would you say this is a disaster for them?
00:18:02.860 It's a sign of, it's a sign of rot within the system that the Google of 2004 would not
00:18:08.420 have allowed this to happen.
00:18:10.040 It is a different culture at Google, one where people who are, who put politics above engineering
00:18:16.040 or just aren't engineers are making these calls and they're creating a political product
00:18:21.540 rather than a technological product.
00:18:24.040 And in the long run, choosing politics might be good for your short-term business, but it
00:18:29.880 rots you from the inside.
00:18:31.040 It's like Boeing, Boeing is still enormously politically connected, but they've over time
00:18:36.080 have really ruined the engineering culture there.
00:18:38.300 And that's why their planes fall out of the sky now.
00:18:40.860 Hold on now.
00:18:42.660 Just flew, but yes, I agree 100%.
00:18:45.060 There was another one the other day, by the way.
00:18:46.800 Of course.
00:18:47.300 Yeah, there was one landing.
00:18:48.600 The wing like came off and they had to emergency land.
00:18:51.500 It's ridiculous.
00:18:52.780 So final thoughts on this guys before we go to the next topic.
00:18:56.280 I just, at the least, it was all pretty funny.
00:19:01.040 The Decline of America Through AI is one of the most staggeringly funny stories.
00:19:05.340 It's so bad it's becoming humorous.
00:19:07.080 Even if it's bad.
00:19:08.300 Ladies and gentlemen, one of the best ways that you can support us here at Human Events
00:19:12.540 and the work that we do is subscribing to us on our Rumble channel.
00:19:17.360 Make sure you're subscribed.
00:19:18.560 You hit the notifications so you'll never miss a clip.
00:19:21.480 You'll never miss a new live episode.
00:19:23.840 And we're putting them out every single day of the week.
00:19:27.320 Next topic, Charlie, is your favorite thing in the world?
00:19:30.460 Oh, is that right?
00:19:30.960 Television.
00:19:31.820 Oh, jeez.
00:19:32.580 Oh, yeah.
00:19:33.460 And so this is data that just went really viral on X slash Twitter the other day.
00:19:38.700 This is data from the Writers Guild of America West.
00:19:43.080 It's just tracking from 2011 to 2020 the demographics of various TV jobs.
00:19:48.580 Pretty straightforward.
00:19:49.700 And one of the most interesting things in this, staff writer.
00:19:52.980 These are the people who write TV shows.
00:19:54.980 They write Sopranos.
00:19:55.840 They write Breaking Bad.
00:19:56.860 They write the storylines of all these reality shows that pretend to be real but aren't.
00:20:00.800 And in 2011, just as an example, in 2011, men were 64% of TV writers.
00:20:09.620 And in 2020, nine years later, they were 36.2%.
00:20:13.920 They went down 28% overall in just nine years.
00:20:19.680 And, of course, women correspondingly went from about one-third of writers to almost two-thirds of writers in, again, the same nine-year period.
00:20:27.000 And you see a similar shift with white versus BIPOC, as the category goes, where in 2011, it was about 72% white, 71.6, and drops to 44.
00:20:39.740 And then BIPOC goes from 28% to 55%, which is notably substantially higher than their actual percent of the population.
00:20:48.320 So they sort of reverse from under-representation to substantial over-representation.
00:20:53.780 And the most obvious response to this is, is this why TV is terrible, Charlie?
00:21:00.760 I mean, I think it was slipping before it.
00:21:02.840 But, I mean, look, anyone can be creative.
00:21:04.980 I just – let me ask you a question, Blake.
00:21:07.600 Do we know why it changed?
00:21:08.780 Are there diversity quotas?
00:21:10.180 Are more women really interested in writing sitcoms?
00:21:13.920 Well, what's interesting is this ends in 2020.
00:21:16.660 And so it probably doesn't even capture the biggest shift.
00:21:19.280 I suspect this has gotten a lot worse because 2020, there were huge diversity pushes in the wake of mostly peaceful events that year.
00:21:28.760 And so I've heard from people in Hollywood that they just say you look around and it's a bloodbath for writers' rooms, for producing jobs, for acting jobs, both starring and supporting.
00:21:39.360 It's really messed everything up.
00:21:40.740 A few weeks ago we talked about that letter from various Jewish Hollywood figures and they were saying Jewish people should be considered underrepresented in Hollywood.
00:21:50.680 And as I pointed out then, this is reflecting this, which is they're getting completely cut down because they're just being included in, you know, white people and they're being told to get out.
00:22:01.760 And also men versus women.
00:22:05.100 And obviously anyone can be creative, but I think it's unlikely to me that a shift that dramatic in nine years is because they suddenly found this billion-dollar bill laying on the sidewalk.
00:22:18.100 And instead we are just seeing – there was a big expansion with all the streaming services.
00:22:22.040 There were more shows being made.
00:22:23.340 And they seemed very ideological in how they hired for them.
00:22:27.120 And I guess if you want the answer of how much that succeeded, they're doing massive layoffs at every TV outlet right now.
00:22:34.240 So, Jack, I know you're a big TV fan.
00:22:36.020 What do you make of this story?
00:22:39.940 Well, I would say that, you know, so I first achieved my sort of like internet claim to fame or whatever you want to call it was –
00:22:47.340 it was through being a critic of television, particularly HBO, and specifically the show Game of Thrones, ran a sort of anti-Game of Thrones website, you know, for, you know, starting in 2012.
00:23:01.160 And that's when I started my Twitter account.
00:23:02.700 Everyone kind of knows the backstory there.
00:23:05.080 Kind of just ripping on HBO and how – we didn't have a word for it at the time, but essentially what you would say was it was becoming woke.
00:23:13.900 Back in those days, we used to just say SJWs are taking over social justice warriors, and we didn't quite have the work, the nomenclature woke just yet.
00:23:22.280 But that's basically what it was.
00:23:23.280 We were cataloging and documenting the rise of wokeness through – and in the Game of Thrones show, you can really see this because it ties to 2011.
00:23:31.920 So 2011 is when that started, when you do have this huge majority of men in the writing jobs.
00:23:38.000 And then all of a sudden, it's as that number decreases, and then people know season one, season two, season four, all the way up to season eight, which people know is –
00:23:47.120 if, you know, for anybody out there who watches Game of Thrones knows that season eight was absolutely god-awful, just the worst possible thing that anyone has ever put on television.
00:23:55.560 And whereas season one was, like, really good and everybody enjoyed it and it was wonderful and it was really close to the books and just took off and sparked an international phenomenon in terms of the show,
00:24:06.400 the coinciding of the Golden Age of television with the end of the Golden Age of television, the rise of wokeness, can be seen directly in these numbers.
00:24:17.060 I would certainly also, of course, tie this to the acquisition of the Star Wars franchise later, Marvel, by Disney and the appointing of Kathleen Kennedy at the head of Star Wars,
00:24:29.580 who decided to change Star Wars and turn Luke into a girl and have a girl character who is, like, super powerful and be at the start of all this.
00:24:39.100 This is, again, the exact same place you would see this.
00:24:42.260 And, in fact, I used to talk about this stuff all the time on the old blog and, you know, the internet.
00:24:46.920 People can go look it up.
00:24:47.780 The internet got very mad at me when I would say these things.
00:24:51.640 I will say, if you're picking on Game of Thrones, that is definitely an argument against having too many white guys in Hollywood
00:24:58.400 because it was two white guys who ran that ship into – or ran the plane – ran the Boeing into the ground, as it were.
00:25:07.520 But definitely, overall, there is this shift over time.
00:25:11.580 Like I said, it's so dramatic that it has to be driven by politics.
00:25:15.360 And just like with other topics we've talked about, if you're making big personnel changes based on a political goal,
00:25:23.880 you're not going to get the best outcome.
00:25:26.080 In fact, producer Andrew just linked an article from The New York Times,
00:25:30.400 and it's just like other topics we've talked about.
00:25:32.340 It's saying they want a ton of diversity in the writing room,
00:25:36.000 but the demand for it is outstripping the supply of experienced writers.
00:25:39.980 Well, if you can't hire experienced writers who are well-reviewed and have a good track record, what do you do?
00:25:46.060 You hire people who don't have that track record and who aren't as good at it.
00:25:49.440 And you have crap shows.
00:25:50.880 And you have crummy shows.
00:25:52.080 And it really is that if you even take a good show and you muck it up with people who shouldn't be there,
00:25:58.260 they can drag down the whole product.
00:26:00.600 And I know the big controversy right now is True Detective, that allegedly True Detective changed its staff over,
00:26:07.980 and the new season was terrible.
00:26:09.080 I don't know.
00:26:09.980 I've never seen it.
00:26:10.760 It was terrible.
00:26:11.480 I watched it.
00:26:12.060 Yeah.
00:26:12.680 I watched it because everybody said how bad it was, and to talk about it on the show is absolutely horrible.
00:26:18.580 And also, it's like, I do wonder, this is related to other phenomenon people complain about.
00:26:23.500 So, what do people complain about with Hollywood?
00:26:26.560 Too many reboots, too many remakes, too much over-reliance on franchises.
00:26:31.580 And especially, they often make installments in these franchises that feel mean-spirited.
00:26:36.300 Like, you're going to take James Bond and make him this beta wuss and then kill him and all of this stuff.
00:26:42.000 And I think that's very much a product of if you've created an intellectual environment where you're promoting less original,
00:26:49.880 kind of hackish people who are there for, again, political reasons.
00:26:54.680 And they have a harder time creating their own ideas, and instead they have to fall back on the same safe things.
00:27:01.240 And also, when they do make something original, it's not popular.
00:27:04.500 So, these studios go, popular stuff isn't succeeding.
00:27:07.420 It's too risky.
00:27:08.400 Go back to the safe thing.
00:27:09.720 Go back to the thing that was made 40 years ago that is still really popular with everyone.
00:27:14.620 And I think it's really telling that what is thriving, what's exploding in popularity right now.
00:27:20.160 Well, think about YouTube.
00:27:21.940 Who's the most popular YouTuber?
00:27:23.100 I bet even you know this, Charlie.
00:27:24.540 Mr. Beast.
00:27:25.060 Mr. Beast.
00:27:25.600 Even I know that.
00:27:26.440 And Mr. Beast, if you watch his videos, there's not really, like, any race stuff in it.
00:27:32.400 There's no politics stuff in it.
00:27:33.680 But what the crew of Mr. Beast is, is it's Mr. Beast and his friends from high school.
00:27:38.160 So, they're mostly white.
00:27:39.420 They're all white.
00:27:40.820 And that's not a thing.
00:27:41.820 It's not some racist thing.
00:27:43.580 It's literally that Mr. Beast and his friends made a show.
00:27:46.580 And it's the most popular thing in the entire world.
00:27:50.480 So, they love it in India.
00:27:51.700 They love it in South America.
00:27:53.400 They love it in Asia.
00:27:54.560 Because, in reality, people don't obsess about diversity worldwide the way that they obsess
00:27:59.540 about it here.
00:28:00.640 What are the cultural products that are super popular all around the world?
00:28:03.520 People love K-dramas.
00:28:05.740 They love Japanese video games and anime.
00:28:08.720 They love European TV shows that don't have the same kind of quota system that we do.
00:28:13.200 They love telenovelas for some reason.
00:28:16.540 And what people love is they love artistic products that show a compelling artistic vision.
00:28:22.580 And they don't care that it's written by the right looking person or a person who has the
00:28:28.020 right equipment.
00:28:28.600 And yet, we are operating on the assumption that that's what's necessary.
00:28:33.060 And no surprise, Hollywood is way less powerful than it's ever been before.
00:28:37.840 What's the contrast, do you think, of that and the casting directors now for these roles in these shows?
00:28:44.820 Because now we've seen it with all these remakes that are going on, changing the gender, changing
00:28:50.200 the race of the main character.
00:28:52.160 Or instead of it's a dad and a mom, it's a mom and a mom, or a dad and a dad.
00:28:56.840 It's the backstory of the parents.
00:28:58.240 I'm curious now, not just the screenwriters, but the people that are also casting these
00:29:03.980 movies, casting these TV shows and things like that.
00:29:06.620 Because we're not only getting crap writers, now we're getting bad actors as well.
00:29:10.460 And we're not only pushing political agendas, we're pushing social and cultural agendas
00:29:15.600 on kids.
00:29:17.400 A cartoon now.
00:29:19.060 You take them there, and the dogs are talking about how their owners are two moms.
00:29:26.940 Cartoon characters for children and things like that.
00:29:30.460 Yeah, so I guess the question is, Blake, objectively, have the analytics begun to go down on television?
00:29:38.220 And do we have, I mean, what are then people replacing their viewing habits with?
00:29:43.080 Is it just that there's so many streaming options that they're going to so many other
00:29:45.720 different platforms?
00:29:46.500 Is that right?
00:29:47.280 Yeah.
00:29:47.660 So there's definitely a recession going on in TV world right now.
00:29:53.000 Does that include streaming?
00:29:53.820 It's not just because of, yeah, definitely.
00:29:55.500 It's not just about wokeness.
00:29:57.840 It's that they thought streaming was this hugely dominant future, and everyone invested
00:30:03.040 in it all at once.
00:30:04.360 So one reason they could change these numbers so much is the number of shows in production
00:30:09.300 in the US, I don't have the exact number, but it went from about 200 to 550 in a decade.
00:30:15.680 And the US population did not triple.
00:30:17.660 In that time span.
00:30:18.660 So you have, in addition to, you used to have CBS, NBC, Fox, you know, the networks and
00:30:23.720 basic cable.
00:30:24.280 They're still making shows, but now we have Netflix make shows and there's Peacock exclusives
00:30:29.200 and Amazon exclusives and Apple TV and Disney plus.
00:30:32.060 And they're making all these shows, often with enormous budgets.
00:30:36.280 I think Rings of Power cost a billion dollars for one season of very poor television.
00:30:43.560 So I'm told I didn't watch it.
00:30:45.520 And so, and the numbers just don't work out.
00:30:48.960 They've spent a huge amount of money.
00:30:50.100 It was a very zero interest rate phenomenon, as they like to say.
00:30:53.440 And now they've realized the income is not matching up to this and they're just, they're
00:30:59.300 cutting costs everywhere.
00:31:00.300 So a ton of shows are getting canceled.
00:31:02.520 A ton of stuff is getting cut back.
00:31:04.520 Now it's, again, it's not just that the new shows are worse.
00:31:07.200 It's also that with streaming, it's way easier to watch old shows.
00:31:10.520 So people just keep watching The Office.
00:31:13.120 They keep watching The Sopranos.
00:31:14.440 They keep watching The Simpsons.
00:31:16.060 That's all way easier than it was in the past.
00:31:18.400 It's way easier to watch global stuff.
00:31:20.460 And then other stuff people are watching.
00:31:22.380 Now people watch YouTube stuff.
00:31:24.000 You just have individual creators.
00:31:25.680 Like I said, Mr. Beast, really popular.
00:31:28.040 You have video games are more popular than they were 30 years ago.
00:31:31.720 You have video game streamers.
00:31:33.320 Why play a game when you can watch someone else play a game?
00:31:35.560 Let's play a game.
00:31:36.640 You, you can go on.
00:31:39.220 The number of entertainment options is essentially unlimited.
00:31:41.940 And this is a dinosaur that's trying to keep up by embracing politics.
00:31:46.240 And I think they're going to fail.
00:31:49.060 Any final thoughts on this, guys?
00:31:53.140 Stop letting your kids watch TV because it's all going downhill across the board.
00:31:57.700 Also, I did see an article earlier today.
00:32:00.160 Not only these streaming services you're talking about,
00:32:02.000 they're also now starting to report on individual user accounts
00:32:06.740 of what they feel is unhealthy behavior.
00:32:10.000 So to your point, people are still watching the streaming service
00:32:13.900 because they're watching the older things, the Simpsons and whatnot.
00:32:18.040 You know, they're now reporting data back on what they view as unhealthy behavior
00:32:23.000 of the viewers.
00:32:25.720 Like one guy watched The Lord of the Rings 300 times a year and stuff like that.
00:32:29.820 That's a lot of times.
00:32:30.600 It is a lot of times.
00:32:31.520 And I'm not saying that's not unhealthy.
00:32:33.420 But what I'm saying is now they're even, outside of all this woke stuff,
00:32:37.740 now they're paying attention to what people are watching
00:32:40.240 and why they're watching it over and over again.
00:32:42.520 We're going to have that day you'll be watching The Old Simpsons
00:32:45.020 and then it'll bud in and say,
00:32:46.300 Are you sure you want to watch this?
00:32:46.880 You've only been watching the first eight seasons.
00:32:49.060 Did you know that a vibrant and diverse cast wrote our more recent seasons?
00:32:52.520 They're saying it's really good.
00:32:53.780 It's engaging with social problems.
00:32:55.000 Why haven't you watched it?
00:32:56.060 We're closing your account until you watch it.
00:32:57.800 Probably.
00:32:58.520 Yeah, absolutely.
00:32:59.780 Jack, tell us about the next partner we have here.
00:33:04.120 Angelo, can you post it in there?
00:33:08.440 Well, I got to tell you guys.
00:33:09.960 So the wellness company is really one of the best partners
00:33:13.520 that we've gotten to here since we've been doing thought crime.
00:33:17.380 And I have to say, as a guy who is always going in on the pharmacies,
00:33:21.880 and I said, look, you know, in the last 10 years,
00:33:23.860 I don't know if it's Obamacare or it was COVID,
00:33:25.620 the pharmacies and that entire crooked system were decimated
00:33:29.580 because of these outages to the system and extras to the system.
00:33:35.380 And I was always trying to find a way to get this better.
00:33:37.840 By the way, also, the pharmacies were down almost entirely today
00:33:41.940 because of this cyber thing that happened.
00:33:44.520 And so we, you know, with TWC.health, and I've got it here, TWC.health.tj.
00:33:53.880 Go to TWC.health.tj.
00:33:56.420 You can get your medical emergency kit.
00:33:58.660 Use code TJ, 10% off at checkout.
00:34:01.700 People are cheering.
00:34:02.700 They love hearing about this 10% off.
00:34:05.680 And people ask you, how do you get ivermectin?
00:34:08.180 They ask me all the time, how do you get ivermectin?
00:34:09.800 By the way, I was feeling a little something in my throat the other day.
00:34:12.500 I'm at a big event, a lot of stuff going on.
00:34:14.260 Pops from ivermectin.
00:34:15.460 Feel good to go.
00:34:17.180 You got your Z-Pak in there.
00:34:18.280 You got everything you need.
00:34:19.160 When you travel, you can go for it as well.
00:34:20.860 TWC.health.tj, excuse me, cj, slash cj.
00:34:25.860 Cut the pharmacies out.
00:34:27.640 Get it direct.
00:34:28.780 TWC.health.cj.
00:34:31.440 Next topic.
00:34:32.660 I want to double back to Gemini for one second.
00:34:35.160 This is a headline from the New York Times.
00:34:37.240 The New York Times has written,
00:34:38.460 so Google getting rid of white people from history was racist.
00:34:42.820 Can you guess who it was racist against?
00:34:46.460 People of color.
00:34:47.540 The headline of the New York Times.
00:34:48.960 Google's chatbot AI images put people of color in Nazi-era uniforms.
00:34:55.460 White people removed from history, women and minorities hardest hit.
00:34:58.820 It's like the 80s all over again.
00:35:00.840 So their problem was people of color in Nazi uniforms when Nazis were, okay.
00:35:08.020 Yeah, exactly.
00:35:09.140 They were fine with the black founding fathers.
00:35:12.240 Well, I imagine they probably get into that, or maybe they don't.
00:35:14.700 I don't know.
00:35:14.980 It's the New York Times.
00:35:17.000 Soon AI will replace the New York Times.
00:35:18.960 At least we can hope.
00:35:20.060 All right.
00:35:20.440 The next actual topic, though.
00:35:21.840 This is a thing we were arguing about a ton off the air yesterday.
00:35:26.040 It's really interesting.
00:35:27.140 So there's a news story a lot of people have probably heard of.
00:35:30.340 It's out of Alabama.
00:35:31.640 And what happened is the Alabama Supreme Court ruled last week that frozen embryos count as unborn life
00:35:39.460 and therefore receive legal protection under the state's laws.
00:35:42.360 So the context of this was that, where is it here?
00:35:48.740 So in 2021, someone broke into a fertility clinic in Mobile, Alabama, and they broke into a freezer with stored human embryos and they pulled some out and they dropped it and caused some of the embryos to die.
00:36:04.120 And the parents of these embryos brought a wrongful death lawsuit.
00:36:09.080 And initially they argued that these did not really count as unborn life.
00:36:15.280 They hadn't been implanted yet or whatever.
00:36:16.800 And so they didn't count.
00:36:19.100 And this went to the Supreme Court and they said, no, these are human lives.
00:36:22.700 This is a valid wrongful death case.
00:36:24.500 That's the context of this.
00:36:26.220 Now, what people are reacting to is if they're declaring IVF embryos unborn life, what that means is, well, for example, the way IVF works is you generate a lot more embryos than you actually need typically because it's an expensive, difficult.
00:36:42.780 What's the ratio usually?
00:36:44.040 Like it's 20 or 30, right?
00:36:45.340 Yes.
00:36:45.760 Basically, often, I think.
00:36:47.140 I don't have the exact number in front of me, but definitely more than one or two.
00:36:51.060 And that's why you can often get twins and triplets and stuff, because they'll often try to implant several and all in the hopes that just one will take.
00:37:00.200 But as a result, you have these excess embryos.
00:37:03.100 Now, in some places, they're just frozen for a long time in case the couple wants to have more children in the future or they need to try again.
00:37:10.500 Other times they're just thrown away, which is killing a independent human life.
00:37:15.920 And so some hospitals in Alabama, the University of Alabama at Birmingham has already paused IVF treatments because they say the legal situation is muddled.
00:37:26.140 They can't do this.
00:37:26.780 They don't want to get sued or prosecuted for breaking the law here.
00:37:30.540 The big picture, of course, is you can already see the Joe Biden ad that's going to say, you know, psycho red states won't let you do IVF anymore.
00:37:38.720 They won't let you have kids.
00:37:40.420 You can see the ad.
00:37:42.020 Yeah, but I mean, so even if you acknowledge, though, that the fertilized embryo is a life, then having IVF, you're not killing the embryo.
00:37:51.320 It does have a low chance of survival, but you're not necessarily, I mean, killing the embryo, correct?
00:37:57.160 Well, they're often thrown away.
00:37:59.320 They're often thrown away.
00:38:00.800 And that would apparently prevent that.
00:38:01.260 Okay, but what about the implanted embryos?
00:38:04.800 Oh, yeah, no, I don't think so.
00:38:06.420 So that's not—
00:38:08.140 I don't know why that would stop treatments.
00:38:09.440 It wouldn't stop.
00:38:10.280 Treatments, then.
00:38:10.600 It would just change the treatment slightly then, right?
00:38:12.860 You wouldn't be able to throw away embryos, essentially.
00:38:14.940 I suppose so.
00:38:15.820 That would probably be the big one.
00:38:16.760 So that seems like a lot of fear-mongering.
00:38:18.260 Again, my stance on IVF is that I know a lot of people that have really benefited from it
00:38:22.520 and that you should probably do it at one or two or three embryos at a time.
00:38:27.600 It lowers your chance of having a successful presidency.
00:38:30.840 Jack, the Catholic stance on IVF is pretty firm, isn't it?
00:38:34.200 Yeah, so Catholics are against IVF.
00:38:41.460 This is, you know, we were talking Catholic doctrine last week on death penalty
00:38:45.100 and how that's sort of something that's been up for debate,
00:38:48.820 whereas IVF, that is something that the Church has always been against.
00:38:52.560 The Church stands against this because it is the complete—
00:38:55.840 and specifically for that reason, right?
00:38:57.800 Because not only is it the complete dissociation of man and woman in a marriage
00:39:02.680 and the procreative act, but also because of this very reason,
00:39:08.120 this idea that there are so many discarded embryos here that are generated
00:39:15.060 and that, as you say, discarded.
00:39:16.120 I was actually just looking at Elon Musk's Twitter because, for anyone who doesn't know,
00:39:20.300 if you've read the great Walter Isaacson book on Elon,
00:39:22.480 of course, this is something, a technology that he has embraced fully.
00:39:25.740 Many, many, not all, but many of his children were born by IVF.
00:39:29.540 And something that I do think is interesting, though, from a legal perspective here
00:39:33.860 is, you know, going in with the pro-life movement and the question, of course,
00:39:38.580 obviously, this creates a huge flashpoint for the Democrats.
00:39:43.340 They want to be able to go at it, go after Republicans, go after conservatives,
00:39:48.660 saying if Donald Trump is put—and it's a complete lie, of course,
00:39:51.480 saying that if Donald Trump would ban IVF.
00:39:53.720 But I do think there's an interesting question here that we could bring up,
00:39:56.580 possibly, for the debate.
00:39:57.840 And, you know, Charlie, this might be something that we could, you know,
00:40:00.480 kind of brainstorm over.
00:40:02.000 Because if the issue is the discarding of the embryos, right,
00:40:07.040 that the pro-life community—and I personally believe, rightfully so,
00:40:11.180 that we should talk about this because we do believe that it's human life as pro-lifers,
00:40:14.880 then the real question is, shouldn't we try to find perhaps a productive use for this?
00:40:21.040 And I'm not talking about scientific experimentation,
00:40:23.280 but what about, like, a donation bank or something of this
00:40:26.280 if there are extra embryos that are made and they're not used?
00:40:30.120 Blake, what do you stand on all this?
00:40:32.060 The politics of this are awfully thorny, right?
00:40:33.960 I agree.
00:40:35.140 And we've talked about this, you know,
00:40:37.180 with the concern with some of the post-Dobbs backlash.
00:40:40.340 It looks like losing a lot—losing some elections that seem related to that,
00:40:43.760 losing referendum.
00:40:45.060 And obviously, you're pro-life.
00:40:47.060 I'm very pro-life.
00:40:48.180 I care very much about not letting people in the Republican Party
00:40:54.300 kind of throw pro-life stuff overboard as a political expedient.
00:40:58.340 And what I do worry about is—I agree with Jack.
00:41:02.600 I don't really like IVF.
00:41:03.960 It seems kind of very morally fraught at best to me and probably just bad.
00:41:08.940 But what I do worry about is that if we allow this to become a big flashpoint issue,
00:41:18.600 what will happen is we'll have the states where we actually currently have strict abortion laws,
00:41:23.540 and they'll end up throwing those out in sort of this big collective backlash to it.
00:41:29.420 And that doesn't help us in any way.
00:41:32.360 I think it's—I think attacking IVF is sort of staking out a position that you can't defend.
00:41:37.760 Think of it like in military tactics terms.
00:41:40.620 It doesn't help you to plunge into enemy territory so that you just get shot up and everyone dies.
00:41:45.520 You have to take defensible positions that you can hold on to.
00:41:49.340 And that doesn't mean be a coward.
00:41:50.820 It doesn't mean never try.
00:41:52.060 It doesn't mean give up.
00:41:53.100 But it does mean don't go into a position where you're just—you know you're going to lose
00:41:59.860 when you're in a fight that you still in the long run can win.
00:42:03.360 Graham, what is your stance on this?
00:42:04.820 On IVF, I tend to go more your route, Charlie.
00:42:10.760 I personally know people who have tried.
00:42:14.580 It hasn't worked.
00:42:15.740 People who have tried, and it has worked.
00:42:17.600 I can't imagine what it must be like to be a man and wife and want children and are unable to in the normal way.
00:42:25.540 So my personal thing is I am okay with it.
00:42:28.620 Now, I am not the most well-versed on how many embryos are being discarded and all of this.
00:42:35.300 And so I do agree that I think that just with anything, we can do things better than we're currently doing them.
00:42:41.840 I agree with you, Blake, that at the same time, we can't give the enemy—we can't give the left the ammunition to come after us to end the advances that we have made for pro-life.
00:42:53.880 And so, yeah, I think that we need to—one, I think it needs clarification, first of all.
00:43:00.460 And then, two, I think it's a lot of fear-mongering along with that, which is what they do.
00:43:06.000 They know that it's not what they're going to make it out to be.
00:43:09.440 They know that all these things—well, now it's going to be murder, all these embryos and everything, and people are going to go to jail and all this.
00:43:16.440 They know that's not really what it is, but I do think that clarification needs to be brought out by this court that made this ruling.
00:43:24.040 Jack, you have some stats you want to share with us here?
00:43:29.380 I would.
00:43:30.120 So I was looking up this, and I didn't have the stats earlier when I was mentioning it just now, but by the numbers,
00:43:35.140 the total annual donated embryo transfers in the United States more than tripled from 2004 to 2019 is primarily in Christian communities.
00:43:45.780 So people who are maybe generating embryos through IVF treatments and IVF procedures,
00:43:52.540 but then have decided that for one reason or another, they don't want to go forward or maybe have enough kids, maybe have more kids, et cetera.
00:44:00.080 Over 8,457 births have been—children have been born for in the last 15 years, in the 15 years cycling through here,
00:44:12.740 from donated and adopted embryos, according to the American Journal of Obstetrics and Gynecology.
00:44:20.020 And I think this is absolutely something that if you're in the pro-life community and you want to have a conversation with people about IVF,
00:44:28.780 in the same way people talk about adoption versus abortion, right, that was a huge thing and still is a huge thing when people talk about abortion for the Christian community.
00:44:37.520 I would also suggest to people that when you're having this conversation, rather than going for this idea of a full-on ban of IVF,
00:44:44.900 that, okay, hold your belief, but also understand that the situation is ongoing and promote services like this,
00:44:54.500 in which case those embryos can find their forever family.
00:44:59.780 Yeah.
00:45:00.160 And some of the most, I mean, pro-life people have used IVF, and it's worked for them.
00:45:05.360 So politically, I think it's a little—
00:45:06.580 Well, I'm sure, yeah, I'm sure it works.
00:45:08.360 I think it does.
00:45:09.280 It actually gets at the heart of why it is hard for pro-life stuff to get over the hump.
00:45:15.140 Because if you look at polls, it's only, you know, maybe about a third of people who call themselves pro-life are really in the abstracts,
00:45:24.760 I think, really get like, oh, this is a human life that is equal to other human lives and you can't kill it.
00:45:30.200 And then you have a lot of softer positions.
00:45:32.300 And so I think there's just a lot of people who are wobbly, and so they're like, oh, abortion's bad because I can think of this cute baby getting ripped apart.
00:45:41.360 So they get really grossed out by the idea of dismembering a more grown fetus.
00:45:45.540 But they can't really internalize the idea that it's really wrong to, you know, kill a relative, you know, sort of just a little ball, a ball of tissue as Planned Parenthood would call it.
00:45:58.980 And they also get the emotional attachment of, oh, this baby is nice.
00:46:04.100 So it's hard for a lot of people to get into the moral framework of it is that you create 10 lives so that you can throw nine of them away to get one baby that is actually born or 20 lives, something like that.
00:46:19.400 And a few people intuit that that's bad, but most people just don't.
00:46:23.920 And that's just, maybe that's just a flaw in how humans reason.
00:46:26.600 Do the pro-abortion people think they have us on this topic?
00:46:29.480 Oh, for sure.
00:46:30.120 They will attack people on this where they will say, you say, you know, life is a life, that we shouldn't have abortion at any stage, but then why are you okay with IVF?
00:46:39.220 They will bring us up because they pointed out as a major inconsistency in the pro-life position.
00:46:44.380 Yeah.
00:46:45.480 Okay.
00:46:45.980 I think we have Tax Network, right?
00:46:48.740 Do you guys owe back taxes?
00:46:50.780 Pandemic relief is now over, finally.
00:46:52.840 Along with hiring thousands of new agents and field officers, the IRS has kicked off 2024 by sending over 5 million pay-up letters to those who have unfiled tax returns or balances owed.
00:47:03.920 Don't waive your rights and speak with them on your own.
00:47:06.600 They are not your friends.
00:47:07.520 Tax Network USA is a trusted tax relief firm.
00:47:10.820 They've saved over $1 billion in back taxes for their clients.
00:47:14.220 So check out TNUSA.com slash Charlie.
00:47:16.940 Call 800-254-6000.
00:47:18.940 That is TNUSA.com slash Charlie.
00:47:22.800 You've got to check it out right now.
00:47:24.940 800-254-6000.
00:47:27.020 TNUSA.com slash Charlie.
00:47:30.060 Graham, tell everyone your social media because I've got a dash.
00:47:33.000 Yeah.
00:47:33.660 So, Graham Allen, you should be able to find it just about anywhere, Instagram, Facebook, Rumble.
00:47:39.620 We have Dear America, the show on Rumble.
00:47:42.660 Yeah.
00:47:43.160 Graham Allen.
00:47:44.020 Email us freedom at charliekirk.com.
00:47:46.120 Thank you guys for watching.
00:47:47.080 Until next week, keep on committing thought crimes.
00:47:51.100 Thought crime is death.