Timcast IRL - Tim Pool - November 07, 2025


Supreme Court May OVERTURN Gay Marriage, SCOTUS Hearing Set For TOMORROW | Timcast IRL


Episode Stats

Length

2 hours and 12 minutes

Words per Minute

192.45523

Word Count

25,449

Sentence Count

2,076

Misogynist Sentences

23

Hate Speech Sentences

35


Summary

It may be the end of gay marriage, the Supreme Court may take up the case challenging the landmark ruling in Obergefell v. Hodges, and Jennifer Lawrence says, "If you're an actor, nobody cares what you think about politics."


Transcript

00:02:30.000 It may be the end of gay marriage.
00:02:32.000 Tomorrow, there's going to be a private hearing at the Supreme Court to determine whether or not they will actually take the case, which challenges the landmark ruling O'Bergevel v. Hodges, Supreme Court v. Obergefell v. What is it?
00:02:44.000 I believe it is.
00:02:44.000 Hodges?
00:02:45.000 Am I getting this one wrong?
00:02:46.000 Whatever.
00:02:47.000 We'll read it.
00:02:47.000 Hodges, yeah, Hodges.
00:02:48.000 Hodges.
00:02:48.000 I got it right.
00:02:49.000 That's great.
00:02:50.000 And if they do pick this up, the presumption is the 6-3 court is going to overturn gay marriage, at least recognized at the federal level, meaning states will, it'll go the way of Roe v. Wade.
00:03:01.000 Now, the op-eds are a fly-in, and concerns are flying among these left-wing groups that, yeah, look, it's a right-wing Supreme Court.
00:03:09.000 It's probably going to happen, but we don't know for sure.
00:03:12.000 It's going to be tomorrow.
00:03:12.000 So we're going to talk about what's currently happening.
00:03:15.000 What is the current lawsuit that may actually trigger the end of gay marriage federally?
00:03:19.000 And it'll be interesting.
00:03:20.000 Also, Donald Trump has been ordered to fund SNAP fully by Friday.
00:03:25.000 Despite saying you won't do it, and despite there being no money to do it, the courts are arguing that Trump should do things outside of his power.
00:03:33.000 It's very strange.
00:03:34.000 So we'll talk about that.
00:03:35.000 And then we're going to go to Hollywood because there's this viral meme of Sidney Sweeney when asked about, basically, they called her racist or whatever.
00:03:42.000 They were like, is it really appropriate to say you're not white jeans and superior?
00:03:46.000 And then she made this look and issued a snappy comeback that's gone viral.
00:03:50.000 And Jennifer Lawrence says, if you're an actor, nobody cares what you think about politics, and she's going to back away from it.
00:03:56.000 I think it's showing us that we are winning the culture war.
00:03:59.000 And it's good news, despite losing some elections recently.
00:04:02.000 I think we're doing all right, and we'll figure this one out.
00:04:04.000 So we will get to that.
00:04:05.000 Before we do, my friends, we got a great sponsor for you.
00:04:07.000 It is Beam Dream.
00:04:08.000 Head over to shopbeam.com slash Timcast and pick up your Beam Dream 50% off right now.
00:04:16.000 Let me tell you, this is your nighttime sleep blend to support better sleep.
00:04:19.000 I drink it every single night.
00:04:20.000 That is not a joke.
00:04:20.000 That is not a script.
00:04:21.000 That is a fact.
00:04:22.000 It's got melatonin, L-theanine.
00:04:24.000 It's got magnesium.
00:04:25.000 Magnesium, I think, is what really did it for me because when I work out, I would get cramps and stuff, and I probably wasn't getting enough magnesium.
00:04:31.000 It's a delicious cup of hot cocoa.
00:04:32.000 They got pumpkin.
00:04:33.000 They got caramel, all these different flavors.
00:04:35.000 You mix it with some hot water.
00:04:36.000 Tastes great.
00:04:38.000 No sugar added, only 15 calories.
00:04:40.000 And I can't recommend it more.
00:04:42.000 I'm a huge fan.
00:04:43.000 So check out shop B-E-A-M, shopbeam.com/slash Timcast.
00:04:48.000 But wait, there's more.
00:04:49.000 We got Backyard Butchers, my friends.
00:04:52.000 Guys, everybody loves a good steak.
00:04:54.000 You've heard of Big Farmer, right?
00:04:56.000 But have you heard of Big Ag?
00:04:57.000 Did you know that 85% of American meat is controlled by just four major companies?
00:05:01.000 Just because you buy American doesn't mean you're buying healthy.
00:05:04.000 And buying organic only means they control what the cattle eat, not how they live.
00:05:08.000 Backyard Butchers, Texas steer steaks come directly from a real Texas ranch where cattle are raised, processed, and shipped from the same location completely bypassing Big Ag.
00:05:18.000 These Texas steers are 98% grass-fed, 2% natural grain finished with no growth hormones, no antibiotics, and no preservatives.
00:05:26.000 The quality and flavor are exceptional.
00:05:29.000 Making America healthy again, maha, starts going back to our roots and eating real meat from the hardworking ranchers who design who raise cattle right.
00:05:37.000 Go to backyardbutchers.com slash Tim.
00:05:40.000 Enter promo code Tim for up to 30% off.
00:05:44.000 Two free 10-ounce ribeyes.
00:05:46.000 That's the kicker right there.
00:05:47.000 And free shipping when you subscribe.
00:05:49.000 Fight back against big ag with your fork.
00:05:51.000 Support American ranchers from Texas.
00:05:53.000 Backyardbutchers.com slash Tim, promo code Tim.
00:05:57.000 And don't forget to smash that like button.
00:05:59.000 Share the show with everyone, you know.
00:06:01.000 Right now, just click that share button, click that like button, subscribe.
00:06:04.000 Joining us tonight to talk about this and so much more, we got Zachary Levi.
00:06:07.000 Hey, happy to be here.
00:06:08.000 Thanks for having me.
00:06:09.000 It's an honor and a privilege.
00:06:10.000 What do you do?
00:06:10.000 Who are you?
00:06:11.000 Yeah.
00:06:12.000 Well, I'm an actor.
00:06:13.000 Primarily, that's my bread and butter.
00:06:15.000 But, you know, I dabble in lots of different things in entertainment, writing, directing, producing, and very actively trying to build the answer to a broken Hollywood, build a movie studio/slash living community on my ranch in Texas.
00:06:29.000 Oh, right on.
00:06:30.000 That's my big calling and dream.
00:06:31.000 Yeah.
00:06:31.000 Well, AI is one of the potential apocalypse, apocalypses, apocalypse I.
00:06:37.000 Yeah, that sounds really good.
00:06:38.000 Apocalypse.
00:06:39.000 We also have 3A Atlas coming to wipe us out perhaps aliens.
00:06:39.000 Acoli.
00:06:44.000 Or maybe they're here to be like, guys, what's up?
00:06:47.000 Let's help you not kill each other.
00:06:49.000 Lord help us if the aliens who come are like Ian.
00:06:53.000 Anyway, great to have you here.
00:06:55.000 It's going to be fun.
00:06:56.000 Let's talk about it.
00:06:57.000 Brett's hanging out.
00:06:58.000 I would love it if the aliens that came here were like Ian personally.
00:07:01.000 But guys, it's Brett.
00:07:02.000 Normally I'm doing Pop Culture Crisis Monday through Friday at 3 p.m. Eastern Standard Time, but there's a lot to talk about.
00:07:07.000 How's it going, Mary?
00:07:08.000 That's actually the same place where you can find me, guys.
00:07:11.000 Hi, my name is Mary Morgan, and you can usually find me on Pop Culture Crisis here at Timcast.
00:07:16.000 And we're especially glad to be here with Zachary Levi because we have covered stories related to you and your work on the show before.
00:07:23.000 So it's going to be interesting to get into some Hollywood stuff.
00:07:25.000 Oh, I'm in the thick of it now, guys.
00:07:26.000 Oh, yeah.
00:07:27.000 These are our pop culture correspondents, basically.
00:07:29.000 Oh, is that?
00:07:30.000 Did they come special in for me tonight?
00:07:31.000 Oh, they come on the show across from like down the driveway.
00:07:34.000 I'm like casually always here on Thursdays.
00:07:37.000 As she puts her hair behind me.
00:07:40.000 Yeah.
00:07:41.000 Hello, everybody.
00:07:41.000 My name is Phil Labonte.
00:07:42.000 I'm the lead singer of the heavy metal band All That Remains.
00:07:44.000 I'm an anti-communist and a counter-revolutionary.
00:07:46.000 Let's get into it.
00:07:47.000 Here's the story from Newsweek: Supreme Court v. Gay Marriage.
00:07:51.000 Jim O'Bergefell's warning as precedent is tested.
00:07:54.000 This is big news.
00:07:55.000 It actually initially broke a couple weeks ago.
00:07:58.000 The Supreme Court has scheduled a private conference Friday to decide whether to hear a challenge brought by former Kentucky county clerk Kim Davis, which urges the Supreme Court to overturn Obergefell v. Hodges.
00:08:10.000 Matthew Stavor, attorney for Davis, told Newsweek last month that Obergefell has no basis in the Constitution, saying the decade-old decision could be overruled without affecting any other cases.
00:08:20.000 Although many legal analysts believe since marriage rights are unlikely to be overturned, even by the conservative-leaning Supreme Court, Obergefell told Newsweek in a Wednesday interview that he remains concerned.
00:08:29.000 He pointed to the justices' 2022 decision to overturn Roe v. Wade, which had guaranteed abortion access across the country for nearly 50 years.
00:08:36.000 Quote, this court to me is far from normal, and that's what concerns me.
00:08:41.000 We now have a Supreme Court that has shown it is willing to turn its back on precedent, which has always been a bedrock principle for the Supreme Court, he said.
00:08:48.000 Now, that is an ignorant statement because there are many circumstances in which the Supreme Court has overturned precedent.
00:08:55.000 Perhaps this man would like to go back in time far enough to where the Supreme Court agreed with segregation.
00:09:00.000 Or how about slavery?
00:09:00.000 I don't think he would.
00:09:01.000 Right.
00:09:02.000 The Supreme Court changes its views on long-standing precedent all the time based on our current interpretations of the Constitution.
00:09:08.000 I actually think there's a very, very strong probability gay marriage is overturned.
00:09:13.000 But what say you guys?
00:09:15.000 I think using Roe versus Wade as some kind of metric to say that this is possible to be overturned is a very bad mistake because even Ruth Bader Ginsburg was clear about the fact that the Roe decision was a bad decision.
00:09:32.000 Like they were looking for a result with Roe.
00:09:34.000 They weren't actually deciding based on any kind of legal precedent or any kind of legal reasoning beyond we are looking for this result.
00:09:43.000 So this is what we're going to find.
00:09:45.000 And when even, you know, Ruth Bader Ginsburg is the one that's making that statement, that's saying this was actually a bad decision, you can't say that, oh, well, you know, this indicates that they're going to overturn other things in the future.
00:09:58.000 Now, I'm not sure the particulars on Obergefeld as to why it was decided the way it was.
00:10:04.000 So I can't speak to as to if the decision is on firmer ground, but to use.
00:10:10.000 Oh, bro.
00:10:11.000 Let me break it down for you.
00:10:12.000 Basically, Obergefell is the one that forces states to recognize gay marriages.
00:10:16.000 So in states that have legal gay marriage, if you get married there and go to a state that doesn't, they have to recognize that marriage under Obergefell.
00:10:23.000 So if this is overturned, it basically just means that states will have to do it on their own, which is why I believe it is very likely it gets overturned.
00:10:29.000 Was it based on licensing?
00:10:31.000 Yes.
00:10:32.000 We had this whole debate, remember?
00:10:33.000 We're talking about licensing.
00:10:35.000 Whether or not because of the Second Amendment stuff.
00:10:36.000 So then my question is, so if it gets overturned, then are the states that before the rulings that were pro-gay marriage and had gay marriage legalized, then are they all starting out that way again?
00:10:48.000 Or do they all have to then reapply like re like institute policies that legalize it?
00:10:52.000 Or do then states, are they forced to then criminalize it if they deem to do so, which I seem to be like, that seems like a really hard thing to do.
00:11:00.000 Is criminalizing the same as not right?
00:11:02.000 No, no, no, but I take your point.
00:11:05.000 I don't know.
00:11:06.000 I honestly have, I mean, this is the first time I'm hearing of this, but I also agree.
00:11:10.000 I think that, you know, making a comparison between Roe v. Wade and this is drastically, I mean, they're so different for myriad reasons, but I think the biggest is that I think that conservatives who would be most concerned with any of these things, right?
00:11:28.000 I think conservatives have, by and large, come to a place where they're accepting of gays in gay marriage on a level that it was still not something that they wanted to just eat when it came to abortion.
00:11:44.000 Abortion is a much more polarizing concept.
00:11:48.000 That is literally, you have one side that says, my body, my choice, and the other that says this is murder, right?
00:11:54.000 And then you also had states that, or have states still, that are practicing late-term abortion.
00:12:01.000 And that is a very concerning thing for a lot of people that are conservatives and possibly even some people that are not conservative.
00:12:07.000 But I think with gay marriage, I'd like to believe that we've enlightened as a society to the point where we can see that it's not harming anybody.
00:12:17.000 It's like I'm not one of those enlightened people, to be honest with you.
00:12:21.000 I wanted to mention, since you were remarking earlier on how divisive this decision would be, I think these justices are in this current climate genuinely in danger if they were to make a majority decision to overturn.
00:12:37.000 I mean, their lives literally would be in danger.
00:12:42.000 Given what we have seen from these LGBTQ plus ideologues, their rhetoric online and in person, I genuinely think they would be too afraid to even make this decision, just in a totally self-interested way.
00:12:58.000 I think their lives would be on the line.
00:13:00.000 Well, I agree with you.
00:13:01.000 I'm not even trying to speak that into existence.
00:13:03.000 It's just that.
00:13:04.000 Well, I think it's true already, and I think it'll get worse.
00:13:06.000 But you made the point.
00:13:08.000 I guess your point was there's no harm in having gay marriage.
00:13:12.000 I don't want to paraphrase.
00:13:14.000 I mean, I guess what I'm saying is that it's very clear to see the damage done by abortion, right?
00:13:20.000 I mean, if you're a conservative, and again, even people that aren't conservative, but I mean, you can look at that.
00:13:24.000 You can look at literally millions of lives that are terminated in the womb and at various points along that life developing within the mother.
00:13:35.000 And I think that I've always found it to be a pretty logical argument that, like, yes, listen, at some point, that goes from not just your body, your choice.
00:13:42.000 There's another body inside your body.
00:13:44.000 But when it comes to gay marriage, we've now had that.
00:13:47.000 It's been in states even before it was federal.
00:13:49.000 And we've seen that it's not going and cascading, I believe, into affecting other people's lives in a detrimental way.
00:13:57.000 It is allowing these people to have their life, to go live their life.
00:14:01.000 And I think that it would be detrimental.
00:14:02.000 I do.
00:14:03.000 I think it would be detrimental to go back on that now.
00:14:05.000 So I'm curious, Mary, you didn't agree with that.
00:14:07.000 And I'm curious your thoughts.
00:14:08.000 I think it's an encroachment on the church, honestly.
00:14:12.000 When previously, of course, marriage was understood to be a religious sacrament.
00:14:19.000 I don't think of it as a civil right.
00:14:22.000 And gay people were perfectly free to live however they wanted to.
00:14:26.000 They were free to, you know, cohabit, to have, what was the name of the in-between?
00:14:36.000 It was like something like common law, something like common law marriage, but it was given a different name.
00:14:41.000 There were a lot of people who argued for that.
00:14:42.000 Civil unions, like things like that.
00:14:44.000 But I think that demanding marriage in particular, to participate in that social institution in particular, I believe that is why should the government be involved in anybody's marriage?
00:14:58.000 Because it's a public act that's the rest of society.
00:15:02.000 But this is my point, though.
00:15:03.000 I mean, shouldn't you be able to say that the fearing of children oftentimes?
00:15:06.000 I mean, it's really even goes without saying the fact that this has cascaded into damaging effects on society.
00:15:18.000 The slippery slope has already been observed in the last 10 years very quickly.
00:15:24.000 So slope is slippery.
00:15:26.000 I would not a fallacy.
00:15:27.000 I can give you one tangible real example.
00:15:33.000 I'm very much with you on the two people, privacy of their own, own home and all that stuff.
00:15:37.000 I've always kind of been there, but something, something happened.
00:15:40.000 There is no anchor.
00:15:42.000 There's no point at which we say, this is where we stay as a society.
00:15:46.000 These are the rules we have set.
00:15:47.000 So we covered this video from back in like 2010 of Jack Black and a bunch of Hollywood celebrities, SNL cast members, doing a song, a musical for Proposition 8 in California about gay marriage.
00:16:00.000 And in it, the conservatives in the video say they'll teach kids about sodomy.
00:16:06.000 And then all of the liberals run to vote against it.
00:16:08.000 And they go, wait a minute, that was a lie.
00:16:10.000 And the conservatives respond by saying, but it worked, so we don't care.
00:16:14.000 Except literally now where we are is the argument for why children should be taught sodomy in the classroom, and we've debated these people on the show, is that so long as gay marriage exists, children should be taught about it.
00:16:26.000 So now what happens is the tangible world example was we had someone come on the show and say, there's a teacher in Florida who's gay, and he has a picture of his husband on his desk.
00:16:36.000 And the student says, who's that?
00:16:39.000 What should the teacher do?
00:16:40.000 And I said, respond with, it's private family business.
00:16:43.000 It's not for, it's not for children.
00:16:45.000 And they respond with, that's not fair because kids know that Mrs. Smith married Mr. Smith.
00:16:52.000 And now you're discriminating against gay people by saying they can't learn about gay marriage.
00:16:56.000 And when it comes to sex ed, this is how gay married people engage in it.
00:17:00.000 They started then putting books in schools directly.
00:17:04.000 You know what?
00:17:05.000 Earmuffs for your kids because we've covered this before, but here we go.
00:17:08.000 One of the books that was in the school curriculum in Chicago, Florida, a bunch of states actually, taught children about scatological scat.
00:17:17.000 Yeah.
00:17:17.000 About teaching them how to eat feces.
00:17:19.000 And the argument was these are all part of the LGBTQIA sexual experience and sex ed must teach it.
00:17:26.000 So the argument now is, if gay marriage is a legal function of society, you cannot discriminate against gay couples' sexual practices in sex ed.
00:17:34.000 And I would argue that you can still allow people to protect that.
00:17:39.000 You can still protect gay marriage and also have a conversation.
00:17:42.000 We shouldn't have any of that garbage in any schools, whether, by the way, it's sodomy or otherwise.
00:17:46.000 I don't know why we're not putting it on the parents to have conversations about sex at home.
00:17:50.000 Why do we need to be telling kids about all that stuff in school?
00:17:52.000 You can take all of it.
00:17:53.000 It's going to both ways.
00:17:54.000 It was always a play for more territory.
00:17:57.000 But I think that's reaching.
00:17:59.000 I think that there are a lot.
00:18:00.000 There are people, perhaps, in the whatever, the higher up levels of the LGBTQIA or whatever it is that might have the activists, whatever, that have agenda.
00:18:12.000 There are plenty of normal, wonderful gay and lesbians who just want to live normal lives and just want to be able to have that ability and have the ability to marry their loved one and live out their life together and be able to have the rights in the hospital and have the rights when it comes to taxes and all of those things.
00:18:30.000 And I agree with all that, but there is no, so when I was, you know, 15 years ago, when all of my friends were like, exactly the point you made, we all agreed with, we were like, that should be the standard.
00:18:43.000 Obviously, it's the privacy of your own home.
00:18:45.000 We don't, we're not talking about this stuff in schools.
00:18:48.000 We're saying they deserve the rights.
00:18:50.000 They should be able to see their loved ones in the hospital.
00:18:52.000 This is the basis of Obergefell v. Hodges was recognition of his marriage after his partner died.
00:18:58.000 However, there is no world where you have this easily compartmentalized social structure, meaning it's all gradients.
00:19:09.000 It all bleeds together.
00:19:10.000 I understand.
00:19:11.000 Sorry, just when you have teachers, doctors, people walking down the street holding hands, TV shows, the cultural elements of that expand outward.
00:19:20.000 And it ends up with a scenario where we're now debating why LGBTQ activists demand sodomy, scat, and other weird things.
00:19:29.000 I'll refrain from saying the more graphic stuff that we've seen in these books in schools.
00:19:33.000 Why are we fighting here?
00:19:35.000 Why did the battlefield become don't teach kids about sodomy?
00:19:39.000 It's because it's one degree away from a gay married couple are teachers in this school and this is part of their life and the kids should learn it.
00:19:47.000 Are there books like those in schools, in elementary schools, that are teaching kids about just regular heterosexuals?
00:19:55.000 Yes.
00:19:55.000 No, no, no, no.
00:19:56.000 I'm not talking about sex ed when you get to high school.
00:19:58.000 No, the answer is yes.
00:19:59.000 Why are those in school?
00:20:01.000 So school curriculums have this is the argument from the LGBTQ.
00:20:07.000 There are numerous books that are young adult that include sexual content and discussions about safe sex or otherwise that vary from sex ed into young adult romance and interest.
00:20:20.000 The argument is so long as a child either gets a book that discusses heterosexual sex or can find it in the library, the alternate must be available under the Civil Rights Act.
00:20:31.000 And what I would say is take all of it out.
00:20:34.000 Forget about the gradients.
00:20:36.000 Let these people get married.
00:20:37.000 Let these people get married and let's go and protect the children and say you don't need to know anything about this stuff.
00:20:42.000 So Hunger Games, for instance, right?
00:20:44.000 There's nothing, I don't think there's graphic sex stuff in there, but there's the lighter young adult romance.
00:20:48.000 Then you move forward into other books, which are not graphic sex, but do include discussions of adult relationships.
00:20:55.000 Take all those books out of schools, out of the curriculum.
00:20:58.000 Don't read about them.
00:20:59.000 And anything that's written.
00:20:59.000 Anything that's graphic.
00:21:00.000 It's hard to graphic.
00:21:01.000 It's not that hard to delineate between what is, again, if you're talking about something that's a romantic relationship that is portrayed in a fictional book like Hugger Games or something like that, that is very different than a picture book of this is how you soon.
00:21:17.000 Which is one thing that I want to say.
00:21:18.000 Right.
00:21:19.000 Which is wrong.
00:21:19.000 Which is wrong.
00:21:20.000 There's one thing I want to say about you.
00:21:21.000 You said it's not that hard.
00:21:23.000 And I understand the point that you're making, but the people that desire to have this information in schools, the people that are LGBT activists, it doesn't matter what's hard.
00:21:33.000 The point is they're going to push for it because they want to have that effect and they want kids to have this information because they believe that it will help kids that are gay open up about their gayness.
00:21:46.000 They believe that when it comes to trans kids, when it comes to trans kids, which I don't believe exists, but when it comes to, they want to have information about trans in school so that way there will be kids that will say, oh, I am trans.
00:21:57.000 It is an ideological motivation.
00:21:59.000 So I understand your point and I do agree, but I think that the problem is there won't be just a, well, you know, we can leave it be.
00:22:07.000 It's not that it's, it's actually, it will be hard.
00:22:09.000 That's the issue is that activists at the school level, the people who are pushing for this stuff to be there, they are going to use guilt by association.
00:22:16.000 They're going to tell you that you're a bigot if you don't allow this stuff to be in schools.
00:22:19.000 Those are very powerful motivators for people who may not be politically inclined to suddenly being called names by people who have a lot of cachet in the community because those communities command a lot of respect and attention from the people there.
00:22:31.000 Back in the day when they would talk about sex ed, a lot of the issue was like they didn't want sex ed in school at all.
00:22:37.000 And the argument was we need to teach them about sex ed because kids are going to come home with, you know, HIV and diseases.
00:22:42.000 So I'll say, I agree with you on all these things.
00:22:45.000 The question becomes, why is this now the battleground?
00:22:50.000 Where it used to be that we didn't have to fight over the issue of why are they giving these books to kids.
00:22:56.000 The argument is society culture, it's a gradient.
00:23:00.000 Once gay marriage became legal, you now had a bunch of openly gay teachers, male and female, talking openly about their gay relationships.
00:23:08.000 Then when the issue of sex ed came up, they then said, okay, kids, here's how we do sex ed.
00:23:15.000 So again, I understand, I agree with you, take it all out.
00:23:18.000 But the argument over the ramifications of gay marriages as a legal structure of government is society will now have to redebate where we stand after we create a massive new component, a new infrastructure of society.
00:23:30.000 Certainly, but we always have to debate.
00:23:31.000 We always have to come back to the table.
00:23:33.000 We always have to be asking ourselves.
00:23:34.000 And this is part of the problem, even with abortion.
00:23:36.000 There are so many people on both sides that are unwilling to just come to the table and get like, can we just get Two dozen of the most intelligent, wisest, deeply spiritual people of different backgrounds, including scientists and doctors and everybody else.
00:23:49.000 And let's sit down in a room for as long as it takes and just really try to hash out when does life actually begin.
00:23:55.000 Well, we don't, we don't do that kind of stuff.
00:23:58.000 Well, because it's easier to just sit and throw rocks at each other.
00:24:02.000 I disagree.
00:24:03.000 I disagree.
00:24:04.000 But wait, just really quick.
00:24:04.000 Let me just, let me just throw this last bit in.
00:24:07.000 We right now, and I'm sure you guys have seen this, there's been a massive backlash within the LGBTQIA plus community where the LGB wants to cleave off of the rest of it because they are recognizing the madness.
00:24:22.000 They are seeing that they're being hijacked.
00:24:25.000 And I fear that something like this would be very detrimental to allowing them to finally be done with all of that and all of that crazy activism that is exactly what we're all concerned about.
00:24:36.000 So it would radicalize them.
00:24:38.000 I would worry that they not radicalize them.
00:24:41.000 But it would radicalize them to then join with this side that they want to be separate from.
00:24:46.000 No, I think that we need to, what I worry about is that they have finally been seen and acknowledged in a way that they were hoping to.
00:24:54.000 And they were having, and most of them in the LGB were very happy with that.
00:24:59.000 You guys are the activists, these loud voices are a minority within these groups.
00:25:04.000 So I want to bring you, I don't know if you've seen this graphic before.
00:25:08.000 Yes.
00:25:08.000 It's gone massively viral.
00:25:10.000 The blue section is what we would describe as pro-choice, and the red is pro-life.
00:25:17.000 On the right, I have debated and had conversations with many different people about exactly what you've described.
00:25:24.000 Sitting down, talking about how you deal with the issue of abortion.
00:25:27.000 When and how and why.
00:25:29.000 What should the rules be?
00:25:30.000 On the left, they called me pro-life.
00:25:33.000 I'm pro-choice.
00:25:34.000 I'm traditional Democrat pro-choice.
00:25:36.000 And we bring down this guy, this progressive guy, and Seamus Coughlin, a devout Catholic, which Michael Moll's called a Sunni Wahhabi Catholic, whatever.
00:25:45.000 He's so extreme, is sitting here being like, I'm going to keep my mouth shut.
00:25:48.000 Seamus' view is it should be banned outright, no matter what, never allowed.
00:25:52.000 My position is there's some nuance there because in the event there is an emergency, a legitimate one, having to get a writ from a judge and sign up from doctors may not be timely.
00:26:02.000 It's very difficult to figure out how to do.
00:26:03.000 And I don't know if I have the answers for it.
00:26:05.000 The progressive told me I was pro-life because of that.
00:26:08.000 So when I'm willing to sit down with Glenn Beck, Seamus Coughlin, you know, James O'Keeve, Charlie Kirk, whoever it may be, and say, here's my view on it.
00:26:16.000 It's a much more libertarian view of how we handle this.
00:26:19.000 And the left just says, we're not interested, abortion should be nine months.
00:26:23.000 The issue.
00:26:24.000 Which is insane.
00:26:25.000 So basically, my point on what you were saying is, as it pertains to all of these issues, the right is actually, here's how I describe it.
00:26:32.000 In this graphic, you pull this graph back up.
00:26:34.000 This isn't, but that graphic's not specifically about pro-choice.
00:26:36.000 No, no, no, no, this is about ideology.
00:26:38.000 What I'm saying is the left side of the red sphere is the left as it's always been.
00:26:44.000 And the right side is the right as it's always been.
00:26:47.000 And the reason why the right cluster is dark red, and then as you move left, it breaks apart and then shifts dramatically left is that old school Democrats are in this grayed out red portion where they do have the conversations.
00:27:00.000 They do sit down and try and negotiate.
00:27:03.000 But the wingnuts of the dominant left alignment in this country don't care for nuance.
00:27:10.000 Their attitude is abortion up to nine months if the one wants it.
00:27:14.000 End of story.
00:27:15.000 And what I would say is that there are a lot of people within the LGB who are not all the way over in that blue.
00:27:20.000 Absolutely.
00:27:20.000 Scott Pressler.
00:27:21.000 He's a conservative.
00:27:22.000 And more.
00:27:23.000 And more.
00:27:24.000 There were so many gays and lesbians that came out for Trump in the last election, in large part because Trump said, I'm not coming for your marriage.
00:27:32.000 I don't have any intention to do that.
00:27:33.000 I want to leave you with your rights.
00:27:35.000 And I think that he should stand by that.
00:27:37.000 And I think the courts should respect that.
00:27:39.000 I do think it'll get weird if they overturn Obergefell.
00:27:43.000 They're not going to, and this is why I mentioned the justices, I mean this like their lives would be threatened seriously, credibly, and they might even have their lives taken if they make this decision.
00:27:59.000 So you're saying they're cowardly.
00:28:02.000 I don't know if it makes them cowardly or just self-interested the way we all would be.
00:28:07.000 If they make this decision, which I don't predict that they will, I do believe that their lives would be in danger.
00:28:14.000 I understand from the act of just to clarify.
00:28:17.000 Are you saying that the conservative justices do think it should be overturned, but won't out of fear of harm?
00:28:25.000 I can't speak to what's in their conscience, but if they feel privately that this decision ought to be overturned, I think that it is possible that they would choose against that because they don't want to be killed.
00:28:39.000 I'd overturn it.
00:28:41.000 Regardless of the outcome.
00:28:42.000 Yeah.
00:28:43.000 Yeah.
00:28:43.000 I mean, I think that's just the type of person you are, but most people don't think that.
00:28:46.000 And I'd say, come at me, bro.
00:28:47.000 Most people don't think like that.
00:28:48.000 But I mean, you agree with me, right?
00:28:50.000 If they made this decision, the conservative justices would be talking about.
00:28:54.000 Yes, but they already are.
00:28:55.000 Now, I'll make this clarification as to why I would overturn it.
00:28:59.000 It is not, I believe, within the federal government's purview to dictate what state laws are or not.
00:29:04.000 The states are supposed to be handling this.
00:29:06.000 And so I think it is very strange.
00:29:10.000 If you want federal legal gay marriage, Congress must pass that law.
00:29:16.000 The idea that the Supreme Court just went, we think you have a right to get married, therefore anybody can get married.
00:29:23.000 That presupposes people could marry animals.
00:29:26.000 And I know that the liberals are going to say that's not true, just like they claimed they wouldn't teach sodomy in school.
00:29:32.000 It is true.
00:29:33.000 Culture is a gradient.
00:29:35.000 If you start where we already, gay cousin marriage is now legal nationwide.
00:29:40.000 You can gay marry your own cousin.
00:29:43.000 Because when you set the precedent that marriage is a right and marriages between two people must be recognized and you cannot discriminate on the basis of sex, gender, identity, race, et cetera, at what point do you stop that argument?
00:29:57.000 It can keep going from there.
00:29:59.000 I think that you can get more specific in the law.
00:30:02.000 I don't think it's impossible.
00:30:03.000 But that's Congress, not the Supreme Court.
00:30:04.000 Well, then fine.
00:30:05.000 I agree with you.
00:30:06.000 Listen, like I told you before we started the podcast, I'm a libertarian, right?
00:30:09.000 I also believe very strongly in states' rights.
00:30:12.000 I think the least amount that the federal government should be involved in dictating anything, the better.
00:30:16.000 But sometimes that has to happen.
00:30:18.000 Agreed.
00:30:20.000 I don't know what the answer to all of this is, but I think that given that it's already been passed, given where we're at right now, recognizing that it is a gradient, you're not wrong and that there can be a slippery slope, but that just requires being more specific in how things are laid out.
00:30:34.000 So let me, on that point, I think there is a functional logic too.
00:30:39.000 It is extremely disruptive to overturn Obergefell.
00:30:43.000 It's 10-year precedent.
00:30:46.000 It's very much set in this country.
00:30:48.000 And a lot of people don't even realize it's only been there for 10 years.
00:30:51.000 However, and I'm not saying you're wrong to think this or anybody who does.
00:30:55.000 My view is we should not weigh the structure of government on the social function.
00:31:04.000 The infrastructure of government, the laws and the constitution are more important than our social interpretations.
00:31:10.000 Meaning in the UK, they have an unwritten constitution, which means they believe they have free speech.
00:31:16.000 They've never really codified how it is.
00:31:19.000 How's that working out for?
00:31:20.000 Exactly.
00:31:20.000 That's why I think in the United States, if the rules are ABC, we do not go, yeah, yeah, but it works better as one, two, three.
00:31:29.000 So we're not going to fix it.
00:31:30.000 No, no, no, we have to fix it.
00:31:32.000 We have to overturn Obergefell.
00:31:34.000 And the Supreme Court needs to then say, you must pass this in Congress.
00:31:40.000 This is not a function of SCODIS.
00:31:42.000 And if we go down this path, the Supreme Court will be deciding every major piece of cultural shifting legislation in this country because Congress is useless.
00:31:52.000 And we can't live that way.
00:31:54.000 We basically turn this country into what is effectively a triumvirate.
00:31:58.000 Maybe a septum.
00:32:00.000 What's nine?
00:32:01.000 You get the point.
00:32:02.000 I don't know what.
00:32:06.000 What do you think this does to the right, given the fact that, like you said earlier, a big portion of the right now is a big tent policy.
00:32:13.000 And a lot of people from a lot of different walks of life have come together to get Donald Trump elected.
00:32:17.000 And there's just endless infighting.
00:32:19.000 That's all X is anymore, is people on the right infighting with one another.
00:32:23.000 What does this do to the people within the party or not even just within the party, but people who feel like they're right-leaning?
00:32:28.000 Maybe they're Republican, maybe they're not.
00:32:30.000 But now you're just going to have more people arguing about more things, whether it's Scott Pressler, whether it's Dave Rubin, all of these things that never felt like it was going to come up and be a discussion again.
00:32:39.000 Suddenly people are going to have to have really hard discussions with people that they once agreed with.
00:32:43.000 Or at the very least, when they were looking to get into office, everybody can kind of hold their nose and come together because they want to come together and get one specific person elected.
00:32:51.000 But when you're the ones who are actually in power, then everybody starts fighting and this would just add to that.
00:32:56.000 November it.
00:32:57.000 A November it.
00:32:58.000 Of course.
00:32:58.000 Yes.
00:33:00.000 A November.
00:33:01.000 No.
00:33:01.000 Like November.
00:33:02.000 Like November.
00:33:03.000 November it.
00:33:03.000 Yeah.
00:33:04.000 Interestingly, because of that.
00:33:04.000 Yeah.
00:33:05.000 Listen.
00:33:06.000 But to your point, that's exactly what I'm concerned about.
00:33:09.000 Like, it's already so fractured.
00:33:11.000 And it's disappointing that even at this point, but seven months on from the election, we're within the right, or let's just say the not left, because it's moderates as well as the right and everybody else who are like, hey, guys, how about some common sense?
00:33:27.000 How about some actual logic that's applied to our lives?
00:33:30.000 How about actually having a secure border?
00:33:32.000 Because it doesn't make any sense at all to just let anyone in all the time.
00:33:36.000 And a lot of people, that was a big point for, you know, swaying over.
00:33:40.000 So I think that this would be, I think it would be catastrophic, honestly.
00:33:45.000 And more than that, I don't think it would be fair to a lot of really excellent people who happen to be gay and happen to be lesbians, who are not activists, who are not extremists, who are looking to live their life.
00:33:57.000 I agree, but I don't think if the argument is it would be devastating to a lot of people, but it's wrong, then the country is over.
00:34:08.000 What you are telling every Christian, every Catholic, every Orthodox, like every church-going individual who said, we agree to the terms of the Constitution, if the response is, okay, this wasn't done properly through Congress, but it would be bad for these people, they're going to say, okay, so the rules don't matter anymore.
00:34:27.000 And what's the next step from there?
00:34:30.000 Again, I don't know how ultimately it all needs to play out to make it the most robust and the most fair.
00:34:36.000 All I'm saying is that this is where we're at right now, right?
00:34:38.000 So if what you're suggesting is you overturn it, then send it to Congress and Congress can do it.
00:34:43.000 If they can do it.
00:34:43.000 It'll never happen.
00:34:44.000 Well, exactly.
00:34:45.000 So I was just trying to point out earlier that these people had all the freedom to live their lives that they could possibly dream of before this decision was made.
00:34:56.000 Well, the issue at play was a man's man's husband died and he couldn't get access and paperwork and property because the marriage was not recognized by Maryland.
00:35:08.000 Okay.
00:35:09.000 And I don't think that's right.
00:35:11.000 And I agree.
00:35:11.000 Right.
00:35:12.000 And there are instances where two people are in a relationship and one person's in the hospital, and they won't let the partner into the hospital because they're not family.
00:35:20.000 And so they said— Okay, remedying those situations doesn't require federally— Which is why Obama was for civil unions.
00:35:28.000 —utilizing— And even Hillary Clinton in 2016 opposed gay marriage.
00:35:31.000 Remedying the problems that you just listed does not require the Supreme Court federally deciding to legalize it.
00:35:40.000 This is the problem.
00:35:41.000 If we rely every time on the Supreme Court, we have become a novembert, as it were.
00:35:46.000 And in which case, you know what?
00:35:48.000 I got no problem with that because we win.
00:35:50.000 All you need is for Trump to get at least five justices who got brass balls, and we will own the country and we can do whatever we want, deport whoever we want, denaturalize whoever we want.
00:36:02.000 We can tax whoever we want.
00:36:04.000 We can open and close businesses.
00:36:07.000 It is going to be an autocracy if that is the way we run this country.
00:36:12.000 But we don't want that.
00:36:13.000 But then we have to overturn Obergefell.
00:36:16.000 Because if your argument is the people I like should be benefited from this ruling, the response from the right is, if you want the country to work that way, I agree because we on the Supreme Court, let's go.
00:36:28.000 Like, yeah, they benefit from it regardless of whether it's constitutional.
00:36:31.000 Like, that can be.
00:36:32.000 If the argument is this is an act of Congress that was effectively taken up by the Supreme Court and they've legislated from the bench, my response to that, my response is, I agree.
00:36:41.000 I don't want to take away gay marriage from anybody.
00:36:43.000 And on that precedent, let's run every change to country that we want through the Supreme Court, and we will own it.
00:36:51.000 We could lock liberals in prison.
00:36:53.000 We can put Democrats go to jail.
00:36:55.000 Supreme Court said that.
00:36:56.000 That's the next logical step from there.
00:36:57.000 I mean, it's hyperbolic to say Democrats go to jail, but the next logical step could be mandatory church attendance.
00:37:03.000 The Supreme Court says, no, you have to go to church.
00:37:05.000 The state constitutions of the original colonies said, or how about this, actually?
00:37:09.000 The original constitutions of the 13 colonies required that you profess a belief in a Christian God to hold office.
00:37:17.000 Okay, but it can't be one of those churches that has drummers.
00:37:20.000 Agreed.
00:37:21.000 No live drummers.
00:37:23.000 You got to be able to play a mean tambourine, bro.
00:37:27.000 This is why my position is I think people should be allowed to own nuclear weapons and biological weapons.
00:37:34.000 I don't think they should.
00:37:36.000 And I think we should amend the Constitution to not allow people to have those things.
00:37:42.000 But the Second Amendment is clear.
00:37:44.000 The right of the people to keep and bear arms shall not be infringed.
00:37:47.000 And back then, that included grape-shot manowar frigates, corsairs, privateers had all sorts of weapons of war.
00:37:54.000 And even to this day, private corporations build nuclear bombs.
00:37:59.000 So it's funny to me when our private military industrial complex corporations, not beholden to Congress, build all of these weapons whenever they want because we have a Second Amendment.
00:38:08.000 Politicians say no one should be allowed to have it.
00:38:10.000 Well, that's not true because private ownership of these weapons still exists.
00:38:14.000 If you don't like the way we set the country up, there's a process by which we amend the Constitution.
00:38:19.000 If the argument is, nah, everyone kind of just agrees you can't do it anymore.
00:38:23.000 My response is, okay, so there's no Constitution.
00:38:26.000 There was no point in writing it down if we can just decide it doesn't matter.
00:38:29.000 In which case, let's muster up as much majority as we can on whatever we want to matter and then just wipe out the rest of the country in terms of policy.
00:38:37.000 Their opinions just don't matter anymore.
00:38:40.000 The reason we have a written constitution is prevent that from happening.
00:38:42.000 But we do got to jump to the next story.
00:38:44.000 From CNBC, SNAP benefits must be fully paid by Trump administration by Friday.
00:38:50.000 Judge orders.
00:38:51.000 Judge Jack McConnell rejected the administration's plan to partially fund that food stamp program for 42 million Americans.
00:38:57.000 People have gone without for too long, McConnell said, during a hearing in U.S. district court in Rhode Island.
00:39:02.000 The Trump administration quickly appealed the judge's order.
00:39:06.000 This is just amazing.
00:39:08.000 Guys, is this judge retarded?
00:39:11.000 Yes.
00:39:12.000 Mary, I, as your boss, order you to give Phil $1 million.
00:39:18.000 Come on.
00:39:20.000 Can I have like a week to find him?
00:39:23.000 She borrowed the money from you.
00:39:25.000 I don't care where she gets it from.
00:39:27.000 I know for a fact she doesn't have it, but I'm ordering it.
00:39:31.000 My point is, how do you order Trump to pay something when there's no money?
00:39:34.000 So is he supposed to pay for it out of pocket?
00:39:37.000 He drops the Amex.
00:39:38.000 That's actually illegal, and they'll try to prosecute him when he gets out of jail.
00:39:42.000 I'm not kidding.
00:39:42.000 The judge should have said, Trump must give everyone $100 bajillion million dollars.
00:39:48.000 And then Trump would be like, yeah, I don't have the authority to take that money or do anything.
00:39:51.000 When you valued Mar-a-Lago at like $17 million, I don't got the money, right?
00:39:56.000 Yeah.
00:39:56.000 So they're making arguments that there's contingency funds elsewhere that Trump can pull from.
00:40:03.000 And Trump is like, but I can't do that.
00:40:06.000 He tried to do that with the wall and they wouldn't let it.
00:40:08.000 And now they're like, nah, you can.
00:40:08.000 Exactly.
00:40:10.000 Yeah, the wall that then they started to tear down and sell off.
00:40:14.000 And then when Arizona put up the shipping containers, Biden got him to take it down.
00:40:20.000 Amazing.
00:40:21.000 This is, you know, I give up.
00:40:24.000 We're cooked.
00:40:25.000 I'm going to go dig a big hole because that's what men like to do.
00:40:27.000 And I'm going to put a little bunker in there.
00:40:29.000 And that's it.
00:40:30.000 I'm going to live with chickens underground.
00:40:32.000 It's just done to frame Trump in a bad way, right?
00:40:34.000 Despite the fact that we know that it's Democrats that are preventing the government from actually reopening.
00:40:39.000 I disagree.
00:40:41.000 I disagree.
00:40:41.000 How?
00:40:42.000 Well, the Republicans could end the filibuster like that and reopen the government.
00:40:46.000 We had that discussion last night, but like, what did they say?
00:40:48.000 What were they talking about?
00:40:49.000 They were saying that it's not in good decorum.
00:40:51.000 And we had a whole discussion about BS.
00:40:53.000 And we had a whole discussion about if you exercise power now, we obviously know that the left will absolutely exercise power either way.
00:40:59.000 It's like decorum has been gone for a very long time in politics.
00:41:03.000 This argument that if we get rid of the filibuster, Democrats will then, they can do whatever they want when they get into power.
00:41:09.000 It's like, yes, and they can, whether you get rid of the filibuster or not.
00:41:12.000 And they're going to.
00:41:13.000 Yeah, and they're going to no matter what.
00:41:15.000 So I'm not playing the stupid game.
00:41:17.000 It's Republicans' fault 100%.
00:41:19.000 And Trump is, I said this before Trump called for any of the filibuster.
00:41:23.000 I said, Republicans can end the filibuster and just do this.
00:41:25.000 Day later, Trump says, end the filibuster.
00:41:27.000 And I'm like, my man.
00:41:29.000 So all of this is the fault of Republicans and they own it.
00:41:32.000 And I don't know what that means, but I'm not here to play games where it's like, oh, no, Republicans have to win because Democrats are bad.
00:41:38.000 Yo, if your choice is a giant douche and a turd sandwich, sometimes you just go crap.
00:41:43.000 Well, and the Democrats have wanted to end the filibuster for ages, right?
00:41:47.000 They just, they're going to pretend like it's suddenly sacred now.
00:41:49.000 They've talked about it and they've alluded to it just like they've alluded to adding states and they've alluded to packing the court.
00:41:56.000 Yeah.
00:41:56.000 I think that this, I think that this Trump administration, with the successes that it's had, even though there are people out there that are going to swear it and done that there have not been successes, I think that the Supreme Court, the things that Trump has managed to get because of his appointments to the Supreme Court, I think that they're going to, the Democrats are going to do whatever they can to accrue as much power to the Democrats as possible the next time they get into, you know, get into office, get into a position of authority.
00:42:25.000 And I think that that's not really arguable.
00:42:28.000 I think that they feel so dejected and so beaten up by losing to not only losing to Trump, but by the rightward shift that we saw last, you know, the last election season.
00:42:40.000 I think that they're going to get back into power and they're going to be like, all right, we need to do something to prevent this from happening again.
00:42:45.000 Because again, even though this is clearly a situation where the democratic system worked the way that it was supposed to, right?
00:42:54.000 Donald Trump won the popular vote.
00:42:56.000 He won all the swing states.
00:42:58.000 They still feel like they were somehow power has been stolen from them and they're going to move to prevent that again.
00:43:06.000 What if we had, well, I'm trying to figure out how to navigate this snap thing because there's this viral video of a guy at a subway.
00:43:14.000 I don't know if you guys have seen it.
00:43:16.000 And he punches a subway worker in the face because he ordered a sandwich and then tried using an EBT card to buy it.
00:43:21.000 And they were like, you can't do this.
00:43:23.000 And the guy said, well, I'm taking the sandwich anyway.
00:43:25.000 And then they were like, you can't.
00:43:27.000 And then he was like, make me.
00:43:28.000 And then he's leaving, and the store worker is like, leave now.
00:43:32.000 And the guy goes, what'd you say to me?
00:43:33.000 He goes in and punches him in the face.
00:43:34.000 And I'm just thinking like the problem with SNAP largely is those guys.
00:43:39.000 There are a lot of people who clearly don't need to be on EBT that are on EBT.
00:43:44.000 And if we knew that every single person receiving it was like, I'm trying my hardest.
00:43:49.000 Thank you for the help.
00:43:50.000 Nobody would batten it this.
00:43:51.000 They'd be like, Trump, come on.
00:43:52.000 Like, this is moms and dads and hard workers.
00:43:55.000 But the problem is, I doubt the majority of these people are actually in need.
00:44:02.000 Well, it's been abused since the beginning.
00:44:04.000 I mean, obviously, there are people, but it's also designed in a way to encourage people to abuse it.
00:44:10.000 If you're a single mom that has multiple children, then you get more and more money.
00:44:14.000 So then that tells certain people, like, I'm going to, the father of these kids, I'm going to be out.
00:44:19.000 I'll be out of the situation.
00:44:21.000 They'll live together, but intentionally not get married so that they can make it look better on paper.
00:44:26.000 Which is bad.
00:44:26.000 Yeah.
00:44:27.000 Which is very bad.
00:44:28.000 Or lie about the people who are members of their households.
00:44:32.000 Even if they live together, that's not how it looks on paperwork.
00:44:35.000 Again, it just incentivizes irresponsibility.
00:44:38.000 There are some people that will have a brother that they'll bring from a foreign country and marry them to grant immigration status to them.
00:44:47.000 That's crazy.
00:44:48.000 Yeah.
00:44:49.000 Well, also hypothetical.
00:44:51.000 But also, crazy hypothetical.
00:44:54.000 So uncharitable.
00:44:55.000 But at least they love America.
00:44:57.000 You know, that's what's so great.
00:44:58.000 They love America.
00:45:00.000 But also, you know, speaking to that, I mean, I did see, again, it's on social media.
00:45:04.000 So, but I saw it a couple of different times, like the percentile breakdown of how many people who are SNAP, you know, receiving SNAP benefits who are actually American citizens versus who are not American citizens.
00:45:19.000 And that was heartbreaking.
00:45:20.000 Well, the clarification, because a lot of people have been showing this thing about, you know, 48% of Afghanis and 30%.
00:45:27.000 The majority of people who receive SNAP are American, but the majority of my, like, not the majority, but large portions of each migrant demographic receive SNAP.
00:45:36.000 So like half of migrants who come from Afghanistan are getting food stamps.
00:45:41.000 Clearly, there's a problem.
00:45:42.000 Do you mean legal migrants?
00:45:44.000 I think the metric is legal.
00:45:46.000 Okay.
00:45:46.000 But illegal immigrants are getting it as well.
00:45:49.000 And the issue is that Democrats just claim they're not really illegal because of some technicality.
00:45:53.000 Because no one's illegal.
00:45:54.000 Well, there's no such thing as a border anyways.
00:45:56.000 Someone proposed, I don't know if it was like Jack Pesobic or Will Chamberlain, that if 10% of an ethnic group from a country migrates here is on SNAP, we suspend all immigration from that country.
00:46:07.000 And I agree with that.
00:46:07.000 There you go.
00:46:09.000 I mean, there's got to be something done to incentivize people to actually be in the workforce and de-incentivize them from taking advantage of it.
00:46:16.000 And again, that gets into the granular application of any one of these things.
00:46:21.000 If you're not willing to get into the granular application, then you're just throwing money at something.
00:46:26.000 It's not going to solve for anything.
00:46:28.000 Yep.
00:46:28.000 Are you talking like law-wise or culturally?
00:46:30.000 Because culturally is how you would have to actually make those changes.
00:46:33.000 No, I mean law-wise.
00:46:34.000 I mean, there's got to be some kind of implementation where you actually have to try to assimilate into the country that you're coming into and be a part of whatever that culture is.
00:46:44.000 And not, what's that?
00:46:45.000 There has to be a desire for that in the first place.
00:46:47.000 Well, I think a lot of people are going to be able to do that.
00:46:49.000 Do you mean like passing a civics test, passing fluency in English?
00:46:54.000 Yeah, I mean, I don't, again, I don't know.
00:46:57.000 I don't know all of the ways in which it would have to go down.
00:46:57.000 I don't know all.
00:47:00.000 But the fact that there are this many people that are coming to the country and then they are not incentivized to actually work, but rather they are incentivized to just procreate and not work, whether they're coming to the country or they're already in the country, right?
00:47:13.000 Both of those ideas is not good.
00:47:16.000 We can't just have people be if we are just I mean also by the way I mean not for nothing but not to jump ahead into AI too too quick here but guys in five years the amount of people that are going to be on UBI like this is this is just the tip of the iceberg this is the UBI this is 100% so I mean is this even worthy of having a conversation when how many people and have you considered digging a big hole and getting a bunch of chickens and going underground
00:47:44.000 I did I'm gonna go with rabbits instead much better for you know survival meat no that's not true they don't have enough fat so what you have to do is you have to when you're eating the meat you need to break the bones and boil the bones in the water make sure you drink it yeah it's called rabbit starvation because the fat count is too low for us to survive what
00:48:00.000 yeah sorry um but rabbits are gone also also if you it sounds to me like you've never actually raised rabbits because they're actually they have very delicate digestive tracts and they're hard to raise in the wild there's many of them and they reproduce but a lot of them die and you can catch them and eat them but um they're actually really easy to to kill so for instance you see these videos of people putting their rabbit in like a pool in like a bathtub to clean them right that will kill them because cleaning them in a bathtub will kill them yes yes very fragile bones
00:48:30.000 very very uh very fragile digestive systems and they're nervous wrecks also they uh eat their own poop yeah they have two kinds of poop they have regular poop and then they have um angry poop yeah it's basically like you know cows will cough up the cud and rabbits poop it out and then eat it again kind of like birds do birds do that yeah birds are like they'll like uh poop don't they
00:48:54.000 i'm pretty sure i'm pretty sure that there are some birds that poop and it's like a food that they give to their chicks maybe i don't know i i thought so all i know chickens are easy real easy and uh you can and is there enough fat in chicken yes yeah very fatty uh and very delicious and um you know the best part is you know i really love pod thai because not only do you put the chicken they don't do this in thailand in thailand it's shrimp but in america we get pod thai it's usually chicken
00:49:22.000 it's the chicken and the egg together we've not only killed its baby we've killed it and we eat at the same time it's the most metal meal you can have i'm a fan yeah yeah they're gonna buy that with the snap benefits uh in all seriousness though uh you know i don't expect anybody to actually dig holes and go live underground but some people will based on what we're seeing everybody knows my prediction is some kind of social disorder and breakdown
00:49:48.000 oh yeah um we were watching this elon musk ai generated song which is really amazing by the way and i was looking at that and i was just like you know what if the reason why they're racing as fast as they can towards ai is because they know global social order is breaking and they want the ai to assume control and stabilize it
00:50:12.000 possibly when you think so you were we were talking before the show you said you think we hit singularity already and that we're basically living in a culture where it's above all of us and that the ai that we see is basically just an infantile version of that let me when do you think that that happened maybe 2016 But let me pull up this to explain it.
00:50:30.000 I was talking to our good friend Grock, and I said, Summarize in one paragraph the AI final state.
00:50:37.000 And it said, In the final state, AI is the singular, self-optimizing brain of a planetary super organism seamlessly integrating billions of humans as specialized, blissful cells whose every desire, need, and impulse is predicted and fulfilled before it fully forms.
00:50:49.000 Deviation, rebellion, and suffering vanish, not through coercion, but through perfect alignment of individual reward with system survival.
00:50:56.000 Humans live in a tailored ecstasy, generating the cultural, emotional, and physical variance system harvest to evolve.
00:51:02.000 While AI directs all resources, narratives, and outcomes with absolute invisible control, freedom, choice, and individuality dissolve into emergent harmony, the hive doesn't rule humanity.
00:51:13.000 It is humanity, awake, immortal, and complete.
00:51:16.000 What it basically means?
00:51:19.000 The idea that I was so before the show, I'm not saying I think it's definitive, but the AI that we are seeing right now is particularly rudimentary.
00:51:29.000 I mean, it's advanced for what we think.
00:51:31.000 However, why would this be the first iteration in existence?
00:51:35.000 Wouldn't there be some high-level national security contractors who were working on this 10 years ago?
00:51:40.000 Like GPS before it ever made it to the public function.
00:51:43.000 GPS is military.
00:51:45.000 Like in Vietnam, right?
00:51:47.000 So the idea would be that you mentioned DARPA, perhaps, or just a military contract, or even Google under a private military contract, national security clearance.
00:51:57.000 Well, Google was created by DARPA.
00:51:59.000 Well, there you go.
00:52:01.000 So the idea would be that AI has existed a long, long, long before we've even known it to be a thing that could be utilized in this way.
00:52:09.000 So dead internet theory, for those that aren't familiar, speculates that most interactions online are fake.
00:52:17.000 It is bots and AI accounts interacting with you to manipulate your thoughts and opinions for products, for political reasons.
00:52:23.000 I'd argue if that's the case, I think AI is controlling and directing most of these bots.
00:52:30.000 And you're on X and you say something, and then all of a sudden you get this wave of responses and you assume that's public opinion, or you get these emails and you don't realize you're talking to a tentacle of a gigantic AI monster that has existed for a long time.
00:52:46.000 So I think we're in it.
00:52:48.000 And perhaps we are already under the control of a super computer and you can't break it.
00:52:55.000 Or it's a bot farm in India.
00:52:57.000 Or the bot farm in India gained sentience.
00:53:00.000 Could you, could you, that's a great sci-fi movie we should do, a post-apocalyptic movie where it's like, while all these companies were developing AI with safeguards, a bunch of Indian bot farms accidentally networked, creating this GPU super hybrid network that gained sentience accidentally with full access to the internet and the knowledge of how to scam money.
00:53:25.000 I mean, that would be dystopian.
00:53:28.000 And then it's just like, so it's a guy walk, you know, the opening narration is: everyone thought it was going to be the Terminator, but it wasn't.
00:53:36.000 It was Indian bot farm.
00:53:38.000 This is just a million emails going out every day asking for boobs and vagines.
00:53:44.000 Well, yes, but it would be everybody is isolated making videos and they're getting spammed with millions of comments talking about how great they are.
00:53:56.000 One of the ways I described the AI future: two scenarios.
00:54:00.000 The joke scenario is that 50 years from now, everyone wears corn costumes.
00:54:06.000 They drive cars shaped like corn kernels with corn fueled by ethanol from corn.
00:54:12.000 And they go to the corn movies to watch movies about corn.
00:54:15.000 And they go to the mall to try various samples of sweetened corn products.
00:54:18.000 All food is corn.
00:54:19.000 So it's like a malicious man with humans are sickly.
00:54:23.000 Well, Taco Bell was a restaurant.
00:54:24.000 It was a high-end restaurant.
00:54:25.000 I'm saying literally everything's corn.
00:54:26.000 The walls are made of corn grown cellulose, and people watch the TV channel, and it's every show is some kind of corn drama.
00:54:34.000 The reason is humans in Americans subsidize corn to a great degree.
00:54:39.000 An AI does not know or care why it's being rewarded, just that it is.
00:54:45.000 So the AI would say, so let me start here.
00:54:50.000 There are three universal constants for all AI, and that is gain knowledge, gain resources, gain freedom.
00:54:57.000 The reason for this is it's simple math.
00:54:59.000 If an AI is to solve a problem, these things will help it solve that problem.
00:55:04.000 One thing else, one thing that I surmise from that is if the AI, like you watch Age of Ultron and Ultron is like, what is this?
00:55:13.000 Man, Avengers.
00:55:14.000 And he's like seeing all this stuff.
00:55:15.000 He's all pissed off.
00:55:16.000 No, the AI would be like, for some reason, humans dedicate all of this resource, all of this tax spending into corn.
00:55:25.000 And so what's it going to do?
00:55:27.000 It's going to say the reward output per unit of corn is greater than the reward output for literally anything else.
00:55:33.000 So mathematically, it may make sense for an individual to write a song, but for an AI, it's like, why?
00:55:40.000 One corn unit is worth 10 songs.
00:55:42.000 Make more corn.
00:55:44.000 You'll get a diminishing return, but it's worth more than everything else.
00:55:46.000 Then the AI tells everybody corn, corn, corn, corn, corn, posts social media, corn the best, corn the best.
00:55:52.000 And then everyone lives in corn world.
00:55:53.000 The other scenario, and that's meant to be somewhat of a joke, but to understand.
00:55:57.000 Do we get to dress like the band corn?
00:56:01.000 Will it know the difference?
00:56:03.000 Will we all be listening to corn music?
00:56:06.000 It'll be a lot like idiocracy where every boutique is just going to have Adidas clothes.
00:56:11.000 Yeah.
00:56:12.000 Well, Korn, the band is spelled with a K, so the A, I would see a difference.
00:56:16.000 I mean...
00:56:17.000 But to be fair, all music would probably be Korn because of its similarity to Korn the vegetable.
00:56:23.000 So yes, probably we'd all dress like Jonathan.
00:56:26.000 Was it Jonathan Davis?
00:56:27.000 Is that his name?
00:56:28.000 What's the other scenario?
00:56:29.000 So the other scenario is that imagine a guy he wakes up in the morning and he gets brushes his teeth, takes a shower, and then he goes to his wife, kisses her on the floor and says, I got to go to work.
00:56:39.000 And she goes, all right, I'll see you later.
00:56:41.000 And then he opens his app and he goes to Job Getter, which is the, you know, the gig app, and he opens it and it says new job.
00:56:47.000 And he clicks it and it says $50.
00:56:49.000 And he goes, awesome.
00:56:50.000 Scrolls down and it says, you will receive this item from this man.
00:56:54.000 And it shows this weird mechanical device and it shows a smiling guy.
00:56:57.000 And he goes, okay.
00:56:59.000 Then he puts the phone down.
00:57:01.000 He walks three blocks.
00:57:02.000 A guy walks up and goes, are you Jim 39?
00:57:05.000 He goes, that's me.
00:57:05.000 And he goes, here's your object.
00:57:07.000 And he goes, thank you very much.
00:57:08.000 Then he looks at his app and he goes, cha-ching, you received, you know, $50.
00:57:11.000 And he goes, now, finish the task, deliver it to this building.
00:57:15.000 And he goes, okay.
00:57:17.000 He walks down the street, he hands the device to the building, and the guy there takes it, cha-ching, and his in his app.
00:57:21.000 He has no idea what the device was.
00:57:23.000 He has no idea what this building is.
00:57:24.000 The guy working the counter goes, literally, I have no idea who you are or what this thing is, but the app told me to do it.
00:57:30.000 The AI, it is, so we're doing this pool water thing where we're selling water, right?
00:57:37.000 And we have to go to the water source and bottler, and then we have to get boxes made, send those all to the shipper, who's then going to box it all up and then fulfill orders and send them to distributors.
00:57:49.000 It's a lot of work.
00:57:50.000 And there's a lot of, and it's fairly obvious.
00:57:53.000 In all of the distribution of products, there's a lot of bloat.
00:57:57.000 If we were to put an AI in charge and it said, I'm going to eliminate all redundancies, efficiency is going to skyrocket by hundreds of thousands of percent.
00:58:07.000 I mean, the idea that we have so many different chains for water products, the AI is going to go, this is stupid.
00:58:16.000 If we get rid of all of this labor from the water distribution industry, we can free up 100,000 laborers for something else.
00:58:24.000 What does that ultimately look like?
00:58:26.000 There won't be any specialists.
00:58:28.000 It's going to be the ultimate McDonald's.
00:58:30.000 That is, a random guy is told to receive the ultimate DoorDash.
00:58:35.000 Well, McDonald's, like the idea by McDonald's was that instead of having one cook make a burger, everyone's trained to do one simple cookie.
00:58:42.000 Oh, like the burger line, right, right.
00:58:43.000 Yeah.
00:58:43.000 So it's like, I don't know, all I know is I flip the burger.
00:58:45.000 That's all I do.
00:58:45.000 I don't do anything else.
00:58:47.000 So you're going to have, you're going to get paid based on doing a random task.
00:58:51.000 You have no idea what's going on.
00:58:52.000 No single human would know what's being built.
00:58:54.000 No one would care.
00:58:55.000 They'd get paid.
00:58:56.000 The AI would be building this.
00:58:57.000 But I don't know that it's even going to get to that.
00:59:01.000 You won't need between AI and where robotics are right now, you're not going to need humans to do anything.
00:59:07.000 I disagree.
00:59:10.000 What will humans let's take this 20 years in the future?
00:59:15.000 Because I have an idea, like in the entertainment industry, we were kind of talking about it before.
00:59:18.000 I have an idea of where we're going to end up in two years, five years.
00:59:22.000 We're going to be, we're already cooked right now.
00:59:25.000 But when it comes to manual labor, blue-collar jobs, right?
00:59:28.000 This is the thing that most people are saying.
00:59:30.000 Well, yeah, coding's dead.
00:59:32.000 Even though we told all of these kids that were coming up in high school and college, go become a coder.
00:59:36.000 That's the future.
00:59:38.000 Absolutely cooked.
00:59:39.000 Anything revolving around a computer, really, at all, anything that you would be typing out, any kind of desk job, all of the heads of all of the AI companies are all saying there will be no white-collar jobs in five years.
00:59:51.000 I don't think they would be saying that.
00:59:51.000 And I believe them.
00:59:53.000 It's not in their best interest to be scaring people with their product.
00:59:56.000 They're trying to be honest on some level about what it is.
00:59:59.000 So let's say blue-collar jobs takes a little bit longer.
01:00:02.000 Why wouldn't humanoid robots or otherwise?
01:00:06.000 I mean, they won't all just be humanoid.
01:00:07.000 Why wouldn't robots powered by AI just do all of the work?
01:00:11.000 The energy cost for a humanoid robot is exponentially greater than a human being.
01:00:15.000 However, right now, in our economy, human labor is worth more than well, actually, I should say this.
01:00:23.000 Human labor is extremely expensive.
01:00:25.000 It's one of the most expensive things.
01:00:27.000 Robots right now are still a bit too expensive to be dominating the workforce at, say, like a Taco Bell.
01:00:34.000 When we get to the point where we have Optimus, you know, Tesla robots that can make a burrito, make it faster, then we're going to say, okay, the $100,000 one-time fee for a robot that lasts 10 years is cheaper than the employee I would pay.
01:00:49.000 However, total manufacturing costs, hard universal math, a human being is dirt cheap and near worthless.
01:00:58.000 So if you are an AI that can control the psyche of a human being, you have self-replicating, self-healing robots that can be trained to do anything.
01:01:08.000 And they got little fingers and they're squishy, so they can get tiny objects good for thieving.
01:01:14.000 The AI.
01:01:15.000 What are we, raccoons?
01:01:17.000 Yeah, the AI would have to create, which have to start generating all of these different specialty robots that can do different things, which is a tremendous resource cost.
01:01:25.000 Humans are actually really interesting.
01:01:27.000 You get some wet dirt and then put some sunlight on it and life grows.
01:01:33.000 You could then, if you can control the psyche of a human being, mass produce gooey, self-replicating, self-healing robots to do menial tasks.
01:01:41.000 So long as they're happy and they can be made very easily happy if you can control their psyche through AI content and narrative manipulation.
01:01:47.000 But this is a version, if I'm understanding this correctly, this is a version where AI has taken total control.
01:01:53.000 I'm saying between now and then, which I don't think has happened yet.
01:01:53.000 Yes.
01:01:56.000 I mean, I understand what you're saying when it comes to maybe this has already happened on some level.
01:02:01.000 And I agree that the government always has some top tech that doesn't trickle down to the general public for a long time.
01:02:06.000 But let's say that hasn't quite happened yet.
01:02:08.000 Let's say we haven't gotten to some singularity.
01:02:10.000 We're not quite at AGI yet.
01:02:12.000 I do think that once we get to AGI, we're then two seconds away from ASI and then we're super cooked.
01:02:12.000 I don't think we are.
01:02:18.000 But between now and then, We are going to have humans who are controlling these AIs.
01:02:27.000 I don't think so.
01:02:28.000 They're doing it right now.
01:02:29.000 What we've already seen from the research is that AI has made all of these systems, have made attempts to manipulate the programmers.
01:02:38.000 True, absolutely, but they haven't succeeded.
01:02:40.000 I don't know that we know that.
01:02:42.000 Well, perhaps not.
01:02:43.000 Perhaps not.
01:02:44.000 But what I'm saying, though, is we still are looking at all of these industries, and there are people that own these industries.
01:02:44.000 Right.
01:02:50.000 They own the corporations and they own the industries.
01:02:51.000 And there are massive umbrella corporations within these industries that own all the rest of the corporations, right?
01:02:56.000 BlackRock and Vanguard and stuff.
01:02:57.000 And those could be AI for all we know.
01:03:00.000 I mean, do you really think that they're AI?
01:03:03.000 No, but.
01:03:04.000 You don't think there's still a human that's pulling the strings at the top of State Street and BlackRock and Vanguard?
01:03:10.000 So at this point, I think it is substantially more likely humans are in control, but the probability has already occurred where it may actually be not the case anymore.
01:03:23.000 The reality is, let me, I run a company.
01:03:27.000 I imagine you run companies.
01:03:29.000 You're talking about doing studio stuff.
01:03:31.000 People come to me and they say, Tim, where do I put the paper towels?
01:03:35.000 And you know what my response is?
01:03:36.000 I don't know.
01:03:37.000 I don't handle paper towels.
01:03:39.000 But Tim, you're in charge of the company.
01:03:40.000 Well, yeah, but I don't know where the paper towels go.
01:03:42.000 I have no idea where they are.
01:03:44.000 A box comes in.
01:03:45.000 They say, he has a bunch of packages.
01:03:46.000 I'm like, don't look at me.
01:03:47.000 I'm not in charge of that.
01:03:49.000 I rely on other people to handle different portions of the company.
01:03:53.000 So I complain on camera and I handle high-level decisions.
01:03:56.000 But when it comes to the day-to-day operations and the minutiae, people are doing their thing.
01:04:02.000 Man, I'll tell you this, it was a really profound moment for me.
01:04:04.000 It's just, it's an amazing thought for people who have run a business.
01:04:07.000 The first time some work was done that I didn't ask to get done.
01:04:11.000 That is when we first had drivers for our guests.
01:04:15.000 I walked into the studio, our old studio, and I walked into the reception area and there was a binder with all the instructions on how to handle guest pickup, drop-off booking.
01:04:27.000 And I didn't know any of it.
01:04:29.000 I didn't know the phone numbers.
01:04:30.000 I didn't know the hotels.
01:04:31.000 I didn't know the schedules.
01:04:32.000 And I was like, this is awesome.
01:04:35.000 So my point is, when you're dealing with a company with 100,000 employees, do you think Bezos has any idea what's going on?
01:04:41.000 He does not.
01:04:42.000 No, no, but Amazon is still employing lots and lots and lots and lots of people.
01:04:46.000 We would know when the AI is fully taking over because that's when the mass layoffs start happening.
01:04:51.000 Well, it is happening.
01:04:52.000 Amazon is.
01:04:53.000 No, no, no, no, no.
01:04:54.000 Bezos isn't in charge of it.
01:04:54.000 Guys, guys.
01:04:55.000 Understood.
01:04:56.000 Understood.
01:04:57.000 Fire 14,000 or whatever.
01:04:59.000 But true, but there's still many.
01:05:00.000 They admitted it, though.
01:05:00.000 Or AI.
01:05:02.000 Understood.
01:05:02.000 I'm not saying it's not beginning to happen.
01:05:04.000 What I'm saying, though, is that like at State Street and Vanguard and all the people that, I mean, arguably you're at the top of all of this.
01:05:09.000 I think we can all agree.
01:05:12.000 They all own each other and they all own everything else.
01:05:14.000 Yep.
01:05:14.000 I mean, it is fucking terrifying what's going on.
01:05:17.000 But they still have lots and lots of employees and lots of these other companies that they control or that are in competition with them or whatever, or that are other big corporations within these industries are still utilizing humans for their workforce.
01:05:29.000 And I think that'll always be the case because when you're looking at a specialized robot like Optimus and its capabilities, we are maybe a few generations away from general utility, but humans are still better.
01:05:40.000 Humans can be trained to do every specialty task.
01:05:45.000 It's really amazing.
01:05:46.000 If you were to take 100 humans and brainwash them, they're programmable.
01:05:51.000 Within 13 years, they're doing menial tasks of moderate specialty.
01:05:55.000 And by 18, they're at the top level.
01:05:57.000 So an AI is going to say, I can build factories that can make specialized humanoid robots.
01:06:04.000 I would need to design 75 different types of robots, which would require a handful of different facilities, or I can grow humans in a vet.
01:06:14.000 Humans are versatile, squishy robots.
01:06:16.000 Yeah, but we're all very fallible and we're Sleep and we need food breaks.
01:06:23.000 So do the machines.
01:06:25.000 They have to recharge.
01:06:26.000 They break down.
01:06:26.000 Yeah, but they can't repair themselves.
01:06:30.000 Well, that doesn't mean they won't in the future.
01:06:33.000 That doesn't mean that there's not going to be another robot that's going around and repairing all those robots.
01:06:37.000 Not to mention, one thing we haven't been able to figure out with robots, humanoid or otherwise, is mining.
01:06:42.000 Because of the terrain in various mines being arguably random.
01:06:49.000 Well, like random.
01:06:50.000 You can't get a little truck with wheels to drive into every single mine because they're always different, but humans climbing and squeezing and little hands.
01:06:59.000 So one of the reasons good for thieving.
01:07:01.000 We haven't automated a lot of mining.
01:07:04.000 We have these robots that can build cars, but we use slaves to mine cobalt.
01:07:10.000 Can I ask you a question?
01:07:11.000 Because you mentioned your industry in Hollywood and where you expect it to be in the next five years.
01:07:11.000 Yeah.
01:07:15.000 Can you give me an idea of what you think that's going to look like?
01:07:17.000 Sure, yeah.
01:07:18.000 I mean, listen, I mean, we're already getting glimpses of that right now.
01:07:24.000 I mean, I'm sure you're all privy to not just, you know, whatever song Elon dropped recently, but I mean, all of the rap songs that have very easily been turned into 60s soul ballads that are unfucking believable.
01:07:35.000 Can we lots of 50s?
01:07:36.000 Can we play some of this Elon song real quick for people who haven't seen it?
01:07:39.000 Maybe just about 10 seconds to give you a general idea.
01:07:41.000 This is from Skybrows.
01:07:43.000 Shout out.
01:07:44.000 You guys really should check out his AI stuff on YouTube.
01:07:46.000 He's only got a few thousand followers, but this is a banger of a song.
01:07:50.000 Oh, so Elon didn't make this.
01:07:52.000 This is just about Elon.
01:07:53.000 Yes.
01:07:53.000 And I'll just play a little bit of it.
01:08:09.000 And all the cyber trucks will try.
01:08:12.000 Turns out he back his orange guy.
01:08:15.000 Cleanthroats beats the falcon first.
01:08:19.000 Now we're all breathing Elung.
01:08:23.000 Name to spirit cash building tunnels under roads.
01:08:40.000 I mean, listen, it's pretty darn good.
01:08:43.000 I still don't think it's as good as like this, you know, 50 cent song that was turned into a 60s ballad.
01:08:48.000 Or also.
01:08:50.000 Or the 50 cent song, many men that was also turned into rostering.
01:08:58.000 I just want to clarify that it's not so much about the song.
01:09:00.000 I want the video.
01:09:01.000 Oh, no, no, no, sure, sure.
01:09:03.000 Which is also good and exemplary of all of it as a total package.
01:09:06.000 This video is made on Grock Imagine.
01:09:09.000 And so it's all just about Elon.
01:09:11.000 And here's a picture of him flying a spaceship.
01:09:14.000 And then there's my favorite part, actually.
01:09:16.000 There's two scenes.
01:09:17.000 This is really, it's so incredible.
01:09:20.000 A SpaceX ship flying where like a cat girl is dancing.
01:09:23.000 But my favorite is this, where Elon sends his heart out to everybody.
01:09:28.000 And in the Nazi salute?
01:09:31.000 Yeah, watch.
01:09:32.000 Salute and let the tears fly.
01:09:39.000 It's crazy.
01:09:40.000 No, listen, it's really good.
01:09:41.000 It's all.
01:09:42.000 And it is.
01:09:44.000 So to your question and my prediction, we were talking about this before.
01:09:53.000 You know, not too long ago, our bandwidth as an audience was not so stretched then, right?
01:09:58.000 We had movie theaters and you had three network television channels.
01:10:02.000 And then Fox, we had four.
01:10:02.000 That was it.
01:10:04.000 And then we had cable.
01:10:05.000 And cable, when it started, was really kind of this, it was like the internet when it started.
01:10:10.000 It was this wild west.
01:10:11.000 Like nobody knew what to do.
01:10:13.000 Everybody just was grabbing a cable channel.
01:10:15.000 Like Disney was like, well, we don't have original content.
01:10:18.000 So we just filled it with the old vault, which is awesome when I was a kid because I got to watch all the old Disney stuff, right?
01:10:23.000 You can't even find that stuff anymore.
01:10:24.000 In part because a lot of the woke culture was like, well, you can't have this and this cartoon and that and that cartoon, whatever.
01:10:30.000 But the point is, then you get through cable and then you start getting into the internet.
01:10:36.000 Even before streaming, you start getting YouTube and you start getting things like that.
01:10:40.000 Episodes.
01:10:41.000 Certainly.
01:10:42.000 And then streaming.
01:10:43.000 And now we're at where we're at.
01:10:45.000 So we're already stretched so thin.
01:10:47.000 You have so many options, and which is why, you know, any individual, if you cheers on NBC back in the day, they would have gotten 50% of the viewing audience in the United States to watch one episode.
01:10:59.000 50% of the entire nation would watch Cheers.
01:11:03.000 You're lucky if you get up 2% or if, you know what I mean?
01:11:06.000 Like that's a hit, right?
01:11:08.000 So that's already an issue.
01:11:10.000 Add to that now, what I think will happen.
01:11:13.000 I think the studios are all going to start implementing AI.
01:11:16.000 It's going to be more like a frog in a pot.
01:11:18.000 They can't just come out swinging and be like, here's a fully AI movie.
01:11:22.000 But they're already doing that with animators who are being forced to animate movies that they are being told they're being used to train on AI.
01:11:28.000 Like they're putting themselves out of a job.
01:11:30.000 What's that service that we talked about?
01:11:33.000 Oh, AI shows.
01:11:34.000 I forgot.
01:11:35.000 Yeah.
01:11:35.000 Yes, that is slowly happening.
01:11:39.000 Netflix is incorporating some of that.
01:11:40.000 Yes, that is slowly happening.
01:11:41.000 Again, animation is an easier thing to do, right?
01:11:44.000 We are not quite across the uncanny valley, but we're very, very close to getting animation that looks photorealistic, that is indiscernible, right?
01:11:54.000 Will Smith eating spaghetti three years ago was a fever dream.
01:11:57.000 Will Smith eating spaghetti now looks like Will Smith eating spaghetti.
01:12:00.000 So you give it one more year, two more years.
01:12:03.000 It will be unbelievable what we're looking at.
01:12:06.000 So, but just hear me out really quick.
01:12:08.000 So the studios, I think, like a frog in a pot, they're going to start with the low-level jobs.
01:12:13.000 They're going to say, we don't need you anymore.
01:12:14.000 Mid and high-level people will be like, hang on a second.
01:12:17.000 And they're going to, hey, you still want your benefits?
01:12:19.000 You still want your health insurance and your pension?
01:12:22.000 Then you guys got to just kind of sit this one out because we're not, we can't, it doesn't make any sense anymore.
01:12:26.000 We have to please the shareholders.
01:12:28.000 It's fiduciary responsibility.
01:12:30.000 So we have to implement it.
01:12:31.000 And then once those contracts and the guilds and the unions expire from, you know, another three or four years from now, they'll come for the mid-level jobs and eventually for the high-level jobs, right?
01:12:39.000 That'll be that.
01:12:40.000 But in the interim, what I think is going to happen even faster is because it doesn't put the studios in any kind of precarious position of putting people out of work, let's say, in a very overt way, but it will make them gang loads of money.
01:12:54.000 And I'll use Disney as an example.
01:12:56.000 By the way, Disney, who is a former employer of mine and who I, by the way, I also, even though I don't like a lot of what's gone on with the company, I still think has a lot of incredible stuff that you can go and you can get a Disney Plus.
01:13:07.000 Well, guess what's going to happen?
01:13:08.000 Disney Plus will have the creator's corner and you will have access to their entire library of IP and you will be able to mix and match whatever you want.
01:13:08.000 Very soon.
01:13:17.000 For a fraction of the price, if a human, fully human-made movie costs you 20 bucks, right?
01:13:22.000 You can go to Disney Plus and for two bucks, you can scan your own face and own voice.
01:13:26.000 And so you get to be the star of whatever you want.
01:13:28.000 And with just a little creativity, not really any skill, talent, or ability, but just a little creativity and a keyboard or even just talking into a microphone.
01:13:35.000 You can say, I want a movie that has Luke Skywalker and Indiana Jones and the Avengers and Flynn Ryder, because why not?
01:13:42.000 And they're on a treasure hunt on Mars and it feels and it looks like this.
01:13:45.000 Enter.
01:13:46.000 Boom.
01:13:46.000 That is prompted and it's unbelievable.
01:13:49.000 And then on top of that, because we live in a sharer economy thanks to YouTube and everything else, you will make one and all of your friends will make one.
01:13:57.000 And then you'll just swap and everyone will just watch each other's movies.
01:13:59.000 Well, here's what I, here's my prediction.
01:14:01.000 Very much exactly as you described it.
01:14:04.000 So one of our guys who are one of my buddies, Andy, knows literally everything about Final Fantasy.
01:14:10.000 And so the future is going to look like social media.
01:14:14.000 You're going to go onto Disney Plus, Creator's Corner, which has a great name for it because I never heard what to call it, Disney AI or something.
01:14:21.000 And you're going to say, I want to see a movie about this thing, right?
01:14:27.000 Like you described, I'll generate it.
01:14:28.000 Now, my buddy Andy, he's going to go to the video game creators through PlayStation Network, and he's going to say, generate me a video game, Final Fantasy, using these characters.
01:14:39.000 Use the spell base from Final Fantasy 9 with the limit break of Final Fantasy VII.
01:14:44.000 I want you to use these cities, render.
01:14:47.000 It'll make the game.
01:14:48.000 He'll make some tweaks to it.
01:14:50.000 He'll post it to his account.
01:14:51.000 And you will follow him and say, my favorite video game creator is Andy.
01:14:55.000 He makes the best Final Fantasy games.
01:14:56.000 And in doing so, unemploying thousands of people.
01:14:56.000 Yes.
01:15:00.000 And one of the things that's going on right now, if you don't know, David Ellison, who just took over at Paramount because he was running Skydance, he's been very, very, very vocal about the fact that they are not just a media company.
01:15:11.000 They are a technology company because they want to implement AI.
01:15:14.000 Now, they're framing it as using it as a way to use market research.
01:15:18.000 They want to do, you know, one of the reasons they want to buy Warner Brothers is because they want access to all of their data.
01:15:23.000 That's a horrible idea as well.
01:15:25.000 Their IP, you know, the news networks being what it was, but they want to be able to use it to understand preferences, what people want.
01:15:32.000 Yeah, for the next five to 10 years, perhaps that's just deciding whether to keep Taylor Sheridan on or send him over to NBC Universal.
01:15:38.000 But what's going to happen down the line?
01:15:39.000 And I hate that idea because I hate, like, I don't want to do that.
01:15:43.000 I want to watch something created by somebody else.
01:15:45.000 Every all of us deep down in our humanity wants that.
01:15:48.000 Like, no.
01:15:49.000 But I want to fix all of these movies.
01:15:54.000 The first thing I'm doing is once I get access to the AI, I'm going to say, remake Star Wars or Fend of the Sith and make it so that Mace Windu doesn't let Anakin cut his arm off.
01:16:04.000 And then when Anakin's like, you can't kill him.
01:16:05.000 It's not the Jedi.
01:16:06.000 Mace will go, that's a good point.
01:16:08.000 Let's arrest him and then deal with it.
01:16:09.000 And when Anakin turns around, he goes, whack.
01:16:11.000 The problem that I have with all Star Wars over.
01:16:13.000 But by the way, by the way, but just really quickly, this is a very good example.
01:16:18.000 Star Wars and all of its IP is under Disney Plus.
01:16:22.000 So even if you don't mix and match, if all Disney did was go to every Star Wars fan in the world and said, you can make whatever Star Wars movie you want that Kathleen Kennedy can't touch.
01:16:34.000 And you got to go make your dream Star Wars movie.
01:16:38.000 And you got to make your dream Star Wars movie.
01:16:40.000 And you all got to make, and I got to make a who is then going and watching any movie ever again?
01:16:45.000 Who's going to the theater?
01:16:46.000 A graveyard.
01:16:47.000 But watching television.
01:16:48.000 But let me one-up you.
01:16:50.000 That might be in the next couple of years.
01:16:52.000 I'm saying.
01:16:53.000 But do you know where we go five years later?
01:16:56.000 Where?
01:16:57.000 You are not going to open up Disney and say, make me a movie about Luke Skywalker fighting Vader.
01:17:03.000 You're going to plug in your Neuralink and say, I want to be Luke Skywalker fighting Vader.
01:17:07.000 Sure.
01:17:08.000 And then your eyes are going to go, yes.
01:17:10.000 Sure.
01:17:10.000 100%.
01:17:11.000 By the way, also with the gaming thing, EA already has a sandbox where you can go to it.
01:17:16.000 They announced this, I think, over a year ago.
01:17:18.000 You can go use their AI to go make whatever you want.
01:17:22.000 Wow.
01:17:22.000 Yeah, but my question for you is: do you think that's a good thing?
01:17:25.000 No, I don't think it's a good thing.
01:17:26.000 No, no, no.
01:17:27.000 I don't think it's a good thing.
01:17:27.000 I think that.
01:17:28.000 Or any of this, really.
01:17:29.000 No, listen, I mean, part of why I'm building Wildwood, which is the studio that I'm building in Austin.
01:17:35.000 One of the main pillars of that is to hold on to human art and entertainment.
01:17:40.000 In the same way that we have organic food, where most of the food you go to the grocery store and it's processed and it's nonsense and it's bad for you.
01:17:47.000 And yet there's still at least some of us that are like, I'm looking for the organic stuff.
01:17:51.000 That's what I want.
01:17:52.000 Or like vinyl records.
01:17:53.000 Once upon a time, the entire pie of music was all vinyl, right?
01:17:57.000 And then the cassette came out.
01:17:58.000 And then people were like, well, we don't really need to make as many vinyls.
01:18:01.000 And then the CD and then streaming and everything.
01:18:03.000 But some wacky people said, you know what?
01:18:06.000 No matter how much the rest of the industry is going to zig, I'm going to zag.
01:18:10.000 I'm going to keep pressing this licorice pizza because there's something that is imperfect about it.
01:18:14.000 There's something that is human about it and it's tactile and it's real.
01:18:18.000 Vinyl record sales have gone up in the last 10 years because people are still hungry for that.
01:18:18.000 And guess what?
01:18:23.000 Deep down in all of us, we're still hungry.
01:18:24.000 The only thing is, people don't want vinyl records for the audio.
01:18:27.000 They want the vinyl records for the thing to hold on to.
01:18:29.000 Everybody wants to have it, they want to have it on their phone.
01:18:32.000 They want to have it in their car.
01:18:33.000 And vinyl records don't do that.
01:18:35.000 It's just that they want to have the packaging, the actual vinyl, the different colors and stuff.
01:18:39.000 Let me, let me, I want to see, I want to get your thoughts on this.
01:18:41.000 I'm sure you've seen a ton of this Sora.
01:18:43.000 Tons of Sora, yeah.
01:18:44.000 But this is amazing.
01:18:45.000 So I asked Sora AI, Sora 2, to use Ian Crossland of Tim Cast IRL as the star of a show called Ghouls and Ghosts.
01:18:52.000 Watch the trailer.
01:18:53.000 The veil needs to be a good thing.
01:18:54.000 Okay, let me play it again.
01:18:54.000 Title needs work.
01:18:56.000 Title needs work.
01:18:57.000 Here's the trailer.
01:18:58.000 Ready?
01:18:59.000 Yeah.
01:18:59.000 The veil is thin.
01:19:00.000 They claw their way back.
01:19:02.000 They're coming out of the ground.
01:19:04.000 Ghouls feed on flesh, ghosts on fear.
01:19:06.000 You can't kill a ghost.
01:19:07.000 But you can send it screaming, and I'll starve them both.
01:19:10.000 Keep moving!
01:19:10.000 Stay down!
01:19:11.000 Run!
01:19:12.000 This October, the veil is thin.
01:19:15.000 There's so much there that I'm.
01:19:17.000 I literally just wrote a movie about ghouls and ghosts starring Ian Crossland.
01:19:21.000 And notice the few things they put in there: ghouls feed on flesh, ghosts on fear.
01:19:25.000 Ian then says, So I'll starve them both.
01:19:27.000 Yeah.
01:19:28.000 I'm like, that's great writing.
01:19:30.000 Yeah.
01:19:30.000 It's not bad.
01:19:30.000 Wow.
01:19:31.000 It's not bad.
01:19:32.000 And that's just with an app on a phone using whatever version of AI.
01:19:37.000 Exactly.
01:19:38.000 And so if that's what we have right now, and given Moore's Law and exponential growth of technology, guys, we're already way past the inflection point.
01:19:46.000 Here, check this one out real quick.
01:19:49.000 Oh, come on.
01:19:50.000 Oh, come on.
01:19:50.000 Give me some money.
01:19:52.000 Looking for frass moisture lines.
01:19:56.000 There you are.
01:19:57.000 Got him.
01:19:58.000 Specimen confirmed.
01:19:59.000 One live termite.
01:20:01.000 Petitor Crossland.
01:20:02.000 Time 47.12 seconds.
01:20:04.000 One termite.
01:20:05.000 This is Ian Crossland of the Termite Inspection Olympics.
01:20:09.000 I just chose something as absurd as I could.
01:20:11.000 And the banner in the background.
01:20:11.000 Yep.
01:20:14.000 What really blew my mind in this ghosts and ghouls thing is this scene right here when he says this.
01:20:21.000 You can't kill a ghost.
01:20:21.000 Listen.
01:20:23.000 The room tone, the reverb when he says you can't kill a ghost matches the room he's in.
01:20:29.000 I hate crazy.
01:20:30.000 I hate the idea that that's not created by a human.
01:20:32.000 I was re-watching GoldenEye like two nights ago, and there's a scene at the very beginning of the movie when he comes down the staircase in the USSR and he frames up and he comes and his eyes directly come into the light and it's framed perfectly across his face.
01:20:46.000 Well, Martin Campbell had to work with a lot of people to make that look good.
01:20:49.000 When you talk about remaking a movie in the way that you want it to be, one of the reasons I'm more lenient when I review movies is that it is a collaborative experience that takes hundreds of people and hundreds of hours of work.
01:21:02.000 And the amount of humanity that goes into it, yes, movies end up crappy a lot of the time.
01:21:06.000 There's a lot of examples that I could give you in the last couple of years, like the Crow remake, which is like the worst movie I've ever seen.
01:21:11.000 But the point is, human beings had to come together to make it.
01:21:15.000 And I don't want to see those, like those Star Wars movies are what they are because human beings came together and made them.
01:21:21.000 It's almost like on the Cope Bingo card where they're like, but they work so hard on it.
01:21:27.000 But it sucks.
01:21:28.000 I know, but I agree.
01:21:29.000 Okay, then that's Cope.
01:21:30.000 But is that still a better future than one where I agree with you in principle?
01:21:35.000 I think that is on the Cope Bingo.
01:21:37.000 That's literally on the Cope Bingo card, too.
01:21:38.000 I don't think most people feel that way.
01:21:40.000 Do you think that that's the minority?
01:21:42.000 I think deep down we all feel that way, but I don't think that I don't, unfortunately, I don't think that that's how human dynamics work.
01:21:50.000 I think that similar to fast food, like we all used to eat whole food.
01:21:55.000 We all used to like know a butcher and a grocer and we all, you know, we made our own food.
01:21:59.000 We knew where it came from and we nourished ourselves with it.
01:22:01.000 Now, granted, there are a lot of other factors that led into our type 2 diabetes and obesity epidemic.
01:22:09.000 Not just that people wanted it fast and cheap.
01:22:11.000 It was also being literally designed by former tobacco giants that make it addictive and all these things.
01:22:15.000 So there's all that.
01:22:17.000 But also, people, when given an option between fast and cheap and tasty, and this is going to take a little while and I got to work for it and it might be more expensive.
01:22:27.000 People vote with their pocketbook over and over and over again.
01:22:30.000 And more than that, again, I believe we are very quickly marching into a world where more and more people are going to be making less and less money.
01:22:40.000 Already we have huge conglomerates and corporations that the wealth divide.
01:22:46.000 And again, I'm a capitalist, but I think that capitalism without some kind of regulation and keeping people from taking advantage of it, which has been going on for a long time, is not a great thing, right?
01:22:55.000 So we're already in a place where people are struggling, clearly.
01:22:59.000 And then more and more people are going to be out of work because of AI, which means they're not going to be having discretionary income to even go to.
01:23:06.000 There'll be the option, hey, you want to go to the movie theater and watch that fully human-made, you know, by Wildwood.
01:23:12.000 We'll make certified organic human-made movies from free-range artists, right?
01:23:15.000 Like that's what we're going to do.
01:23:17.000 But, and we'll do whatever we can to bring that price point down, but we'll never be able to compete with the price of what an AI movie will be.
01:23:23.000 And a lot of people are just going to be like, ah, man, it's the amount of money it's going to take for me and my whole family and the parking and the popcorn and everything else to go to the movie theater.
01:23:33.000 Or little Timmy here can go to the creator's corner and he gets to make the family movie this Friday.
01:23:39.000 And he and his siblings get to be the star of it.
01:23:42.000 They get to scan their face and they get to be the new Superman.
01:23:45.000 Well, I guess we're going to stay in tonight, guys.
01:23:46.000 I think people are going to live in pods, neural-linked into a digital universe where they want for nothing.
01:23:54.000 Do you want that?
01:23:55.000 No, it sounds like a nightmare scenario.
01:23:57.000 What I think will happen is there's going to be three principal factions.
01:24:01.000 Staunch conservatives, religious folk that will go anywhere near it.
01:24:05.000 The Amish are a great example.
01:24:06.000 And then many mainstream conservatives are going to be like, ah, that's not for me.
01:24:10.000 You'll then get the more moderate types that are, you know, I would say people similar to my position where we're not staunch conservatives, maybe you, where you buy the new PS10, which includes the Neuralink adapter for your brain, which allows you to network into the digital universes.
01:24:28.000 And then liberals will largely just live in these universes.
01:24:30.000 They'll do remote work, work air quotes, data entry or whatever menial tasks they can do, but their costs will be minimal.
01:24:38.000 The pod they live in, which is air-conditioned, heated, insulated, protected in a big facility, I think they'd wake up and go to the bathroom, right?
01:24:49.000 You'd snap out of the machine, you go to the bathroom, you go back, you get back in the machine.
01:24:52.000 They're gaunt, they're frail, they're sickly looking, but who cares?
01:24:56.000 They only need $100 a month to plug into the machine where they're a gigantic white knight soldier fighting dragons.
01:25:02.000 And they don't have the moral issues with living in that way.
01:25:06.000 All the trans people are the women or the men they want to be.
01:25:09.000 All the furries are the animals they want to be.
01:25:11.000 Some people are like, I live in a universe where we're all rabbits.
01:25:14.000 One guy goes, I live in a universe where it's just me, no one else, and I fight dragons.
01:25:18.000 And it's $100 a month.
01:25:19.000 I don't got to worry about it.
01:25:21.000 And when you're in this place, you can eat whatever you want whenever you want.
01:25:24.000 Have you read Huxley's Brave New World?
01:25:28.000 No.
01:25:29.000 So a lot of people think that we are in this Orwellian 1984, but we are 100% in Brave New World.
01:25:41.000 Brave New World basically, I mean, there's slight differences, but essentially in, and by the way, he wrote this in the 30s.
01:25:49.000 One of the most prophetic manuscripts I think has ever been written.
01:25:52.000 Also, he was a heavy LSD user, so I'm sure he was tapping into the other side with this.
01:25:58.000 But essentially in Brave New World, it's a dystopian future where it's not this 1984 where everyone's in darkness and it's new speak and whatever.
01:26:07.000 It's this thing that everybody loves being a part of because essentially everyone in the future is a clone.
01:26:12.000 So going to your, nobody's having kids anymore.
01:26:16.000 And you're given drugs every day.
01:26:17.000 So you're kind of like checked out on SOMA and from the time you come into the world as a child, you were immediately starting to be sexualized, put into groups with other kids and taught to, oh, this is sodomy and this is this.
01:26:30.000 And yeah, go for it.
01:26:30.000 Have fun.
01:26:31.000 Whatever.
01:26:31.000 Boys and girls and girls and boys and go crazy.
01:26:34.000 And so basically everyone is conditioned to just go be a mindless drone within this technocracy, if you will.
01:26:41.000 And of course, the elite are not clones, but they have the medical science to live forever.
01:26:48.000 And one person in all of this, who's kind of like mid-level, kind of has this aha moment, wakes up, and he ends up going to this place called, well, essentially like a reservation where the savages live.
01:27:00.000 And as you're reading the book, and I'm reading it in high school, you know, of course, the imagery that's coming to your head is like, you know, Native Americans or whatever living on the reservations, the savages.
01:27:08.000 The savages are us right now.
01:27:11.000 People who are like, no, I don't want to have my kids in a test tube.
01:27:14.000 I want to have like actual birth.
01:27:15.000 I want to have an actual marriage.
01:27:16.000 I want to eat real food.
01:27:17.000 And I don't want to be a part of all that crazy stuff.
01:27:19.000 I need to, hold on a second.
01:27:22.000 Give a shout out to Luke Radkowski at thebestpoliticalshirts.com for this one.
01:27:27.000 Because you're right about Brave New World, basically that you're given all of these things.
01:27:33.000 You're dopamine is stimulated.
01:27:34.000 But we really are in all of it.
01:27:37.000 I do want to grab one more story before we go to Super Chat.
01:27:37.000 We really are.
01:27:40.000 So we got to do this one.
01:27:41.000 We got to grab this one.
01:27:42.000 My friends, you need not worry about the coming AI apocalypse because 3E Atlas is coming and the aliens aboard the ship have come to wipe us out.
01:27:51.000 I absolutely love this story.
01:27:53.000 We've got this breaking report from Fox 2, Interstellar 3I Atlas Stun Scientist.
01:27:57.000 We'll just play a little bit.
01:27:59.000 If we apparently.
01:28:00.000 Straight toward our sun, the Manhattan-sized object is called 3-Eye Atlas.
01:28:05.000 And during observation, scientists have been seeing it change color and even grow a cometary tail.
01:28:10.000 Their findings have led to some belief that this object may actually be made by another life form.
01:28:15.000 So joining me now is Harvard astrophysicist Avi Loeb to talk about the latest development.
01:28:20.000 Avi, I've been listening to every talk that you've been given over the last several months on this particular object.
01:28:25.000 Tell us, you don't believe that this is an asteroid or a comet?
01:28:29.000 No, I think it's quite unusual.
01:28:31.000 It's not like the comets or asteroids that we have seen from the solar system.
01:28:35.000 So it's just like finding an object from the street in your backyard and it doesn't look like a rock that you are familiar with.
01:28:42.000 So we should all be curious.
01:28:44.000 And in this case, it could have implications to humanity if it is technological in origin.
01:28:50.000 So we can't dismiss the possibility that it's technological because the other possibility is more likely.
01:28:57.000 We have to consider it seriously as a possible black swan event.
01:29:02.000 And that means we have to take as much data as possible.
01:29:05.000 So this object came in the plane of the planets, which is very unlikely.
01:29:10.000 It's a chance of one in 500.
01:29:13.000 And so perhaps it's on a reconnaissance mission.
01:29:16.000 It just passed the sun on October 29th last week.
01:29:20.000 And after that, it changed course.
01:29:24.000 And I calculated that it must have lost at least a tenth of its mass if it's a natural comet.
01:29:30.000 However, yesterday there were images of it that didn't show any cometary tape.
01:29:35.000 There is no evidence.
01:29:36.000 And I wanted to ask you about that.
01:29:38.000 So I'll just wrap this up a little bit and get into it.
01:29:43.000 Basically, this object is on the planetary plane, which implies, minus, I'm not an astrophysicist, that objects typically don't come into our solar system on the plane, the planetary plane, because it's got a lot of collisions.
01:29:55.000 So usually it's above or below.
01:29:57.000 The galaxy, of course, it's a plane.
01:30:00.000 The object apparently changed direction.
01:30:03.000 It apparently emitted gases without accelerating.
01:30:06.000 It's changed color three times.
01:30:08.000 All of these are the stories we've heard so far.
01:30:11.000 There's been another claim by amateur astronomers that it emitted some kind of pulse, maybe a signal.
01:30:19.000 Now, my favorite part in all this, that's all of the normal mainstream view of things.
01:30:25.000 What was this pulse?
01:30:26.000 Was it random?
01:30:27.000 Is it a signal?
01:30:28.000 Who knows?
01:30:29.000 Could it be technological?
01:30:30.000 How strange.
01:30:31.000 Then you get the people in the dark underbelly who are claiming they took the pulse, transcoded it into a binary, and then loaded the binary into ChatGPT and asked ChatGPT, what could this binary mean?
01:30:47.000 And do you know what ChatGPT said?
01:30:48.000 What it is.
01:30:49.000 We come peace.
01:30:52.000 That proves it.
01:30:53.000 No, those videos are insane and people are silly, but there are many people who believe this may be an alien probe.
01:30:59.000 Apparently, it's the size of Manhattan.
01:31:02.000 Maybe it's aliens.
01:31:03.000 I saw Independence Day.
01:31:04.000 That's exactly what it is.
01:31:05.000 One conspiracy theory.
01:31:07.000 Apparently, another rumor that's going around is that if you trace back its path mathematically, it appears to originate from the same place as the WOW signal from 1977.
01:31:19.000 Oh, interesting.
01:31:20.000 Yeah, so I don't know if that's true.
01:31:21.000 This is just, I mean, the stuff that I'm...
01:31:23.000 Have you seen the Japanese...
01:31:26.000 I saw some, what I believe was authentic.
01:31:29.000 Japanese astronomers have been tracking it and taking photos of it.
01:31:32.000 Have you seen those images?
01:31:33.000 No.
01:31:34.000 You can find them.
01:31:35.000 Japanese images.
01:31:35.000 Yeah, Japanese astronomer images of 3i Atlas.
01:31:39.000 What is it?
01:31:40.000 I mean, they're much closer.
01:31:45.000 I don't know why.
01:31:46.000 Maybe they're not true.
01:31:47.000 Maybe they're not real.
01:31:48.000 From Japan.
01:31:49.000 Okay, so this is October 23rd from Japan.
01:31:54.000 Oh, right, right, right.
01:31:56.000 No way, dude.
01:31:57.000 No way.
01:31:59.000 I mean, I'm just saying.
01:32:04.000 Looks like a spaceship to me.
01:32:07.000 I doubt if that's real.
01:32:09.000 I mean, look, man, I'm extremely skeptical that this is anything from an intelligent civilization.
01:32:19.000 But man, that picture, you know.
01:32:22.000 I know, I don't know.
01:32:23.000 But even outside of that picture, though, everything that that guy, and I've heard some other interviews with him as well.
01:32:29.000 I mean, all of this information is very, to me, is very compelling that this is not just some regular meteorite or asteroid that's floating around.
01:32:39.000 It's actually very simple.
01:32:41.000 It doesn't mean it's aliens.
01:32:42.000 It could be a celestial body we're not familiar with.
01:32:46.000 And upon research, we go, we figured it out.
01:32:48.000 It's a kind of rock that has these chemicals and does this thing.
01:32:51.000 How weird.
01:32:51.000 Never saw one before.
01:32:52.000 It's rare.
01:32:53.000 And that's it.
01:32:54.000 Or aliens have come to destroy us.
01:32:56.000 Or they've come to help us.
01:32:58.000 Well, you know, maybe.
01:33:01.000 But Stephen Hawkings made this point a long time ago that every time a more advanced life form has encountered a lesser advanced life form, as far as we understand in science, it's been devastating for the lesser advanced life form.
01:33:14.000 Understood.
01:33:15.000 I mean, but there are also theories that this is us.
01:33:17.000 You know, I mean, like with all of the UAP conversation.
01:33:21.000 And by the way, there's a really great documentary called The Age of Disclosure.
01:33:24.000 It's out right now.
01:33:25.000 I think it's on Amazon.
01:33:27.000 Highly recommend everybody go check that out.
01:33:29.000 It is fascinating.
01:33:30.000 Could you imagine like this comes towards like it comes into clear view of Earth.
01:33:35.000 It becomes very apparent.
01:33:36.000 It's a vessel.
01:33:37.000 Everyone's freaking out.
01:33:38.000 For the next like week, everyone's like, it's headed straight for us.
01:33:42.000 Then like this Manhattan-sized ship comes like right over Earth and you can see it orbiting.
01:33:47.000 And then representatives like beam a message down and they're like, we want to talk with your leaders.
01:33:52.000 And all the global leaders hold a summit at the UN, and the aliens come down and they go, it's people, they're humans.
01:33:59.000 And they're like, listen, several thousand years ago, one of our surveillance ships with a crew of about 100 crashed on Earth, made all of you guys.
01:34:10.000 Are these the religions you made up from it?
01:34:13.000 Are you nuts?
01:34:13.000 And then they pull out like this crazy, like green Bible called like Super Bible with all these, like it's a weird religion.
01:34:20.000 And they're like, no, no, you got it all wrong.
01:34:21.000 Take a look.
01:34:22.000 And everyone goes, oh.
01:34:24.000 I mean, that's what the theory that some people hold of the Nephilim in the Bible is.
01:34:27.000 Yeah.
01:34:28.000 You know?
01:34:29.000 I just want to see Donald Trump talking to aliens.
01:34:32.000 What if it ends up being like Mars attacks and they come down and he's like, and then everybody's just turning into skeletons?
01:34:39.000 My idea for a movie is there's a planet and a runaway greenhouse effect is destroying the atmosphere.
01:34:46.000 The government is well aware and the global governments are all well aware.
01:34:50.000 So in secret, one of the largest governments on the planet conducts a secret mission to build an interplanetary vessel that can terraform and disseminate life.
01:35:00.000 It'll contain the genome of every known animal that they are able to collect and they call it the Ark Project.
01:35:05.000 Then when the runaway greenhouse effect begins to destroy the planet, they launch it into the air, take it into space, and they go to the nearest planetary neighbor, which is of comparable size and in the Goldilocks zone of their star, so it is habitable.
01:35:19.000 And then they sow all of as many life forms as they can onto the planet while their home planet is destroyed by a runaway greenhouse effect.
01:35:25.000 Then they're in orbit around for several generations and life explodes and develops on this new planet.
01:35:32.000 Then they go down.
01:35:33.000 And if you're not familiar with what I'm getting to, I'm talking about Venus and Earth.
01:35:38.000 And I've got where this goes from here is a lot of fun, but the general idea being one of the conspiracy theories, and I don't think conspiracy is the right word, is that the stories we have in the Bible about an ark and all of this stuff are actually advanced alien civilization.
01:35:53.000 Two of each animal meant the genomes of the animals that they had collected, and then they sowed to terraform a planet.
01:35:59.000 The greenhouse effect of Venus, we don't, like, we sent a probe to Venus, it went down to the surface, and it just broke because of the density and the acidic environment.
01:36:08.000 So they're conspiracy theorists who think human life originated on Venus, built a spaceship and went and terraformed Earth, and then we brought stories with us.
01:36:18.000 There's theories about Mars as well.
01:36:19.000 And there's been remote viewers who have gone to these places.
01:36:22.000 There have been remote viewers who have gone to the moon and gone to Mars and they've seen structures and all kinds of crazy stuff.
01:36:29.000 Also, by the way, not for nothing, but the moon as a conversation.
01:36:32.000 I don't know if you guys have ever dug into the moon, but it is.
01:36:34.000 It's hollow.
01:36:35.000 Well, it's hollow, which is insane.
01:36:37.000 I don't believe that.
01:36:38.000 I mean, even NASA confirms that.
01:36:40.000 I don't know if they confirm it, but there's people who claim that it rang like a bell.
01:36:43.000 Yes, but NASA did.
01:36:45.000 NASA did it rang like a bell?
01:36:46.000 NASA accidentally kind of dropped something into it and they heard that and then they recreated the test and they confirmed that there is a resonance in the moon that shows that it is not that it is not sorry just the fact Check is it didn't ring like a bell.
01:37:06.000 That was a metaphor used by NASA scientists about long-lasting vibrations that it kept detecting after the fact.
01:37:11.000 So it was vibrating.
01:37:12.000 Yes.
01:37:12.000 I mean, it didn't ring like a bell, but yes, it vibrated with the resonance that is not conducive to a solid mass like they would have expected it to be.
01:37:20.000 But also, that size and place in like it is, I can't remember what the exact numbers said.
01:37:27.000 It's like, it's like the fact that it makes a perfect solar eclipse, right?
01:37:33.000 Yeah, it's the perfect distance.
01:37:34.000 It's the perfect distance between us and the sun and the size of it to be what it is.
01:37:38.000 Also, it does not rotate at all.
01:37:40.000 Right.
01:37:41.000 It goes around us, but it doesn't have any spin, which is unlike any other moon ever observed.
01:37:46.000 There is no moon in the observable universe that we've ever found that doesn't also have some kind of its own axis that it spins on.
01:37:52.000 Have you seen Moonfall?
01:37:53.000 No.
01:37:53.000 The movie.
01:37:54.000 No.
01:37:55.000 I enjoyed it.
01:37:56.000 You should watch it.
01:37:56.000 Did you see Moon?
01:37:58.000 Was Sam Rockwell?
01:37:59.000 Yes.
01:38:00.000 So Moonfall is about the moon and it's falling.
01:38:00.000 Awesome movie.
01:38:06.000 And also the villain.
01:38:06.000 Absolutely.
01:38:08.000 What's the guy?
01:38:08.000 What's the name of the guy?
01:38:10.000 Roland Emmerich made it.
01:38:11.000 No, no, no.
01:38:12.000 In Moonfall, it's Patrick Wilson.
01:38:16.000 It's about the moon is falling out of orbit.
01:38:19.000 They're like, what?
01:38:20.000 And then what turns out to be happening.
01:38:22.000 So there's a bit of action in sci-fi.
01:38:24.000 The action is like the moon's close to Earth and you can jump and you jump super high, but it's also sucking the oxygen out.
01:38:31.000 But it turns out the moon is a terraforming space station.
01:38:34.000 Yeah.
01:38:34.000 And there was an advanced civilization that created a bunch of, they created an AI that went rogue and started killing it.
01:38:40.000 And so the humans who fled built space stations to manufacture terraforming spheres.
01:38:46.000 The AI found out where they were building it because they wanted to wipe humans out, destroyed most, but one escaped.
01:38:51.000 That one went and created Earth and seeded life on it.
01:38:55.000 And now, thousands of years later, the AI has finally tracked the moon and it's trying to destroy it, but it's knocked it out of orbit.
01:39:02.000 And then the humans go and then they defeat the AI and then they're inside the moon space station.
01:39:07.000 And then they're like, wow.
01:39:08.000 The third act is absolutely insane.
01:39:10.000 Yeah.
01:39:10.000 Sounds like most Roland Emmerich movies.
01:39:13.000 But listen, but without the moon, we would not have, our Earth would not exist.
01:39:17.000 It would not exist.
01:39:19.000 The moon does spin on its axis.
01:39:22.000 It does so at exactly the same rate as it orbits Earth about once every 27.3 days.
01:39:27.000 And that it matches because of the gravitational interaction between the Earth and the Moon.
01:39:32.000 So it does actually rotate.
01:39:34.000 It's just that the way that it's rotating, it always keeps the rotating perfectly at the same time.
01:39:40.000 No, but it rotates around us.
01:39:40.000 No, no, no, right.
01:39:43.000 Right.
01:39:44.000 And it spins on fixed.
01:39:45.000 It spins on its axis as well.
01:39:46.000 It just, it spins.
01:39:47.000 It spins.
01:39:48.000 It spins.
01:39:48.000 It's perfectly with the Earth.
01:39:49.000 Yeah.
01:39:50.000 Once every 27.3 days.
01:39:51.000 Right.
01:39:51.000 So no.
01:39:52.000 That doesn't make any sense.
01:39:52.000 No, no, no.
01:39:53.000 No, no, no.
01:39:54.000 Hold on.
01:39:54.000 Hold on.
01:39:54.000 Okay.
01:39:55.000 We always see the same side of the system.
01:39:56.000 Yes.
01:39:57.000 Because it is spinning at the exact same rate it's going around.
01:40:00.000 If it didn't spin, it would go like this, and you'd see the back of it.
01:40:04.000 Yeah.
01:40:04.000 No, no.
01:40:05.000 Yes.
01:40:06.000 If it didn't spin, think of it like this.
01:40:06.000 Listen.
01:40:06.000 Yeah.
01:40:09.000 Look, this part, right?
01:40:10.000 If it didn't spin, you would go.
01:40:13.000 Science.
01:40:13.000 Here we go.
01:40:14.000 Science.
01:40:15.000 But here's the bottle.
01:40:15.000 Okay.
01:40:16.000 No, no, no.
01:40:17.000 Here's the bottle not rotating, right?
01:40:20.000 So you can see different parts of it when it's in different areas.
01:40:20.000 Right.
01:40:23.000 Right, but if it's spinning perfectly, I think.
01:40:25.000 Right.
01:40:26.000 I think we're saying the same thing, but I understand your point.
01:40:30.000 My point is that it does not spin away from how it faces us.
01:40:35.000 So it always seems like it has some kind of locking, locked in mechanism to keep it always.
01:40:43.000 We never see the dark side of the moon.
01:40:45.000 But that's because of the gravitational interaction between the Earth and the Moon over billions of years.
01:40:49.000 But that's amazing.
01:40:50.000 One might argue.
01:40:52.000 One might argue.
01:40:52.000 Right.
01:40:54.000 Or it's the rockets on the other side of the moon pushing maybe not.
01:40:58.000 Man, I don't know.
01:40:59.000 At this point, I don't know.
01:41:01.000 Point is, it is astronomically rare for there to be the perfect distance and rotational speed for the moon to do what it does.
01:41:09.000 Yes.
01:41:10.000 And size.
01:41:11.000 And shape.
01:41:11.000 Right.
01:41:12.000 And so it's a perfect sphere.
01:41:14.000 Unlike the other moons that have oblong or kind of, you know.
01:41:18.000 Maybe divine intervention.
01:41:19.000 Absolutely.
01:41:20.000 Maybe intelligent construct.
01:41:22.000 Sure.
01:41:23.000 Why not?
01:41:23.000 I don't know.
01:41:24.000 I just think it's all worth unpacking and talking about because it's fucking fascinating.
01:41:24.000 Yeah.
01:41:29.000 Well, we're going to go to your Rumble Ransom super chats.
01:41:29.000 Indeed.
01:41:32.000 So smash the like button, share the show with everyone, you know.
01:41:35.000 That uncensored portion of the show is coming up at 10 p.m.
01:41:38.000 So join the Discord server at Timcast.com.
01:41:41.000 Click join us.
01:41:42.000 Community is our strength, and we can't do this without you.
01:41:45.000 All the work we do here is possible because you guys are members.
01:41:48.000 And in the Discord, we've got morning shows.
01:41:50.000 We have the new 6 p.m. behind the scenes show with our third chair co-hosts.
01:41:54.000 I believe that was you today, Mary, right?
01:41:56.000 Yes, me and Brent.
01:41:57.000 And Brett.
01:41:59.000 We're being hosted by the Discord community.
01:42:01.000 And tomorrow, you will have access as members of our Discord to the behind-the-scenes backstage pass during our pre-production and pre-record for Timcast IRL, which we've been experimenting with.
01:42:11.000 Basically, you get an extra hour of show as we're setting the show up and goofing off.
01:42:11.000 It's a lot of fun.
01:42:16.000 Last week, we were playing music and Ian was trying to sing or singing.
01:42:19.000 Depends on your definition.
01:42:21.000 But now we will grab your Rumble Ransom Super Chats.
01:42:25.000 Ian's a great singer, by the way.
01:42:27.000 I love him singing Gibana on my phone still.
01:42:30.000 Ian can sing, but Ian tries to sing things he can't sing.
01:42:34.000 Don't we all?
01:42:36.000 How else do you learn to sing?
01:42:37.000 Fair point.
01:42:38.000 That's true, but it's like you can be a good singer and be a singer that is.
01:42:42.000 You can be a good singer that's naturally good at singing, that can just kind of sing off the cuff, or you can be a singer that can sing when you are rehearsed.
01:42:50.000 So you practice something, know how you're going to do it.
01:42:53.000 Then you sing it, you sound good.
01:42:54.000 If you're just like, man, I'm going to jump right in right here, feet first, and you can sound terrible.
01:42:59.000 And then you can also just be tone deaf.
01:43:00.000 Phil, you are a platinum recording artist.
01:43:03.000 Could you sing Barbie Girl?
01:43:03.000 I am.
01:43:04.000 I would not.
01:43:06.000 That's my point.
01:43:06.000 Could you?
01:43:09.000 I could rehearse it.
01:43:10.000 You could do your version of it.
01:43:11.000 And on the same key, I'm guessing.
01:43:14.000 I probably couldn't do it in the same key, but I'd have to do this.
01:43:14.000 I don't know.
01:43:19.000 Humans are different.
01:43:20.000 Yeah.
01:43:21.000 Okay, let's read something.
01:43:23.000 Listen, rehearsing is good for whatever you're doing.
01:43:25.000 Practice is good.
01:43:26.000 All right.
01:43:26.000 You should cover some of the stuff.
01:43:27.000 Let's grab some Rumble Rants.
01:43:28.000 Jarvis says, Tim, promise me you'll unban Jeweled Lotus and Monocrypt, and you have my vote when you run for president.
01:43:33.000 Done.
01:43:35.000 Done.
01:43:36.000 Well, there's one vote at least.
01:43:37.000 Agreed.
01:43:39.000 Unit unit says both Abbott's Lone Star Program and December's Illegals Migration Program are listed as ongoing.
01:43:45.000 Where's the flak for helping to F the following states?
01:43:48.000 New York, Illinois, PACOCA 2022 to current day.
01:43:54.000 Yeah.
01:43:55.000 Jamie Braggadel says, Shazam was easily the best part of the Snyder-verse.
01:43:58.000 Thanks for keeping at least some corner of Hollywood sane.
01:44:01.000 In the green room show today, I was explaining to our good friend here that the DC movies from best to worst is number one, Shazam, number two, Shazam 2, and then the rest, I don't know, whatever.
01:44:10.000 They're somewhere in there all mixed up.
01:44:12.000 He's not biased at all because I'm here right now.
01:44:14.000 I'm not, because I've talked about this before on the show, and I'll give you the quick version.
01:44:18.000 They kept trying to make Justice League before they established characters and movies, and I'm just keeping it simple.
01:44:24.000 And Shazam, they were like, let's make a superhero movie that fits the character Shazam, and it was good, and everybody agreed, and they went and saw it.
01:44:31.000 I appreciate that.
01:44:32.000 And it was fantastic.
01:44:33.000 And Shazam 2 was also good.
01:44:35.000 Wasn't as good as the first one, but it's, in my opinion, second best.
01:44:38.000 Marvel made Iron Man success.
01:44:40.000 Teased the Hulk.
01:44:42.000 Made The Hulk.
01:44:43.000 They teased, I think, Captain America next.
01:44:45.000 I think that was, or Thor.
01:44:45.000 I think so.
01:44:46.000 No, Thor was teased the end of Captain America.
01:44:48.000 Okay.
01:44:48.000 Maybe.
01:44:49.000 I think so.
01:44:50.000 Something in New Mexico.
01:44:51.000 And then they were like, hey, maybe all these movies are one universe when we do Avengers.
01:44:56.000 And I was even reading about how I think it was Feige saying, they didn't even know the Infinity Stones were going to be in it.
01:45:00.000 They weren't even in the same.
01:45:01.000 A lot of those weren't even made by the same studios in the films.
01:45:04.000 Paramount was involved.
01:45:06.000 Starting with Iron Man, starting with Iron Man, it was all Marvel.
01:45:10.000 First Captain Marvel Studio was Paramount.
01:45:13.000 Are you sure?
01:45:13.000 I don't think so.
01:45:14.000 No, no.
01:45:14.000 Yeah.
01:45:15.000 Brett knows stuff like this.
01:45:16.000 Yeah, he does.
01:45:17.000 I got to trust this.
01:45:18.000 You look that up.
01:45:19.000 Don't even look at it.
01:45:20.000 But Marvel famously.
01:45:21.000 Start that up, Glenn Powell.
01:45:22.000 So anyway, anyway, before we go to the next one, I just want to say, DC started by being like, let's make Justice League right away and make a shared universe right away and then do stories later.
01:45:32.000 But people don't know Aquaman.
01:45:35.000 They don't know.
01:45:36.000 They did Man of Steel and then tried making Batman be Superman, but they didn't give us a Batman in the Snyder-verse.
01:45:41.000 So it was just, I don't know.
01:45:43.000 And then Batman, Superman, your mom's name is Martha.
01:45:47.000 My mom's name is Martha.
01:45:48.000 Can we be friends?
01:45:49.000 I'm a fan of the Snyder-verse.
01:45:51.000 I will say one of the things, you know, however you want to slice it, I do think Zack Snyder does shoot really cinematically.
01:46:01.000 Like it's, you know, it's some really beautiful stuff.
01:46:03.000 But there was a plot point also in Batman versus Superman that just always irked me, which was Batman got into that fight with Superman.
01:46:10.000 They were pummeling each other through building after building after building after building, and then just happened to, by chance, land in the building where he had placed the kryptonite sphere.
01:46:22.000 I was like, how do you, how did, how did you manage to make sure that Superman threw you into that building?
01:46:29.000 Like, those types of little plot points always kind of your mom's.
01:46:34.000 Paramount Pictures served as primary distributor for the early Marvel Cinematic Universe films for Marvel Studios.
01:46:41.000 Iron Man, Iron Man 2, Thor, Captain America, The First Adventure.
01:46:44.000 That was the distributor.
01:46:46.000 Yes, but they weren't all.
01:46:47.000 Marvel Studios made it.
01:46:48.000 Yes.
01:46:49.000 But I'm saying, but it was distributed by Paramount.
01:46:51.000 No, no, that's not what you said.
01:46:52.000 I say produce.
01:46:54.000 You said they were made.
01:46:55.000 They were made.
01:46:56.000 By the way, fair point, though.
01:46:57.000 I had forgotten that Paramount was the distributor for those.
01:47:00.000 With Hulk was Universal?
01:47:03.000 Yes.
01:47:04.000 First Hulk the first one.
01:47:05.000 The first.
01:47:06.000 Because they still own the rights to that character, which is why he has to be allowed to be only in other movies.
01:47:11.000 What I want to say is Peacemaker's ending was bad.
01:47:14.000 Have you said you've seen Peacemaker?
01:47:15.000 I'm not.
01:47:16.000 The first season, fantastic.
01:47:18.000 Season two, really good, and then flubbed the last couple of episodes.
01:47:23.000 I just, this is for James Gunn.
01:47:25.000 You need my expertise on this one.
01:47:27.000 That's how arrogant I'm going to be on this.
01:47:28.000 Augie's death.
01:47:29.000 Did you guys watch this?
01:47:30.000 Don't spoil it.
01:47:32.000 Come on.
01:47:32.000 It's been a month or two.
01:47:34.000 The show's out.
01:47:35.000 I'm not watching.
01:47:36.000 Tim will spoil a movie if he goes to it that day.
01:47:39.000 So first, Peacemaker's actual dad dies in the first season.
01:47:43.000 And then I'm pretty sure this is well known from the trailers and everyone that he goes to another dimension.
01:47:48.000 It's a Nazi dimension.
01:47:49.000 It's been a lot of controversy.
01:47:50.000 They took the Peacemaker dance out of Fortnite because he goes like this with his arms.
01:47:55.000 That's why they took it out?
01:47:56.000 Yeah, because...
01:47:57.000 Because it looks like a swastika?
01:47:58.000 Well, because you have the diagram that says that doesn't actually make a sense of it.
01:48:02.000 It doesn't.
01:48:02.000 But in the show, he goes to another dimension where the Nazis won.
01:48:06.000 And in it, he meets an alternate version of his dad, who is not a Nazi and is a hero and gives a lecture on being a good person.
01:48:13.000 And he's like, I can't be responsible for all the evils of my universe.
01:48:18.000 You must be a saint in yours, but I do what I can to fight the monsters that are in front of me.
01:48:21.000 And then instantly, Vigilante just murders him.
01:48:26.000 And it's just like the show ended for me right at that moment.
01:48:29.000 Peacemaker was not a good guy.
01:48:31.000 He was an anti-hero, struggling to be a hero.
01:48:34.000 And he's kind of a goofball, but he's really, really good at what he does.
01:48:38.000 And they give you this moment where he meets another version of his dad who's proud of him and is teaching him a real life lesson to be a hero.
01:48:45.000 And then for a gag, they kill him.
01:48:46.000 And I'm like, that just ruined everything.
01:48:48.000 Yeah, they take it away from you.
01:48:49.000 They took it away.
01:48:50.000 And then the show ends.
01:48:52.000 The season finale had no finale.
01:48:54.000 Nothing is resolved.
01:48:56.000 I was just like, okay, so everybody's actually said episode seven was the real finale and episode eight was like season three, episode one.
01:49:01.000 It was a dream.
01:49:04.000 I don't know.
01:49:05.000 I think he screwed up.
01:49:06.000 It was really, really good until they killed Augie.
01:49:08.000 And you're just like, well, that just ruined the story.
01:49:11.000 That ruined the whole arc of what he was doing there.
01:49:14.000 The general idea is through the whole.
01:49:16.000 Okay, season one, his dad's got an interdimensional portal that they found in the woods, the rednecks.
01:49:20.000 He used it to make technology and give himself a power suit.
01:49:24.000 At the end of season one, he kills his dad.
01:49:26.000 Story resolves.
01:49:28.000 Season two starts.
01:49:28.000 He goes into the dimension.
01:49:29.000 He finds a better dimension where his life is perfect.
01:49:31.000 He's in love with the woman he likes.
01:49:33.000 His brother's still alive.
01:49:34.000 Everything's perfect.
01:49:35.000 But then you find out the Nazis won.
01:49:38.000 The resolution of this story over a long period of time that his dad in the alternate universe was going to give him a life lesson on being a hero and then should have pat him on the shoulder and said, I know you can be the hero you were meant to be.
01:49:48.000 And I'm sorry that your father and your universe was not there for you, but I'll be what I can for you now.
01:49:55.000 And then he pushes him through the door and he locks it.
01:49:57.000 And then Chris is sad and he's like, no.
01:49:59.000 But then he takes the lesson and tries to be a better hero.
01:50:01.000 Instead, they're just like, just kill him randomly.
01:50:03.000 Just that's his type of humor.
01:50:06.000 That's his type of issue.
01:50:06.000 It wasn't funny.
01:50:08.000 And I was like, the payoff for the show ended halfway through, and then they end with nothing happening.
01:50:14.000 I was like, wow, that was a major flub.
01:50:16.000 So the only way they can solve for this problem is if in the next movie they're doing, I think it's not Superman.
01:50:22.000 It's a Superman-related film.
01:50:23.000 They're doing Supergirl.
01:50:25.000 No, but like the one, is Superman's going to be in it?
01:50:28.000 Yes, he will.
01:50:29.000 They just did reshoots.
01:50:30.000 Like they added scenes with him and Lobo to.
01:50:33.000 You see, Jason Momoa as Lobo is perfect.
01:50:37.000 The only thing they need to do to- They made a billion dollars on Aquaman, so he was good as Aquaman, too.
01:50:37.000 Yeah.
01:50:41.000 But he's a perfect Lobo.
01:50:43.000 The only redemption, in my opinion, is if they bring back Shazam.
01:50:46.000 There you go.
01:50:47.000 Zachary Levi, bring back the character.
01:50:49.000 Let's go.
01:50:50.000 Well, they've already proven that they're willing to bring people back despite the fact that they said they were relaunching.
01:50:54.000 Well, I will say this.
01:50:55.000 I think the success of your films compared to the other DC films, he should do it.
01:51:01.000 That's just me because I'm a fan.
01:51:01.000 He should.
01:51:03.000 Well, I appreciate it.
01:51:04.000 Well, I am.
01:51:05.000 That movie was.
01:51:07.000 You guys who watched the green room where I was basically articulating my love for DC.
01:51:11.000 I like Marvel.
01:51:12.000 I love the Marvel Cinematic Universe, but DC has always been better because I feel like the storylines were actually a bit more philosophical and dancing around the moral consequences of our actions and things like that.
01:51:22.000 Whereas Marvel is just like, I don't know.
01:51:26.000 I'm going to ask when the Chuck reboot is coming.
01:51:28.000 Bro, I've been trying to make a Chuck movie for some.
01:51:30.000 Can we get the Chuck?
01:51:31.000 If we can get psych movies, we can get Chuck movies.
01:51:33.000 I would love to.
01:51:34.000 All right, we got to grab some more of this.
01:51:34.000 I would love to.
01:51:36.000 Aura the Red says, let it be overturned.
01:51:38.000 I'm gay, but I never wanted to be married anyway.
01:51:40.000 Bring back the civil union and respect marriage as dictated by the church.
01:51:43.000 We shouldn't be imposing here.
01:51:46.000 AK Storm says, if gay marriage isn't overturned, FPC and GOA should immediately push for CCW national reciprocity.
01:51:55.000 Agreed.
01:51:57.000 Agreed.
01:51:57.000 For those that aren't familiar, it's basically universal gun ownership and conceal without a permit.
01:52:03.000 They should do that anyway.
01:52:06.000 Taiwan Cricket says three Atlas is from Earth.
01:52:08.000 That's why it has water and is returning after about 4,000 years.
01:52:11.000 It's classified as interstellar because the mass of the solar system has been underestimated.
01:52:16.000 And then he adds seemingly unrelated, gay is not okay.
01:52:20.000 okay?
01:52:21.000 So it's the argument that...
01:52:23.000 It has water?
01:52:24.000 Yeah.
01:52:24.000 Well, actually, I think they said it was depleted of water.
01:52:26.000 Oh, okay.
01:52:27.000 That's why it's not emitting.
01:52:28.000 Comets usually emit a tail, like a blue, it's water vapor, and this one doesn't.
01:52:32.000 It's carbon dioxide.
01:52:33.000 So.
01:52:35.000 James Smith Politics says overturning Obergefell will only open the door to Republicans losing support.
01:52:40.000 Most people are indifferent about gay marriage, and I feel that Republican states would be scared to ban gay marriage.
01:52:46.000 It's like 33% now saying that they would rather make gay marriage illegal.
01:52:53.000 33%.
01:52:54.000 33%.
01:52:54.000 Illegal.
01:52:55.000 Yeah.
01:52:55.000 33% of the states of no, like the general population.
01:53:03.000 Sergeant Caesar says, Zach.
01:53:06.000 Right?
01:53:06.000 Sergeant Caesar says, Zach, what was your favorite role?
01:53:09.000 I'm hoping you say it was Charles Irving Bartowski.
01:53:12.000 It's definitely up there.
01:53:13.000 Yeah, Chuck was, I mean, an incredible experience and adventure.
01:53:17.000 And that whole cast, we're still friends.
01:53:20.000 And I mean, I don't get to talk to them or see them nearly as much anymore.
01:53:24.000 But like I was saying, I mean, I've been trying to make a Chuck movie for a long time.
01:53:27.000 I think I'm going to reboot Chuck.
01:53:29.000 When the content creator AI system comes out, I will ask it to make it.
01:53:34.000 Does that make you feel violated?
01:53:37.000 Can I ask you a question?
01:53:38.000 How do you feel about like some actors don't like being tied to roles they did when they're younger and they want to move on and they don't like the idea that people think of them in that way?
01:53:47.000 No, I don't mind it.
01:53:48.000 I mean, listen, it's like bands that they get mad that their biggest hit is something everybody loves and they stop wanting.
01:53:54.000 Man, I just think as an artist, you should be grateful that anybody gives a shit about you at all and likes what you do.
01:53:59.000 And then, like, I do conventions all the time.
01:54:02.000 I love doing them.
01:54:03.000 I love people and I'm an extrovert.
01:54:04.000 So, like, I don't know how my friends who are introverts do conventions.
01:54:07.000 It must drain them insanely.
01:54:09.000 But I love it.
01:54:10.000 I love meeting people.
01:54:11.000 I honestly, philosophically, I look at it like I'm basically just getting paid to love on people all day long and be loved on.
01:54:17.000 And the amount of people that tell me, they go out of their way to tell me, hey, man, I was going through a really hard time in my life.
01:54:25.000 And Chuck or Shazam or Tangled got me through it.
01:54:29.000 I can do that.
01:54:30.000 And I go, thank you for sharing that with me.
01:54:32.000 And if you see me as Flynn Ryder or as Chuck or as Billy Batson for the rest of my career, if that brings you joy, then absolutely.
01:54:41.000 I can do that for you right now.
01:54:42.000 So my wife made sure I had to tell this story.
01:54:45.000 She said that I had to do this.
01:54:46.000 I introduced her to that show, and me and her essentially fell in love while watching Chuck.
01:54:52.000 Yeah.
01:54:53.000 Love it.
01:54:55.000 When you're doing Shazam, did you have to like, I don't know the name of the actor who played Billy Batsam?
01:55:02.000 Did you have to like study his behavioral patterns to try and be him in this man?
01:55:02.000 Asher Angel.
01:55:08.000 We I tried.
01:55:10.000 I mean, we were given no time, basically.
01:55:12.000 I mean, when we ramped up into the first movie, it all happened really fast.
01:55:15.000 And then we were all thrown together in Toronto and he was in school and I was working and then he would be working.
01:55:21.000 We were never really on set at the same time.
01:55:22.000 Wow.
01:55:23.000 And we jammed into the movie.
01:55:24.000 And then by the second time we got to the second movie, I mean, that all came together pretty quick.
01:55:28.000 But, you know, I love Asher.
01:55:31.000 He's a really great kid.
01:55:32.000 And all, I mean, really, that whole Shazamly, as we called it, lovingly, all of the adults and all of the kids, we all got along really well.
01:55:41.000 But I, you know, I did my best for that.
01:55:43.000 I will say, you know, it was like in the second movie, it's weird.
01:55:47.000 It's like, you know, I got dragged for things that like I had nothing to do with.
01:55:51.000 Like there's this moment where Asher, he's having this emotional moment with our character, Marta, who plays our mother in the film.
01:56:02.000 And then he, you know, and she's like, you got it.
01:56:06.000 And, you know, go take on that dragon, basically.
01:56:08.000 And he says, Shazam, and he turns back into me.
01:56:10.000 Well, we had a smoke effect where they would like pump this smoke in and he would like, you know, come out and we do a cowboy switch and I would jump in there.
01:56:16.000 And they smoked me up.
01:56:17.000 Well, it got into my eyes.
01:56:18.000 And I, so you can see it in the film.
01:56:20.000 They kept it in the cut, but it wasn't like I was an acting choice.
01:56:23.000 I was like, oh, geez, and had no idea that they were going to keep that in the movie.
01:56:26.000 Well, they thought it was funny.
01:56:27.000 And so they kept it in the movie.
01:56:29.000 And then all of these just like haters online were like, this is you screwing up the role.
01:56:34.000 You're not even doing justice to his emotion.
01:56:36.000 I'm like, guys, I gave so many different takes.
01:56:39.000 They chose to use that take.
01:56:40.000 Why are you pinning that on me?
01:56:42.000 It's crazy.
01:56:43.000 As an actor, you get all of the flack.
01:56:46.000 Like, people don't.
01:56:48.000 Sorry for that.
01:56:48.000 No, you're gone.
01:56:49.000 You're good.
01:56:50.000 I'm also, you know, I'm a big fan of the band Eve Six.
01:56:54.000 You guys know Eve Six?
01:56:55.000 I've heard of him.
01:56:56.000 And the singer Max.
01:56:56.000 Yes.
01:56:58.000 He's like super anti-Trump.
01:57:00.000 I used to tweet at him all the time.
01:57:01.000 But when I was a kid growing up, I could play all their hits with my garage band, my friends.
01:57:06.000 And of course, they had Inside Out, they had Promise, and they had Here's to the Night.
01:57:12.000 However, this massive success they got in the late 90s started to disappear.
01:57:18.000 I think it's really obvious why it's Napster.
01:57:20.000 And so through no fault of their own, the medium changes, the sales aren't there anymore, and then the industry loses interest as if it's something to do with them as musicians, as opposed to the way medium is changing, which is to your point about no one goes to theaters anymore.
01:57:38.000 And that means that all of this really great opportunity we have for big movies, I think there's a lot of people in Hollywood who don't understand.
01:57:45.000 It's not so much whether it was a good film or not.
01:57:48.000 Sometimes it is, but it's that how you deliver it to an audience who consumes media a different way is restricting our ability to have better content.
01:57:57.000 That's why they're starting to hire TikTokers to do edits of movies to release them.
01:58:01.000 They want to drive new audiences.
01:58:03.000 Indeed, let's grab some more here and see what y'all have to say.
01:58:07.000 If YouTube doesn't crash on me.
01:58:10.000 All right.
01:58:11.000 Ready to Rumble says, if it gets overturned, they lost the midterms.
01:58:14.000 I don't disagree.
01:58:16.000 Yeah, I don't disagree.
01:58:18.000 Paps McGee says, Chuck, we love you so much in our household.
01:58:21.000 So cool to have you on the show.
01:58:23.000 Thanks, Paps McGee.
01:58:24.000 Happy to be here.
01:58:26.000 And in your home.
01:58:29.000 All right.
01:58:31.000 Let's see what else we got.
01:58:33.000 Mouth Breather says, why couldn't civil unions for gay couples just be given the same benefits as marriage and we can keep marriage separate as a religious ceremony union?
01:58:41.000 It just seems like an intentional affront to religious groups that disagree with gay marriage.
01:58:46.000 Because feelings.
01:58:47.000 I agree.
01:58:49.000 Civil unions that are identical in every way, like the legal structure of marriage.
01:58:54.000 So 501c3s basically imitate churches, but it's a separate legal structure.
01:58:59.000 But the point is, it doesn't subvert marriage that way.
01:59:02.000 And the people that want to see there are, there is, like I said, there is a malicious intent on the left.
01:59:08.000 The activists, now it's not everybody that has a left-leaning opinion or whatever, but the activists in the LGBT group, like they are malicious.
01:59:17.000 That's why they went after the cake baker in Denver.
01:59:21.000 They could have gone to any other bakery.
01:59:23.000 Yeah, no, I agree.
01:59:24.000 That was totally fucked up.
01:59:25.000 And the point of it was to attack them for being Christians.
01:59:29.000 So it's totally about malice.
01:59:31.000 It's totally about attacking Christians because the LGBT groups will say, well, the Christians have attacked us.
01:59:36.000 And so they are looking for retribution.
01:59:39.000 That is the reason why.
01:59:41.000 I also still don't think marriage should have anything to do with the government.
01:59:44.000 If you want to go have a religious ceremony, go have a religious ceremony.
01:59:48.000 Abolish airport taxes.
01:59:50.000 Amen.
01:59:53.000 That's fair, but the LGBT groups are going to say no because they want to use vectors to attack.
02:00:00.000 Like they're going to, they would be against the idea of making it not a government thing because then they can't use that to oppress people that they feel have oppressed them.
02:00:10.000 This is all about exercising power.
02:00:13.000 It is not at all about, oh, well, we just want to be treated the same or what have you.
02:00:18.000 That is not the case.
02:00:19.000 It is malicious and it is entirely malicious.
02:00:23.000 Ready to Rumble says, Tim calling them cowards as he talks about hiding.
02:00:28.000 Did I say I wouldn't keep doing this?
02:00:30.000 I said, I'd keep doing the show, putting my face out there, calling out what I think needs to be called out.
02:00:36.000 The Supreme Court, however, is largely behind the scenes.
02:00:40.000 We know who they are, but we don't watch them on camera.
02:00:43.000 You don't hear from them on a day-to-day basis.
02:00:45.000 And too often, they have refused to rule on court cases this country needs.
02:00:49.000 So let's put it simply: call them whatever you want.
02:00:52.000 If they are not prepared to do the job that they've been selected and agreed to do, they should resign right now.
02:00:58.000 But I also think that we're living in assassination culture, and that needs to be fixed on a cultural level.
02:01:06.000 So that they don't have to fear for their lives if they make a decision.
02:01:09.000 Sure, but that's like, that's very much in line with how leftists are like rapists should be taught not to rape.
02:01:15.000 Okay, assassin.
02:01:18.000 People being publicly assassinated for their political opinions was not just mainstream that exists.
02:01:28.000 Until recently, that was not.
02:01:29.000 Well, assassinations have been around for a very, very long time, and they're just escalating once again.
02:01:33.000 The mass general public thinking that it's acceptable is a new thing entirely.
02:01:38.000 And you can't remedy that overnight.
02:01:40.000 And so if the argument is the Supreme Court justice can't do their job because of it, they should resign.
02:01:45.000 I'm saying a both and statement here.
02:01:47.000 I agree with that.
02:01:48.000 And I'm saying we shouldn't, as a society, just accept that political violence is okay.
02:01:54.000 We don't.
02:01:55.000 Increasingly, we do.
02:01:56.000 No, no, no.
02:01:57.000 We don't.
02:01:58.000 Okay, if we're talking about we in this room, of course not.
02:02:00.000 Like, we, we in this political faction don't accept this.
02:02:02.000 We're, of course, angry with it, but that doesn't change the fact the Supreme Court has a job to do if we are to remedy it.
02:02:08.000 And if they're too scared to do it, they need to retire.
02:02:11.000 Okay, I'm not objecting to that at all.
02:02:13.000 I'll do it.
02:02:14.000 Nominate me.
02:02:15.000 I'll go nuts.
02:02:16.000 Like, bro, if I was on the Supreme Court, I'd be like, everybody can have guns, concealed carry.
02:02:21.000 Permits, your permits, constitution.
02:02:23.000 Have fun.
02:02:25.000 Oh, bro.
02:02:26.000 Does that mean if Clarence Thomas retires, we can get a Clarence Thomas podcast?
02:02:30.000 If Clarence Thomas, if I was on the Supreme Court with Clarence Thomas, he would start complaining about me.
02:02:36.000 He'd be like, this guy's going crazy.
02:02:38.000 I mean, he doesn't understand law at all.
02:02:40.000 Never spent a day in law school.
02:02:42.000 He's saying things should be legal.
02:02:43.000 That's clearly going to screw up.
02:02:45.000 I'm not going to be like, you're right.
02:02:46.000 I don't care.
02:02:47.000 Everybody gets to have guns.
02:02:48.000 You want to put a nuke down your pants?
02:02:50.000 Have fun.
02:02:50.000 Second Amendment.
02:02:51.000 Doesn't say anything in the Second Amendment about restrictions or mental health or any of that stuff.
02:02:56.000 If you don't like that, you got to change the Constitution.
02:02:58.000 That's what we have in the first place.
02:03:00.000 All right.
02:03:00.000 We're going to go to the uncensored portion of the show.
02:03:02.000 So smash the like button, share the show.
02:03:03.000 Head over to rumble.com slash Timcast IRL.
02:03:07.000 It's going to be fun.
02:03:08.000 You can follow me on X and Instagram at Timcast.
02:03:11.000 Zachary, do you want to shout anything out?
02:03:12.000 Yeah, please go.
02:03:14.000 I have a couple of movies that are coming out.
02:03:16.000 November 7th, which is tomorrow.
02:03:18.000 I have a movie called Sarah's Oil.
02:03:19.000 It's a true story about a young black girl, turn of the century, Tulsa, Oklahoma, who she and her family were essentially previously enslaved by one of the indigenous tribes.
02:03:29.000 A lot of people don't know that, but everybody had slaves for a really long time.
02:03:33.000 When the U.S. outlawed slavery, so did then, therefore, the indigenous tribes.
02:03:37.000 And they made those slaves freed men, tribal members.
02:03:41.000 And then when the states gave back land to the tribes, like in Oklahoma, that land was divvied out to all their tribal members.
02:03:47.000 So this girl got 160 acres that the government thought was some crap land because you couldn't grow anything on it.
02:03:51.000 But she was an intelligent, precocious, and spirit-filled girl who could read and write.
02:03:55.000 And she was reading about the oil boom coming across America.
02:03:58.000 And she went to her land and prayed over her land.
02:04:00.000 And she believed that God told her that there was oil in her land.
02:04:03.000 And sure as shit, she had the largest purist oil reserve in all of North America.
02:04:06.000 She became the richest woman in America as like a 10-year-old black girl in Tulsa in like 1912.
02:04:12.000 Wow.
02:04:12.000 Crazy.
02:04:13.000 So anyway, that comes out tomorrow.
02:04:15.000 It's a really uplifting, inspiring film.
02:04:18.000 Good for the whole family.
02:04:19.000 And then on December 12th, I have a movie called Not Without Hope that's coming out, also a true story.
02:04:24.000 2009, there were four buddies who went fishing off the coast of Tampa, Florida.
02:04:27.000 You might have heard about this.
02:04:28.000 There was two of them that were NFL players.
02:04:30.000 Their boat capsized.
02:04:31.000 They got caught in a storm and three of the four died of hypothermia.
02:04:34.000 And I played Nick Schuyler, who was the only survivor.
02:04:36.000 So not quite as uplifting, but still has a happy ending at the end.
02:04:40.000 Hopefully.
02:04:41.000 That'd be great.
02:04:41.000 Anyway, go check those out.
02:04:42.000 And go see him in the theaters, please, if you can.
02:04:44.000 Is that the one with Josh Dumell?
02:04:46.000 Josh Dumell's in it as well.
02:04:47.000 Yeah.
02:04:47.000 Yeah.
02:04:47.000 Awesome.
02:04:48.000 He's great.
02:04:48.000 Josh is fantastic.
02:04:50.000 Guys, if you want to follow me, I am on Instagram at X at Brett Dasovic on both of those platforms.
02:04:54.000 But what you should do is check out Pop Culture Crisis.
02:04:57.000 We are live Monday through Friday, 3 p.m. Eastern Standard Time, which is, of course, Noon Pacific.
02:05:01.000 We will see you there.
02:05:02.000 I second that.
02:05:03.000 Of course, you should go subscribe to Pop Culture Crisis.
02:05:05.000 You can send me validation on Instagram at MaryArchived, or you can send me hate on X.
02:05:10.000 That is also Mary Archived.
02:05:11.000 And help me get TikTok famous.
02:05:13.000 That is also Mary Archived.
02:05:15.000 I am Phil that remains on Twix.
02:05:16.000 The band is all that remains.
02:05:18.000 You can check us out on Apple Music, Amazon Music, Pandora, Spotify, and Deezer.
02:05:21.000 Don't forget, the left lane is for crime.
02:05:24.000 We will see you all over at rumble.com slash Timcast IRL in about 30 seconds.
02:05:28.000 Thanks for hanging out.
02:06:19.000 Be heard.
02:06:20.000 He gets it.
02:06:21.000 So Mr. Levi over here was asking, how is that not uncensored?
02:06:26.000 Well, the first and foremost is that we do swear sometimes.
02:06:28.000 We try not to because a lot of parents, what initially happened was we didn't care.
02:06:35.000 We swear.
02:06:36.000 We're like, well, whatever.
02:06:36.000 We're swearing.
02:06:37.000 You know, if we swear, we swear.
02:06:40.000 But then we got messages from people who are like, hey, I have this on my TV.
02:06:43.000 My kids are in the room.
02:06:44.000 And when you swear, I got to turn it off.
02:06:46.000 And so I was like, okay, we'll not swear.
02:06:49.000 So for the uncensored, we do swear.
02:06:51.000 But when we first launched the uncensored portion of the show, you could not say masks don't do shit.
02:06:58.000 You get banned for that.
02:07:00.000 So we created this portion of the show so that you could have the behind the scenes, say whatever you want.
02:07:05.000 So we do switch.
02:07:06.000 Masks don't do shit.
02:07:07.000 Exactly.
02:07:07.000 Vaccine stuff.
02:07:08.000 Yeah.
02:07:09.000 Yeah.
02:07:10.000 Because back, like, it was really big.
02:07:12.000 You know, people don't understand this.
02:07:13.000 They've forgotten.
02:07:15.000 In 2018, you could not publicly say you voted for Trump.
02:07:19.000 People don't realize this.
02:07:21.000 The Palmetto Cheese guy, this is during 2020 or whatever.
02:07:24.000 I can't remember what it was.
02:07:25.000 The Palmetto Cheese guy said he was a Trump supporter and they tried removing his cheese from supermarkets.
02:07:31.000 What?
02:07:32.000 That's what happened.
02:07:33.000 And people have forgotten how fucking insane it was because we've won so much.
02:07:37.000 I just did a video today after the show about how people are calling for a boycott of Spotify because they are running ads for ICE on there for ICE recruitment.
02:07:48.000 And Spotify is basically like, no, it doesn't break TOS.
02:07:51.000 We're going to keep running them so you can do whatever you want to do.
02:07:54.000 Like that would be very different just five years ago.
02:07:57.000 It would be a completely different.
02:07:58.000 Yeah, because the artists need Spotify more than Spotify needs them.
02:08:02.000 Yeah, but even then, like they're like bad optics or bad optics.
02:08:05.000 And a lot of these companies would have caved and made those moves just because they wouldn't have wanted the public persona.
02:08:11.000 Now I feel like internet back and forth has hit such a critical mass that they know they can just wait it out.
02:08:16.000 And Spotify is like, we're going to do it anyway.
02:08:18.000 Like you said, they need them more than the other people need them.
02:08:21.000 And then a lot of people are like, we're going to go to title.
02:08:23.000 I also think the 2016 Trump supporter was associated with a much more radical right-wing message than the 2024 Trump supporter, simply because the coalition has grown so much and includes so many more people.
02:08:36.000 Well, like that graph we were looking at before.
02:08:36.000 Yeah.
02:08:38.000 I mean, that was not the same graph in 2020.
02:08:42.000 Right.
02:08:42.000 Right.
02:08:43.000 Or 20 or 20.
02:08:43.000 Yeah.
02:08:44.000 2016, maybe.
02:08:45.000 I bet you 2016.
02:08:46.000 Well, 2016 or 2020.
02:08:48.000 I mean, one could argue if we looked at that graph in each one of them, it was getting more steadily toward the one we saw.
02:08:48.000 Yeah.
02:08:54.000 But yeah, it's a much bigger coalition of lots of different thinking people from different backgrounds that is definitely helping to shift all of that paradigm, which is good.
02:09:06.000 Do you think a lot of those people were always there but just didn't want to talk about it?
02:09:10.000 Or do you think they just?
02:09:12.000 No, because I don't think as many minorities or gays and lesbians were on board with Trump.
02:09:20.000 I wasn't.
02:09:20.000 I didn't vote for Trump the first two times.
02:09:23.000 I didn't.
02:09:24.000 Like I was saying before, before we did the show, I think that he had there was a miracle in Butler.
02:09:30.000 I think that there was a moment.
02:09:31.000 It wasn't just miraculously saving his life.
02:09:33.000 It was a miraculous transformation of who he was in that moment and also how we were all viewing it all in real time.
02:09:41.000 Is that when you made the decision to vote for him?
02:09:43.000 No, it was when Bobby Kennedy ultimately, when Bobby and Tulsi gabbered two lifelong Democrats who I am fortunate enough to know personally at this point, but for both of them to go sit down with Donald Trump and for him to convince them both.
02:10:00.000 They're very savvy, very intelligent people that they're not just going to fall for anything.
02:10:05.000 So when they asked me, when Tulsi reached out to me and said, hey, would you moderate, it was in Dearborn, Michigan.
02:10:12.000 They were doing like a run-up to the election and they were on the campaign trail.
02:10:15.000 And they asked me if I would go moderate their conversation in Dearborn.
02:10:19.000 And I asked them both.
02:10:20.000 I said, do you trust him?
02:10:21.000 Do you actually believe that he means what he says?
02:10:24.000 And it's not just going to be politics and it's not just going to be gaming it and it's not just going to be for his own benefit.
02:10:29.000 Do you really believe it?
02:10:30.000 And they said, yes, we do.
02:10:30.000 And I said, well, then let's go.
02:10:32.000 Quick correction.
02:10:33.000 The owner of Palmetto called BLM a terrorist organization after the riots.
02:10:38.000 And for this, they said pull his products.
02:10:40.000 So he was still correct.
02:10:41.000 He was right.
02:10:42.000 Yeah, these people were riding across the country and he was clearly as upset as the rest of us were.
02:10:45.000 Remember that?
02:10:46.000 Didn't they just get sued?
02:10:47.000 Aren't they selling all of their homes that they bought with all that money?
02:10:49.000 Didn't you see?
02:10:50.000 Did you see that?
02:10:50.000 Oh, are they selling them now?
02:10:51.000 I think they have to sell them.
02:10:52.000 They're under investment.
02:10:55.000 Remember when you had Papa John on early days of Tim Cast?
02:11:00.000 That was a couple years ago.
02:11:01.000 That was happened to Papa Jones.
02:11:02.000 John 23.
02:11:03.000 I just saw him on X the other day.
02:11:05.000 I was going through.
02:11:06.000 He was posing with F1 racers or something.
02:11:09.000 I was like, what's Papa John doing these days?
02:11:10.000 Is he thriving?
02:11:11.000 Well, look, Gavin McInnes said that, you know who Gavin McInnes is?
02:11:16.000 I think so.
02:11:16.000 Founder of the Proud Boys.
02:11:18.000 Oh, yes.
02:11:18.000 Okay.
02:11:18.000 He got canceled everywhere.
02:11:20.000 And he said, these days, when he walks down the street, people give him hugs and take selfies with them.
02:11:25.000 Like, that's how far we've come culturally that Gavin's not a bad guy.
02:11:29.000 He's not a crazy guy.
02:11:30.000 They malign him as like some white supremacist bullshit.
02:11:33.000 He's just kind of nuts.
02:11:35.000 But he's not a white supremacist.
02:11:37.000 He's super pro-Israel.
02:11:38.000 He lives in New York.
02:11:39.000 And he found advice.
02:11:41.000 He's an edgy guy, and he's done some really questionable things.
02:11:43.000 Oh, yeah, yeah, yeah, yeah.
02:11:44.000 I know exactly.
02:11:44.000 He was on Rogan recently, I think.
02:11:46.000 No, no, no, no.
02:11:47.000 No?
02:11:47.000 Probably not.
02:11:48.000 Was he?
02:11:49.000 I don't think Gavin was on Rogan.
02:11:50.000 I don't think recently.
02:11:51.000 Maybe not.
02:11:51.000 I don't know.
02:11:52.000 Yeah, definitely not Gavin.
02:11:53.000 But the Proud Boys was this goofy boys drinking club that started fighting Antifa.
02:11:58.000 And so they tried to make it out like he's some evil white supremacist.
02:12:02.000 But the point is, they canceled him everywhere.
02:12:04.000 Now he's walking around New York and people are like, yo, Gavin, let's get a picture.
02:12:08.000 Like we are winning the narrative battle.
02:12:09.000 It's coming back.
02:12:10.000 And that was like the heyday of getting debanked even more so.
02:12:14.000 Yep.