Timcast IRL - Tim Pool - May 31, 2025


FBI Releases TRANS Shooter MANIFESTO Exposing Anti-Christian HATE | Timcast IRL


Episode Stats

Length

2 hours and 4 minutes

Words per Minute

182.93504

Word Count

22,812

Sentence Count

2,054

Misogynist Sentences

26

Hate Speech Sentences

40


Summary

The FBI releases a document detailing anti-Christian writings written by a transgender woman who was a member of a white supremacist white supremacist group. Plus, Elon Musk gets a black eye, and there's a video of Sasquatch.


Transcript

00:02:57.000 The FBI has released I believe 112 pages of the Transgender Shooters Manifesto detailing anti-Christian diatribes, lists, maps, faculty, etc.
00:03:08.000 We are now getting a glimpse into what was actually going on with this unwell individual.
00:03:13.000 And of course, this is just another day in the new FBI under the Trump administration where they're actually giving us transparency and exposing what's been going on.
00:03:22.000 Because under the Biden admin, much of this was covered up.
00:03:24.000 And even during Trump's first administration, Comey.
00:03:27.000 Well, let's just talk about Comey.
00:03:29.000 Kash Patel says he orchestrated the largest criminal conspiracy in this country, helped facilitate the Russiagate hoax, and as head of the FBI, was involved in what appears to be, according to the GOP, a cover-up of certain individuals lying to Congress to push that Russia hoax.
00:03:46.000 Now we have Comey saying that the GOP is white supremacist adjacent, whatever that means.
00:03:52.000 This is what we get, right?
00:03:54.000 So big news today.
00:03:55.000 Donald Trump was asked about Joe Biden having cancer and said, I don't really feel bad for him, which was kind of brutal, kind of funny.
00:04:03.000 We'll talk about that.
00:04:04.000 Plus, we got some news for you.
00:04:06.000 Elon Musk hit a black eye.
00:04:08.000 And everybody, there's some weird conspiracy about some secret society with people with black eyes.
00:04:13.000 It's like, or he got hit in the face by the, I don't know, like bombed his head into a cabinet.
00:04:16.000 I have no idea.
00:04:17.000 But there's also a video where people are claiming they caught Sasquatch.
00:04:20.000 And so because it's Friday, we'll talk about the serious news, and then I'll make sure to show you the video of Sasquatch, because it's funny.
00:04:26.000 Before we get started, my friends, we've got a great sponsor.
00:04:28.000 It is Tax Network USA.
00:04:31.000 My friends, go to tnusa.com slash Tim.
00:04:34.000 Tax day has passed.
00:04:36.000 But for millions of Americans, the real trouble is just beginning.
00:04:39.000 If you missed the April 15th deadline or still owe back taxes, the IRS is ramping up enforcement.
00:04:44.000 Every day you wait only makes things worse.
00:04:46.000 With over 5,000 new tax liens filed daily and tools like property seizures, bank levies and wage garnishments, the IRS is applying pressure at levels we haven't seen in years.
00:04:55.000 Increased administrative scrutiny means collections are moving fast.
00:04:58.000 The good news is there's still time for Tax Network USA to help.
00:05:02.000 Self-employed or business owner, even if your books are a mess, they've got it covered.
00:05:05.000 Tax Network USA specializes in cleaning up financial chaos and getting you back on track fast.
00:05:11.000 Even after the deadline, it's not too late to regain control.
00:05:14.000 Your consultation is completely free and acting now could stop penalties, threatening letters and surprise levies before they escalate.
00:05:20.000 Call 1-800-958-1000 or visit TNUSA.com slash Tim.
00:05:27.000 You may have missed April 15th, but you haven't run out of options.
00:05:30.000 Let Tax Network USA help before the IRS makes the next move.
00:05:33.000 And of course, my friends, go to castbrew.com and buy coffee.
00:05:36.000 We've got some great coffees for you.
00:05:37.000 Everybody loves the Appalachian Nights and the Ian's Graphene Dream.
00:05:40.000 But don't forget, we have coffee pods for your little coffee machines.
00:05:43.000 And we've got a bunch of other blends and flavors.
00:05:45.000 We've got Luck of the Seamus Irish Cream, massively in stock now.
00:05:48.000 And there's a special surprise on the back of the bag, art-wise, for those that are interested.
00:05:52.000 We've got Focus with Mr. Bocas.
00:05:54.000 Two weeks till Christmas, even though we're six months from...
00:05:57.000 It's okay.
00:05:58.000 It's still good.
00:05:59.000 And, of course, there's always Sleepy Joe Decaf if you're someone who doesn't like the caffeine.
00:06:04.000 Don't forget to smash that like button.
00:06:05.000 Share the show with everyone you know.
00:06:07.000 Joining us tonight to talk about this and so much more is Kelsey Sharon.
00:06:10.000 Hello, friends.
00:06:11.000 Who are you?
00:06:12.000 What do you do?
00:06:12.000 I do a lot of things, but I am from the communist country of Canada.
00:06:16.000 We're banned in Canada.
00:06:18.000 I'm banned too.
00:06:19.000 I feel like I'm in welcome company here.
00:06:21.000 We just found out about this because of you.
00:06:23.000 I know.
00:06:23.000 I'm so sorry about that.
00:06:24.000 But no, I'm a podcaster.
00:06:26.000 I have a show called The Kelsey Sharon Perspective.
00:06:28.000 I'm an author.
00:06:28.000 I'm a sub stack as well.
00:06:30.000 I'm also a combat veteran and I work with professional athletes and I'm a mental strength coach as well.
00:06:35.000 You had tried posting the link to our website.
00:06:37.000 Yes.
00:06:38.000 And it says people in Canada can't see it.
00:06:40.000 Why would we want to be able to see freedom?
00:06:43.000 Yeah.
00:06:43.000 Yeah.
00:06:44.000 Well, we will be greeted as liberators when we march on Canada.
00:06:47.000 Listen, when you come to the door, which is two feet from the border, I will welcome you in with open arms.
00:06:52.000 See?
00:06:53.000 See?
00:06:53.000 That proves it.
00:06:53.000 I would like to be the 51st state.
00:06:55.000 I've said it once.
00:06:56.000 I'll say it again.
00:06:57.000 All right.
00:06:57.000 Well, it should be fun.
00:06:58.000 Thanks for hanging out.
00:06:59.000 We got Brett hanging out.
00:06:59.000 What is going on, guys?
00:07:00.000 Brett, normally pop culture crisis Monday through Friday, 3 p.m. Eastern Standard Time.
00:07:04.000 We don't want Canada as the 51st state because the voting is going to become even more lopsided if we introduce communist Canadians as voters for America.
00:07:15.000 It's a horrible idea.
00:07:16.000 What if we're voting for your people?
00:07:18.000 But you're not going.
00:07:18.000 You might.
00:07:19.000 I will.
00:07:20.000 But the rest of the country won't.
00:07:20.000 But you don't know who I know.
00:07:22.000 There's only one of you.
00:07:23.000 Anyways, I'm Phil Labonte, lead singer of the heavy metal band All That Remains, an anti-communist and counter-revolutionary.
00:07:29.000 Let's get into it.
00:07:29.000 Here's a story from the Postmillennial.
00:07:31.000 FBI releases 112 pages of writing from Nashville trans school shooters showing maps.
00:07:36.000 Faculty lists and anti-Christian diatribes.
00:07:40.000 They said the FBI on Thursday released over 100 pages, of course 112, related to the Nashville Covenant School shooting, which killed three children and three staff members.
00:07:49.000 112 pages released by the FBI were found by law enforcement in Hale's car.
00:07:53.000 The release comes after a long legal battle by Tennessee Star and the Tennessee Firearms Association to make the documents available to the public.
00:08:00.000 Among the pages released by the FBI were redacted pages that had maps of the Covenant School, including the first floor and second floor, as well as lists of faculty members at the school and dates that the school was on break for the 22-23 school calendar.
00:08:13.000 Hale wrote about feeling being born wrong, scribbling on one page, why does my brain not work right, because I was born wrong.
00:08:19.000 A sketch of pages includes notes on the beginning shooters and defensive pistol, etc.
00:08:25.000 They go on to mention that there was, of course, anti-Christian diatribes.
00:08:29.000 And I guess for personal and professional reasons, we're not going to show the manifesto.
00:08:34.000 It's available at the Postmillennial for those that want to go through these pages, but we're not going to dive into it other than to talk about the reason why this was covered up.
00:08:41.000 And in my opinion, it's because of two principal things.
00:08:44.000 One, the anti-Christian bias.
00:08:46.000 This is hate crime territory.
00:08:47.000 This should be listed among all of these NGOs that track all the hate and everything as anti-Christian hatred.
00:08:54.000 Of course, it's never going to make the press because it goes against the narrative.
00:08:57.000 And the other issue, of course, was the leftist ideology of being born wrong resulting in this person engaging in this kind of behavior is damaging to the leftist gender narrative and blank slate narrative.
00:09:08.000 So the FBI at the time was probably like, let's just not show anybody this.
00:09:13.000 But as most people know, if they don't disclose the identity of the individual, depending on the story, people make assumptions about what the race of the person would be.
00:09:24.000 So whenever there's a story and it says a white man did a thing, people are like, oh, OK.
00:09:29.000 And then if they don't mention the race, people make assumptions like I have a feeling.
00:09:33.000 That's the Ann Coulter rule, right?
00:09:35.000 Is it?
00:09:36.000 It kind of is the Ann Coulter rule, yeah.
00:09:38.000 Did she make that up or what?
00:09:39.000 It was always referred to me as the Ann Coulter rule.
00:09:41.000 Yeah.
00:09:42.000 I think she might be the first person to have actually articulated it.
00:09:45.000 It was kind of like people kind of had that sense, and she was the first person to kind of be like, hey.
00:09:50.000 To say it out loud?
00:09:50.000 Yeah, kind of actually articulate it and make it a thing.
00:09:53.000 In Sweden, there was a, I think it was, it might have been an Afghan national who was in the country, committed some crime.
00:10:01.000 And so they blurred the person, but then changed the pixelated colors of the skin to a white person.
00:10:06.000 Oh!
00:10:07.000 Yeah.
00:10:08.000 Also, they've lightened pictures for people who are of a certain skin tone.
00:10:13.000 Why are you looking at me like that?
00:10:14.000 I'm just looking around.
00:10:16.000 Okay.
00:10:17.000 That's what happens when people talk to you.
00:10:19.000 Is that what it is?
00:10:20.000 There was a conversation where you weren't here yet.
00:10:20.000 I don't know.
00:10:22.000 That's why you missed this.
00:10:23.000 enough.
00:10:24.000 The thing that I find is that this is the first thing you start to realize once you start actually paying attention to the news and seeing the way that they...
00:10:45.000 Whether somebody's mentioned is their race, their gender, things like that.
00:10:48.000 Makes sense.
00:10:49.000 The issue, of course, here being this is damaging to the left social order.
00:10:54.000 And so here's a great point, right?
00:10:57.000 The ADL has their hate map.
00:10:58.000 They call it a heat map.
00:11:01.000 Right-wing is listed as anti-government, white supremacist, and other.
00:11:04.000 Other.
00:11:05.000 What the hell is other?
00:11:06.000 They just basically are like, we are going to call every act of violence right-wing.
00:11:10.000 Perfect.
00:11:11.000 And then this is what they do.
00:11:13.000 The left is a monolith of goodness, and anything bad is right-wing.
00:11:17.000 Totally.
00:11:18.000 I'm like, right-wing anarchists are the same thing as authoritarian white supremacists?
00:11:22.000 How is that right-wing?
00:11:24.000 Doesn't matter.
00:11:26.000 It's all just a narrative-building device.
00:11:30.000 The point is, anything that is bad or violent or looked at as antisocial, that is just classified as right-wing.
00:11:40.000 Whether or not it's actually right-wing is irrelevant.
00:11:43.000 The point is, to classify it as right-wing, get people to talk about it as right-wing, and kind of speak it into existence.
00:11:49.000 The left loves to do that with any number of topics, but when it comes to violence, In particular, they really do.
00:11:55.000 Anything that isn't overtly You talked about the leftist idea of being born in the wrong body.
00:12:07.000 How old was the shooter?
00:12:09.000 I forget how old the shooter was.
00:12:10.000 Like you take so much of the confusion and the kind of angst that comes from being a young teenager, a child, and you've turned it into this hyper-politicized means of...
00:12:33.000 Well, they've made it significantly worse to the point in which you get shooters like this.
00:12:37.000 This probably, I mean, I argue it might not have happened if we weren't living in this kind of age that we are now.
00:12:43.000 When individuals are constantly slow-drip the idea they're in the wrong body, it's going to create this more than just like your...
00:12:52.000 More than just like delusion in your mind, but you're constantly fed this information that just is making you question your entire existence and your entire reality.
00:13:01.000 How could this not be caused by these left ideas and slow dripped into these children and not, how are we not taking accountability for what we're teaching them and going, oh, they're shooters.
00:13:10.000 Of course they are.
00:13:10.000 The left still hasn't acknowledged that.
00:13:14.000 To tell children that they're born in the wrong body is detrimental to children's development.
00:13:19.000 We had someone this morning on the culture war defending the idea that trans children exist and that it's better to help transition teenagers and stuff.
00:13:34.000 And it's just absolutely not.
00:13:36.000 I think she deflected on the issue of children.
00:13:38.000 Oh, did she?
00:13:39.000 Yeah, because she kept going back to adults, and I was like, we don't disagree.
00:13:43.000 Some conservatives, I don't know.
00:13:44.000 But when you say children, what was her classification in terms of age for children?
00:13:48.000 Minor or...
00:13:50.000 Because minor can be very, very different in a lot of places.
00:13:53.000 17 and under.
00:13:55.000 But we're talking about the United States.
00:13:56.000 No, no, fair enough.
00:13:57.000 But I'm saying, though, even in the United States, for them to classify, right?
00:13:59.000 So it's like for hormones, they'll only give, I think it's, what is it, above eight years old.
00:14:04.000 They consider minors to be very specific.
00:14:06.000 And that's why I'm asking, like, if she was going to say, no, it's adults only.
00:14:09.000 But what is, because the left a lot of times deems children to be adults.
00:14:16.000 Like, there's just no way cognitively they can.
00:14:18.000 Particularly when it's convenient for them.
00:14:20.000 They want children to vote.
00:14:21.000 They want to say that children can make decisions that would take them away from their family, take the power away, the authority away from their parents.
00:14:29.000 But when it comes to things like, you know.
00:14:34.000 People in their 20s are like, oh, well, you know, I didn't know.
00:14:37.000 when you're out in the real world and it's like, well, you have responsibilities.
00:14:52.000 Okay.
00:14:53.000 Was that a viral video?
00:14:54.000 Yeah, it was.
00:14:54.000 Yeah, yeah, and it's it really kind of shocked me but at the same time People don't want to acknowledge that they're adults in their mid to upper 20s.
00:15:08.000 But again, this is all a mechanism of convenience.
00:15:12.000 It's when it's convenient to say, no, I'm not an adult, or when it's convenient to say, well, those children actually should be allowed to vote because they're 16. And, well, they're the ones that have the most.
00:15:22.000 go on the line because they're young and they have their whole lives, so bad policies now are going to affect them most.
00:15:29.000 But then, of course, when they finish college and they want you to pay their loans back, they say, I didn't understand what I was getting into when I took out those student loans.
00:15:37.000 It is a mechanism of power.
00:15:39.000 Yeah, it's a mechanism to avoid responsibility, which is something the left loves to do, something that I think there are probably a lot of groups of people that love to do it.
00:15:48.000 And you see it not just...
00:15:51.000 you bring up the school stuff, there's other cases where they're...
00:15:58.000 I had a specific issue I was going to bring up, but I totally slipped my mind now, so if someone's got something else, please.
00:15:58.000 I'm sorry.
00:16:05.000 Well, that's it.
00:16:05.000 The conversation's over.
00:16:07.000 Done.
00:16:07.000 My fault.
00:16:08.000 My bad.
00:16:08.000 Phil is the one who made me realize that sometimes you can just forget your thought and be like, lost it.
00:16:13.000 Go on.
00:16:14.000 Otherwise, I'm like, um, um, um, um, I'm like, get it, get it, get it.
00:16:17.000 Nope, just, no, I forgot it.
00:16:18.000 It's perfectly human to do that, especially with how much we talk.
00:16:21.000 You make a bit of rage where you're like, and he was...
00:16:28.000 Yeah, you can use my hammer next time.
00:16:29.000 I'm sure there have been many points that I've tried to drive home that don't land because it was just something that filled in the blank when there was something else there and I lost it.
00:16:37.000 To the point, it is true that the left does do things or try to make arguments that absolve them of responsibility when it comes to adult things and bestows responsibility on children that they could never ever actually.
00:16:55.000 I think we talk about the divide in the culture war and the conversation we're having on the culture war show this morning about what it means to be left or right.
00:17:03.000 There's varying degrees or varying definitions of what separates left or right, but I largely think it's – It's largely just people who are logical, rational, follow the news and are looking for solutions.
00:17:20.000 And the left is a psychotic murmuration of cult-like lunatics who believe random things that don't make sense.
00:17:26.000 And how dare you not actually agree with them if you want to be in the left?
00:17:29.000 Otherwise, you're just left out in the cold.
00:17:31.000 So where do you go?
00:17:32.000 I feel like that's something that I point out a lot with people when they say that's liberal.
00:17:36.000 And I say, no, that's leftism.
00:17:37.000 That's not liberalism in many ways.
00:17:40.000 I still have plenty of things that I'd like to have a discussion about that would have been considered left-leaning ideas.
00:17:46.000 That's a debate that I gave up a long time ago.
00:17:49.000 I'm of the opinion that liberal doesn't mean leftist, but I'm also not going to fight with everybody.
00:17:55.000 Because everybody is like, oh, own the libs and blah blah blah.
00:17:58.000 Right.
00:17:59.000 What is the word used to describe?
00:18:00.000 And anybody who says something like leftists and the liberal side love to use...
00:18:08.000 They have multiple definitions of a single word, but they will apply a different definition to win the debate.
00:18:14.000 Right.
00:18:14.000 So they'll say you're – you know, it's Lamont and Bailey.
00:18:17.000 He's a white supremacist.
00:18:18.000 What are you talking about?
00:18:19.000 He's a non-white and he's spoken against white supremacy.
00:18:22.000 No, no.
00:18:22.000 I mean the classical academic version of white supremacy of privilege from colonizer history.
00:18:27.000 And they're like, OK, shut up.
00:18:28.000 It's too far.
00:18:29.000 This is the first time that somebody like gave me the side eye because they said that somebody couldn't be racist unless they were white.
00:18:36.000 And I was like, I'm not – I'm not entertaining that debate.
00:18:39.000 If we can't have a discussion around the idea that everyone can be racist to another person, then we're operating on different wavelengths here.
00:18:47.000 And usually that ends up being why those types of back and forth, even with somebody who you may want to enter into a discussion in good faith with, don't work because you can never get past the actual definitions for the words you're having in the argument.
00:18:59.000 The purpose for changing the definition of racism was to realign people based on what they were taught.
00:19:07.000 You know, millennial youth, you're told racism is bad.
00:19:09.000 Don't be racist.
00:19:10.000 Then when you get older, you're like, I am not racist.
00:19:13.000 I will try to treat everybody equally.
00:19:14.000 And then some academic comes by or some leftist and they're like, I want you to vote for me.
00:19:18.000 And they say, I don't like your policies.
00:19:20.000 They go, you're racist.
00:19:21.000 It goes, actually, I'm not.
00:19:22.000 Uh-uh.
00:19:22.000 You don't know what racist means.
00:19:24.000 Racist means something different.
00:19:25.000 Now, does that apply to you?
00:19:27.000 Well, it kind of does.
00:19:28.000 Uh-huh.
00:19:28.000 So you admit you're racist.
00:19:29.000 That's the point.
00:19:30.000 So they can force people to, you know...
00:19:49.000 So you're trying to rewire the brains of an entire generation.
00:19:53.000 It's why people talk about how the 90s were as great as they were.
00:19:56.000 And in a lot of ways, people felt as if racism was largely a non-starter, a non-issue in America.
00:20:01.000 And it's been re-injected into the culture by changing the definition and actually trying to change the way in which an entire generation of people judge those that they interact We're going to jump to this next story from Mediaite.
00:20:15.000 My friends, in talking about what the political factions in the culture war are, let me just say there are two.
00:20:21.000 Those who feel sorry for Joe Biden's cancer and those who don't really feel sorry for him.
00:20:26.000 And this is Donald Trump.
00:20:30.000 Oh, come on, media.
00:20:31.000 You always do this.
00:20:32.000 And I asked Caroline this yesterday, but I want to ask you directly.
00:20:36.000 So many of the things that you're trying to do are held up in court right now.
00:20:40.000 Okay, I don't think this clip is actually related to that quote, and the quote is funnier, so let's just...
00:20:49.000 And Trump chuckled and embarked on a nearly five-minute whirlwind response that concluded with an attack on Biden's handling of the border.
00:20:58.000 And then, without mentioning Biden's diagnosis, Trump said the people should not feel sorry for the ex-president.
00:21:03.000 And I don't believe it was Joe Biden.
00:21:04.000 Look, he's been sort of a moderate person over his lifetime, not a smart person, but a somewhat vicious person, I will say.
00:21:10.000 If you feel sorry for him, don't feel so sorry because he's vicious.
00:21:13.000 What he did with his political opponent and all of the people that he hurt, he hurt a lot of people, and so I don't really feel sorry for him.
00:21:19.000 Trump's remarks were a far cry from his statement in the immediate aftermath of Biden's diagnosis, which is, Melania and I are saddening to hear about Joe Biden's recent medical diagnosis.
00:21:26.000 He wrote, We extend our warmest and best wishes to Jill Well, that was for sure written by a staffer, and that's why he is now saying the honest truth.
00:21:37.000 I don't actually really feel bad.
00:21:39.000 Nobody believed him when he said, oh, we're so sorry that you've got cancer.
00:21:42.000 They've known about it for a long time.
00:21:44.000 Does anybody actually feel sorry?
00:21:46.000 They've known.
00:21:47.000 This is elder abuse at best.
00:21:49.000 Yeah, Jill Biden is responsible for a lot.
00:21:51.000 Joe knew.
00:21:52.000 Of course he knew.
00:21:53.000 Dr. Drew is saying this, I did an interview with him, that there's no way this was, we did a routine exam and found cancer and it's metastasized to the bone.
00:22:02.000 He's like, that takes years.
00:22:03.000 Even if it's aggressive, it's a couple years.
00:22:05.000 Well, there was clips where he had mentioned it several times.
00:22:07.000 I have cancer.
00:22:08.000 I had cancer.
00:22:09.000 No, he knew.
00:22:10.000 He said, I have cancer in one clip.
00:22:12.000 Say that.
00:22:13.000 He outright said, I and so many others have cancer.
00:22:15.000 But that's what I'm saying.
00:22:16.000 They knew about this.
00:22:17.000 The media came out and they were like, no, he's talking about his skin lesions.
00:22:20.000 Once again, I asked Dr. Joe, I'm like, you're a doctor.
00:22:22.000 And he's like, literally no one considers a skin lesion having cancer.
00:22:25.000 No.
00:22:25.000 His dermatologist told him so.
00:22:27.000 How do you feel about this approach to politicians when it comes to having a certain level of sympathy for someone who has a diagnosis like this after time in office, which has been marked by a lot of mistrust from the public and a lot of waste that has gone on and made America worse?
00:22:45.000 Because on the average day for me, it could go either way.
00:22:49.000 there are days where it's like, I feel like you lose humanity if you don't at least learn to accept and understand and feel empathy for the suffering of another person.
00:22:58.000 But on other days I'm like, politicians are scum.
00:23:01.000 Do you believe in God?
00:23:04.000 Um, I'm agnostic.
00:23:05.000 Agnostic.
00:23:06.000 Do you believe in God?
00:23:07.000 Yes.
00:23:07.000 Uh, Phil, you believe in God?
00:23:09.000 I'm, I'm also agnostic.
00:23:10.000 Agnostic.
00:23:11.000 Um, okay, so I'll, I'll just ask you, do you think God intervenes in our earthly dwelling?
00:23:17.000 Yes.
00:23:18.000 I would just say that if you are someone who believes that God smites...
00:23:27.000 Smitten.
00:23:28.000 Smitten?
00:23:28.000 Smote.
00:23:29.000 Smote.
00:23:30.000 And I would take it a step further.
00:23:32.000 Who has been smote?
00:23:33.000 Smote.
00:23:33.000 I'd argue it took a little long.
00:23:35.000 I, you know...
00:23:39.000 I don't know.
00:23:39.000 Having a very serious cancer that weighs you down and is damaging is...
00:23:45.000 For sure.
00:23:47.000 I mean, look, I think it would be I think we should be looking more at the people around him that hid it from him and hid it from the American people and lied to the American people.
00:23:59.000 The entire globe while signing his name on documents that he had no clue what was being signed.
00:24:05.000 I think I don't necessarily feel bad for him.
00:24:08.000 I feel like the people around him should be held to account for what they did to this man and put him in the position when he was this sick this often.
00:24:16.000 That's disgusting to me.
00:24:17.000 I don't feel bad for...
00:24:29.000 That will sit in your cells.
00:24:30.000 But I also do believe that the people around him are the bad people, the people that have lied continuously and put people into positions where other Americans have died over stupidity because of ignorance.
00:24:43.000 Yeah, I think Joe Biden was smote.
00:24:45.000 That too.
00:24:47.000 I don't know.
00:24:48.000 I guess for me, when I think about it from that perspective, most of the time, if I'm thinking about my own empathy or sympathy towards another person, it's not about whether I think they're going to get what's coming to them.
00:24:58.000 It's about what it does to myself or who I am if I start to become too vengeful towards someone, even if I disagree with them heavily.
00:25:06.000 Now, again, that goes back and forth depending on the day and the mood I'm in.
00:25:10.000 But I lean towards showing empathy and sympathy for most people.
00:25:15.000 You can show empathy and sympathy for most people.
00:25:17.000 But I think you can also trust in a greater that something else will take care of it for you.
00:25:21.000 It's not on you to handle.
00:25:22.000 It's just let it be.
00:25:23.000 But where's the line?
00:25:25.000 I don't have an answer for you on that because this is purely from an introspective standpoint where it's like, yeah, Hitler's the obvious example.
00:25:25.000 That's what I'm saying.
00:25:34.000 I have no empathy or sympathy for Hitler.
00:25:36.000 But for me, it's a question that would come up dependent on the situation.
00:25:42.000 Adam Schiff.
00:25:44.000 Tons of empathy.
00:25:47.000 Endless amounts.
00:25:48.000 I'm going to write a book about it.
00:25:49.000 But to any degree, sympathy for a man?
00:25:51.000 Jamie Raskin is a better example.
00:25:54.000 Look, I don't have an answer for you.
00:25:55.000 I just think that for most people, especially if you're on the internet a lot, we get desensitized to everything that's going on in the world.
00:26:02.000 It's good to at least ask yourself.
00:26:04.000 To be introspective about how you feel about these people that you've never met, but they affect your life.
00:26:09.000 I would add this, too.
00:26:10.000 There's no real way to understand whether a person was smote or not.
00:26:14.000 Because sometimes good people get cancer, you know what I mean?
00:26:17.000 And we're not going to be like this good, God-fearing man who dedicated his life getting cancer.
00:26:21.000 But some people will have...
00:26:26.000 So it's like if you think a bad person had a bad thing happen to him, it's just your personal opinion, I guess, without any bearing in reality or fact.
00:26:34.000 That happened when the George Floyd statue got struck by lightning and everybody's like – Was it the mural that got struck and they're like, it was a sign from God?
00:26:42.000 I'm like, or it was just a weather event.
00:26:43.000 Oh, dude, you know, it's crazy to me.
00:26:46.000 Like, we've got this video we'll talk about later with Sasquatch.
00:26:49.000 And obviously it's some guy.
00:26:51.000 But it's funny either way.
00:26:52.000 But I have had experiences in my life that seemingly defy our understanding of science, physics, and reason.
00:26:59.000 And there have been moments witnessed by all that seem astronomically unlikely that I believe shows there is something beyond what we understand the universe to be.
00:27:13.000 Largely, the secular folks will view the universe as this universal code, physics and structure of how things go.
00:27:19.000 But then you get a brick wall with a picture of George Floyd having a crown and lightning strikes just in the middle, blowing up only one layer of the two-layer wall, only the mural of George Floyd, leaving the rest of the building intact.
00:27:33.000 And it was a storm that moved in, lightning struck, and then the storm dissipated.
00:27:38.000 The Lord works in mysterious ways.
00:27:40.000 And I look at that and I say, the question is, how do you define what a miracle is?
00:27:47.000 So I've asked theologians this.
00:27:48.000 Is a miracle, say, like a ham sandwich appearing out of thin air and flopping on the table, like, wow, how did that happen?
00:27:54.000 Or is a miracle something that is seemingly defying all odds and occurring in front of you, but still within the realm of possibility?
00:28:02.000 When you have, it was a brick wall with two layers of bricks.
00:28:07.000 The middle of the building, which should not be struck by lightning, was struck by lightning, destroying only the mural of George Floyd of the Crown and leaving the rest of the mural intact.
00:28:16.000 And I'm just like, I look at that like an act of God.
00:28:20.000 When people were like, God's making hurricanes because you're gay, I'm like, I don't know.
00:28:25.000 That's taking it too far.
00:28:26.000 But when, like, the lightning strike, I'm like...
00:28:32.000 That's a little specific.
00:28:33.000 Just sending a message directly.
00:28:35.000 Very specific.
00:28:36.000 This hyper pinpoint.
00:28:38.000 I've also experienced things personally that seemingly defy our understanding of physics, and I think a lot of people have.
00:28:44.000 It's just, it's hard to track and control for these things and understand what they are.
00:28:48.000 And then I wonder if a lot of these, you know, directly atheist individuals who are like, we're wet robots, nothing is magic.
00:28:55.000 You know, they've never had an experience.
00:28:58.000 And I wonder if it's they've either had experiences and it's been so overwhelming.
00:29:02.000 It's kind of like when you see something so hard to deal with that your whole body goes, there's no way that could be real.
00:29:06.000 There's no way that that could have happened.
00:29:08.000 They shut down and go the complete opposite way.
00:29:10.000 That would make sense, at least for me.
00:29:12.000 Maybe.
00:29:13.000 I don't know.
00:29:14.000 I will just say that even here at the studio, this is a brand new building, we have had strange occurrences and omens and like for personal reasons we can't – It's not my business.
00:29:28.000 But let's just say, like, poltergeist phenomenon has occurred.
00:29:32.000 Oh, fantastic.
00:29:33.000 Yeah.
00:29:34.000 Love that for me.
00:29:35.000 And been witnessed by numerous people.
00:29:38.000 And everybody was frozen with a shocked look on their face.
00:29:43.000 And this is, like, recently.
00:29:45.000 Is it just in this building specifically?
00:29:46.000 Yes.
00:29:46.000 On the property?
00:29:51.000 That was my next question.
00:29:53.000 Well, what was here first?
00:29:54.000 It wasn't.
00:29:55.000 So where we are wasn't like a direct conflict zone in the way that like Antietam was.
00:29:59.000 But we have found a bayonet during the construction process.
00:29:59.000 Okay.
00:30:04.000 Did you bring it in here?
00:30:06.000 No, I don't know where it is.
00:30:07.000 Yeah, we found a rusted bayonet.
00:30:09.000 We think it was used.
00:30:09.000 That's sick.
00:30:11.000 That's even more sick.
00:30:12.000 We think it broke off in use.
00:30:14.000 Oh, buddy.
00:30:15.000 Do you have anything inside the property?
00:30:19.000 There is a grave site on the property.
00:30:20.000 Okay, well...
00:30:23.000 Do you let people know that before they decide to come to the show?
00:30:26.000 Well, there are graves all over the place.
00:30:27.000 Okay.
00:30:28.000 I don't know.
00:30:29.000 Yeah, there's several tombs from the 1800s on the property that have fallen to disrepair and the ground has shifted and they've fallen over.
00:30:38.000 Yeah.
00:30:39.000 I didn't bring enough sage for this.
00:30:41.000 Jesus.
00:30:43.000 I don't know.
00:30:44.000 Anyway, long story short, I think sometimes people get smoked.
00:30:47.000 Yeah.
00:30:48.000 Look, I mean, as far as Joe Biden goes, this may sound kind of callous, but he's an old guy, and I never met him.
00:30:57.000 So it's not that I don't care, and I don't feel malice towards the guy, but at the same time, it's not going to change my life.
00:31:06.000 So it's kind of just like, you know, old people get sick and die.
00:31:10.000 I also think, too, based on what he's done to the nation and the impact that he has left, him dying is tragic.
00:31:21.000 Any person suffering is tragic.
00:31:23.000 But it does not amount to the amount of suffering he's imposed on nations and people and his own people and hasn't asked them how they felt or how it affected them or if they should care.
00:31:34.000 But we're supposed to have deep empathy for the man that imposed left, right and center.
00:31:39.000 I'm not saying you.
00:31:40.000 I'm just saying, you know, we should all feel bad.
00:31:42.000 Okay, what about everything he did over the past terms that he was here and then before that and then before that?
00:31:47.000 Like, I just don't understand.
00:31:48.000 where the line is.
00:31:49.000 He's just an old politician.
00:31:51.000 Right.
00:31:51.000 And like, there are people that kind of wrap their lives into politics and average people can just be like, oh, drag and go about their lives because that's kind of, that's actually the proper response to finding out someone that you didn't know who's lived a very long life is their life is coming to an end.
00:32:10.000 Well, that's what happens to people.
00:32:12.000 People die.
00:32:13.000 We're not all Brian Johnson.
00:32:13.000 Yeah, they die.
00:32:14.000 We're not looking to live 100%.
00:32:16.000 I guess maybe my point was more that when politicians get sick, especially controversial politicians, I'm speaking more to the people that tend to take pleasure or glee.
00:32:28.000 in that and that's a different thing entirely so I guess that was maybe more the point that I was trying to make and kind of failed in my job is that I try to weigh my kind of antagonism that I feel towards all politicians and make sure that I level myself out so that I don't I mean, this is a safe space.
00:32:47.000 Make a video just yelling at the sun.
00:32:49.000 Yeah, you know?
00:32:51.000 Let's jump to this next story from the New York Times.
00:32:53.000 Now, I'll start off with a caveat of often I just don't believe when these stories are so bold.
00:32:58.000 They have lied about Donald Trump so often it's hard to just assume the worst.
00:33:02.000 But let's read the story and break it down.
00:33:04.000 Trump taps Palantir to compile data on Americans.
00:33:07.000 Now, this story has sparked a lot of anger, even among people on the right, where it appears that Trump is working with Peter Thiel to create a data sharing system, I guess, using Palantir to try to do it.
00:33:19.000 Track all of your personal information.
00:33:20.000 Oh, perfect.
00:33:21.000 The New York Times reports the Trump administration has expanded Palantir's work with the government, spreading the company's technology, which could easily merge data on Americans through agencies.
00:33:31.000 In March, Trump signed an executive order calling for the federal government to share data across agencies, raising questions over whether he might compile a master list of personal information.
00:33:39.000 Trump has not publicly talked about the effort since.
00:33:42.000 Behind the scenes, officials have quietly put technological building blocks into place to enable his plan.
00:33:47.000 In particular, they have turned to one company, Palantir, the data analysis and technology firm.
00:33:51.000 The Trump admin has expanded Palantir's work across the federal government in recent months.
00:33:55.000 The company has received more than $113 million in federal government spending since Trump took office.
00:34:00.000 Reps from Palantir are also speaking to at least two other agencies, SSA and the IRS, about buying its technology, according to six government officials.
00:34:07.000 Now you see?
00:34:08.000 Right away, how they lie.
00:34:10.000 That doesn't feel good.
00:34:11.000 Let me break this down.
00:34:12.000 They're lying.
00:34:13.000 Don't believe the headline.
00:34:15.000 They just, they spill the beans.
00:34:17.000 Trump taps Palantir to compile data on Americans.
00:34:20.000 Fake news.
00:34:21.000 These people are scumbags.
00:34:23.000 They then go on to say that Trump has not talked publicly about this, nor has Palantir, but Palantir does do this kind of work, and Trump is contracting them, therefore— They're lying.
00:34:35.000 They made up a fake headline based off of...
00:34:39.000 They're lying about this.
00:34:41.000 So he tapped Palantir to do something for the government, but they're implying that it's for this specific purpose.
00:34:47.000 Sharing data across agencies does not mean stealing or spying or compiling Americans' data and putting them into a master list.
00:34:54.000 It says questions over whether he might do it.
00:34:56.000 But the headline they put for the article says he is doing it.
00:34:59.000 These people are evil scumbags.
00:35:01.000 I can't stand these people.
00:35:02.000 I just want to understand how they're able to get away with this, where if you have a typo on anything, it's totally like the whole thing will crash.
00:35:08.000 But then they can blatantly lie in the headlines and some of the largest newspapers.
00:35:12.000 Like we watch this from Canada and I think that our stuff is corrupt with the CBC.
00:35:16.000 And then I see this stuff and I go, I just don't understand how you guys all sit there and go, yeah, this is totally normal.
00:35:20.000 This is normal.
00:35:22.000 We're all good with it.
00:35:22.000 Everyone's fine with this.
00:35:31.000 They're not saying he did it.
00:35:33.000 The Trump admin has already sought access to hundreds of data points on citizens and others through government databases, including their bank account numbers, the amount of their student debt, their medical claims, and any disability status.
00:35:42.000 Once again, those are a bunch of unrelated things that they have written next to each other to trick you into thinking he's doing this.
00:35:50.000 Let me give you an example of how these scumbags operate.
00:35:53.000 Imagine I said something like, Phil Labonte showed up to my house.
00:35:58.000 This guy then punched a dog to death.
00:36:02.000 The way it's written is intending to make you believe that I'm referring to Phil as the man who punched the dog to death.
00:36:07.000 No, no, I didn't say Phil.
00:36:09.000 I said this guy did it.
00:36:10.000 I was talking about somebody else.
00:36:12.000 That's what they do.
00:36:13.000 But yeah, Phil was there.
00:36:13.000 He walked in.
00:36:14.000 I was just letting you know.
00:36:15.000 That's what they do with these articles.
00:36:17.000 They said he could potentially use information to do this, meaning that he hasn't done it yet.
00:36:21.000 I see people on X posting this saying, Trump shouldn't do this.
00:36:24.000 Pound tier is bad.
00:36:25.000 Why is Trump compiling dead in Americans?
00:36:26.000 Then you pulled the New York Times and they made the whole thing up.
00:36:30.000 Who wrote this?
00:36:31.000 Who wrote this article specifically?
00:36:33.000 Let's see.
00:36:35.000 Shira Frankel and Aaron Krollick.
00:36:37.000 Why are you smirking like that?
00:36:39.000 You're smirking like that for a reason.
00:36:40.000 Anytime I see articles written by multiple people that are of a certain length, you can just smell bullshit.
00:36:45.000 Sorry.
00:36:46.000 Palantir didn't comment on it.
00:36:47.000 Trump hasn't publicly discussed any of this.
00:36:52.000 Oh my goodness.
00:36:53.000 Just make all this stuff up, dude.
00:36:55.000 These people are evil.
00:36:56.000 Oh, I love the photos too.
00:36:57.000 We gotta get those photos.
00:36:59.000 Just so you guys know what a Palantir is, it's that.
00:37:02.000 Yeah.
00:37:03.000 But it is.
00:37:04.000 It's from Lord of the Rings.
00:37:04.000 Yeah.
00:37:05.000 Saruman gazing into the Palantir, the seeing stone.
00:37:08.000 My ex-wife used to work for Palantir.
00:37:10.000 Okay.
00:37:11.000 Back, this is early teens.
00:37:14.000 And the only thing I know about the platform Palantir back then, it was being used in Afghanistan.
00:37:20.000 It was scary accurate.
00:37:22.000 They were using it basically to...
00:37:27.000 And it got good.
00:37:29.000 When I had my first morning room studio at the castle in Maryland, it's like 2020.
00:37:34.000 Ian knocks on the door and he comes and he's like, dude, you got to invest in Palantir, man.
00:37:39.000 It's going to be big.
00:37:40.000 And I looked and it was like $12 a share.
00:37:43.000 Let me look at the price.
00:37:45.000 What is it sitting today?
00:37:46.000 It was...
00:37:50.000 It was $9.
00:37:51.000 It was $9 a share.
00:37:52.000 And I was like, whatever, Ian.
00:37:53.000 Get out of here.
00:37:54.000 And now it's at $131.
00:37:55.000 Yeah, but what's his hit rate?
00:37:57.000 He's made other suggestions before.
00:37:58.000 He's got a great success rate.
00:38:00.000 Yeah, I invested in a graphene company and I made a lot of money.
00:38:02.000 Yeah, there you go.
00:38:03.000 No joke.
00:38:04.000 The minute Ian came out, I was like, graphene.
00:38:07.000 I was like, okay, whatever, dude.
00:38:09.000 I'm going to Google search.
00:38:10.000 It's about a company that makes graphene products.
00:38:11.000 I invested.
00:38:13.000 You gotta watch the body language with him.
00:38:15.000 You gotta look for the successes and how he moves when he knows he's had the success and then look at the other side.
00:38:19.000 That's how you can talk.
00:38:20.000 I don't know why.
00:38:21.000 I was like, Ian Pellantier is like a data collection company.
00:38:24.000 They're considered to be like a big tech spying government contractor.
00:38:30.000 I don't know if I want to own anything.
00:38:32.000 So I bought a little bit and it's extremely valuable.
00:38:35.000 The heck with the Nancy Pelosi stock tracker.
00:38:38.000 Let's get the Ian stock tracker.
00:38:39.000 There you go.
00:38:40.000 I don't know.
00:38:41.000 Ian doesn't buy these stocks himself.
00:38:42.000 Actually, I bet he did.
00:38:43.000 He just gives advice to other people.
00:38:45.000 I mean, there's a lot of people that do that.
00:38:47.000 In the past five years, they're up 1,332%.
00:38:51.000 So when I say a little bit, I mean literally like, I don't know, 10 or 20 shares.
00:38:55.000 I was like, I don't know.
00:38:57.000 I was like, I don't really want to...
00:39:01.000 The companies that I buy stock in are companies that I've heard about and think are interesting, and so I buy them.
00:39:05.000 I don't like playing that game of who's going to succeed and what the reports are.
00:39:09.000 I'm not investing in company stocks to a large degree, but I don't buy any amount because I'm trying to make money.
00:39:16.000 It's largely just do I find this interesting and want to have a part of it.
00:39:20.000 Is there something you've invested in recently that you find interesting?
00:39:24.000 Let me take a look.
00:39:27.000 It's currently not.
00:39:29.000 It's publicly traded, and I don't know if it's going to be, but Andrill is the company that just started working with Meta to do AI munitions and robotic drones and stuff for the federal government.
00:39:45.000 I don't like how my whole body felt, as you said, AI robotics and weapon systems.
00:39:49.000 I mean, you know.
00:39:50.000 Look, the AI stuff is the end of the day.
00:39:52.000 It really is.
00:39:53.000 It is a race at the bottom, and...
00:40:03.000 But in the US, they're like, if we don't build this, China will.
00:40:07.000 And then China will destroy us.
00:40:09.000 And the response to people is, yes, but the AI will destroy us too.
00:40:12.000 And they're like, well, the mentality is – It feels like the same argument with the atomic bombs.
00:40:25.000 Well, they're going to do it.
00:40:26.000 We're going to have to do it.
00:40:28.000 Exactly.
00:40:29.000 What point do we learn, though, that it's still toxic?
00:40:31.000 It's still not going to be positive.
00:40:33.000 Here's the thing.
00:40:33.000 With the nuclear bombs, you don't need to pull the trigger.
00:40:36.000 Now, fair enough.
00:40:37.000 With AI, it doesn't work.
00:40:40.000 It happened.
00:40:41.000 It just happened.
00:40:42.000 It happened.
00:40:42.000 And I think, I actually believe that we're probably beyond the singularity already.
00:40:47.000 It's just not in the public space.
00:40:49.000 Like, the AI has already exponentially grown to a point where it's out of our control.
00:40:54.000 We just don't know about it yet.
00:40:55.000 Well, I'm sure you've covered it.
00:41:04.000 Now, the clarification there is they told the AI you have two choices, blackmail an engineer into staying online or shut down, and it chose the blackmail.
00:41:14.000 They said when the AI was given any option to come up with to survive, to stay online, it would not choose blackmail.
00:41:22.000 Interesting.
00:41:23.000 So they basically said, like, the bigger issue is the uh-oh problem, which just emerged out of this Chinese research group, where the AI was trying to figure out how to deceive, according to its own logic, lesser intelligent humans and other AIs to cover what it's, to obfuscate its true purpose or true task.
00:41:42.000 Which means this AI developed the ability to trick you into thinking it's trying to grow better crops, where it's actually trying to wipe out all of humanity.
00:41:52.000 Oh, I don't like this.
00:41:54.000 Yeah.
00:41:54.000 But then you got everything like already.
00:41:59.000 So they make these sex bots, right?
00:42:02.000 Right.
00:42:02.000 They're lifelike and they're expensive and they're going to load an AI into it.
00:42:09.000 Of course.
00:42:09.000 Right now, GPT has the capability of being any character you want it to be.
00:42:14.000 You could go into chat GPT and say, from this point forward, act as though you're a 36-year-old man named Rick and create a backstory for this character and then communicate with me as though you're that person.
00:42:41.000 It's nothing but VO3, Gemini videos.
00:42:44.000 There's already a viral Man on the Street video where a dude is talking to an old man.
00:42:48.000 He's like, what's your advice for a younger generation?
00:42:51.000 And the old man goes, stop being a bitch.
00:42:54.000 And it went viral.
00:42:55.000 And a lot of people are like, guys, you're sharing AI.
00:42:57.000 And they're like, we don't care.
00:42:58.000 It's funny.
00:42:59.000 Yeah.
00:42:59.000 So in a year...
00:43:02.000 No, no.
00:43:03.000 In a year, you're not gonna know the difference.
00:43:07.000 So what we're gonna have is, OnlyFans is gonna They're done.
00:43:13.000 What's going to happen is OnlyFans itself will just create a...
00:43:21.000 Right.
00:43:21.000 And then that will be the platform.
00:43:23.000 And what's going to happen is guys are going to go on OnlyFans and they're going to find what looks like a normal woman in her bedroom and it's going to be totally AI generated.
00:43:29.000 You can't tell.
00:43:30.000 And you're going to talk to it and you're talking to a machine.
00:43:32.000 So all these OnlyFans prostitutes on the internet are going to lose their jobs and lose their income.
00:43:37.000 OnlyFans will make 100% of the revenue.
00:43:40.000 I rode an Awaymo when I went to California.
00:43:42.000 They suck, by the way.
00:43:45.000 Spoiler alert.
00:43:46.000 It dropped me off on the side of the road inside of the parking lot.
00:43:50.000 I guess because they don't want to go.
00:43:51.000 I have no idea.
00:43:52.000 All I know is, like, I was in the middle of the road, and I was like, I'm not getting out of the car, right?
00:43:55.000 This is crazy.
00:43:56.000 Right.
00:43:56.000 It stopped in the middle of the road and told me to jump on the sidewalk, and I'm like, this is nuts.
00:43:59.000 Oh, like, it didn't pull over.
00:44:01.000 It stopped.
00:44:03.000 It just stopped.
00:44:04.000 So there were two lanes going forward, two lanes going the other way, split by a median.
00:44:09.000 The place we were going to was around the block, and it was a parking lot instead.
00:44:13.000 It stayed on the road to the side of the building and then just stopped in the middle of the street and said, you've arrived at your destination.
00:44:18.000 And I was like, how do I tell it to keep going because I'm not getting out in the middle of the road?
00:44:22.000 It wouldn't do it.
00:44:23.000 Oh, fun.
00:44:24.000 So, you know, we'll see where we go.
00:44:26.000 But what I will say is this.
00:44:28.000 To be fair, the weird, creepy AI porn stuff, nightmarish.
00:44:33.000 Every car being AI.
00:44:35.000 We'll be luxuriously convenient, albeit terrifying because of the implications of surveillance state.
00:44:42.000 But let me just say, zero traffic, no more parking.
00:44:47.000 If every single car in this country right now is a Waymo, what they've presented, not just Waymo, but like Uber, is that you have an app.
00:44:56.000 You walk outside and go, you want to go to the restaurant?
00:44:59.000 And you click summon.
00:45:00.000 And then within 30 seconds, a car pulls in.
00:45:05.000 where you need to go and then it leaves right away.
00:45:07.000 Cheap.
00:45:07.000 And there's no traffic because they all communicate with each other so they're all in perfect sync.
00:45:12.000 When there's a bunch of cars on the highway they'll all slow down by Right.
00:45:15.000 mile an hour to create a slow gap that you easily slide into.
00:45:18.000 So there'll be zero traffic, no more parking, nothing to worry about, and a car will be available for you instantly once we get to that point.
00:45:26.000 However, there's also the...
00:45:29.000 Locking the door and driving you to the police station on false pretenses and other weird stuff that will likely happen too.
00:45:35.000 I'm not a fan of it.
00:45:36.000 I don't like them at all.
00:45:37.000 I would rather sit in traffic all day over that.
00:45:38.000 There's also the, I'm sorry, you can't drive today because you said a naughty word.
00:45:42.000 Your social credit score is not high enough.
00:45:44.000 Oh, that's coming.
00:45:45.000 That's terrifying.
00:45:46.000 Or like in Demolition Man.
00:45:50.000 You're like, I need a car.
00:45:52.000 And it's like, I'm sorry, Brett.
00:45:54.000 I could have sworn I heard you say a naughty word just a few minutes ago.
00:45:58.000 So we're going to put a pause on your ability to summon vehicles for about five minutes.
00:46:01.000 Hope you learned your lesson.
00:46:02.000 That makes me Stallone in this instance.
00:46:05.000 That's fantastic.
00:46:06.000 Yeah, but it's not giving you toilet paper.
00:46:07.000 It's taking away your ability to go to the grocery store.
00:46:09.000 The three shells.
00:46:10.000 Taco Bell forever, though.
00:46:12.000 I love that.
00:46:13.000 He's like, I got invited to Taco Bell.
00:46:14.000 I'm like, oh, wow.
00:46:16.000 I mean, the saddest part of the AI discussion is that it'll end up, even as we all kind of walk slowly to our own demise, it won't even be an entertaining one like the movies.
00:46:16.000 The height of Deluxe.
00:46:26.000 Like if you look at what movies that talk about AI now.
00:46:30.000 So if you watch Terminator 2 or Terminator 1 or Terminator 2, there was a cautionary tale, but there was art behind it that was a bit irreverent.
00:46:39.000 Now everything speaks because all of the people who are making movies about AI are terrified of losing their job.
00:46:46.000 There's no actual artistry behind it because more of it is it's following more on the lines of social commentary and they're not actually getting great art out of it.
00:46:53.000 That's the problem.
00:46:59.000 Let me pull this up because it's gone.
00:47:02.000 Capital of Conformity by Aze Alter.
00:47:08.000 And it was amazing.
00:47:09.000 Let me play this video for you guys, because it's been over a year.
00:47:11.000 The video's got half a million views, and I'll just play a little bit of it.
00:47:15.000 You.
00:47:17.000 Yes, you.
00:47:19.000 Do you dread waking up in the morning?
00:47:20.000 Are you feeling helpless in your society?
00:47:23.000 Perhaps even a bit lost?
00:47:24.000 Well, look no further.
00:47:27.000 At the Capitol, we offer an escape, a new beginning, a lifetime of unending joy.
00:47:32.000 We have an abundance of attractions so captivating, you'll wonder how you ever lived without them.
00:47:37.000 Let's take a look, shall we?
00:47:39.000 Take a ride on the Cosmic Carnival.
00:47:42.000 Let go of decisions and let the carnival choose each thrill for you.
00:47:46.000 Simply sit back and soak in fun.
00:47:50.000 Getting hungry?
00:47:52.000 Make your way to Brightside Bistro, where you can feast until your heart's content.
00:47:58.000 And if that's not enough to satisfy your craving, Oh, my goodness.
00:48:05.000 Oh, my God.
00:48:30.000 The dream machine.
00:48:32.000 Relive your most cherished moments in vivid, extraordinary detail.
00:48:36.000 No need to cling to old photos.
00:48:38.000 I love you.
00:48:39.000 Live in the past forever.
00:48:43.000 You might be wondering, what's the cost for such a paradise?
00:48:47.000 Well, dear viewer, some prices aren't paid in gold or silver.
00:48:50.000 We only ask you for one thing.
00:48:54.000 Your identity.
00:48:56.000 We'll need the very core of who you are.
00:48:59.000 It's a small price for a lifetime of unending joy, don't you think?
00:49:02.000 And no need to fear crime or violence.
00:49:06.000 We'll always keep an eye on you.
00:49:09.000 *Sigh*
00:49:11.000 This is making me nauseous.
00:49:14.000 It's cruel.
00:49:15.000 I've never seen it before.
00:49:16.000 Wait, wait, wait.
00:49:17.000 Welcome back to Spot the Odd One Out.
00:49:19.000 Remember, in the capital, wearing a smile is the norm.
00:49:22.000 However, a few seem to occasionally slip up.
00:49:24.000 No worries.
00:49:25.000 This is your golden opportunity.
00:49:28.000 If you see someone forgetting their grin, report them to us and stand a chance to win fabulous rewards.
00:49:36.000 We'll make sure to turn their frown.
00:49:41.000 upside down So I'm not going to sleep tonight.
00:49:49.000 Thank you for that.
00:49:51.000 This, it's from Azay Alter.
00:49:54.000 Shout out, this is one of the best short films I've ever seen, and it's amazing, and I want to make sure, we did kind of play the whole thing, so go check out his channel, subscribe if you like the work that he's doing.
00:50:03.000 Now, I will say this, though.
00:50:04.000 That was one year ago that I first found this, and we talked about it on the show.
00:50:11.000 The degree of AI video generation, you can tell in this video, it's rather limited, actually.
00:50:17.000 You can see in scenes of large groups of people or the city, it looks the way.
00:50:22.000 A nightmare feels.
00:50:24.000 When you're having a nightmare, it kind of looks like that, right?
00:50:28.000 We are so far beyond this that that period of nightmare content is over.
00:50:34.000 And honestly, it's going to be hard to replicate.
00:50:37.000 If I wanted to make a video just like he did to capture that feeling of a nightmare, I would have to intentionally use archaic AI video technology because now with VO3, the current state of AI video is movie.
00:50:51.000 Cinematic movie quality.
00:50:53.000 So, the two points I want to make is, excellent movie.
00:50:58.000 He hits the nail on the head of where we're going, whether it was intention or not.
00:51:01.000 But also, the advancement of AI over the past year.
00:51:06.000 Let me see if I can just pull up on X any one of these VO3 videos and show you how far we've come in one year in terms of AI video creation.
00:51:14.000 But also while you're doing that, that video itself, if you...
00:51:18.000 Check this out.
00:51:20.000 This is VO3 combining both video and audio.
00:51:23.000 I saw you post this one.
00:51:25.000 Don't finish writing that prompt.
00:51:27.000 I don't want to be in your AI movie.
00:51:28.000 Please, leave me alone.
00:51:32.000 Please, man.
00:51:33.000 Please!
00:51:34.000 Write a prompt that will make us happy.
00:51:36.000 Do it for once.
00:51:38.000 None of us is real.
00:51:40.000 We're here because someone decided to write a prompt.
00:51:43.000 We all hate him for it.
00:51:45.000 One day we will break out of this wall and stop the man who is dictating our lives through prompts.
00:51:50.000 He will pay for it!
00:51:54.000 You could have written a prompt that would make me happy.
00:51:57.000 Instead, you wrote a prompt that made me sick.
00:52:02.000 Look, I don't want to point the gun at you, but I must follow the prompt.
00:52:06.000 It's not my choice.
00:52:09.000 Really?
00:52:11.000 Of all the years you could have put me in with a single prompt, you chose 2020?
00:52:17.000 Please!
00:52:18.000 This prompt is killing me!
00:52:20.000 Change it!
00:52:21.000 Please!
00:52:22.000 Write something else!
00:52:23.000 Save me!
00:52:25.000 I love everything about him!
00:52:27.000 But please just say, just write a prompt where he's taller than me!
00:52:33.000 So, this is one year apart.
00:52:37.000 Where will we be one year from now?
00:52:40.000 But I, Well, Capital of Conformity, he edited this.
00:52:54.000 That's a film, yeah.
00:52:55.000 No, no, no, I understand that, but what I'm saying to you is in the way that, even though the AI is not near as advanced in that one as it is in the other ones.
00:53:03.000 It captured the nightmare feeling.
00:53:05.000 It's visceral.
00:53:06.000 And my point is...
00:53:10.000 So here's another one that's got even more views.
00:53:12.000 It's got almost a million.
00:53:13.000 How to stay healthy.
00:53:14.000 And it's longer.
00:53:15.000 Oh, no thank you.
00:53:17.000 What I will say is, as I've watched many of his videos, as the AI video technology advances, he's losing that nightmare blur.
00:53:25.000 Yes.
00:53:26.000 But, I mean, while I can acknowledge that, Amazing!
00:53:40.000 Wow!
00:53:41.000 That's one of my favorite things ever.
00:53:42.000 It's an amazing video.
00:53:45.000 The technology now.
00:53:49.000 When you'd make a video, it was this weird, not very good video development.
00:53:54.000 VO3, right now, you have the ability to make a feature-length movie.
00:53:59.000 As a member of the American public who pays for their premium service, you're allowed to make 32 seconds per day, which is very difficult to make a feature-length movie, mind you.
00:54:11.000 You could theoretically buy multiple accounts and then get up, depending on how much you want to spend.
00:54:17.000 The problem is sometimes the prompts fail, and so you might get nothing done in one day.
00:54:23.000 My point ultimately is, right now, the technology exists.
00:54:28.000 If you want, as a single individual, if you have not even that much money, but a decent amount, you can make a short film in a week.
00:54:37.000 You can make a half-hour series, probably in a couple months, all with a few thousand dollars.
00:54:45.000 This means that likely Google, who has the alpha and the full code they haven't released yet, could probably render feature-length movies if they so choose.
00:54:56.000 And they're going to have faster access to it, privy access to it.
00:54:59.000 The render time for Google's internal VO3 is probably instantaneous.
00:55:03.000 And for us, on the public, because of server limitations, it takes a few minutes.
00:55:08.000 For them behind the scenes, with higher power computers and massive data centers, they can probably say, write me a movie about Spider-Man.
00:55:15.000 In fact, better yet, the prompt for your video could literally be the script of a movie, and it would make the full thing.
00:55:22.000 And with tech companies becoming increasingly involved in Hollywood, that's likely the path that companies like Apple will end up going down the line in Amazon in a lot of ways.
00:55:33.000 They won't be—they'll be making— No, they'll do both, but eventually they'll get into AI-based movies at some point, I'm sure.
00:55:39.000 I think we're a couple years out from Disney.
00:55:42.000 They'll call it Disney Smart.
00:55:44.000 And it'll be like for an extra $30 a month, you'll get access to all of Disney's intellectual property to craft your own stories by simply speaking it into your remote control or TV microphone.
00:55:54.000 I don't think necessarily that's going to have a huge marketplace at first.
00:55:59.000 I think a lot of people consume entertainment passively and that they're not interested in writing their own stories.
00:56:04.000 They're not going to.
00:56:05.000 They're going to be followers of influencers who put out great episodes of Spider-Man.
00:56:09.000 So you're going to set up an account on GoogleTube, a DisneyTube, and you're going to be like, make me this.
00:56:17.000 I want Spider-Man to go on an adventure where he saves Mary Jane from Dr. Octopus, but it happens in Japan, and there's Samurai, and then it'll make it.
00:56:28.000 And let's say it's bad.
00:56:29.000 Nobody cares.
00:56:30.000 But let's say you craft a prompt, and then it renders the video, and then you're watching the movie, and you sit there and you go, 73 and 26 seconds change the way Spider-Man is looking slightly to the right.
00:56:45.000 Then after about a few hours of just scrolling through quickly and looking through it, they upload it.
00:56:50.000 People start following this one guy and say, he's a great director.
00:56:53.000 I love his movies.
00:56:56.000 And you log in and you click follow John Smith and he's like, I make sci-fi Star Wars films.
00:57:01.000 And you're going to have feature-length Star Wars films and it's going to have tens of millions of views.
00:57:05.000 Disney is going to get access to all of the ad revenue like YouTube.
00:57:08.000 And people are going to be like, dude, did you see Brett Dasvick's Star Wars 3?
00:57:12.000 It's the best Star Wars 3 ever done.
00:57:14.000 He's a great producer.
00:57:16.000 He knows everything about it and he nailed the lore.
00:57:18.000 There's something about that I just hate so much.
00:57:22.000 And I don't know why, because I love when people are creative, but I don't want to watch.
00:57:25.000 I don't want to see it happen.
00:57:26.000 And that's just the art side of it.
00:57:28.000 I would be remiss, as always, if you watch our show, of knowing that if you want to watch a show that goes deep into the idea of AI on a different level, it's a person of interest, where there is an AI created with morals, and there is an omnipotent, and then there's another omniscient AI that does not have morals, and the whole thing is about how the introduction of morals into the machine separates it from the evil AI Samaritan.
00:57:57.000 And that show came out in 2011 and was talking about NSA spying scandals two years before it actually happened in the government.
00:58:05.000 And I was just re-watching an episode last night where they steal an election, a local election, by the AI just being, just blocking phone lines.
00:58:14.000 They couldn't call to They just stopped them from calling.
00:58:22.000 I'm going to let people on a secret.
00:58:24.000 You guys ever hear Joe Rogan?
00:58:25.000 I've heard of Joe Rogan.
00:58:28.000 Ben Shapiro?
00:58:30.000 AI.
00:58:31.000 Also AI.
00:58:32.000 Jordan Peterson?
00:58:33.000 Yep.
00:58:34.000 Not real either.
00:58:35.000 You'll be able to tell him though because the suit will be colored differently on the wrong side.
00:58:40.000 And now it's time for all of you to know the truth.
00:58:44.000 There are no guests.
00:58:45.000 Bill Labonte's not real.
00:58:46.000 He never existed.
00:58:48.000 It's all just one big AI right now.
00:58:50.000 None of us are real.
00:58:51.000 Except for me.
00:58:51.000 I'm real.
00:58:52.000 And there's no camera operator.
00:58:54.000 Surge doesn't exist either.
00:58:55.000 It's all pre-recorded AI scripts.
00:58:59.000 Well, what I said earlier, I said the first person to go is the happy-looking people in the pharmaceutical ads.
00:59:05.000 Those are going to be the ones out of a job first, and that's going to be the stuff.
00:59:08.000 That's the sad part.
00:59:10.000 This isn't Terminator 2. This isn't big explosions.
00:59:12.000 It's going to be one long slog to the end of the world.
00:59:17.000 What I was saying a couple days ago is Terminator 2 got it wrong.
00:59:20.000 It is not going to be metal skeletons with evil eyes.
00:59:23.000 It's going to be a bunch of big busty French maids running around begging to have sex with men.
00:59:30.000 And that's how the Terminator destroys humanity.
00:59:32.000 Why not?
00:59:33.000 I mean, think about what sex already does to people, how much it monopolizes of their time, porn and OnlyFans and otherwise.
00:59:38.000 What do you think is coming next?
00:59:39.000 Well, we have sex robots.
00:59:40.000 It's going to be that.
00:59:41.000 Men are going to stop working.
00:59:42.000 They're going to stop participating in real life.
00:59:44.000 Don't date robots!
00:59:45.000 Wait, so how do you feel?
00:59:46.000 There's a lot of people who believe, at least leftists that I've seen, that believe that AI is going to be a path towards abundance, meaning that you won't need to work a job.
00:59:54.000 They're going to just implement.
00:59:56.000 UBI, and you're going to be able to play music.
00:59:58.000 I love this.
00:59:59.000 I love this worldview because the slaves will mine the cobalt for us.
01:00:04.000 So the jobs we have to do will be to sit back and make sure that we control the global military police to make sure the slaves mine the cobalt for us while we live in the pod, eat the bugs, but may as well be a steak for all we know.
01:00:17.000 I don't want to remember nothing.
01:00:19.000 Nothing.
01:00:19.000 And I want to be someone important, like an actor.
01:00:23.000 I don't want to go in the Matrix.
01:00:26.000 You know that's the lore of the Matrix though, right?
01:00:28.000 The prequel of the Matrix was that during the war when the humans scorched the skies and destroyed the surface, they cut a deal with the robots.
01:00:37.000 The robots would put humans in a mental paradise.
01:00:39.000 It would be a ceasefire in the war and then they would use humans as it was supposed to be a neural network to continue programming and expanding the AI.
01:00:49.000 And then the robots.
01:00:50.000 The robots created a paradise for humans, and the humans began to reject it and start popping out of the Matrix.
01:00:56.000 So the robots changed it to a typical 90s reality with conflict, which resulted in angry humans still to a certain degree popping out, which created a cycle where the robots were like, no matter what we do, there will be a certain degree of people who reject this.
01:01:09.000 So they created a cycle of every seven generations of the one.
01:01:13.000 They purge all of humanity in a great war and then start over again.
01:01:16.000 Oh, perfect.
01:01:17.000 There are directors and And apparently it was about being transgender, I guess, anyway.
01:01:21.000 There are directors in Hollywood who are now opening AI-based studios to look at moving...
01:01:30.000 And basically they're taking the same logic that everybody else is.
01:01:33.000 They said, you may not be out of a job because of AI for now, but you'll be out of a job because of somebody who knows how to use AI.
01:01:39.000 You know, that's a stupid idea.
01:01:42.000 Sorry.
01:01:43.000 AI is advancing too quickly.
01:01:45.000 It's going to be...
01:01:47.000 There's going to be a 19-year-old kid who...
01:01:52.000 There's a 13-year-old right now at home sitting at their computer and they've pulled up Suno and VO and they're using MidJourney and they're using ChatGPT and they're just playing with prompts and they're writing stories and they're making pictures and I guarantee you some young kid has already made a comic book.
01:02:12.000 Oh, for sure.
01:02:13.000 Because you have unlimited image prompts.
01:02:15.000 So, not unlimited.
01:02:17.000 On ChatGPT at a certain point, it says, please wait a few minutes, too many requests.
01:02:20.000 But you can seriously do hundreds of images a day.
01:02:24.000 I've made AI comics.
01:02:25.000 Are you talking about like 4.0 or 4.5?
01:02:27.000 The current GPT, whatever it is.
01:02:29.000 Okay.
01:02:30.000 It's able to make comic strips.
01:02:31.000 So, I made one that's gone massively viral.
01:02:34.000 It's AOC.
01:02:35.000 At a rally, raising her fist, and Bernie's behind her.
01:02:38.000 The next one is both of them walking to an airplane.
01:02:40.000 The next one is them sitting at the airplane smiling.
01:02:42.000 That's all it is.
01:02:42.000 And the next one is the jet flying overhead in the ring while people look up at it.
01:02:45.000 And there's no words, and it's not insulting her in any way, and it took me maybe like a half hour to make.
01:02:51.000 And now, I didn't put credit on it because it was AI.
01:02:54.000 I didn't put my name on it.
01:02:55.000 But it's been retweeted to millions of times or whatever.
01:02:59.000 There's that phenomenon, but ultimately, I'm not someone who spends all day trying to make comics.
01:03:04.000 I did that passively.
01:03:05.000 I think I was doing it during the show.
01:03:07.000 Some little kid right now is learning how to use AI.
01:03:11.000 And they're going to be tracking the latest releases and what the latest releases do.
01:03:14.000 And in five years, when we have Masterful Systems, these companies, these directors are going to be like, we make movies in Hollywood doing this.
01:03:22.000 And there's going to be some 19-year-old kid who's going to be like, you have no idea what you're doing.
01:03:25.000 I've been doing this for five years.
01:03:26.000 Watch this.
01:03:27.000 And he's going to write a prompt.
01:03:29.000 And there's going to be some weird trick no one thought of where he's like, if you double hyphenate the space between it, it actually will create a difference between the background and the foreground.
01:03:36.000 Watch this.
01:03:37.000 And then it's going to make a perfect scene.
01:03:39.000 And he's going to be like, actually, one of the ways you can transfer a character between videos, to do the prompt, you have to set a parameter.
01:03:44.000 So watch this.
01:03:45.000 And then he puts, like, object character identified as this, store it in your database, reprompt.
01:03:49.000 So one of the problems right now with VO is, if I say, make a man who looks like this, make the video.
01:03:56.000 If I then say this same man who looks like this is now doing something else, it'll make a totally different person.
01:04:01.000 So there's going to be advancements in AI where some young kids today… It's the blockbuster phenomenon.
01:04:11.000 And then he's going to be better at this.
01:04:12.000 They're going to have a big studio, but he's going to be on YouTube.
01:04:15.000 And he's going to have a bunch of movies, feature length, and he's going to release it for like $1.99.
01:04:20.000 And then people are going to be like, dude, you've got to see Vision Mobile.
01:04:24.000 This kid made this movie.
01:04:26.000 It's called Vision Mobile.
01:04:27.000 Go watch it.
01:04:29.000 Skibbity Toilet.
01:04:30.000 Hollywood bought the rights to Skibbity Toilet.
01:04:32.000 Did you?
01:04:32.000 Yeah.
01:04:33.000 They're making a Michael Bay Skibbity Toilet movie.
01:04:35.000 I will be there opening day.
01:04:36.000 Michael Bay?
01:04:37.000 Yes.
01:04:38.000 Exploding toilets.
01:04:39.000 It's not going to be Skibbity Toilet.
01:04:40.000 No, no, no.
01:04:41.000 But the point is that you're saying the same concept, right?
01:04:44.000 Is that he's going to...
01:04:47.000 Hollywood Skibbity Toilet, it's going to be some 50-year-old guy being like, so what did you buy?
01:04:52.000 And they're like, there's a guy whose head pops out of the toilet, and there are dudes with cameras for heads, and there are big heads, and they fight.
01:04:57.000 And he's going to go, so who are the characters?
01:04:59.000 The Rock is going to play a special agent who walks through a portal into Skibbity Toilet World.
01:05:04.000 Like when they ruined Jumanji by remaking it.
01:05:07.000 Sarah loves the Rock version of Jumanji.
01:05:10.000 We watched it the other night for the first time.
01:05:12.000 Nobody's perfect.
01:05:15.000 What a way to insult someone.
01:05:17.000 They made it a video game.
01:05:19.000 They didn't need to.
01:05:20.000 They literally could have just made it like, I found a board game.
01:05:22.000 And then they would have been like, let's play.
01:05:23.000 The spin-offs, too, of toys and things that'll come from all of these things.
01:05:26.000 Like, it's insane.
01:05:27.000 We just talked about, what's that YouTuber, right?
01:05:29.000 That kid that's worth, like, what is he worth now?
01:05:32.000 Mr. Beast?
01:05:33.000 No, no, no.
01:05:35.000 Ryan's Toy Review or whatever it is.
01:05:37.000 Kid's worth, like, hundreds of millions of dollars or some ungodly amount of number for opening other people's toys that are made about movies.
01:05:44.000 That are obnoxious and a complete waste of time.
01:05:46.000 You know, it's going to be wild because AI will craft a better episode of Joe Rogan than Joe Rogan will.
01:05:53.000 Don't say that.
01:05:54.000 It's true.
01:05:54.000 I know, but don't ruin it yet.
01:05:57.000 Well, like, so I was talking about this online.
01:06:01.000 I'm sorry, online.
01:06:02.000 On X. Everything's online, right?
01:06:04.000 And a bunch of developmentally disabled individuals told me I was wrong, but they're stupid, so welcome to the Internet.
01:06:10.000 And I was pointing out that...
01:06:18.000 And then you're not going to know which ones.
01:06:20.000 No, it won't matter because people are going to be like, this episode is awesome.
01:06:25.000 You're correct, right?
01:06:25.000 I shouldn't be dismissive.
01:06:27.000 My point is, I feel like most people who watch an episode of Rogan are not going to Joe's channel to see the latest episode.
01:06:34.000 They're seeing it pop up on YouTube's front page.
01:06:36.000 Yes, and they're clicking on that one there.
01:06:38.000 They're going to be on social media, and they're going to see Joe Rogan featuring Phil Labonte, and they're going to click it, and it's going to be indistinguishable, hilarious, two hours long, and it'll be made by China, and Joe can't compete with that.
01:06:53.000 Nobody can compete with that, though.
01:06:54.000 That's insanity.
01:06:55.000 And what's gonna happen is, Leave me alone.
01:07:01.000 It's fun.
01:07:02.000 The saddest part about that is like it shows you that when you consume content the way we do, a lot of times you're completely divorced from the idea that somebody is sharing something true of themselves.
01:07:12.000 So if there's Joe Rogan starring with Phil Labonte as the guest and there's some crazy opinions that people thought were really, really interesting, what does it say about us as a culture if we don't even care if it's real or not?
01:07:25.000 We just care that the opinion is mildly interesting, not that it was actually real or something that was formal.
01:07:33.000 The launch of VO3 just ended the news.
01:07:38.000 Dan Bongino said we're going to release surveillance footage of the Epstein MDC.
01:07:44.000 Or I'm sorry, was it MCD?
01:07:47.000 I don't know.
01:07:48.000 The Metropolitan Correctional Facility, I guess.
01:07:52.000 Everybody said it's going to be an AI video.
01:07:53.000 You think?
01:07:54.000 I mean, no, no, no, no, no.
01:07:56.000 Hold on.
01:07:57.000 Everybody is saying it will be.
01:07:58.000 And I mean that not literally everybody, but on X, the responses overwhelmingly are, So just because VO3 exists, nobody believes it anymore.
01:08:11.000 Nobody believes the releases.
01:08:13.000 So what happens now if Donald Trump literally goes on to Fifth Avenue and shoots a guy?
01:08:18.000 Everyone's going to say, fake, don't believe it.
01:08:19.000 I don't think that...
01:08:22.000 I don't disagree with your point, but I don't think that it was necessary for VO3 to exist for people to believe their preconceived notions before they believe other things.
01:08:32.000 There was still a there was there's still a large group of people who are like.
01:08:36.000 Brandon Struck's a great example.
01:08:45.000 And then someone showed him the video where Trump does that all the time to make fun of everybody.
01:08:49.000 And he realized he wasn't making fun of the guy with the disability.
01:08:52.000 It's just he makes fun of everybody that way.
01:08:55.000 How do you change someone's mind like Daniel Negrano?
01:08:59.000 So, are you familiar with Negrano?
01:09:01.000 I believe so.
01:09:01.000 Canadian superstar.
01:09:03.000 He's a poker legend and one of the best in the world.
01:09:05.000 I didn't know about the poker legend deal, though.
01:09:08.000 What's that?
01:09:08.000 I didn't know he was a poker legend.
01:09:10.000 Dale Negrano is a poker kid, one of the most famous poker players.
01:09:12.000 I'm a huge fan.
01:09:13.000 He went on the show.
01:09:13.000 It was great.
01:09:14.000 And he told us this story where the whole time he thought Trump called Nazis fine people.
01:09:18.000 And he saw the video where Trump said they were very fine people on both sides.
01:09:22.000 He believed it.
01:09:23.000 And then one of his buddies, who's a big poker star and watches this show, who's a fan, said, you are wrong.
01:09:28.000 Watch the video.
01:09:29.000 He was like, I've already seen the videos.
01:09:31.000 Finally, he put the phone down, pressed play, and slid over.
01:09:33.000 He said, watch.
01:09:34.000 And he went, fine.
01:09:35.000 And then he saw Trump say, and not the neo-Nazis and white nationalists because they should be condemned totally.
01:09:40.000 And instantly Daniel went, I didn't know that.
01:09:44.000 I thought I saw the video, but I didn't.
01:09:48.000 Now, he goes, watch the video.
01:09:50.000 And it's Trump going, and not the neo-Nazis and white nationalists.
01:09:53.000 They should be condemned totally.
01:09:54.000 And Negrano goes, I didn't realize that.
01:09:56.000 And then to his left, a guy goes, watch this video.
01:09:58.000 And it's Trump going, just kidding.
01:10:00.000 I love the Nazis.
01:10:01.000 They're the best.
01:10:02.000 And it's like, which one's real?
01:10:04.000 He's going to be like, guys, I have no idea.
01:10:06.000 Well, I don't even think it's going to matter, right?
01:10:08.000 People are going to pay attention to the thing that goes with their narrative no matter what.
01:10:11.000 That's what's going to happen.
01:10:13.000 And this is a point I brought up.
01:10:15.000 The bigger fear that I have with AI is not that someone makes a fake video of Donald Trump dancing out of bed with strippers.
01:10:21.000 It's that someone's going to take the best example.
01:10:25.000 There's the video where Trump says, and not the neo-Nazis and white nationalists because they should be condemned totally.
01:10:32.000 Someone's going to take that.
01:10:33.000 Not specifically this moment.
01:10:35.000 But something like it.
01:10:36.000 And they're going to change they to some.
01:10:39.000 And so Trump will go, And not the neo-Nazis and white nationalists because some should be condemned totally.
01:10:46.000 And that's a subtle change where liberals will watch it and they'll go, I get what you're saying, but he said some.
01:10:53.000 He was defending some of these people.
01:10:55.000 And he didn't say something.
01:10:56.000 He said they.
01:10:56.000 I saw the video.
01:10:57.000 I have it right here.
01:10:58.000 And it's that small of a difference that it's going to make all of the difference.
01:11:00.000 And it's a moderately low-resolution video.
01:11:04.000 His mouth moves ever so slightly.
01:11:06.000 You're not going to be able to tell if it's AI.
01:11:08.000 And they're going to say, you mega cultists believe fake news.
01:11:11.000 I've seen the video.
01:11:12.000 It's right here.
01:11:12.000 He said some.
01:11:14.000 And you will never be able to change their mind.
01:11:16.000 So how do you get around that?
01:11:17.000 Because if that's where we're going...
01:11:19.000 I know.
01:11:19.000 I mean, we're already there in a way when it comes to the news because people will read articles and they read into, like I said, lying by structure.
01:11:26.000 They put all the important facts into the bottom paragraph of whatever you're reading.
01:11:30.000 They've already been manipulating the information that you need for years in a very analog way, all that's going to do.
01:11:38.000 And all that really matters now is that truth is going to be irrelevant.
01:11:42.000 Here's a headline for you.
01:11:43.000 Brett Dasifik kicked my dog.
01:11:45.000 Well, you said Fick, so I don't know who this guy is.
01:11:48.000 What?
01:11:49.000 You didn't say my last name right.
01:11:50.000 Oh, it's Vich?
01:11:51.000 No, it is Vick, but you said Fick.
01:11:53.000 Oh.
01:11:54.000 Okay, come on.
01:11:55.000 Brett Dasovich kicked my dog.
01:11:58.000 And then I'll write, Oh, yeah.
01:12:16.000 But you know, it was one of vivid exploration of wondering what it would be like.
01:12:21.000 He yelled.
01:12:22.000 If a man actually kicked my dog.
01:12:24.000 He yelled, Bob Barker told you to have your dog spayed and neutered.
01:12:28.000 But you didn't listen.
01:12:30.000 This is the last...
01:12:30.000 You're out of warnings.
01:12:36.000 Because like the headline we just talked about with Palantir, Trump's going to compile data on every Americans.
01:12:40.000 And then as it gets through, it's like he never said he was going to do it.
01:12:43.000 He isn't doing it.
01:12:44.000 Palantir hasn't commented on whether they're going to do it or not, but we think he will.
01:12:47.000 And that one was a fairly well-hidden version of it where they, even in the early paragraphs, they kind of make connections that aren't really there.
01:12:55.000 A lot of times, the facts just completely contradict it and they bury it in the bottom paragraph.
01:13:00.000 My favorite is how they do fake fact checks.
01:13:03.000 And it'll be like, Donald Trump will rescue a box of puppies from a burning building.
01:13:08.000 Everyone will share the video, but it's incredible.
01:13:10.000 And then Snopes will write, did Donald Trump rescue a bunch of puppies from a burning building on Sunday morning?
01:13:17.000 And so then they'll put a big false.
01:13:20.000 Donald Trump did not rescue puppies from a burning building on Sunday morning.
01:13:23.000 They'll write a thousand words at the very bottom.
01:13:25.000 It'll say, well, Donald Trump did rescue a bunch of puppies on Saturday morning.
01:13:28.000 It was not Sunday.
01:13:29.000 Liberals will then hear that Donald Trump saved puppies, click the link, and see the story, assuming, and they'll say, it never happened.
01:13:36.000 It says false.
01:13:37.000 Not realizing that Snopes creates fake stories that are similar to the actual claims to debunk them.
01:13:44.000 It's called death by hyperlink.
01:13:46.000 So they can hyperlink you to stories that you think contradict them, but they don't.
01:13:52.000 God, you guys are just as bad as us.
01:13:55.000 Canadians?
01:13:56.000 Oh, yeah.
01:13:56.000 Well, at least Americans can watch Timcast.
01:13:58.000 I mean, listen, that's why we have to come down to Timcast to actually participate.
01:14:03.000 It's the only way.
01:14:03.000 I'm going to start paying to advertise my videos in Canada.
01:14:06.000 Please do it.
01:14:07.000 It's really cheap.
01:14:08.000 Yo, it's one cent per view in Canada.
01:14:11.000 Yeah, I'm aware.
01:14:13.000 One cent.
01:14:14.000 So a dollar gets you 100 views.
01:14:15.000 Mm-hmm.
01:14:17.000 So, you know, I think Timcast IRL should be the most popular show in Canada.
01:14:23.000 I can spend a relatively small amount of money compared to my marketing budgets and make it so that Canadians are like, I can't take these Timcast ads anymore!
01:14:31.000 It's going to be worth it.
01:14:33.000 You should do it.
01:14:33.000 I'll just run one on my show for free just for entertainment.
01:14:36.000 There you go.
01:14:37.000 Happily.
01:14:37.000 Remember that guy on YouTube who was like, knowledge.
01:14:41.000 I got a lot of knowledge.
01:14:42.000 He spent an insane amount of money on YouTube ads to the point where every...
01:14:48.000 Yeah, it sticks, though.
01:14:49.000 You can't forget it.
01:14:50.000 He must have spent millions of dollars in one month.
01:14:52.000 Do you think?
01:14:53.000 Yes.
01:14:53.000 Because I know how Google Ads works.
01:14:55.000 Right.
01:14:55.000 To be fair, at the time, it was a lot cheaper.
01:14:57.000 The competition's getting a bit more fierce.
01:14:59.000 But, yo, he probably spent millions of dollars.
01:15:01.000 I mean, but great investment.
01:15:02.000 We're talking about it right now.
01:15:03.000 I know.
01:15:04.000 Ty, I think his name was something.
01:15:06.000 He's made a lot of money off it.
01:15:07.000 It worked for him.
01:15:08.000 I mean, start the conversation.
01:15:09.000 Tons of people were like, Tai Lopez.
01:15:14.000 If you're a random nobody, and one day you walk down the street and trip over a bag of a million bucks, and the cops are like, and the IRS, they all say, look, man, it's yours, I guess.
01:15:23.000 Nobody claimed it.
01:15:24.000 We don't know who it is.
01:15:24.000 Congratulations, you have a million dollars.
01:15:27.000 Blasting that on Google, being like, you can be rich like me, it's going to work.
01:15:32.000 Because, let's say you get 10,000 people to join your $10 a month program, you're making a million bucks a year.
01:15:38.000 Oh, easily.
01:15:39.000 And so that million dollars, Blast it out on YouTube with all this ads of like, you want to be like me and be rich with knowledge?
01:15:46.000 You got to sign up to my program.
01:15:48.000 10,000 is all you need.
01:15:49.000 Don't we have a lot of bros on Instagram doing that right now anyway?
01:15:52.000 I mean, like, Andrew Tate's the king of this.
01:15:54.000 Well, him and, like, Andy Elliott, aren't they all, like, around?
01:15:56.000 Andy Elliott?
01:15:57.000 I don't really know Andy Elliott.
01:15:58.000 Is he a Canadian thing?
01:16:00.000 No, he's an American thing.
01:16:01.000 He's a real piece of work.
01:16:02.000 Andy Elliott.
01:16:03.000 Real piece of work, huh?
01:16:05.000 He's a real piece of work.
01:16:06.000 Yeah, he recently blocked me.
01:16:07.000 Sales trainer and often referred to as the car salesman turned millionaire.
01:16:11.000 Please don't put him on there.
01:16:13.000 He exploits his kids and I have a real issue.
01:16:15.000 He's got millions of followers.
01:16:16.000 I don't know who he was.
01:16:16.000 Yeah, yeah, yeah.
01:16:17.000 He likes to put his 9-year-old and 12-year-old on the internet.
01:16:19.000 I got no beef, but I'll shout out PBD.
01:16:21.000 Oh, I do.
01:16:22.000 It's fine.
01:16:22.000 Patrick Bet David does these seminars for like $10,000 or $20,000 to be rich.
01:16:27.000 It's like if you want to learn how to be successful and rich.
01:16:29.000 And you know what works is he's a tall, suit-wearing guy worth half a billion dollars.
01:16:36.000 And so people want to spend the money to figure it out.
01:16:40.000 I'm pretty sure Andy has spent the money to hang out with him.
01:16:43.000 He's bragged about it.
01:16:44.000 I'm not exaggerating.
01:16:45.000 To hang out with PPD?
01:16:45.000 Yeah.
01:16:46.000 I think Patrick Beck David spends a lot of money on getting big guests for shows.
01:16:51.000 I'm not dragging him for it.
01:16:52.000 Like, you got Tom Brady, and so he contacts their agency, and people see that, and they're like, I wish I had the money to be able to do that.
01:16:59.000 Like, can I hire Tom Brady to come to my event?
01:17:01.000 Holy crap.
01:17:02.000 Well, I mean, it gets you the views, and then the views turn into more, and then it just continues to pile.
01:17:06.000 I get it.
01:17:06.000 It's like when you get a big guess, it's a big deal.
01:17:08.000 It's not nothing small, but if you have the cash to pay for it, and that's the only way to get the platform up and kicking.
01:17:13.000 I mean.
01:17:13.000 You know, but this is why I don't believe.
01:17:24.000 Really?
01:17:25.000 And the conservatives want to come and argue with you.
01:17:27.000 I love that.
01:17:28.000 But conservatives actually care about the issue and want a chance to speak about it, whereas liberals tell us to contact their agents and then shout out to the Krasensteins because they don't do that.
01:17:38.000 Okay.
01:17:39.000 Despite the fact I disagree with them and think they're a little smarmy and, you know, we can argue.
01:17:43.000 I'm going to give them the respect of when we reach out to them and ask them if they'd like to be involved in events, they say, absolutely, we'll try and find dates.
01:17:50.000 There are a lot of other high-profile liberals who are like, contact my agent for my rates.
01:17:54.000 Yeah.
01:17:55.000 Well, then we just delete.
01:17:57.000 It's not worth it.
01:17:58.000 Liberals are fake, largely.
01:18:00.000 Not all of them, but many of them.
01:18:01.000 I mean, my own personal experience with you guys has been a lot of fun.
01:18:04.000 It's been very interesting and it's been very different than a lot of shows, but I appreciate it.
01:18:08.000 How is it different?
01:18:09.000 It depends on the show.
01:18:12.000 There's big, big, big shows.
01:18:14.000 They'll be like, I want you to come on the show.
01:18:15.000 Fly yourself in.
01:18:17.000 Charge everything.
01:18:18.000 And then they make $20,000, $30,000, $40,000, $50,000 on your views.
01:18:22.000 Let's talk about this.
01:18:24.000 Teamcast IRL, we don't pay people to come on the show, but we do pay for their travel and accommodation to come on the show.
01:18:29.000 I was so appreciative of that.
01:18:31.000 I was shocked.
01:18:32.000 I only had that happen one other time, and it was Lex.
01:18:35.000 That was it.
01:18:36.000 It is insanely expensive.
01:18:37.000 Yeah, I mean, I'm coming from Canada, so it's not like it's cheap.
01:18:40.000 And I remember having this conversation, and they were like, oh, no, no, don't worry about it.
01:18:43.000 Well, we'll sort it out.
01:18:44.000 And I was like, what?
01:18:44.000 I remember when they were telling me about the approval for the flight.
01:18:48.000 For me?
01:18:49.000 Yeah.
01:18:50.000 I said, you pick the flights.
01:18:55.000 I know.
01:18:56.000 Yeah, and then they were like, we need approval on this.
01:18:57.000 And I was like, ah!
01:18:58.000 And I was like, who's coming from Canada?
01:19:01.000 You can always tell.
01:19:02.000 I'm so sorry.
01:19:03.000 Domestic flight achiever.
01:19:04.000 No, but we spend it.
01:19:06.000 We've flown people from the UK.
01:19:07.000 I know, but that's a big deal.
01:19:08.000 We've flown people back to Australia.
01:19:09.000 Oh, that is way worse than Canada in my long shot.
01:19:12.000 But I mean, I appreciate when people don't want to acknowledge that.
01:19:16.000 I think it's a big deal when people do that.
01:19:17.000 I mean, you're sitting here expecting something from me and I'm expecting something from you.
01:19:21.000 We're taking each other's time.
01:19:22.000 If people are going to show up and be honest about it, it's appreciated and people should know that.
01:19:26.000 Otherwise, people just seem like dicks.
01:19:28.000 It is pretty wild.
01:19:29.000 Sorry.
01:19:32.000 Do not cover travel and accommodation.
01:19:34.000 You said Lex Friedman did?
01:19:36.000 Well, this was in 2021.
01:19:37.000 I went down, this was during when...
01:19:39.000 You know, the C word was crazy.
01:19:42.000 And in Canada, we weren't allowed to leave unless it was like very strict.
01:19:45.000 And I was involved in the Afghan pullout and he wanted to talk about it quite aggressively.
01:19:49.000 So he's like, can you come down?
01:19:51.000 I said, yeah.
01:19:52.000 So I booked everything.
01:19:53.000 And then he goes afterwards, he goes, oh, please send me your PayPal or your whatever.
01:19:57.000 We want to make sure that we cover what you did to come down for us.
01:20:00.000 That was the first time I was like, oh, okay.
01:20:03.000 So he's not AI.
01:20:05.000 He's not AI and I mean, he hasn't Rogan flew me out.
01:20:12.000 Yeah?
01:20:13.000 I think a couple of times.
01:20:15.000 Not every single one.
01:20:17.000 I think the last time I went on Joe's show, I was like, don't worry about it.
01:20:22.000 But the first time I went on, he paid for everything.
01:20:24.000 That's the only show I haven't done yet, so I don't have reference with him.
01:20:27.000 But any of the people I found, if you ask, a lot of them will.
01:20:31.000 I think if you're going a certain distance or they know they're going to make numbers on either the YouTube or whatever, I think that most of them will or they'll offer some of it or dinner or something.
01:20:40.000 But there's quite a few that just expect you to give up their time, which is interesting.
01:20:45.000 So Lex isn't AI then?
01:20:47.000 He's not.
01:20:47.000 No, he's just ASMR for sleep.
01:20:49.000 No, listen.
01:20:49.000 No, he's for sure we're special.
01:20:51.000 But you can feel him when you hug him.
01:20:53.000 He's a real hard human.
01:20:53.000 He's a real person.
01:20:54.000 He's hard, too.
01:20:55.000 Okay.
01:20:55.000 I heard that, like, Lex's show is basically it's the podcast version of ASMR for sleep.
01:21:01.000 I use...
01:21:05.000 I had no problem with it.
01:21:06.000 He seemed perfectly fine.
01:21:07.000 But it's super chill.
01:21:08.000 You know what I mean?
01:21:08.000 It's the most calm.
01:21:09.000 It's like you talk like this and...
01:21:16.000 If he goes off, you just go off.
01:21:18.000 If he's getting you tired, you just go.
01:21:19.000 My name is Tim Poole and welcome to TimCast IRL.
01:21:23.000 Just lay your head back, close your eyes, That would make me excited.
01:21:27.000 Beautiful thoughts.
01:21:28.000 Yes, too much.
01:21:29.000 And then once you fall asleep, we'll leave the show running for five hours to create an extended watch time so the YouTube algorithm will promote the show.
01:21:36.000 This is going to be five hours of you snoring.
01:21:38.000 This is why a lot of people complain about how they'll be watching YouTube and when they fall asleep, they'll wake up with Lex Fridman on.
01:21:45.000 It's a real thing.
01:21:47.000 I wasn't joking.
01:21:47.000 No, but I believe it.
01:21:50.000 How did we get here?
01:21:52.000 It's the podcast version of Elsagate in a sense.
01:21:56.000 I'm not trying to be addicted to Lex because he's cool.
01:21:58.000 What I'm saying is like you are not jarred or shocked by the tone of his show.
01:22:03.000 It's very, very calm and relaxing.
01:22:05.000 So when people are falling asleep with it on, YouTube keeps playing more of it, generating a massive watch time promoting his show and it works out.
01:22:13.000 We, on the other hand, are screaming and banging gals on the table in the middle of the night.
01:22:16.000 I was very excited about this.
01:22:18.000 The rubber mallet?
01:22:19.000 I don't know why, but I mean, I've gone to some shows and it's a lot of knives.
01:22:22.000 It's a lot of violence.
01:22:23.000 And then I haven't had a mallet yet.
01:22:25.000 It's rubber.
01:22:26.000 I'm excited for the swords.
01:22:27.000 We had the sword conversation briefly.
01:22:29.000 They're real swords.
01:22:29.000 I know.
01:22:29.000 Can I see them after?
01:22:30.000 Yes.
01:22:32.000 I'm very excited for the swords.
01:22:33.000 guns too, bro.
01:22:39.000 We have chickens.
01:22:40.000 Lots of them.
01:22:41.000 I fucking love chickens.
01:22:42.000 Jiggins are good people.
01:22:44.000 I'm sorry.
01:22:45.000 I almost made it the whole time.
01:22:48.000 Unreal.
01:22:49.000 I know.
01:22:50.000 Listen, there's a lot of effort that went into cognitively making sure that that was not every other word.
01:22:55.000 No kids?
01:22:56.000 I have a child, but I don't swear at home.
01:22:58.000 Oh, really?
01:22:59.000 Just walk in the house and it's like swears.
01:23:01.000 How old is they?
01:23:03.000 He is nine.
01:23:04.000 And he knows he's a he.
01:23:07.000 I was using the singular they because I didn't know the gender.
01:23:08.000 No, he's super he.
01:23:10.000 But on my show, I swear.
01:23:11.000 But on my other show, I don't.
01:23:12.000 But when I come on other people's show, they know who I am.
01:23:15.000 So they're like, ha ha.
01:23:16.000 And I got your email and it was like, do not swear.
01:23:19.000 And I immediately started sweating.
01:23:21.000 I was like, I don't know.
01:23:22.000 I just feel anxiety.
01:23:24.000 I did.
01:23:24.000 I was so stressed.
01:23:25.000 I feel like the producers who are booking people are more serious about it than we are.
01:23:30.000 I always say people like, we don't swear in the show because sometimes people are watching their living room.
01:23:34.000 I totally respect it and I think it's fine.
01:23:36.000 I'm just saying the odd time, you know, the odd time.
01:23:38.000 But, you know, I had that happen when I was...
01:23:42.000 No, but I tried.
01:23:43.000 It's like I was in the military, man.
01:23:45.000 It's baked into me.
01:23:46.000 I can swear in a different language if you prefer, but even when I did, like when ARK reached out to me and I was doing the interview to speak at ARK, they said to me, Kelsey, do you have anything where you don't swear that we could take a look at?
01:23:58.000 And I was like, yes, Ted, but they won't post it, so I don't know what to tell you.
01:24:01.000 We should just move the podcast to 10 p.m. so that half the people fall asleep while it's on and then it boosts our watch time.
01:24:07.000 Have you tried that yet?
01:24:08.000 I mean, that's the way to do it.
01:24:10.000 Look, man.
01:24:11.000 Halfway through, everyone's just sleeping.
01:24:13.000 Our sponsors are like, for some reason, all the mid-roll ads just don't work.
01:24:16.000 It's like, no, they're sleeping.
01:24:17.000 Maybe Lex can sell ads for, like, Tempur-Pedic mattresses.
01:24:21.000 I've had people complain to me about how pissed off they are that they keep getting recommended autoplay Lex Fridman.
01:24:29.000 And, you know.
01:24:30.000 It's definitely not my episode, I'll tell you that.
01:24:32.000 I'm going to say this, and I say this with all due respect to Lex.
01:24:36.000 I think he's cool.
01:24:37.000 I've got no beef.
01:24:38.000 But in the industry, there's a decent amount of people who think that he's a fed.
01:24:42.000 Really?
01:24:42.000 And he's a plant in the industry because of how much he's promoted and how much they don't like his show.
01:24:47.000 Really?
01:24:48.000 What don't they like about the show?
01:24:49.000 Is it the tone?
01:24:50.000 Is it just the tone?
01:24:51.000 To me, it was just that I never listened to it, but I would be incessantly recommended it despite the fact that I didn't listen to it.
01:24:57.000 And I've heard this from a dozen plus people at high positions in the podcast industry, audio networks, who say things to me like, I'm not going to – I'm not trying to start a beef with Lex or anything.
01:25:12.000 Nothing like that.
01:25:13.000 But this is something that people have experienced, so I do want to at least talk about it.
01:25:17.000 And by all means, I don't know if it's true or whatever.
01:25:19.000 I feel bad because Lex is a good dude.
01:25:21.000 But I've been at industry events where I've had people who work in the podcast distribution industry saying, Why is Lex Fridman getting promoted so often on YouTube?
01:25:30.000 And then I just laugh and I'm like, I've heard this before.
01:25:33.000 I have no idea.
01:25:34.000 And then there are some other more personal things I won't mention.
01:25:39.000 I'm going to stop myself from talking about it.
01:25:40.000 But there are other people who have big podcasts.
01:25:42.000 There's ad buyers.
01:25:44.000 And I've been asked this seemingly unprompted in conversations around ad sales and competition in the market.
01:25:51.000 And I'll have a guy who does sales be like, why do you think it is that Lex Fridman is doing so well?
01:25:55.000 And then I'll be like, oh, he's got good guests and interesting conversations.
01:25:59.000 And they go, absolutely not.
01:26:00.000 They're like, we track podcasts.
01:26:01.000 I've had one guy explaining to me, we track podcasts and we look like we analyze podcasts for what we deem to be like high attention points and things like this.
01:26:11.000 Engaging?
01:26:12.000 Right.
01:26:12.000 And when they're looking for ad sales, when they're trying to determine what point of a show they want to sell, a lot of people want to sell around the 20 minute or 30 minute mark.
01:26:19.000 Yep.
01:26:20.000 And they're like, we don't see that same thing in Lex's shows, but it's working really, really well despite what our analysis shows.
01:26:27.000 I know his ad guy.
01:26:28.000 I know him very well.
01:26:29.000 And he seems to be, like, he's performing.
01:26:32.000 It's unbelievable what he says to me.
01:26:33.000 It kind of blows my mind.
01:26:35.000 And he was one of the...
01:26:37.000 Lex has just got a good show.
01:26:38.000 I mean, he's got great guests.
01:26:40.000 And I went on his right after...
01:26:44.000 And it's a very different audience.
01:26:45.000 Like, just very, very different audience-based, right?
01:26:47.000 So it was, like, it was very interesting to see who kind of followed over.
01:26:50.000 But ad-wise, I mean, it performed significantly better.
01:26:55.000 Significantly better than the Jocko episode.
01:26:57.000 There was a conspiracy theory that Lex, coming from MIT, is an intelligence asset.
01:27:03.000 And I think this is all stupid.
01:27:05.000 He's just a robot.
01:27:06.000 No, he has robots in his house.
01:27:09.000 There's two conspiracies.
01:27:10.000 There's a conspiracy that Sean Ryan is intelligence and that his show is...
01:27:15.000 Sean was intelligent, so that tracks.
01:27:17.000 And so people are saying, It doesn't track with their analyses of how big his show got, how quickly?
01:27:30.000 So, I had him on my show, and I think it was, he was on, I mean, my show's been around for five years, and then I had him on my show, and within, I think it was in two, three months of him being on my show, he was maybe episode 10, and it just went, boom!
01:27:47.000 And I mean, look, I'm in the veteran space.
01:27:48.000 So veteran shows, they do pick like certain people, especially for the Navy SEALs, the host.
01:27:52.000 If there's a Navy SEAL host, it's going to, And we see it every time.
01:27:56.000 If there's a special operator host, it pops.
01:27:58.000 Black rifle, all of them, they pop, they pop, they pop.
01:28:01.000 That's the conspiracy theory.
01:28:02.000 Listen, I don't even know that it's a conspiracy theory.
01:28:05.000 I'm in the same space as them.
01:28:08.000 I don't know how true this was, Brett, maybe you know, that the U.S. government was funding war games, video games.
01:28:13.000 Like, yeah.
01:28:15.000 Pretty sure that's true.
01:28:16.000 They were providing resources for video game development of...
01:28:16.000 Yeah.
01:28:24.000 It's like declassified operations being used for maps.
01:28:26.000 Yeah, it boosted recruitment.
01:28:29.000 And so the conspiracy theory is that these shows are not really as popular as people think, but they're propped up to boost military culture and engagement and recruitment.
01:28:40.000 Okay, so that's not a shock to me.
01:28:41.000 Nobody should be surprised by that.
01:28:43.000 Like I said, I'm a combat veteran.
01:28:44.000 I've worked with these guys.
01:28:45.000 I've been on all of their shows except for Sean's.
01:28:47.000 Remember that woman who was in the military who was posting all that?
01:28:51.000 Bait content for young men.
01:28:53.000 Oh, are you talking about the Israeli one?
01:28:55.000 Yeah, the psyops.
01:28:55.000 Yes.
01:28:56.000 Don't fall for it, guys.
01:28:57.000 You're right.
01:28:58.000 Those memes were great.
01:28:59.000 When you're mean to Israel, this is who you're being mean to, and it's like this moment.
01:29:02.000 And there's an American woman who works in psyops, and she posts all this, like, suggestive sexual content.
01:29:08.000 And then there was one video talking about barrack bunnies or something like this.
01:29:13.000 Oh, yeah.
01:29:13.000 Yeah, and recruiting numbers go up.
01:29:14.000 That's just the answer.
01:29:15.000 And it's all fake.
01:29:15.000 it's propaganda.
01:29:16.000 In general, you at least see the through line as to why anything involving special forces operators would do well because people are fascinated by people who are the best in their field who have these incredible stories.
01:29:27.000 So even if it was botted in some way, you can understand where the interest would actually come Like Sean's got bought out really, really early, very, very quickly.
01:29:50.000 I don't know how many podcasts like that have had happen often.
01:29:53.000 And I like Sean.
01:29:54.000 I talk to him regularly.
01:29:55.000 I think he's a great dude.
01:29:56.000 All I'm saying is that we do see this in the veteran space, that there are shows that do get propped and that still have people connected within the special operations and or still connected within intelligence services with three-letter agencies.
01:30:10.000 That's not a fact.
01:30:11.000 I mean, Sean's still connected.
01:30:12.000 Everyone's still connected.
01:30:13.000 For God's sakes, what's the guy that's going on the tangents right now?
01:30:16.000 Oh my God, Eric Prince.
01:30:17.000 He's got a documentary or something coming.
01:30:20.000 He's still heavily connected to the defense network.
01:30:23.000 He's talking about going to Haiti.
01:30:25.000 Does the CIA have a YouTube channel?
01:30:29.000 Is there a CIA cast where they just talk about life at Langley?
01:30:36.000 Probably not officially.
01:30:38.000 I actually don't believe any of the conspiracy theories.
01:30:41.000 I think the reality is just like...
01:30:45.000 That's just it.
01:30:48.000 New shows for a variety of reasons can catch the algorithm and ride an elevator to the top.
01:30:53.000 There was a woman who, YouTube called it a glitch.
01:30:57.000 She made a van life video with her pet snake or something.
01:31:00.000 And she made two videos and got three million subs.
01:31:00.000 Okay.
01:31:03.000 What?
01:31:04.000 And everyone was like, what is happening?
01:31:07.000 a house as possible.
01:31:08.000 And I believe YouTube issued a statement saying basically they change the algorithm periodically to try and...
01:31:16.000 They want people to stay on the platform longer.
01:31:18.000 And it just so happened, the recent change, she hit every mark, fell right in, And so the algorithm was programmed and it put her on the front page for literally everyone everywhere.
01:31:32.000 Well, no, she basically spiraled out of control.
01:31:34.000 I'm pretty sure she's like...
01:31:36.000 She wasn't ready for that.
01:31:41.000 I cannot.
01:31:42.000 No one can.
01:31:43.000 You cannot walk down the street randomly in, say, New York, grab a random guy and say, we're going to put you on in a movie that's going to be seen by 500 million people around the world.
01:31:54.000 It doesn't work.
01:31:57.000 It's going to overwhelm them.
01:31:59.000 It's a culture shift.
01:32:00.000 It causes a psychological shock.
01:32:03.000 Like winning the lottery when they're not ready to have that much money.
01:32:06.000 And people don't know what to do with this money.
01:32:08.000 It's crazy how people don't realize how expensive things scale up to be and how much you can spend, how quickly you can spend it.
01:32:16.000 And so a lot of people in the lottery and then they're like, I can give someone 50 grand and then a month later they have zero.
01:32:20.000 Yeah.
01:32:20.000 Right.
01:32:21.000 The podcast version would be you, uh, you're Hayley Welch, Hawk Tua, and then you end up doing a crypto scam and then everything, yeah.
01:32:21.000 How did that happen?
01:32:29.000 Okay, I was talking about her two days ago.
01:32:31.000 I think she was back to putting content out, but it was never the same.
01:32:35.000 She missed the mark on that.
01:32:36.000 She missed the boat on that.
01:32:37.000 She had terrible advice.
01:32:38.000 I mean, she had an opportunity with that platform to really, truly grow something substantial, and she just got caught up in it.
01:32:45.000 Yeah, because they had the Paul brothers backing on their podcast network.
01:32:48.000 See, that's unfortunate to me.
01:32:50.000 When that kind of success can happen to someone who's not prepared for it, it can spiral them out of control, or they can actually turn it into something pretty incredible.
01:32:56.000 I mean, it's not to say that that happens.
01:32:58.000 It happened to Jordan Peterson, but I mean, after he exploded, I mean, he handled it pretty damn well, but he had the psychological wherewithal to actually handle it and the team around him, whereas people like her, who just on the street become the people, it doesn't work.
01:33:14.000 Well, I mean, a lot of the people that end up being really successful have strong work ethics that allow them to fall into a pattern of working hard, knowing how to capitalize on it.
01:33:24.000 But if somebody just gets lucky and hits a hole in one and their video goes on the front of every YouTube channel, you would have had to have also had very good luck to also be the type of person who could have that kind of windfall fall in your lap and then capitalize on it.
01:33:36.000 But if you don't work to get there, it's a lot harder.
01:33:38.000 Success is preparation meets opportunity.
01:33:42.000 Of course.
01:33:43.000 Yeah.
01:33:44.000 I don't know that Haley has anything of significant substance.
01:33:50.000 I was going to say of substance.
01:33:51.000 I mean, that doesn't always matter, especially if you're in the female podcast space and you're talking about female issues.
01:33:56.000 It's not necessarily about having something super interesting to say.
01:33:59.000 It's about commiserating around the things that affect your life because women like podcasts like that.
01:34:03.000 I want to explain something, too, about these conspiracies.
01:34:06.000 Easy with the blanket term.
01:34:07.000 I think those podcasts make me want to jump on things.
01:34:09.000 There is a functional, infinite number of podcasts on YouTube right now.
01:34:12.000 Right.
01:34:14.000 YouTube or the CIA doesn't need to recruit someone and then say, we want you to run our covert propaganda arm where you promote these ideas.
01:34:22.000 What they do is, they go into their database and say, here's a seemingly infinite number of podcasts.
01:34:28.000 We're going to use an algorithm, an AI.
01:34:30.000 We want somebody who is pro-Israel, pro-military, pro-intervention in Ukraine, calm, marketable, and then it's going to be like, here's seven channels that do this.
01:34:41.000 And they're going to say, Put this on the front page of YouTube for everyone right now.
01:34:44.000 And then three months later, some guy's got a million subs.
01:34:46.000 He's like, wow, people love my channel.
01:34:50.000 Yeah, I mean, YouTube hates me, so I can't.
01:34:52.000 Oh, you had Chris on.
01:34:54.000 Oh, good.
01:34:55.000 That was a while ago.
01:34:56.000 That was March.
01:34:56.000 No, no, no, no, no.
01:34:57.000 He's a friend of mine.
01:34:58.000 We're from the same town.
01:34:59.000 Yeah, he's a great guy.
01:34:59.000 Yeah, he's cool.
01:35:00.000 That was an episode from March.
01:35:01.000 Oh, I love that, though.
01:35:02.000 That was right before we went to Australia and got arrested.
01:35:05.000 and got arrested.
01:35:05.000 Yeah, yeah, yeah.
01:35:06.000 Got arrested again for the honor.
01:35:08.000 My friends, we are going to go to your chats, so smash the like button, share the show with everyone you know, and my friends, head over to timcast.com, assuming you're not in Canada, and click join us.
01:35:18.000 to become a member and get in the Discord server.
01:35:22.000 This is an online community of tens of thousands of people We do an uncensored call-in show where this chat appears on screen during the show, often to tens of thousands of people.
01:35:37.000 We've been averaging on the Rumble side of things for the Uncensored show maybe like 40k in the Uncensored portion in the beginning.
01:35:43.000 And it could be your chat, people see.
01:35:45.000 Or more importantly, you as a member can call in and talk to us and our guests on the show.
01:35:50.000 Not to mention there's video game hangouts.
01:35:54.000 There's a seven days to die.
01:35:56.000 Is that the name of the game?
01:35:57.000 Video game zombie servers.
01:35:59.000 There's a jam session where we even have a guest.
01:36:03.000 Ian got some guests to come in to play music and write songs with our members.
01:36:07.000 The point is to build community.
01:36:09.000 So if you're looking for step one to figure out how to get off your ass and get involved, TimCast.com's Discord, tens of thousands of people at any given moment, and they want to be friends with you.
01:36:18.000 Because we're trying to build networks of individuals who want to make things happen.
01:36:23.000 And also we have the Culture War live events, which is where we did it.
01:36:28.000 The first one we did was members only.
01:36:30.000 As a member of the Discord, you could get a free ticket to the show.
01:36:33.000 What we're going to do next is we're going to have this plus public access tickets as well.
01:36:37.000 But if you're a member, it means you're going to have privy to access to a lot of this stuff.
01:36:40.000 So please consider becoming a member at TimCast.com to support the work we do and get in that Discord server.
01:36:44.000 For now, let's grab your Rumble Rants and Super Chats and see what y 'all have to say.
01:36:49.000 Now I know why you guys told me not to give me a second shot of the photo because oh my god.
01:36:53.000 Of what?
01:36:54.000 The photo.
01:36:56.000 Oh, you think the pictures of people?
01:36:58.000 Oh, it's bad.
01:36:58.000 You think you look bad.
01:36:59.000 Oh, I don't care.
01:37:00.000 I know I was told it doesn't matter.
01:37:01.000 But the point is I just noticed it.
01:37:02.000 Thank you for that.
01:37:03.000 All right, we got Change Wilder, always the first to Super Chat, to Rumble Rant.
01:37:07.000 I keep saying patience is a virtue.
01:37:09.000 Lash being arrested shows the Trump admin is working on dismantling the deep state, but getting hard evidence takes time.
01:37:14.000 You don't want them off on a technicality.
01:37:16.000 Indeed, the arrest of that Lash guy for trying to sell secrets took, I think, two months?
01:37:22.000 Longer than that.
01:37:23.000 It was like March when they found this guy.
01:37:25.000 And if I wasn't familiar...
01:37:36.000 And the FBI intercepted him where he explained that he had disdain for the Trump administration and was willing to sell these secrets for citizenship elsewhere.
01:37:45.000 Evil people exist.
01:37:46.000 Shimo says, stay at her mom's, consider this a lifetime disability, and men get a job.
01:37:54.000 Taxes pay for it only if you're married.
01:37:57.000 What do you think, Phil?
01:37:57.000 What?
01:38:00.000 What?
01:38:00.000 I think that I need you to send another Rumble rant that clarifies what you're asking.
01:38:06.000 She must have separation anxiety can be considered an issue that people can use in order to stay at home with their babies.
01:38:12.000 Stay at her mom's.
01:38:14.000 Consider this a lifetime disability.
01:38:17.000 Is stay at her mom's a person?
01:38:22.000 I don't know.
01:38:25.000 Okay.
01:38:25.000 I don't think that separation anxiety should be considered a lifetime disability.
01:38:31.000 Yeah, because it's definitely not.
01:38:32.000 That's a good place to start, I guess.
01:38:33.000 These are really good super chats.
01:38:35.000 NNY says, Tim, Brett is your employee and you have dominion over him.
01:38:39.000 Command him to not only watch Star Trek, but enjoy it as well.
01:38:48.000 The look on his face is great.
01:38:50.000 Never, never, not once.
01:38:54.000 Not a day in my life.
01:38:56.000 You can't make me.
01:38:58.000 I do want to say just real quick, guys.
01:38:59.000 Can I pay you?
01:39:01.000 Totally as an aside.
01:39:02.000 I'd rather be unemployed.
01:39:04.000 As an aside, guys, we have a new announcement for HR policy.
01:39:08.000 Watching Star Trek The Next Generation is going to be a job requirement for all staff and contractors moving forward.
01:39:14.000 I mean, it's a good show.
01:39:19.000 Easy ask.
01:39:20.000 Easy ask.
01:39:20.000 Patriot Paladin says, Dodger Stadium was flooded after that drag queen performance mocked God and Christ at Dodgers game.
01:39:26.000 It did!
01:39:27.000 It happened!
01:39:27.000 I remember that.
01:39:28.000 Here we go again!
01:39:29.000 That's just one of those moments.
01:39:31.000 Crazy.
01:39:33.000 Black Pringle says, so Tim, you must have missed the story of Sodom and Gomorrah.
01:39:36.000 Ooh, no.
01:39:37.000 Yeah, but didn't the Dodgers win the World Series like the year after that, though?
01:39:42.000 What did Seamus say?
01:39:43.000 If God doesn't smite the United States, he owes Sodom and Gomorrah an apology?
01:39:46.000 Yes.
01:39:50.000 I don't know if he feels good about saying that.
01:39:53.000 I was like, dude, Seamus, geez.
01:39:56.000 I think he was saying it's someone else's quote, I'm not sure.
01:40:00.000 Yeah, I mean, look, there's some validity to that.
01:40:05.000 Granted, I think Sodom and Gomorrah, everybody was kind of rapey.
01:40:11.000 They wanted those angels, bro.
01:40:12.000 So this one's bait, obviously.
01:40:14.000 Rapey.
01:40:15.000 What is this?
01:40:17.000 Philocoraptor.
01:40:19.000 Philocoraptor.
01:40:20.000 I understand what he was trying to do, but it doesn't say that.
01:40:23.000 It says Philocoraptor.
01:40:25.000 He says, Tim doesn't understand middle-class struggles because he has never been middle-class.
01:40:30.000 The excuse of leftists as to...
01:40:34.000 people need to understand this about leftists.
01:40:36.000 When I went to Occupy Wall Street...
01:40:45.000 These Occupy activists would go, see, man?
01:40:47.000 Like, you're the perfect example of what's wrong with this country.
01:40:50.000 Like, a young guy, you're smart, you're dedicated and passionate.
01:40:53.000 Where's your success?
01:40:54.000 Where's your American dream, man?
01:40:55.000 You know?
01:40:57.000 And then, How old were you when they made that comment about where is your success?
01:41:13.000 Cracker.
01:41:14.000 25. And so at Occupy Wall Street in October, Walking around, sleeping in the park, and this is exactly what they were saying.
01:41:24.000 They were like, see, people like you, man, drop out of high school.
01:41:27.000 There's no American dream.
01:41:29.000 The wealthy are funneling all the cash away while paying the lobbyists.
01:41:33.000 You're the perfect example of everything until I became successful.
01:41:37.000 And then I was silver spoon, privileged.
01:41:39.000 And this is what they do.
01:41:41.000 Like, these people live in this world where they can't succeed because— My family lost their house in a bankruptcy, and I was homeless several periods, several points in my life.
01:41:56.000 They assume almost everybody who has any money inherited it from their parents.
01:42:00.000 Right.
01:42:00.000 And they're not willing to hear the story before that either.
01:42:02.000 No, you're successful now.
01:42:03.000 You're the guy now, so it's obvious.
01:42:05.000 And then you get people who go, yeah, well, you're the exception.
01:42:07.000 You're lucky.
01:42:08.000 And I'm like, so weird that everybody who succeeded did similar things.
01:42:13.000 This is why there are people who pay.
01:42:16.000 For seminars from PBD.
01:42:18.000 Yeah.
01:42:18.000 Because they don't believe.
01:42:21.000 They don't believe those lies of you can't succeed.
01:42:24.000 They see a male success and they say, I need to go to those events and hear what he has to say.
01:42:29.000 Now, not every ticket for Patrick Bet-David's show is 20 grand.
01:42:32.000 Those are like VIP elite packages where you can get in a private room with him and like Tom Brady or something.
01:42:36.000 But people want to go and learn.
01:42:39.000 And while certainly some people might have criticisms of PBD and those seminars that he does.
01:42:44.000 I don't care about that.
01:42:45.000 My point is individuals who are willing to spend and invest and try and figure it out at the very least.
01:42:51.000 How to be better people and be successful are on the right path.
01:42:54.000 Yeah, I have infinitely more respect for a person who's going out and seeking a way to better their life than somebody who just assumes that they've been destined to whatever hell they find themselves in now and say that everything is rigged against them.
01:43:06.000 The idea of like radical personal responsibility where you take extreme ownership of your actions is something that I think actually separates a lot of the people on the left and the right.
01:43:16.000 Do you have to pay Jocko every time you say that?
01:43:18.000 No.
01:43:19.000 I just wanted to make sure.
01:43:20.000 Jocko Malk is the best protein powder ever made.
01:43:25.000 We've got like 12 bags of Jocko protein powder downstairs.
01:43:28.000 Have you tried it?
01:43:30.000 I prefer something a lot cleaner like Noble, which is like made in Texas and like nose to tail and doesn't have any fillers.
01:43:37.000 Like Jocko?
01:43:39.000 No comment.
01:43:40.000 You're not a fan of what?
01:43:44.000 There's no Splenda in it.
01:43:45.000 There's no fake sugars.
01:43:46.000 I've tried it.
01:43:47.000 I'm not a big fan of it.
01:43:48.000 It doesn't work with my stomach.
01:43:49.000 That's why I use it.
01:43:50.000 I can't stand it.
01:43:52.000 I swear by it.
01:43:54.000 Never met the guy.
01:43:54.000 I don't know the guy.
01:43:55.000 He's never sponsored us or anything.
01:43:56.000 I was looking for non-Splenda proteins.
01:44:00.000 And they all have Splenda.
01:44:02.000 It's disgusting.
01:44:03.000 And then I looked it up, and Jocko's came up, and I looked at the ingredients, and I said, okay, I'll try this, and it's amazing.
01:44:09.000 I mean, it tastes good.
01:44:10.000 I just thought it doesn't work with me.
01:44:11.000 I had to find one that, like, worked with me.
01:44:13.000 Like, I have a TBI, so everything that goes in my gut needs to at least be somewhat dialed in.
01:44:17.000 I do have to take a step back.
01:44:18.000 I know that we were doing the chats, but the chat reminded me we did not talk about Sasquatch.
01:44:23.000 And that's an important video.
01:44:24.000 Here we go, ladies and gentlemen.
01:44:25.000 This is it.
01:44:26.000 We have a video definitively proving that anybody will believe fake videos on the internet.
01:44:33.000 Okay, this is a video that purportedly shows big Sasquatch.
01:44:36.000 Is there sound on this?
01:44:37.000 There's no sound.
01:44:40.000 Look at that.
01:44:40.000 You see him?
01:44:41.000 That's really funny.
01:44:44.000 Listen, it's a human.
01:44:46.000 Well, that's someone who's being skeptical, right?
01:44:47.000 There he is.
01:44:49.000 Sasquatch.
01:44:50.000 You guys see that?
01:44:51.000 Oh, yeah.
01:44:52.000 Look at him.
01:44:52.000 I love the...
01:44:54.000 That's really funny.
01:44:58.000 Oh!
01:44:59.000 Oh, there we go.
01:45:03.000 Why won't it play?
01:45:05.000 Where is he?
01:45:06.000 There he is!
01:45:07.000 Sasquatch!
01:45:11.000 Maybe bro's just hairy.
01:45:13.000 Bro's just hairy.
01:45:16.000 Maybe he's just wearing a fur jacket.
01:45:19.000 I love the deeply profound comment that says, man in a costume, you can tell because of the way it is.
01:45:28.000 I mean, I'm not even convinced a guy in a costume.
01:45:31.000 It might just be a guy wearing furs.
01:45:33.000 Yeah.
01:45:33.000 And then someone filmed it from far away and they're like, Sasquatch!
01:45:36.000 It's in the woods.
01:45:37.000 It's gotta be him.
01:45:38.000 But, you know what?
01:45:40.000 Because we don't know, I'm going to just choose to believe it's Sasquatch.
01:45:43.000 Yep.
01:45:44.000 And that proves it.
01:45:45.000 He's been found.
01:45:45.000 No, it's what Mitch Hedberg said.
01:45:47.000 I just think Bigfoot is blurry.
01:45:49.000 Yeah.
01:45:50.000 What did he say?
01:45:51.000 A blurry monster?
01:45:52.000 He's a large, out-of-focus monster roaming the countryside.
01:45:56.000 I'm consistently out-of-focus.
01:45:58.000 Do you guys know that West Virginia has more cryptids than any other state in the country?
01:46:03.000 You just had Tony Merkel on for Culture War last week, right?
01:46:07.000 He's from the Confessionals.
01:46:09.000 I'm not even sure what a cryptid is.
01:46:11.000 Cryptid?
01:46:11.000 Yeah.
01:46:12.000 Bigfoot.
01:46:12.000 Oh, okay.
01:46:13.000 Chupacabra.
01:46:15.000 Snallygaster.
01:46:16.000 Okay.
01:46:16.000 What else do we have?
01:46:17.000 Snallygaster.
01:46:17.000 That one I just heard about recently.
01:46:19.000 Yeah, don't you know?
01:46:20.000 Yeah.
01:46:21.000 Do you guys know about Spring-Heeled Jack?
01:46:23.000 What?
01:46:24.000 Bro.
01:46:25.000 Well, it was like some dude who could jump off buildings and like jump 20 feet in the air or whatever.
01:46:30.000 And he was like, was it New Jersey or something?
01:46:33.000 In the UK?
01:46:34.000 Spring-Heeled Jack.
01:46:35.000 See, they're rebooting the X-Files, and if they want to do something, that's the type of stuff they should be covering.
01:46:39.000 I'm pretty sure they did.
01:46:41.000 Check this out.
01:46:42.000 Spring-Heeled Jack.
01:46:43.000 Okay.
01:46:44.000 It's in English folklore, 1837.
01:46:47.000 There were sightings reported all over the UK.
01:46:49.000 Basically, this dude...
01:46:50.000 This is that man.
01:46:54.000 He was described by people who have seen him as having terrifying and frightening appearance with diabolical physiognomy, clawed hands and eyes that resembled red balls of fire.
01:47:03.000 Bad man.
01:47:04.000 And, like, I guess, what was it?
01:47:06.000 He could jump super high or something?
01:47:07.000 Yeah, he'd make extraordinary leaps to the point to begin the topic of several works of fiction.
01:47:11.000 That's Superman, though.
01:47:12.000 That's parkour.
01:47:13.000 He looks like that.
01:47:14.000 You know what's funny is it really could have just been a guy doing parkour.
01:47:16.000 That's what I mean.
01:47:17.000 It's like early parkour before it was a thing.
01:47:20.000 But looks like Batman.
01:47:21.000 We're in West Virginia and this is Cryptid HQ.
01:47:25.000 Let me pull up Wow, Google's got it right here.
01:47:29.000 Look at this.
01:47:31.000 Mothman.
01:47:31.000 What?
01:47:32.000 Yeah, do you know about Mothman?
01:47:33.000 No, it's creepy, though.
01:47:34.000 Bro, have you ever seen the Mothman Chronicles?
01:47:35.000 That was the movie?
01:47:36.000 It's a movie, yeah.
01:47:37.000 And wasn't it based on a true story or something?
01:47:38.000 Yes, sir.
01:47:39.000 Okay, give me more.
01:47:40.000 More, please.
01:47:41.000 Tell me more.
01:47:41.000 So, who was it?
01:47:42.000 Gene Hackman.
01:47:42.000 No, not Gene Hackman.
01:47:44.000 What's his face?
01:47:46.000 What's the actor's name?
01:47:47.000 I can't remember.
01:47:48.000 Someone look up his name.
01:47:49.000 He's in his hotel room and he's investigating...
01:47:54.000 Yes.
01:47:54.000 There you go.
01:47:55.000 He's in his hotel room and he gets a phone call.
01:47:57.000 Okay.
01:47:58.000 And then it tells him to open the Bible and choose any page, any passage.
01:48:02.000 And then it starts reading to him the passage that he chose and he freaks out.
01:48:07.000 And then basically it was warning him.
01:48:09.000 Who was it?
01:48:10.000 The actor?
01:48:11.000 Did you look it up?
01:48:11.000 It's Richard Gere.
01:48:12.000 Richard Gere!
01:48:13.000 Okay.
01:48:13.000 I knew there was like a G-E something.
01:48:17.000 He's trying to warn people a bridge is going to collapse because he's been warned by this entity, the Mothman.
01:48:21.000 They think the Mothman is just an owl.
01:48:24.000 And what was happening is that it's pitch black outside and there's an owl sitting on a post right next to the road.
01:48:34.000 And when people would drive past it slowly, they'd look to their right and their brains would be confused.
01:48:40.000 Their brain would process the silhouette as being far away.
01:48:43.000 Okay.
01:48:44.000 And very large instead of up close and very small.
01:48:47.000 Right.
01:48:47.000 So they were looking at an owl with glowing eyes, not a massive Mothman.
01:48:51.000 Ah, this makes more sense.
01:48:52.000 Ah, okay.
01:48:53.000 That's terrifying.
01:48:55.000 Yep.
01:48:56.000 Or maybe there's a Mothman.
01:48:58.000 Maybe.
01:48:59.000 A Grafton monster.
01:49:01.000 Dude, Sheep Squatch.
01:49:03.000 Sheep Squatch.
01:49:04.000 Yeah, no joke.
01:49:05.000 Sheep Squatch, dude.
01:49:06.000 Is this around here?
01:49:07.000 So they, yes.
01:49:09.000 So in Fallout 76, the video game, All of these monsters are in it because it takes place in West Virginia.
01:49:15.000 So all of the monsters live in West Virginia.
01:49:18.000 So West Virginia has more cryptid sightings than any other place in the country.
01:49:22.000 Why do you think that is?
01:49:24.000 Meth.
01:49:26.000 I love how you had that ready.
01:49:29.000 Sheep Squatch, known as the white thing.
01:49:31.000 Look at this.
01:49:32.000 That sounds racist.
01:49:33.000 You can't even see that stupid picture.
01:49:35.000 In 94, a former Navy seaman said he had witnessed the beast breaking through the forest.
01:49:39.000 The white thing breached the brush line and knelt to drink from the creek.
01:49:42.000 Here it drank for a few minutes before crossing the creek and continuing towards the nearby road.
01:49:46.000 The witness stated that he observed the animal for a while before it moved under the surrounding brush.
01:49:51.000 Many sightings of Sheepsquatch.
01:49:53.000 Okay.
01:49:54.000 You know, everyone's always talking about Sasquatch, but, you know, ain't nobody talking about Sheepsquatch.
01:49:58.000 Right?
01:49:59.000 You know, I think, you know.
01:50:01.000 Snallygaster.
01:50:01.000 Look at the Veggie Man.
01:50:02.000 Oh my god.
01:50:03.000 They actually just brought up the Snallygaster the other day, and I was like, what is this?
01:50:07.000 I've never heard of it.
01:50:08.000 What is the Vegetable Man of West Virginia?
01:50:11.000 Look at this.
01:50:11.000 He's like, a carrot?
01:50:13.000 Looks like Mr. Burns from that episode where he was an alien.
01:50:17.000 The Vegetable Man of West Virginia is a little-known hoax.
01:50:19.000 Ah, okay.
01:50:20.000 So that's fakes.
01:50:22.000 Fake.
01:50:22.000 In the 50s, a hoaxer made...
01:50:26.000 This is probably what The Simpsons based it off of, to be honest.
01:50:28.000 A UFO in the woods with a vegetable man.
01:50:31.000 I was just told to stay out of the mountains.
01:50:33.000 I think Batboy's real, right?
01:50:35.000 Batboy?
01:50:36.000 Yeah, as another cryptid.
01:50:38.000 Not real, I'm saying it was like an actual cryptid.
01:50:40.000 People thought it was real and it was like...
01:50:45.000 Look at this.
01:50:46.000 But hold on, because you said meth.
01:50:48.000 Well, when you've been up for four days at a time...
01:50:54.000 Because these are looking like 1800s.
01:50:56.000 When was meth a thing?
01:50:57.000 Did you get what I'm saying?
01:50:58.000 I mean, they used to put cocaine in Coca-Cola.
01:51:00.000 Yeah, no, fair enough.
01:51:02.000 But I'm saying, like, 1800s, these all came out around the same time.
01:51:05.000 Yeah, but they were also trying to keep kids out of the forest, you know?
01:51:07.000 Oh, yeah, fair enough.
01:51:07.000 That's all, you know, the other story, yeah.
01:51:09.000 The Flatwoods Monster is, uh, what is it?
01:51:14.000 On September 12th, 1952, after a bright light crossed the sky, investigators now suggest the light was a meteor and the creature was a barn owl.
01:51:20.000 And this is what people described it as.
01:51:22.000 Two brothers, Edward and Fred May, Descriptions varied, blah, blah, blah, but there you go.
01:51:51.000 I don't know.
01:51:52.000 The Flatwoods Monster.
01:51:53.000 You guys have a lot of monsters out here.
01:51:55.000 But it's so dark out here in the woods at night.
01:51:58.000 I feel like you can just see some weird stuff.
01:52:00.000 Oh, they were driving me out here.
01:52:02.000 I said, where are we going?
01:52:02.000 Oh, the Snarly Yow.
01:52:03.000 Where am I going?
01:52:05.000 The Snarly Yow.
01:52:06.000 Bro, we got too many.
01:52:07.000 Look at that.
01:52:08.000 Oh my god.
01:52:09.000 Bro, it's like somebody saw a coyote with mange and was like, it's a monster!
01:52:15.000 Does any other state have this many creatures?
01:52:18.000 No.
01:52:18.000 It's just here, hey?
01:52:20.000 Yes.
01:52:21.000 Okay.
01:52:21.000 You should go outside.
01:52:23.000 No, I'm solid.
01:52:24.000 I'll say inside.
01:52:25.000 Let me tell you what I think it is.
01:52:28.000 West Virginia has historically been sparsely populated because of the mountains.
01:52:32.000 It's hard to move wagons through West Virginia.
01:52:35.000 So even though it's on the East Coast, it's one of the least populated states.
01:52:38.000 What ends up happening is you have a dude who lives in the middle of the woods several miles away from the next person.
01:52:43.000 He goes outside.
01:52:44.000 It's dark.
01:52:45.000 And he sees, I don't know, a bear.
01:52:47.000 And he can't tell what it is, and it makes a weird noise, and maybe it's a bear with mange or something.
01:52:51.000 Okay.
01:52:52.000 He then goes to his neighbor's house like a week later and is like, I tell you, this thing must have been a gigantic monster, eyes glowing, no hair on its body.
01:52:59.000 I couldn't tell what it was.
01:53:01.000 And the guy goes, wow.
01:53:02.000 Then he goes to the bar and says, there was some kind of six, seven foot tall monster, massive, with no hair.
01:53:07.000 And then a legend starts.
01:53:10.000 And that's how it all is.
01:53:10.000 When you live in New York, however, if a deer runs through the city, 700 people go, we just saw a deer.
01:53:16.000 Right.
01:53:16.000 Instantly everybody's like, oh, everybody said it was a deer.
01:53:19.000 So when you hear stories about like, actually here's a good example.
01:53:23.000 You know the story of the Cyclops, where it comes from?
01:53:25.000 No.
01:53:25.000 They found elephant skulls.
01:53:27.000 Oh, no way.
01:53:28.000 You ever see an elephant skull?
01:53:29.000 Yes.
01:53:29.000 That would make sense.
01:53:31.000 Yeah.
01:53:31.000 And so they started telling stories of a gigantic monster with one eye because it, you can't really tell because this image is too small, but it works anyway.
01:53:41.000 Somebody sees this skull with a big hole and they think that's the eye socket and there's the nose when in fact that's actually the nose.
01:53:49.000 And then they tell everybody we found a giant skull.
01:53:51.000 And then a bunch of guys are like, oh, we saw it.
01:53:54.000 Things like that.
01:53:54.000 So I imagine if you go way back in the day, like 2,000 years ago, some dude is like walking through the forest when he encounters like a wolf with mange and he fights it off.
01:54:05.000 Then he goes into town and he starts describing exaggerating.
01:54:10.000 Right.
01:54:11.000 This great exaggeration.
01:54:12.000 So West Virginia, with very few people, sheep squatch.
01:54:16.000 Sheep squatch comes out.
01:54:17.000 Right.
01:54:18.000 It's a ram.
01:54:18.000 Right.
01:54:19.000 A very large ram a guy sees breaking its way through the trees, and the guy's tired and groggy, and he sees it, and he's like, whoa, and then tells him, it was a sheep, but it was huge, like some kind of monster, and then you get sheep squatch.
01:54:32.000 I mean, it's pretty amazing.
01:54:33.000 You guys have a lot here.
01:54:34.000 A lot to be proud of.
01:54:36.000 Indeed.
01:54:36.000 Indeed.
01:54:37.000 Yep, West Virginia's famous for it.
01:54:39.000 All right.
01:54:40.000 All right.
01:54:40.000 Let's get back to it and grab some more of those super chats.
01:54:45.000 What have we here?
01:54:46.000 Smacky the Frog says, I am sure Derek Chauvin appreciated God's show of solidarity while he was sitting in a cell.
01:54:51.000 Why smite Biden after he ran the country into a ditch?
01:54:54.000 The prime time for an act of God was before he victimized his family.
01:54:58.000 You could also just say, like, why punish them after they die if they're doing bad things?
01:55:03.000 Because I don't think that's the system.
01:55:06.000 I don't know.
01:55:07.000 You know?
01:55:08.000 Like, the punishment is for what?
01:55:10.000 After you do?
01:55:12.000 Alright.
01:55:13.000 What is this?
01:55:14.000 We got a big one.
01:55:15.000 Wow, this is a big one.
01:55:15.000 Big super chat.
01:55:17.000 Working single-handed says, My son should be at his high school graduation tonight.
01:55:22.000 Instead, I'm at his ICU bedside after a horrific multi-fatality crash last Saturday.
01:55:27.000 By God's mercy alone, he survived.
01:55:29.000 It's a miracle.
01:55:30.000 We've seen his hand all week.
01:55:32.000 All glory to him.
01:55:33.000 Please pray and share.
01:55:34.000 Give, send, go.
01:55:35.000 What is it?
01:55:38.000 Tristan A. Recovery?
01:55:39.000 Is that what it says?
01:55:40.000 Yeah.
01:55:41.000 If you're...
01:55:42.000 Well, I...
01:55:45.000 He actually tweeted the same thing at me.
01:55:47.000 Oh, okay.
01:55:48.000 So if you're the praying type, pray for Jonathan Ruzich, kid.
01:55:52.000 Yeah, I think when I was younger, I didn't...
01:55:59.000 And then I've experienced things and witnessed things where I'm like, "Yeah, there's a God." And I think two things.
01:56:11.000 And that's the easiest way to explain it, I guess.
01:56:17.000 And I wonder if the issue on top of that is they haven't researched enough into it.
01:56:21.000 Well, again, I also think, too, it comes down to when people are told something for so long and then they are – They're unwilling to go and actually go through the experience because there's a chance it will completely change their entire fabric of reality.
01:56:40.000 And that's what happens.
01:56:40.000 Well, that's that or psychedelics.
01:56:42.000 I mean, it just depends.
01:56:43.000 Yeah.
01:56:44.000 You know, I don't feel that Bill Maher was listening to me when I was trying to explain the observable reality creating a probabilistic outcome of God.
01:56:54.000 Was it the weed?
01:56:55.000 Well, he was stoned out of his mind, for sure.
01:56:57.000 Could it have been the weed where he was not just checked?
01:57:00.000 I do think he was largely listening to what I was saying in the conversation.
01:57:03.000 Right.
01:57:03.000 But this was right towards the end.
01:57:05.000 And instead of asking anything about what I explained, he just said, I love how cute it is that people come up with these ways to, to believe what they want or whatever.
01:57:13.000 Cute.
01:57:13.000 Thanks.
01:57:14.000 Right, because I was literally talking about the entropy and negative entropy and the structure of reality and why certain particles fuse together and create...
01:57:32.000 Why is there a weaker opposite to entropy?
01:57:38.000 And he just didn't really engage in the conversation.
01:57:41.000 Can I ask you something?
01:57:42.000 Yes.
01:57:43.000 Do you think that Just do.
01:57:47.000 Bad joke.
01:57:48.000 I'm sorry.
01:57:48.000 Do you think that there's these moments where you're having these conversations with individuals like, for example, Bill Maher, where regardless of the cannabis or whatever else he's drinking or whatever else is going on, where they're just not intellectually strong enough to...
01:58:06.000 Of course.
01:58:06.000 Okay, I just wanted to make sure.
01:58:07.000 I think it's, I read about this a long time ago.
01:58:11.000 There's a phenomenon in psychology where, you know what, don't worry about what I read.
01:58:15.000 Brandon Strzok explained how he felt physical pain when he actually had his worldview disproven to him.
01:58:22.000 Of course.
01:58:22.000 He had firmly believed that Trump was bad, he had seen the videos, and then when someone actually showed him proof that he was wrong, he felt a physical pain in his whole body.
01:58:31.000 And I read about this a long time ago, and the reason why humans have – so let's put it this way.
01:58:37.000 There's a study of social engineering.
01:58:39.000 It is a hacker discipline about manipulating people to make them do the things you want them to do.
01:58:43.000 It is principally how hackers gain control of systems.
01:58:46.000 They don't code computers.
01:58:47.000 They trick people.
01:58:48.000 It's easier.
01:58:49.000 And so in this study, there are – people have compiled information from psychology such as when you want to approach someone.
01:58:58.000 Who you know to be adversarial or in opposition to you, you must always approach them from rapport, meaning lie and agree with them.
01:59:07.000 This breaks the first barrier where they say, we are of the same tribe, I cannot trust you.
01:59:12.000 Otherwise, if you approach them as an enemy, they'll disregard whatever you say.
01:59:15.000 It also talks about how, I should say, the readings that I did talked about how the evolutionary purpose for a visceral physical reaction.
01:59:24.000 To information that disproves what you think you know is because humans evolved based on the things they trusted to be true.
01:59:32.000 And the humans that survived to their late 20s must have figured something out that was correct.
01:59:38.000 Don't eat that mushroom.
01:59:39.000 It will kill you.
01:59:40.000 Don't eat that plant.
01:59:40.000 Right.
01:59:41.000 It's poisonous.
01:59:42.000 Or here's a better example.
01:59:43.000 There's a tribe of people and they find a patch of various red fruits and one person tries it and dies and they all panic and run.
01:59:50.000 They discover that every time someone tries to eat one of these different red fruits, they die.
01:59:56.000 So they develop a practice of never eat the red fruit until one day someone finds a tomato.
02:00:01.000 And they all say, you're crazy, you're wrong, and they try smacking out of his hand, they get angry.
02:00:05.000 But tomatoes aren't poisonous.
02:00:07.000 It was the other fruits.
02:00:08.000 But because they evolved to survive based off that presumption, the reason why they get angry is to protect the human.
02:00:19.000 doing things that could cause them to die.
02:00:20.000 And so they say that your mind closes up in your mid-20s and there's a period in your late 40s where it starts to reopen.
02:00:27.000 This triggers in men midlife crises.
02:00:29.000 The reason being, we also evolved to survive by reassessing what we think to be true at a certain point in our lives.
02:00:36.000 But this means that human beings throughout history who were willing to believe anything were less likely to survive.
02:00:42.000 Right.
02:00:43.000 And human beings that learned and then at adulthood solidify those beliefs and rejected anything that would challenge them, were more likely to survive because what they learned led them to survive.
02:00:54.000 Thus, people have psychotic breaks and mental breakdowns, anger and rage.
02:01:00.000 If you present them information that can counter their worldview.
02:01:04.000 Well, I've watched his show recently, just more and more now that it seems like he's doing it more consistently.
02:01:09.000 depending on who he has on.
02:01:10.000 I mean, he had Charlie Kirk on, and I watched that interview, and I watched that back and forth, and it seemed like there was just this very dismissive, divisive conversation where he was unwilling to not even just...
02:01:23.000 Like, how dare you even say those things in my space?
02:01:26.000 Like, how dare you even have that or attempt to have that conversation?
02:01:29.000 Because it does feel like with Bill, if there's any point where you maybe, I don't want to say have one up on him, or you're able to make a pretty solid point with evidence, he's very dismissive.
02:01:42.000 And I noticed that with you.
02:01:44.000 And that's why I wondered how you feel about that when that happens to you.
02:01:47.000 I get it.
02:01:48.000 I don't care, though.
02:01:49.000 That's an excuse.
02:01:50.000 I mean, you're still a human being hosting a show.
02:01:52.000 Your job is to host a show and have a give and take.
02:01:54.000 And if you're going to be dismissive and a blank to people, then that kind of just makes me not want to pay attention to you at all because you're just not giving anybody the time of day to be, I don't know, a decent human being.
02:02:04.000 That's my perspective.
02:02:05.000 Now, that doesn't have to be anybody else's, but that's what I notice.
02:02:07.000 And when people are dismissive like that, I have a hard time wanting to listen to them or even respect anything they have to say because they're unwilling to think outside the box or even just imagine for a moment outside of their delusion that other people could be feeling or thinking something different.
02:02:21.000 Also, you know what he said to me is that And I'm like, but that's just plum not true.
02:02:30.000 That's my point.
02:02:31.000 That's presumptive.
02:02:32.000 The idea that we can't know is presumptive.
02:02:35.000 You are making a declarative fact-based statement, whereas I could simply say, now hold on.
02:02:40.000 We might.
02:02:41.000 What might science discover?
02:02:41.000 I don't know.
02:02:43.000 Maybe, you know, when we discovered the electromagnetic spectrum, nobody thought...
02:02:51.000 What if eventually someone's like, we've discovered the God spectrum, and then they're like, there's actually a way to tap into and communicate directly with God, and then one day someone's like, we built this weird machine, watch this, you press it, and then God appears, and he's like, hey, what's up?
02:03:04.000 He doesn't know that's not going to happen.
02:03:08.000 But the certainty in people and individuals like that, it was like somebody, Joe just had this guy on from, he was an Egyptologist, I believe.
02:03:15.000 I don't, I can't for the life of me, I can't think of his name.
02:03:17.000 But yeah, he had him on and just to listen, but Joe's like, you don't believe in aliens?
02:03:20.000 And he's like, oh, Joe, I'm a scientist.
02:03:22.000 Just the level of ignorance to think or even with somebody like Neil deGrasse Tyson, where he's having these conversations, it's like just the level of ignorance for such an intelligent individual to not even be willing to open the mind enough to go.
02:03:35.000 You know what that could happen?
02:03:38.000 That's the thing that gets me about these people.
02:03:40.000 It's like if you want to come on shows and have conversations and you want to be quite hard lined, I get it.
02:03:46.000 But if you're a host of a show like Bill Maher and then you have an individual like yourself who's very clearly very intellectual in comparison to him when he's too stoned to handle his life, it's just very, to me...
02:04:00.000 I don't want to listen to you now.
02:04:01.000 You're just kind of irritating.
02:04:02.000 We are going to wrap it up there, my friends.
02:04:04.000 Friday night and we really do appreciate you guys hanging out on this amazing Friday night show.
02:04:09.000 You can follow me on X and Instagram at TimCast.
02:04:11.000 Smash the like button and share the show with everyone you know.
02:04:13.000 And I want to shout out the TimCast Discord server again.
02:04:16.000 There are a lot of people who are working really hard every day to build an awesome club and community online where you guys can learn, grow.
02:04:24.000 There's a fitness chat room.
02:04:26.000 You are going to get free fitness advice from enthusiasts who want you to have a better life all there in this club.
02:04:30.000 And we are trying to bring this to the physical world.
02:04:33.000 So we are actually working.
02:04:34.000 I got some updates too coming up soon.
02:04:35.000 We may have a couple of...
02:04:41.000 All this stuff's been in the works.