Timcast IRL - Tim Pool - August 11, 2022


Timcast IRL - Armed Man Who Attacked FBI SHOT DEAD, Posts Admission On Truth Social w-Bethany Mandel


Episode Stats

Length

2 hours and 4 minutes

Words per Minute

201.30487

Word Count

24,992

Sentence Count

2,018

Misogynist Sentences

29

Hate Speech Sentences

23


Summary

On today's show, we discuss a man who was shot and killed in a standoff with police in Cincinnati, Ohio. We also discuss the latest in the Trump/Russia scandal, and why the FBI should have been more transparent about their decision to raid Trump's home. Finally, we talk about the Keto Elevate MCT oil powder and how it's helping me lose weight.


Transcript

00:00:00.000 you you
00:00:28.000 earlier today a man went to an FBI office in Cincinnati with a rifle and a
00:00:53.000 nail gun and he tried to, I guess he tried to breach the building.
00:00:58.000 He opened fire.
00:00:59.000 They're saying in the direction of FBI agents, so it's hard to know exactly how it all went down, but we know that after an alarm was set off, he fled.
00:01:09.000 Ended up in a cornfield where he was opening fire on police The breaking news now is this man is dead He was shot and killed after about a five-hour standoff now.
00:01:20.000 It's being reported this man.
00:01:21.000 This man's name is being reported I'm not here to shout his name out but we'll get into the story because apparently he was on truth social and he made a bunch of posts talking about why he did what he did and And if this is the correct person who's posting on Truth Social, it was about the FBI raid on Donald Trump.
00:01:37.000 So we got to talk about that.
00:01:39.000 And then of course we have Merrick Garland has come out and stated that he signed off on the raid of Donald Trump's home.
00:01:47.000 My opinion?
00:01:48.000 Revenge.
00:01:49.000 Of course.
00:01:50.000 He's pissed off.
00:01:51.000 Conflict of interest, and this is one of the problems you have with weaponized and politicized law enforcement.
00:02:00.000 So we're gonna talk about that.
00:02:01.000 There's a lot of other stories to go through.
00:02:03.000 I'm proud to state, as much as I have my qualms with NewsGuard, they have rated MSNBC as fake news.
00:02:10.000 Finally.
00:02:11.000 Share that one with your family when they're obsessed with it.
00:02:14.000 They tell you, you're wrong.
00:02:14.000 MSNBC is true.
00:02:15.000 Rachel Manna, blah, blah, blah.
00:02:16.000 Yeah, well, NewsGuard says, fake news!
00:02:19.000 Before we get started, my friends, head over to eatrightandfeelwell.com and pick up your Keto Elevate C8 MCT oil powder.
00:02:27.000 That's medium chain triglyceride powder.
00:02:30.000 And, uh, you know what?
00:02:31.000 I got the spiel, and I'll read some of it, but I just gotta tell you, guys.
00:02:34.000 Have you noticed that since the last year, I've lost like 30-some-odd pounds?
00:02:38.000 I've been doing Keto.
00:02:39.000 I have cut out all the sugar.
00:02:41.000 I have a little bit of sugar, but it went from keto to just slightly low carb.
00:02:44.000 Way more fat, way more protein, way more vegetables, way less sugar and grains.
00:02:50.000 And this Keto Elevate stuff really, really does help.
00:02:52.000 So again, eatrightandfeelwell.com.
00:02:55.000 And you will get a 60-day money-back guarantee.
00:02:58.000 Keto Elevate provides your body only C8, the most ketogenic MCT.
00:03:01.000 That means it provides support for energy levels, healthy appetite management, mental clarity and focus, athletic performance.
00:03:07.000 Again, I'm going to pause and just state, when I was eating, you know, all the garbage, after dinner, I'd be falling asleep.
00:03:14.000 And then I'd have to wake up before the show and be like, come on, let's go.
00:03:17.000 Now, I'm just like, I feel like there's energy running through my body.
00:03:21.000 With BioTrust's Keto Elevate, you'll get 5 grams of their highly sought after MCTC8.
00:03:26.000 You'll get free shipping on every order.
00:03:28.000 And for every order today, BioTrust donates a nutritious meal to a hungry child in your honor through their partnership with NoKidHungry.org to date.
00:03:35.000 BioTrust has provided over 5 million meals to hungry kids.
00:03:39.000 Please help them hit their goal of 6 million meals this year.
00:03:41.000 You'll get free VIP live health and fitness coaching from BioTrust's team of expert nutrition and health coaches for Life With Every Order and their free new e-report, The Top 14 Ketogenic Foods With Every Order.
00:03:52.000 Someone else pointed out that at the end of the clips we do, I'm like a little fatter.
00:03:57.000 And I'm like, yeah, I've been losing weight.
00:03:58.000 So we gotta re-record our close-out clip for the clips on this channel, because the keto stuff's really been working.
00:04:04.000 So again, eatrightandfeelwell.com.
00:04:06.000 And don't forget, head over to timcast.com to support our work directly.
00:04:10.000 And check out our after-hours uncensored show Monday through Thursday at 11 p.m.
00:04:15.000 We've had some pretty crazy conversations this past week.
00:04:17.000 Larry Elder was particularly interesting.
00:04:19.000 Naomi Wolf was very interesting last night.
00:04:22.000 She mentioned that the government actually was targeting her to get her censored on social media platforms.
00:04:26.000 So really crazy stuff.
00:04:28.000 As a member, you get access to all of our shows, and soon, and because of all the members, we will have two documentaries launching, really great ones.
00:04:35.000 One's about gun control, one about the Federal Reserve.
00:04:37.000 We're also working on a transhumanism documentary, but that's going to be coming in the next phase.
00:04:41.000 So, it is because you are members, we are able to produce these documentaries, and then we're going to release them, and members will be able to watch them, so I'm really excited for that.
00:04:49.000 I think our timeline is... the rough is two and a half months from today.
00:04:53.000 Maybe it'll end up being a little bit longer, but I'm really excited for this gun control documentary.
00:04:57.000 So don't forget to smash that like button, subscribe to this channel, share this show with your friends.
00:05:01.000 Joining us today to talk about all of these issues is Bethany Mendel.
00:05:04.000 Hey, thanks for having me.
00:05:06.000 Would you like to introduce yourself?
00:05:07.000 Yeah, so I am the editor of a children's book series called Heroes of Liberty.
00:05:13.000 And I am a columnist at Deseret News and a mom of five and a half people.
00:05:18.000 Five and a half people?
00:05:19.000 What's a half person?
00:05:20.000 Percolating a new person.
00:05:22.000 Ah, making new people.
00:05:23.000 Oh, glad to hear it.
00:05:24.000 You're watching it happen in real time.
00:05:26.000 On this show!
00:05:28.000 Right on.
00:05:29.000 Well, thanks for joining us.
00:05:30.000 We also have Hannah-Claire Brimelow.
00:05:31.000 Hi, I'm Hannah-Claire Brimlow.
00:05:32.000 I'm a writer for TimCast.com.
00:05:34.000 No, what is that site?
00:05:36.000 It's this super cool kind of independent news site.
00:05:39.000 We do news on all kinds of things, all kinds of issues.
00:05:42.000 I post five times a day, and I think this is the longest my intro has ever been.
00:05:46.000 Oh, very nice.
00:05:47.000 News guard certified.
00:05:48.000 It is, but not good enough.
00:05:50.000 82 out of 100.
00:05:50.000 Insulting.
00:05:54.000 No, but it is.
00:05:54.000 They posted a bunch of fake news about us, had to correct it, refused to issue proper corrections, violating their own policies.
00:06:00.000 And I take it very seriously.
00:06:02.000 If they're going to claim USA Today, which admitted to fabricating 23 sources and their stories, is more responsible than we are when we've had one article out of 4,000 that required a correction, that they noticed.
00:06:14.000 We issue corrections all the time when we make mistakes.
00:06:17.000 But they're like, we noticed one article, so you're irresponsible.
00:06:19.000 Get out of here.
00:06:20.000 Anyway, Ian's here.
00:06:21.000 Oh yeah.
00:06:22.000 Anyway, I decided to refresh last night.
00:06:25.000 I took like an hour-long bath.
00:06:26.000 I was telling you guys about it before the show.
00:06:28.000 And instead of coffee today, I'm drinking coconut water.
00:06:30.000 I had a little bit of aloe vera.
00:06:32.000 Just the inner filet.
00:06:33.000 It's incredibly healing.
00:06:34.000 They call it the flower.
00:06:35.000 I think the plant of life.
00:06:36.000 The Egyptians used to call it that.
00:06:38.000 Highly recommend.
00:06:39.000 Get it on!
00:06:40.000 Catch you later.
00:06:41.000 Let's do this.
00:06:41.000 Yes, let's do it!
00:06:42.000 I'm loving how many ladies we have here tonight.
00:06:44.000 You may notice I'm zoomed in a little bit more than usual.
00:06:46.000 It's because Hannah Clare is lovely and tall just like me.
00:06:48.000 I kept getting the top of her head in my shots.
00:06:50.000 I did crop it out.
00:06:51.000 I'm sorry, Hannah Clare.
00:06:52.000 Such a middle child.
00:06:53.000 I need all the attention.
00:06:54.000 No, it's fine.
00:06:55.000 It's all good.
00:06:55.000 I'm really excited for tonight.
00:06:56.000 I love my ladies.
00:06:57.000 Let's get going.
00:06:58.000 All right, here's the first story from the Daily Mail.
00:07:00.000 Armed man who attacked FBI's Ohio field office is dead.
00:07:05.000 After five-hour standoff, suspect also attended deadly Capitol riot.
00:07:10.000 Now, We don't know exactly to what extent he was at the Capitol riot.
00:07:14.000 I think that maybe the Daily Mail is reporting something different.
00:07:16.000 I've read a bunch of other sources.
00:07:18.000 The New York Times says that he was there the night before, but he's not been charged with any crimes.
00:07:22.000 But let's read and see what they say.
00:07:25.000 Ricky Walter Schiffer was shot dead by police Thursday after he raised a gun towards officers around 3 p.m.
00:07:32.000 State Highway Patrol confirmed.
00:07:33.000 Schifrin attempted to break into the office, prompting a five-hour standoff with authorities.
00:07:38.000 The body armor-wearing suspect fled the office and was chased onto the highway before abandoning his car by a cornfield on a country road just off of Interstate 71.
00:07:49.000 The confrontation came as officials warned of an increase in threats against federal agents in the days following a search of former President Donald Trump's Mar-a-Lago estate in Florida.
00:08:00.000 Investigators say Schiffer was also at the U.S.
00:08:02.000 Capitol during the January 6th insurrection, they report.
00:08:07.000 All right, well, they reported this stuff.
00:08:09.000 I'm gonna pull up some tweets that we have.
00:08:11.000 Travis View says, The New York Times identified the Ohio shooting suspect as Ricky Schiffer.
00:08:16.000 There is a truth social account using that name.
00:08:19.000 On the same day, the FBI executed a warrant on Mar-a-Lago.
00:08:22.000 The account made a call to be ready for combat and, I'm not going to read what he said next, but he called for extreme violence.
00:08:29.000 In the end, one of the last things he said was that, yeah, I don't think we should read exactly what he said, but he explained that he thought he had a way to get through bulletproof glass.
00:08:41.000 He was wrong.
00:08:42.000 He did not.
00:08:42.000 But this could explain why it was reported that he fired a nail gun at the FBI.
00:08:47.000 Now, I read on NBC he fired at the agents, but perhaps they said towards them instead of at was because there was bulletproof glass he was not able to penetrate.
00:08:55.000 But apparently this Ricky Schiffer guy on Truth Social was saying that he did it, and if you don't hear for him, it's because they got him or something to that effect.
00:09:02.000 So I guess my view on things, obviously, George Conway, all right?
00:09:10.000 I'm going to make sure I cite George Conway on this one.
00:09:13.000 He said they crossed the Rubicon.
00:09:16.000 This is anti-Trumper George Conway.
00:09:19.000 They crossed the Rubicon, which is an insinuation that a faction of people have crossed the point of no return towards, what, a civil war?
00:09:28.000 Yeah, but... Oh, there's no thing.
00:09:31.000 There's just some random dude that went crazy, in my opinion.
00:09:34.000 No, no, no.
00:09:35.000 He's talking about the Democrats and the FBI.
00:09:37.000 Right.
00:09:38.000 Specifically.
00:09:39.000 So, Ian, you've talked quite a bit about, you know, ancient Rome and stuff like that.
00:09:43.000 The crossing of the Rubicon.
00:09:45.000 Yeah, it was forbidden in Rome.
00:09:47.000 There was a river right outside the city.
00:09:48.000 And if I get any of this wrong, just correct me in the chat and I'd be happy to go over it again.
00:09:53.000 It was illegal to ever bring troops across the river into the city of Rome.
00:09:56.000 That was something they'd all decided.
00:09:58.000 It was too dangerous.
00:09:59.000 So when Caesar was off on campaign, he had, you know, however many hundreds, thousands of troops that just basically worshipped him.
00:10:05.000 And when he decided he came back to Rome, he's like, they were going to try and put him on trial, to strip him of his power.
00:10:10.000 And he was like, you know what?
00:10:11.000 No, I want Rome.
00:10:13.000 He crossed the Rubicon with his troops, took the city, and they called it forever known as the crossing of the Rubicon is when you've taken that step, the one step too far.
00:10:23.000 And that was the start of the Civil War.
00:10:26.000 That was basically the end of the Roman Republic and the beginning of the Empire.
00:10:30.000 So there was this leftist outlet that they were like, the far right is saying this, that, or otherwise, or something like that.
00:10:36.000 And they said, Tim Poole said they crossed the Rubicon.
00:10:39.000 And I'm like, yeah.
00:10:40.000 And so did George Conway.
00:10:42.000 Like this is not a call to anything.
00:10:43.000 It's a statement of it's an observation.
00:10:46.000 I think they did.
00:10:47.000 Yeah.
00:10:48.000 And first and foremost, obviously, let me just say one thing.
00:10:52.000 This dude who went to the FBI field office must have really wanted Democrats to win.
00:10:57.000 Because I mean, we're months out from the midterm election.
00:11:00.000 And Surprisingly, he did exactly what the Democrats needed.
00:11:05.000 He got violent, failed, and now the Democrats have their example of what's wrong with the right.
00:11:10.000 That's why the craziest thing to me is like now is the absolute worst time for anything like that, and this is why I say violence doesn't work.
00:11:18.000 We are months away from Republicans taking the House and the Senate in what the Misery Index predicts will be a crushing defeat.
00:11:26.000 Except now, in the past few months, Democrats have evened out in the polling.
00:11:29.000 Something like this happens.
00:11:30.000 I think we're going to see Democrats spiking in the polls because of this.
00:11:33.000 Yeah, no, I agree.
00:11:34.000 I mean, this is the continuance of the January 6th hysteria.
00:11:39.000 This is how they continue that line of conversation.
00:11:42.000 But I think that we need to be a little bit careful in, you know, knowing that this was the guy.
00:11:49.000 Because I was trying to Google which mass shooting was it where it was the brother that was identified.
00:11:55.000 And it went on for hours.
00:11:57.000 And this guy, like, he was getting calls.
00:11:59.000 Do you remember?
00:12:00.000 Was it Newtown?
00:12:01.000 I don't remember.
00:12:02.000 I don't remember which shooting it was, but there was a mass shooting where the brother was falsely identified.
00:12:08.000 And so I would just, you know, caution the expertise of whoever this guy is with QAnon.
00:12:15.000 Like, maybe it's not him, and maybe this guy… Well, the New York Times said this was the guy's name.
00:12:20.000 Oh, well, that's the beacon of truth and reality.
00:12:23.000 I mean, sure.
00:12:24.000 But I guess the best thing we can do is... I mean, I always just wait 24 hours.
00:12:29.000 Right.
00:12:29.000 Fair point.
00:12:30.000 Fair point.
00:12:30.000 And even if it turns out that this is the guy, I don't think blaming a political party or a movement or any of that makes any sense because this guy just went up the rails.
00:12:38.000 Yeah.
00:12:39.000 I mean, it's the same as the baseball shooter.
00:12:41.000 What's that?
00:12:42.000 The congressional baseball shooter.
00:12:44.000 That was never blamed on Democrats.
00:12:47.000 It's a guy who I think he was like even maybe a Bernie Sanders supporter.
00:12:52.000 He was a Bernie Sanders volunteer.
00:12:54.000 He shot Steve Scalise and like he opened fire at the the congressional baseball practice and almost killed Steve Scalise.
00:13:02.000 I mean very few people know about it because it was a story for like 0.3 days.
00:13:07.000 Meanwhile January 6th we've been talking about forever.
00:13:10.000 But, I mean, that violence, that was never crossing the Rubicon.
00:13:14.000 That was never blamed on Democrats.
00:13:16.000 It was just like, oh, it's just some crazy guy.
00:13:19.000 Like, maybe this is just some crazy guy.
00:13:21.000 This crazy guy right now, I don't think is a crossing of the Rubicon.
00:13:25.000 I think the weaponization of the DOJ and Merrick Garland being like, yeah, I signed off on this, is a crossing of the Rubicon.
00:13:31.000 Yes, and going into Melania's closet.
00:13:33.000 Yeah, I mean, look, we've talked about this when Trump was saying he would lock up Hillary.
00:13:37.000 Everybody was like, that would be a dangerous time in this country.
00:13:41.000 And then what did Trump do?
00:13:41.000 He said, we're not going to go after Hillary.
00:13:43.000 We're not going to do it.
00:13:44.000 And everybody was disappointed.
00:13:45.000 But Trump was like, no, no, you know, we're not going to do it.
00:13:47.000 And they didn't.
00:13:48.000 Trump was standing at that river line.
00:13:49.000 He was like, no, no, that's too much.
00:13:50.000 It's too much.
00:13:51.000 It's been in the Republic.
00:13:52.000 Yep.
00:13:53.000 Meanwhile, Hillary has her hats, but her emails hats.
00:13:56.000 She's campaigning already, or at least fundraising.
00:13:59.000 Go ahead.
00:13:59.000 I think I watched Merrick Garland.
00:14:01.000 He did a speech today, the official like explanation.
00:14:04.000 They said they're going to unseal the warrant to explain why they invaded.
00:14:08.000 Is that the right word?
00:14:09.000 Trump's house?
00:14:10.000 I don't know if what the word is here.
00:14:11.000 Raided.
00:14:12.000 Raided.
00:14:12.000 They said, but don't call it a raid.
00:14:14.000 It was a raid.
00:14:14.000 Yeah.
00:14:15.000 I mean, they went in there, you know, unannounced or maybe they announced themselves right before they went in.
00:14:20.000 Well, they did.
00:14:20.000 They went to the lawyers, said, get out.
00:14:21.000 We're going to go do our thing.
00:14:22.000 Apparently they kept the cameras rolling.
00:14:25.000 So Rubicon, I don't know.
00:14:26.000 Jury's still out.
00:14:27.000 I want to see what the warrant said.
00:14:29.000 There is always the possibility that Trump was doing something extremely nefarious.
00:14:33.000 And a warrant wouldn't prove that.
00:14:35.000 Okay.
00:14:36.000 But we're not, you know.
00:14:37.000 A warrant is their accusation of probable cause.
00:14:39.000 Or maybe they wiretapped him and then they heard him say, you know, no one, we don't know.
00:14:44.000 I mean, it's hard not to view this as a complete phishing expedition, right?
00:14:47.000 I mean, why would you write?
00:14:48.000 I'm sorry, I'm sorry.
00:14:49.000 It's hard not to view this as anything other than fake and weaponization.
00:14:53.000 Why?
00:14:54.000 Because of Russiagate.
00:14:56.000 Because of Ukrainegate.
00:14:56.000 Because we know they had fabricated evidence and manipulated evidence already.
00:15:00.000 So, you know, forgive me.
00:15:02.000 Hillary Clinton's email server, we looked at.
00:15:05.000 They said, oh, there's no criminal intent.
00:15:06.000 Fine.
00:15:07.000 Then they smear and lie about Trump.
00:15:09.000 They should have stopped the investigation into Russiagate a long time ago.
00:15:12.000 What happened?
00:15:13.000 A lawyer fabricated a letter or something?
00:15:16.000 It's been a while since I covered this story.
00:15:18.000 But anyway, sorry, I digress.
00:15:20.000 Oh, I was gonna say is that to me this warrant, you know, we've heard the story that there was a room that had a padlock on it that had the documents that they're requesting.
00:15:29.000 When you write a warrant you have to be specific about what you're asking for.
00:15:32.000 So theoretically Garland signed off on a warrant that said we want complete and unfettered access and we want the right to not announce because we think that they'll hide stuff.
00:15:41.000 That, to me, indicates that it's a grab.
00:15:45.000 They want to get as much stuff as possible and then maybe justify it later when they're like, oh, but look what we found when we showed up there.
00:15:51.000 In Melania's closet, there were all these secret things that we knew about.
00:15:55.000 So to understand, what is it called?
00:15:57.000 The fruit of the poisoned tree?
00:15:59.000 Is that what it's called?
00:15:59.000 There's the exclusionary rule.
00:16:02.000 And this states that if your rights are violated and evidence is found, that evidence is inadmissible in court, So there was one story I remember reading about back in Illinois of a guy who had, they thought he was a murderer.
00:16:14.000 And so a cop ran his plates, pulled him over, and then while he said get out of the car, illegally searches the car, finds evidence.
00:16:22.000 Send in the evidence.
00:16:23.000 Turns out the stop was illegal.
00:16:25.000 The lawyers get it thrown out.
00:16:26.000 Exclusionary rule.
00:16:27.000 You cannot use evidence seized in violation of someone's rights.
00:16:31.000 With this, you get a warrant for something like classified documents.
00:16:35.000 Then, once you're inside, if you have a warrant, and you enter a home looking for, say, classified documents, and you find pills and a gun, That's a bold claim.
00:16:43.000 missable. So anything they found in the house, perhaps this was a fishing expedition and
00:16:48.000 or Trump suggesting they're planting evidence. That's a bold claim.
00:16:51.000 Yeah, that's the first thing that crossed my mind when I heard about this story. What
00:16:54.000 if they play like how can you confirm or deny if they did that?
00:16:57.000 I mean, I wouldn't put it past them. I was gonna say, I think of all there's like been
00:17:01.000 over 100 subpoenas that have been issued through the January 6th subcommittee.
00:17:06.000 When you subpoena people's documents records you can ask broadly for tons and tons of stuff.
00:17:10.000 They aren't being specific in what they want and that to me shows that they are kind of grasping at straws.
00:17:17.000 Like they are wanting you to turn over stuff so they can figure out later what you did wrong.
00:17:22.000 Is this the kind of thing where they can fabricate a warrant after the fact and make it look like they had it back in the day?
00:17:28.000 No, we know this guy, this Epstein-linked judge signed off on it.
00:17:33.000 And so a lot of people are asking questions about this, but a lot of people are bringing up now that they think this is a false flag right before the midterms for an October surprise or for this to be weaponized to help Democrats.
00:17:45.000 Look, I gotta say, show me the evidence.
00:17:48.000 I mean, I certainly understand the possibilities, but show me the evidence.
00:17:51.000 Considering what happened with Ray Epps, I'm more inclined to believe there's malfeasance going on at the highest level.
00:17:56.000 This is a guy who went out on January 5th and 6th telling people to go in, and they're just like, oh, this poor man is a victim.
00:18:03.000 That's what the New York Times is writing about.
00:18:04.000 That's what they're claiming.
00:18:05.000 Adam Kinzinger is defending the guy, and I'm like, something doesn't make sense.
00:18:08.000 We know this guy.
00:18:09.000 He's on camera saying it, and they let him go?
00:18:11.000 So with this, look, the simple answer.
00:18:16.000 People are shocked and angered by the FBI raiding the former president's house.
00:18:20.000 And out of the 74 million Trump voters, one guy went nuts.
00:18:25.000 Yeah.
00:18:25.000 Or one guy was nuts and then went off.
00:18:28.000 Right, right.
00:18:29.000 That's it.
00:18:29.000 I don't know.
00:18:30.000 Unless you guys think I'm wrong and the feds planned everything, but I just don't, I don't know.
00:18:34.000 No, I think it's legit.
00:18:35.000 I think this guy legitimately just like freaked out or was already freaked out.
00:18:40.000 I'm, I'm thinking about like, like, obviously you don't attack people.
00:18:45.000 That's not the way we live in a civilized society.
00:18:48.000 We have our second amendment because if we're attacked by our own government or by outside countries that we can defend ourselves.
00:18:55.000 And like, I think about like Nazi Germany, like I used to be like, why didn't they fight back?
00:19:00.000 Why didn't they like stop Hitler and stop the Nazis?
00:19:03.000 And like, you kind of don't can't cause it's illegal to fight the law.
00:19:08.000 Yeah, it's also people are cowards.
00:19:09.000 I mean, the last two years have really shown me a lot of facets of human nature that I was just maybe in denial about, but people are sheep.
00:19:20.000 They just kneel at the face of power and all common sense goes out the window.
00:19:25.000 There's a lot of things in history that don't make sense.
00:19:29.000 We're only a couple months away from a very, very serious election.
00:19:34.000 You've got people saying, you know, let Trump's second term begin January 3rd, or whatever, 2023, or whatever they do the swearing in.
00:19:42.000 Yeah.
00:19:42.000 Of these members of Congress in the Senate, or members of Congress.
00:19:47.000 And so now is the most crucial time.
00:19:49.000 Just the other day, this is funny, we had Naomi Wolf, and she said, it's gonna get crazy these next few months, right before the midterms.
00:19:55.000 And we're like, oh yeah, baby.
00:19:57.000 If you thought it was crazy before, wait till you see what's gonna happen next.
00:20:00.000 Sure enough, the next day, some guy goes up to the FBI field office with a nail gun and a rifle, and he tries breaking in.
00:20:07.000 I mean, this is crazy stuff.
00:20:09.000 But, I don't know, man.
00:20:12.000 The false flag narrative stuff, the reason I don't like it, as much as we've talked about Gulf of Tonkin numerous times, I understand the possibilities.
00:20:18.000 I know all about Operation Northwoods and this crazy stuff that they've done in the past.
00:20:22.000 It's like, you need evidence.
00:20:24.000 Extraordinary claims require extraordinary evidence.
00:20:27.000 I'd love to believe we've exposed some nefarious plot.
00:20:30.000 Great.
00:20:30.000 Well, let's expose it.
00:20:31.000 But in order to do so, you can't start with the premise that's extraordinary.
00:20:34.000 You have to start with the, what happened?
00:20:36.000 A claim has been made.
00:20:37.000 It's been reported in the press that a guy did these things.
00:20:39.000 It's appeared on social media.
00:20:41.000 This guy said these things.
00:20:42.000 Now we need to start from there and then see where we go, not decide where we want to be and then try and build our way up to it by, you know, by pointing out other things throughout history.
00:20:49.000 Right.
00:20:51.000 Yeah, I mean, I am curious to see sort of how it plays out, but I imagine that they're going to try to paint him.
00:20:59.000 There was another sort of recent incident where, you know, it came out like, oh, this guy follows Ben Shapiro and he was radicalized by Ben Shapiro.
00:21:05.000 Like, yeah, yeah, the nerdy Orthodox Jew from L.A., that was definitely who radicalized that shooter.
00:21:12.000 But I mean, they're going to ride this into the sunset, just like they have January 6th.
00:21:17.000 But I think they blame radicalization on the introduction of any information that they don't agree with, right?
00:21:22.000 So it's not that Ben Shapiro himself is like waving some crazy symbols and acting erratic, it's that he opens the door to a line of thought and a line of questioning that ultimately they would argue is always corrupt and always violent, which I don't think is ultimately Something is happening in this country with the rise of parallel economy, alternate payment processor system, censorship resistance, resistant, we use them, rumble, what you've got growing in Florida with not just the technology infrastructure but with, you know, Ron DeSantis, his worldview, what Florida's been doing in general in terms of governing.
00:22:00.000 Something is amassing in this country that's starting to stop the insanity, stop the cult.
00:22:06.000 Now I'm not saying it's a guarantee, you know, it's gonna, it's gonna, you know, it's gonna, wokeness is gonna be crushed or anything like that.
00:22:14.000 But we're seeing a steady path, a light at the end of the tunnel where we are going to be reaching a good place.
00:22:21.000 For something like this to happen completely undermines the opportunity for success.
00:22:26.000 So it does make you wonder, why would anyone on the right staring down the barrel of a midterm election be like, I know, here's what's going to help?
00:22:35.000 No, absolutely not.
00:22:36.000 This is the opposite of doing anything good for anyone.
00:22:39.000 Yeah.
00:22:39.000 I mean, he's probably just a nut job, honestly.
00:22:42.000 Yeah, and then what do you do?
00:22:44.000 What do you do when you have all these nutjobs?
00:22:45.000 What are you supposed to do?
00:22:46.000 Should they have, like, been tracking his social media or something?
00:22:49.000 No.
00:22:49.000 No.
00:22:50.000 Well, and also, who's to say they weren't?
00:22:52.000 And then let it happen?
00:22:53.000 We don't know anything about this guy yet.
00:22:55.000 Every time there's a mass shooter, there's never the, like, oh, I had no idea he was sitting there.
00:23:00.000 Everyone's like, yeah, no, he was a nutjob and we were kind of waiting on this.
00:23:04.000 It's never a secret.
00:23:05.000 It's never a surprise.
00:23:06.000 Yeah, a lot of times our high school classmates are like, yeah, that was the guy that beat puppies.
00:23:11.000 Yeah and I think what happens now is that the the powers that be that want him to be representational of every MAGA supporter out there are gonna work really hard to say like he is just like that guy down the street from you who has a Trump 2024 sign you know they're gonna work really hard to make this We don't know anything about that.
00:23:40.000 We don't know anything about him because he disappeared, but there was a point where people were saying like, hey, I think he is a registered Democrat, and then suddenly he disappeared.
00:23:49.000 I don't know if that's true.
00:23:50.000 There's nothing I can do to fact check it.
00:23:52.000 My point is just that ultimately we know that there are people who, whether it be for mental reasons or whatever else, pick extremist behavior, but that's not actually, number one, helpful to the political party that they're going to link him to, and it's not actually representational to most voters on either side.
00:24:09.000 Yeah, guilt by association isn't real.
00:24:12.000 Think about Fast and Furious.
00:24:14.000 No, I will not.
00:24:15.000 Not the movies.
00:24:16.000 The operation by Obama where he gave guns to the cartels.
00:24:19.000 I don't think about that either.
00:24:20.000 What would they have been willing to do to cover up something like that?
00:24:23.000 Yeah.
00:24:25.000 The question is, don't look at Fast and the Furious and say, wow, look what they did.
00:24:30.000 Look at what they did and think, what could they be doing now that we don't know about?
00:24:35.000 Yeah.
00:24:35.000 So I have to be like a little bit careful because this is like,
00:24:40.000 but there's I know someone who was involved in that as a gun runner.
00:24:45.000 There was a setup, there was some messed up stuff that happened on the part of the feds, and they did try to cover it up.
00:24:55.000 In Vegas or what?
00:24:57.000 Uh, he was in, in the southeast part of the, or southwest part of the country.
00:25:01.000 Fast and the Furious.
00:25:02.000 Yeah.
00:25:02.000 Fast and the Furious.
00:25:02.000 Yeah, yeah, yeah.
00:25:03.000 Right, right, right, right.
00:25:03.000 And he was arrested as a gun runner in that, and I know him very well.
00:25:08.000 And it was, he was set up, and then they tried to put him in prison for a very long time.
00:25:14.000 Wow.
00:25:14.000 And he spent a lot of money for his freedom with, for a very, very, very good lawyer who was like, I'm gonna bring this to trial because he has nothing to lose and y'all might not want this all to be out there.
00:25:27.000 Yeah.
00:25:28.000 And his lawyer got him an amazing deal and he got out like a year later because the lawyer basically put him put the feds on the spot and were like he was 17 and an orphan like do you want to go into like how you entrapped him and then how you like the whole thing and Yeah.
00:25:45.000 That was, for me, someone who was very back the blue.
00:25:49.000 I never really knew that side of the FBI before.
00:25:53.000 To hear his story, someone who I trusted implicitly, and see how it all went down up close was like, oh, they bad.
00:26:02.000 Yeah.
00:26:03.000 Those are not good people.
00:26:05.000 Yep.
00:26:06.000 And here we are.
00:26:07.000 Yeah, well, but I don't know.
00:26:09.000 I'm only bringing this up because I'm not I'm not trying to insinuate anything about this particular instant instance.
00:26:15.000 But there there probably are tons of stories like I mentioned this because of the Vegas thing.
00:26:20.000 People have questions.
00:26:21.000 Nobody knows who this guy was or what happened.
00:26:23.000 It's all just vanishes one day.
00:26:24.000 Yeah.
00:26:25.000 And a lot of people are like, shouldn't we have like learned way more information about this like with every other incident?
00:26:31.000 Or maybe what was happening was there's something going on behind the scenes that went south.
00:26:36.000 Yeah.
00:26:36.000 And they're not going to tell you.
00:26:38.000 You know, the funny thing is, I really don't want to get into the issue of 9-11 because it's just people can lose it.
00:26:44.000 But I always tell people, like, do you believe the official story?
00:26:48.000 You think the government just came out and told you exactly how our security was undermined?
00:26:52.000 I mean, that's absurd.
00:26:53.000 There's confidential and top-secret information.
00:26:54.000 So of course the official story is omitting information, lacking information, and probably obfuscating information.
00:26:59.000 So it's crazy to me when, for one, there's obviously a lot of the conspiracy theorists who believe... I think they take leaps of faith to believe things they want to believe.
00:27:08.000 But then also the people come out and say like, I will blindly believe whatever the government says.
00:27:12.000 And then I'm like, dude, even that would require you to say the government was not honest about what happened.
00:27:18.000 Because if you like, let's say it again.
00:27:21.000 The United States government did not come out on 9-11 and say, here's a roadmap to how our security was undermined, and please, you know, can you read that?
00:27:28.000 No, they were like, okay, we better not let people know that happened right there, because that's how they got us.
00:27:33.000 So they're not gonna release all the information.
00:27:35.000 But, you know, that being said, it's hard to know when secrets are kept from the American people, and then we're supposed to make decisions on who we vote for without complete information.
00:27:44.000 And then you have the media organizations that intentionally obfuscate and manipulate Dark days indeed, I'll put it that way.
00:27:52.000 It seems like there's a defense of the liberal economic order right now by American military and sub-military like FBI and CIA that they don't want it to get broken up.
00:28:03.000 They want to make sure that we, the United States, we have like a police, not a police state, but like control of the earth.
00:28:09.000 Like we have military bases all over.
00:28:11.000 So there's no World War III breaks out.
00:28:13.000 I understand that they don't want World War III to break out, but like, I don't think that the real threat is internal.
00:28:20.000 It doesn't seem like that.
00:28:21.000 I think most people want stability in the United States.
00:28:24.000 Like the CCP may be a bigger threat?
00:28:26.000 Probably.
00:28:27.000 I don't know, man.
00:28:28.000 I don't know if it matters what we think or feel about who the bigger threat is.
00:28:31.000 I think it's obvious China is a serious threat.
00:28:33.000 I mean, honestly, Russia is a threat.
00:28:35.000 They just, I think they overhype Russia when China is a much bigger threat.
00:28:39.000 But the fact is, we've got two distinct cultures in this country and they're headed for chaos.
00:28:46.000 Yeah, I was going to say, maybe a lot of people want stability, but what they view as stable is not the same thing.
00:28:51.000 There's such a division in how people ascribe their values and what they would describe as their ideal stable life, right?
00:28:58.000 There are people who are Incompatible in a lot of ways in this country.
00:29:02.000 Obviously, I don't think that's a call for like extremist violence or anything like that.
00:29:05.000 But like you have to recognize that stability is almost impossible when you have people who need your life to be different and you see that as unstable.
00:29:14.000 Yeah, just like real quick.
00:29:16.000 I do want to bring up this next story just as an aside, which we'll get into later on, maybe for the members only.
00:29:21.000 A video's going viral from the Boston Children's Hospital talking about giving hysterectomies to children.
00:29:27.000 Oh, it's freakish.
00:29:28.000 To children.
00:29:29.000 So we'll get into all that, but you brought back the blue, so I have this tweet here from Brianna Joy Gray.
00:29:34.000 Oh boy.
00:29:35.000 She said, Marjorie Taylor Greene is right about the FBI.
00:29:38.000 Bad faith or not?
00:29:39.000 In today's radar, I argue that the left should take advantage of the right's new acknowledgment of systemic bias and push to abolish the FBI, an institution that has always protected elite power, not the people.
00:29:53.000 My response to this was fire emojis.
00:29:55.000 I completely agree.
00:29:56.000 I was thinking yesterday, I was like, that appeals to me.
00:30:02.000 But I mean, I think that sort of I disagree a little bit about the fact that both sides want stability.
00:30:09.000 I don't think that the other side wants stability.
00:30:12.000 I think they want to remake the world.
00:30:15.000 But this is not that.
00:30:16.000 This is an appreciation and understanding that the FBI, at least, is— Maybe Brianna Joy Gray and the left perspective on this is, we need to tear down the system to rebuild a new one.
00:30:29.000 Don't know, don't care.
00:30:30.000 If we look at the FBI and we're like, hey, there's corruption going on there.
00:30:34.000 We should defund that and dismantle it.
00:30:36.000 And they say, yeah, we also want to.
00:30:38.000 I'd be like, well, if we agree on that, we're fine moving forward.
00:30:41.000 We'll figure it out afterwards.
00:30:42.000 Well, then what?
00:30:43.000 Here's the problem.
00:30:45.000 Then what comes next?
00:30:46.000 Yeah, right, like a new organization that's even less accountable that we don't even know about?
00:30:50.000 No, I don't think.
00:30:51.000 And that tracks our every movement.
00:30:53.000 You can't say, we should not do away with corrupt institutions because of fear of more corrupt institutions.
00:30:59.000 Like, we're actually taking action to get rid of the corrupt institutions.
00:31:01.000 Yeah, but you can't, like, take a wheel off the car mid-drive.
00:31:05.000 Is that what the FBI is?
00:31:07.000 It's one of the wheels on the vehicle.
00:31:08.000 Are they that important for the United States?
00:31:10.000 Have they been investigating Antifa?
00:31:12.000 Have they been holding people accountable?
00:31:15.000 Have they been going after Hunter Biden?
00:31:17.000 And I mean, well, we may see something with Hunter Biden, but it's been a really long time and only because of public scrutiny.
00:31:23.000 So I'm not convinced that... First, I'll say, I don't think the entirety of the FBI is corrupt.
00:31:28.000 I think there's different people in different field offices.
00:31:30.000 I've actually talked with people who, you know, there's like lower level people who... There's cadets.
00:31:36.000 They have similar politics to you.
00:31:37.000 The culture war is in every facet of the government as well.
00:31:41.000 But I just don't know if the FBI does enough to warrant any of this.
00:31:45.000 Well, no one knows.
00:31:46.000 I mean, it's a secretive operation.
00:31:48.000 That's part of the problem and the strength of it.
00:31:50.000 That's not what I'm saying.
00:31:51.000 I'm not saying no one knows.
00:31:52.000 I'm saying we have repeatedly asked questions for years.
00:31:55.000 Why have they not done this, that, or otherwise?
00:31:58.000 Yet, they have time to send a dozen agents to a garage over a pole rope.
00:32:02.000 That is shockingly, shockingly insane.
00:32:06.000 You talking about the Jussie Smollett case?
00:32:07.000 Bubba... Bubba Watson, right?
00:32:09.000 Bubba Wallace.
00:32:10.000 Was that his name? I don't know the NASCAR NASCAR driver said there was a noose in his garage
00:32:15.000 And so they sent a dozen agents and what do they find it was a garage pull rope. It was the door pull rope
00:32:19.000 It's in Wallace bubble. Yeah, it's insane and then what they go and and Merrick Garland signs off on
00:32:25.000 raiding the former president's home and And we're sitting here being like, let's contemplate whether they're an organization worth funding.
00:32:31.000 No, no, no, no, no.
00:32:32.000 There's no question.
00:32:33.000 The left has long talked about all of the malfeasance.
00:32:36.000 They talk about Martin Luther King and Malcolm X as really big examples.
00:32:39.000 And I'm like, sure, fine, whatever.
00:32:41.000 Yeah.
00:32:42.000 Defund them, dismantle them.
00:32:44.000 Or maybe, maybe we can start by reducing through a moderate defunding and reduction of the FBI force.
00:32:52.000 I mean, I think the FBI, there's two problems.
00:32:53.000 One of them is, it's the problem of every government institution.
00:32:58.000 It's just bureaucratic bloat and stupid people get promoted and your expertise and your… it's not a meritocracy is basically what I'm saying.
00:33:09.000 I think that's a number one problem.
00:33:10.000 But I think that in the last five years, one of the things that has become extremely apparent is that the people who are in charge, like the grown-ups in charge, I always thought that there was grown-ups in charge.
00:33:21.000 And then, like, Comey.
00:33:24.000 He started talking and you're like, oh my god, you had a lot of power and you are a total nutjob.
00:33:30.000 But it's all of these people who were in the top rungs of power who are nutjobs.
00:33:34.000 George Conway is another one.
00:33:36.000 Peter Kovach?
00:33:36.000 Frickin' nutjob.
00:33:38.000 I was concerned with Adam Kinzinger.
00:33:40.000 I don't know him personally, but I've been seeing his tweets and I think he's isolated.
00:33:43.000 There seems to be like a bubble that some of these people are existing in right now, that they think it's really what life is about, is about red and blue Democrat-Republican thing.
00:33:52.000 But that's like an infinitesimally small part of reality.
00:33:55.000 This human thing is not that big a part of reality.
00:33:58.000 We need to really kind of get outside of our own butthole, if you know what I'm talking about.
00:34:02.000 And look out, bubble up, look outside, but that's what I said, bubble.
00:34:05.000 That's what I said, butthole.
00:34:06.000 Yeah.
00:34:07.000 Uh, we need to, we need to de-investigate ourselves for a moment and look around at the universe because things are flying around at a hundred million miles per hour.
00:34:15.000 It can slam into earth at any moment.
00:34:17.000 And we gotta be prepared for that kind of thing.
00:34:18.000 And that is an important mentality for people to realize, like you can find purpose outside of all of this.
00:34:25.000 Most of these Democrats, this is their life, their religion, their purpose.
00:34:28.000 They have nothing else.
00:34:29.000 Maybe if they got interested in the stars, they might be like, yo, I don't care about this.
00:34:34.000 I want to look in a telescope.
00:34:35.000 Instead, their whole world, I'll put it this way.
00:34:38.000 Politics has become pop culture.
00:34:40.000 That's the danger.
00:34:42.000 And public health has become part of their identity, too.
00:34:45.000 I mean, COVID gave them a religion, like Mazel Tov.
00:34:49.000 I'm so happy for you.
00:34:50.000 You finally found faith.
00:34:51.000 And it's, you know, in the form of virus mitigation.
00:34:54.000 Yep.
00:34:54.000 Although the CDC relaxed guidelines.
00:34:56.000 I don't know if that's come up yet.
00:34:57.000 We talked about it before the show.
00:34:58.000 It hasn't.
00:34:59.000 Officially, that was according to NPR.
00:35:01.000 They reported on that today.
00:35:02.000 Maybe we can go into that later a little bit.
00:35:04.000 I just can't help thinking with the FBI, there's so much clear and evident malice.
00:35:08.000 You can look at Peter Strzok.
00:35:10.000 I just cherish the way he smirked.
00:35:12.000 It was insane.
00:35:13.000 He looked ideologically possessed.
00:35:16.000 And Blasey Ford did the same thing.
00:35:17.000 They do this thing where they look in the camera and they look down and go, what the?
00:35:21.000 What is that?
00:35:22.000 Are you imagining you're the Wicked Witch of the West or something?
00:35:25.000 A problem is the camera's above what we got on the show, so it's looking down at them, so you see their eyes looking up.
00:35:33.000 We don't look to the camera, turn our heads down, look up, and then go... Speak for yourself, Tim.
00:35:40.000 We don't do that!
00:35:42.000 Yeah, well, we're not hiding stuff.
00:35:43.000 I mean, why are they doing that?
00:35:44.000 Not more than a normal person would hide some, you know, menial personal or private things on TV.
00:35:49.000 But like, I think these people have to have secrets.
00:35:51.000 That's their job is like secrecy.
00:35:52.000 Half these people in the government, the CIA, secrecy organization.
00:35:56.000 Oh, it's all secrets.
00:35:58.000 Yeah.
00:35:58.000 So that's the remarkable thing.
00:35:59.000 It's like anytime anyone from the government comes out and says words like, why would you believe them?
00:36:03.000 They have a whole system of keeping information a secret from you.
00:36:06.000 They're not going to tell you.
00:36:07.000 I mean, the people who go to the White House press conference, I'm just, do you really think any White House press secretary at any point ever is gonna tell you the truth?
00:36:20.000 I'm sorry if you believe that, I got a bridge to sell you.
00:36:22.000 You think the first one did?
00:36:23.000 No.
00:36:23.000 Who was the first one?
00:36:24.000 No.
00:36:25.000 I mean, the thing is, the government trades in strategically releasing information, so why would that be true in foreign policy and not true in our Like, why would we be like, oh, yes, American people, I'm taking myself, I'm going to be on national TV and I'm going to tell you the truth, but don't tell anyone else because that'll reveal our big plan.
00:36:46.000 Like, we can't reasonably expect our government to be honest with us.
00:36:52.000 And also be openly telling everyone, all of our adversaries, what's truly going on with us.
00:36:58.000 That seems like a terrible plan.
00:37:00.000 I mean, I judge all of the press secretaries by their ability to lie.
00:37:04.000 Yes.
00:37:05.000 That's their job.
00:37:06.000 And to spin.
00:37:07.000 And some of them are good, and some of them are not good.
00:37:10.000 Have you had a favorite?
00:37:11.000 Oh, gosh.
00:37:13.000 So Sarah Huckabee Sanders was fantastic.
00:37:16.000 The key is, and I think it's the bunch of kids, nothing Yeah, she was unflappable.
00:37:23.000 She was like, come at me.
00:37:26.000 Someone peed on my foot today.
00:37:28.000 Kayleigh McEnany was fantastic with the book, being like, here's the story right here.
00:37:33.000 Ah, yes, you're wrong.
00:37:34.000 That was fantastic.
00:37:35.000 But Jen Psaki wasn't bad.
00:37:38.000 You know, people, you might not like her, but her ability to spin and spin quickly and create sound bites, she know how to do it.
00:37:44.000 This current lady?
00:37:45.000 Terrible.
00:37:45.000 Oh man, she's bad at this.
00:37:46.000 And you know why she's terrible?
00:37:48.000 Because they did a checklist.
00:37:50.000 They're like, well, she checks the LBGTQ and she checks the woman and she checks, but they didn't actually like have a check mark for... Charisma?
00:37:58.000 Merit.
00:37:59.000 Or temperament.
00:37:59.000 Or talent.
00:38:00.000 Or anything.
00:38:01.000 She's just completely inactive.
00:38:03.000 I feel bad.
00:38:04.000 I know, I mean, she's been over her head and she was chosen because she hit all those little boxes.
00:38:09.000 And she has been, from what I know about her career, I mean, she's been fairly insulated.
00:38:13.000 She's been with the Democratic movement for so long.
00:38:15.000 I think people who are Farther on the outside of the shell, have to learn how to spar a little bit more.
00:38:20.000 And I think, to her credit, that's where Jen Psaki's background came from, whereas with Corinne Jean-Pierre, like, no.
00:38:27.000 She has been with the movement, she knows the soundbites, and when challenged, she really struggles.
00:38:32.000 Yeah, no, absolutely.
00:38:32.000 I mean, I think that's actually a skill of, like, Pete Buttigieg, because he, like, was in middle America and had to actually talk to people with whom he disagreed.
00:38:42.000 It makes me sharper.
00:38:43.000 I mean, that's how I felt.
00:38:45.000 I went to Rutgers University with, like, every lib on planet Earth except James O'Keefe.
00:38:51.000 You know what really works, though, is believing what you're saying.
00:38:55.000 Because then you don't need a book.
00:38:58.000 I mean, it's nice to have notes, but you don't need to look at them if you know, if you just glance down and then talk about what you know, you know, I mean, I'm here with the formula stuff.
00:39:04.000 She genuinely had no answer.
00:39:06.000 She didn't know that there was a formula crisis.
00:39:08.000 She didn't know the plan.
00:39:09.000 Like she did not.
00:39:11.000 There was no, there was nothing not to believe.
00:39:14.000 She just didn't know it.
00:39:15.000 For the record, Andrew Johnson was the first president to grant a formal interview to a reporter.
00:39:20.000 Wow.
00:39:20.000 That was 1869-ish.
00:39:21.000 Yeah, so before that, I guess they didn't even talk to the media for the first 60, 70 years of their life.
00:39:27.000 It's kind of crazy if you think about it.
00:39:29.000 Like, you have no idea what they're doing.
00:39:31.000 I mean, when was the last time Biden spoke to anyone?
00:39:33.000 He rarely talks to anyone.
00:39:35.000 He yells, come on, man, quite a bit, you know, with a helicopter.
00:39:39.000 And I would argue, too, the first 50, 60 years, media was really different than what it is today.
00:39:42.000 Yeah, radio and TV have changed a lot.
00:39:43.000 Yeah.
00:39:44.000 I mean, you have to get a journalist to a president to then put a paper out that would then, like, by the time it reached anyone, be like three weeks old.
00:39:50.000 And I mean, radio and TV is the reason why Hitler was able to mass the population so fast.
00:39:55.000 And it's super dangerous tech.
00:39:57.000 I understand why there's censorship and why the CIA is involved with, you know, the PRISM thing and they want to Oversee and make sure, but like, we should talk more about the power of TV and video, I think.
00:40:08.000 Maybe not today, but just in life.
00:40:10.000 TV was more powerful than social media.
00:40:12.000 With TV, you had five channels starting with three channels and five.
00:40:17.000 So all of the messaging was distilled through the trusted names in news or whatever.
00:40:21.000 Then the internet happened and you can reach people faster, but now you've got too many channels and they're all just... All these poles busted on the dam and trying to plug them.
00:40:30.000 Like, oh, we have conventional fingers.
00:40:32.000 We don't know how to plug all these at once.
00:40:34.000 Let's make an algorithm.
00:40:35.000 The TikTok algorithm, people can get, you know, the more they use it, the more it knows them.
00:40:39.000 And so it feeds them to like Edgar Allen Poe TikTok.
00:40:42.000 Like that person is not getting the same content that you would get or you would get.
00:40:45.000 That's true for all the platforms.
00:40:47.000 True.
00:40:47.000 Yeah.
00:40:48.000 But TikTok is especially talented at it.
00:40:50.000 There was the Wall Street Journal story about young teenagers, 13-14 years old, and they lingered on one pornographic video and then they were just inundated with more and more pornographic content.
00:41:02.000 That sounds like they're not better at it.
00:41:04.000 No, they're very good at it because they got the kids hooked on the app.
00:41:08.000 And it was because if you lingered 0.3 seconds longer.
00:41:11.000 It's like every second on TikTok is more valuable to them, partially because the media is so short, which keeps you scrolling faster, whereas like YouTube, you may have to watch, you know, you watch one video, it might feed you another.
00:41:22.000 You watch 10 videos, it'll feed you a lot more.
00:41:24.000 Right, but I mean, these videos are over an hour, like you really have to invest some time, but you can have like total ADD on TikTok.
00:41:30.000 On YouTube, I've noticed if I go to a video and I watch it for like half the length of the video or more, that it starts to hit
00:41:37.000 me with more videos of that.
00:41:38.000 So like if I go to one, I'm like, no, no, no, I don't want, no, no, no, I don't want
00:41:41.000 this in my life.
00:41:42.000 You just X out really fast.
00:41:43.000 So then the algorithm knows you don't want it.
00:41:45.000 I try to trick the Instagram algorithm because it started, I had a like family friend who
00:41:51.000 like, she had like this tragic thing with a dying kid or whatever.
00:41:54.000 And then the Instagram algorithm decided I only wanted to see stories about dying children.
00:41:59.000 Yeah, it does that to me too.
00:42:00.000 And I was like, I'm gonna click on every single thing about Meghan Markle.
00:42:05.000 Let's trick the algorithm.
00:42:08.000 No, I mean, it's better than dying children and I hate Meghan Markle, so I'm all in.
00:42:12.000 I'm saturated with cats and it creates a compounding effect where I watch more cat videos because they're giving me more to look at and then I get even more and now it's all cats when I go through my stories.
00:42:21.000 Instagram decided I was like a young Mormon bride because I was really interested in national parks.
00:42:27.000 So I click on a lot of like photography in Zion National Parks and then it was like but you know who takes pictures in Zion National Parks?
00:42:33.000 Mormons!
00:42:33.000 And then it fed me a lot of Mormon Church stuff, and I really was like, I don't know what's happening.
00:42:38.000 So I gave my brother my phone, and he looks for other things, and he sends me the videos.
00:42:42.000 De-Mormonized it?
00:42:43.000 Yeah.
00:42:43.000 It still occasionally is like, but are you sure?
00:42:45.000 I mean, and are you sure?
00:42:46.000 Are you questioning?
00:42:48.000 I really want to see Zion National Park, and I don't know if I'm willing to commit to Mormon Church.
00:42:51.000 Did you linger on the Mormon videos, and then they were like, she's definitely a Mormon?
00:42:55.000 I think it was like, here are photographers active in this area and then because Utah is an LDS you know stronghold it's like well here are people who are hiring photographers and then it was like well here are the mommy bloggers and i was like oh look at those kids and then it kept going and eventually i was like this is creepy i don't like it at all i know i love it i that's that's my jam i mean it's better than the dead kids that's for sure the algorithm you love or
00:43:18.000 Well, no, that algorithm, because I get that algorithm too.
00:43:20.000 I get all the, like, moms of... It's funny because I get all of the, like, big family people on Instagram because I follow a couple of them and then it sort of compounds.
00:43:30.000 And then they start getting crazier.
00:43:31.000 Crazier and crazier.
00:43:32.000 And it is hard not to be like, what are you doing?
00:43:34.000 It's like your own customized version of reality TV.
00:43:37.000 Yeah.
00:43:37.000 Like, it's like, I think I know what plot lines you're going to be into and I will serve you some.
00:43:41.000 It's kind of the danger of hate watching too, because sometimes I'll watch stuff because I find it so, like, revolting.
00:43:46.000 I gotta know.
00:43:47.000 I know.
00:43:47.000 So this is the libs of TikTok.
00:43:49.000 She says, like, I'm friendly with her.
00:43:51.000 She said, like, I didn't even, like, I just started watching them and then TikTok just feeds me more and more.
00:43:56.000 I don't go looking for them.
00:43:57.000 And now people send them to her because she's had such a big account.
00:44:00.000 But she like, she built this entire platform on just posting what the algorithm showed her, which was craziness.
00:44:07.000 Interesting.
00:44:08.000 I think I get a lot of Marvel stuff.
00:44:10.000 Nice.
00:44:11.000 Yeah, I get like, like skateboarding, rollerblading, scooting, biking, all of that.
00:44:17.000 I guess it's like the only thing I really watch.
00:44:18.000 And then a whole bunch of Marvel stuff.
00:44:20.000 You do need more cats, yes.
00:44:21.000 Like the new movies that are coming out.
00:44:23.000 And then there's like a lot of music stuff.
00:44:25.000 But it's usually just like a guy... I don't scoot, but they sent me a video of a guy doing a triple flare on a scooter and I just couldn't stop watching it.
00:44:31.000 It was like the craziest thing I've ever seen.
00:44:32.000 I didn't even know that was a verb.
00:44:33.000 Scoot.
00:44:34.000 Scoot.
00:44:34.000 Yeah, he's scooting.
00:44:35.000 A triple flare.
00:44:36.000 You know what that is?
00:44:38.000 No.
00:44:38.000 It's like... So a flare is like a backflip 180.
00:44:41.000 So a triple is like three backflips in a 180.
00:44:44.000 I'm just sitting there on the toilet and I'm like...
00:44:47.000 When I started doing pop culture crisis I would occasionally look up like people who were involved in the stories on my phone and so then for a while it was sending me lots of like e-news and then it would send you like people who like do their own content about following various celebrities and I have never been so informed in my life.
00:45:05.000 You know what the worst thing about Instagram now is?
00:45:09.000 When you're scrolling through your feed, you get accounts you don't follow.
00:45:11.000 Yes.
00:45:12.000 I am so annoyed.
00:45:13.000 Every time I see it, I'm like, get rid of it, get rid of it.
00:45:16.000 It's like I'm watching videos of a dude doing a 360 flip crook down a rail, and I'm like, whoa.
00:45:22.000 I'm watching a video of a guy doing a backside flip going 30 miles an hour, and I'm like, whoa.
00:45:26.000 And then it shows me a woman jump roping, and I'm like, I don't care about this.
00:45:30.000 Remove.
00:45:31.000 Why did you put that in my feed?
00:45:33.000 Yeah, but I will say this for Instagram.
00:45:35.000 The personal shopper is on point.
00:45:39.000 It's so good!
00:45:41.000 On point.
00:45:42.000 I actually, I call it my personal shopper.
00:45:44.000 It's not even advertising.
00:45:45.000 I buy everything.
00:45:46.000 I do too.
00:45:46.000 They once advertised to me, so my older son had food allergies when he was younger.
00:45:51.000 I once got a targeted ad for yarmulkes, because we're Orthodox Jewish.
00:45:59.000 Kid yarmulkes for kids who are not really talking yet with allergy information listed on the yarmulke.
00:46:06.000 I was like, you hit so many data points there that were Orthodox Jews with a young son with food allergies who is too young to verbalize them yet.
00:46:17.000 It lined it up and it was like, we know what you'll buy.
00:46:21.000 I won't buy anything off the Instagram.
00:46:23.000 It serves me stuff I like, but sometimes I feel like I can't help it.
00:46:27.000 I want to feed it because it's my personal shopper.
00:46:30.000 The singularity has occurred.
00:46:32.000 The AI is in control.
00:46:34.000 That excites me.
00:46:35.000 I mean the Instagram one.
00:46:36.000 Only the Instagram.
00:46:36.000 You're a puppet, Ian.
00:46:37.000 You're being controlled by the machine.
00:46:38.000 Because I think my mind is strong enough.
00:46:40.000 I think that I'm able to tend information without believing it or disbelieving it enough that I could exist within the algorithm and function wrong peacefully.
00:46:49.000 Excuse me, wrong.
00:46:50.000 Okay.
00:46:52.000 Excuse me, wrong.
00:46:53.000 You see, here's what's happening.
00:46:55.000 When you get fed information on social media, it is shaping your worldview, and you don't have control over that.
00:47:01.000 I have noticed.
00:47:02.000 Man, a lot of people I follow on Twitter are people that have been on the show, and it's a lot of politics.
00:47:07.000 I do not like it, man.
00:47:08.000 And I like those people.
00:47:09.000 I want to know what they're up to in life, but I can't stand reading about the left, and the right, and the color red.
00:47:15.000 Follow them on Insta instead.
00:47:17.000 Yeah, I'm a completely different person on Instagram.
00:47:20.000 I am delightful on Instagram, and I'm awful on Twitter.
00:47:24.000 Interesting.
00:47:25.000 But Twitter is, for most people, it's not algorithmic.
00:47:30.000 It's reverse chronological.
00:47:31.000 You can choose to do the algorithmic or otherwise.
00:47:33.000 So for me, I follow news.
00:47:35.000 I mean, Twitter is the news platform.
00:47:37.000 There's like not much else on there.
00:47:38.000 Celebrities don't really get traction.
00:47:39.000 They're there, they have big following, but like... I never see those tweets.
00:47:43.000 Yeah, they go on Instagram for that stuff.
00:47:45.000 But on Instagram, they're feeding you stuff in your feed and shaping your worldview.
00:47:50.000 Just just outright.
00:47:52.000 Facebook as well.
00:47:53.000 Facebook is where it's substantially worse.
00:47:55.000 So I don't really use Facebook, so I don't know.
00:47:56.000 But Instagram, nothing is shaping my worldview because it's just pictures of like homeschool classrooms, which is great.
00:48:02.000 And I love them.
00:48:03.000 And I've taken a lot of information.
00:48:05.000 But you're, you know, I think for younger people, they're being inundated with very specific things.
00:48:11.000 But is it like do people get their news from Instagram?
00:48:14.000 Yeah I think they do and I think for especially some young people they'll follow like uh meme accounts on Instagram that'll be not political memes like silly like uh what it's like when you're 20 right and like then those accounts start to promote you know maybe they really believe them but maybe you know just part of the cultural narrative they know certain things will get more likes and so they start incorporating certain content that's more pro certain issues especially I saw this a lot after um Robbie Wade was um overturned and I had People jump on that bandwagon like there was no tomorrow.
00:48:45.000 People who are not particularly political and maybe this is an important issue to them and they just don't vocalize it.
00:48:49.000 I grant that some people are like that, but a lot of accounts picked up on the fact that it was correct to, you know, lifestyle kind of content.
00:48:58.000 I think that that would have happened even without social media though.
00:49:00.000 I think that your... I think social media reinforces it because people who don't seek out political information are served it anyways through this backdoor channel of like, look at these cute jeans I got!
00:49:09.000 Also my row pin!
00:49:10.000 Also... There was interesting sort of Stuff that was leaked from marketing firms after Roe, I mean Dobbs, that a lot of brands were told like, don't touch it, that they went all in on Black Lives Matter.
00:49:24.000 And so I know people who are like pretty big influencers on Instagram, and they got questions like, why didn't you post a black box?
00:49:31.000 Where was your black box during Black Lives Matter?
00:49:34.000 And I feel like the corporate pressure was not the same with Dobbs.
00:49:40.000 I think because Dobbs is more complicated and they knew it from the beginning whereas with the black box thing it was like a you must submit I mean remember the culture that's run in some ways the leak of the Roe decision that came out like ahead of time I think was trying to build the same tension that we had during the summer of rioting after George Floyd's death.
00:50:03.000 It sets the circumstances very differently.
00:50:04.000 Sorry.
00:50:04.000 I want to jump to this next story.
00:50:05.000 We got this from timcast.com.
00:50:07.000 Twitter announces plan to tackle misleading narratives ahead of midterms, vows to throttle tweets deemed incorrect.
00:50:14.000 Exciting!
00:50:15.000 Like they did not do with all the Democrats who were screaming that the election was stolen in 2016.
00:50:20.000 They're now basically telling us They are going to decide what is true.
00:50:24.000 The last election they did this, what did they do?
00:50:26.000 They suppressed the Hunter Biden laptop information.
00:50:28.000 They suppressed anything that was basically bad for Democrats, and now they intend to do it again.
00:50:34.000 So I don't know if there's any point in reading what they claim is going to be done, their civic integrity project.
00:50:38.000 I know, this seems like a really good news site.
00:50:39.000 I think you maybe should read it.
00:50:40.000 No, no, no, but the point is, like, who cares about what Twitter has to say about why they're going to be censoring and controlling the flow of information?
00:50:46.000 I'd like to hear a little bit about it.
00:50:47.000 What's the official statement here?
00:50:48.000 The Civic Integrity Policy covers the most common types of harmful misleading information about elections and civic events, such as claims about how to participate in civic process, like how to vote, misleading content intended to intimidate or dissuade people from participating in the election, and misleading claims intended to undermine public confidence in an election.
00:51:07.000 Harmful.
00:51:10.000 The thing about all that is you're allowed to do all that stuff.
00:51:15.000 That's free speech.
00:51:16.000 As long as you're not inciting violence.
00:51:18.000 I mean, you're allowed to tell people not to vote.
00:51:21.000 Oh.
00:51:21.000 Yes, but you're I don't think you're allowed to defraud people by like telling them the wrong voting day
00:51:26.000 Right things like that. Okay, so I follow a great account He's a great guy.
00:51:32.000 His name is Political Math.
00:51:33.000 It's Polymath on Twitter.
00:51:34.000 Yeah, I love him.
00:51:36.000 And so he tweeted, he retweeted the CDC's guidance about the kids vaccine between six months and five years old, saying like, absolutely get, you know, these young children vaccinated.
00:51:47.000 And Polymath retweeted it and said, fire these people.
00:51:50.000 This is unspeakable stupidity on the part of the CDC.
00:51:53.000 That agency should be burned to the ground.
00:51:55.000 Which is opinion.
00:51:57.000 First of all, it's opinion.
00:51:58.000 And second of all, it is- Well, it's a call to violence.
00:52:00.000 That's their interpretation of it.
00:52:01.000 No, they call it misleading.
00:52:02.000 Oh, okay.
00:52:03.000 Interesting.
00:52:03.000 They don't call it a call to violence.
00:52:05.000 But if you look at the numbers of people who have had their young children vaccinated for COVID between six months and five years of age, I wrote about this for Deseret like two weeks ago, the numbers are like 3% right now.
00:52:18.000 So, when I see a misleading tag on Polymath's opinion about the CDC's statement, it makes me be like, oh there is something up there.
00:52:26.000 Like, it backfires on me.
00:52:28.000 It backfires, right.
00:52:29.000 Yeah, because I see that and I think like, why are they gatekeeping like that?
00:52:33.000 But I think a lot of this is just failed, it's, what's the right word, what's the politically correct word of saying your brain doesn't work?
00:52:41.000 The people at Twitter are really dumb, and there are people at Twitter that are really evil, and then you have the government trying to get Twitter to censor people, which we've heard over and over and over again now.
00:52:52.000 So what happens is they're like, hey, let's implement a policy.
00:52:55.000 What happens?
00:52:56.000 There have been several instances where people's tweets have been flagged, and the fact check is totally unrelated.
00:53:02.000 And it's like this really weird thing, like, huh?
00:53:04.000 There have been several instances where guidance has changed, and they've been like, hey, the CDC is like, we're revising our guidance, and then Twitter flags it as fake news.
00:53:12.000 Here's what the CDC says, and then links to like an article from the year before.
00:53:15.000 There was a famous incident on Facebook, where the, I think it was the CDC's own website was labeled fake news.
00:53:23.000 Because these machines don't work.
00:53:26.000 They don't know context.
00:53:28.000 They don't have up-to-date information.
00:53:30.000 And so if someone's like, I got breaking news, the CDC says X, they'll delete you and say that was fake news and you're banned because our official fact checkers have not yet.
00:53:39.000 So when they talk about getting involved in the election, at what point do we as a society do something about the interference and manipulation of our elections?
00:53:52.000 The problem is there's no mechanism for solving this, and there's no political process for solving it.
00:54:01.000 It's just there.
00:54:02.000 Twitter is this corrupt, broken, evil machine.
00:54:07.000 And I don't care if the intention put into it was good, what came out was evil.
00:54:10.000 And what do you do?
00:54:11.000 Antitrust?
00:54:12.000 The people who run this platform are all out of their minds.
00:54:15.000 Elon doesn't even want it anymore, and he said he hopes that he doesn't have to buy it.
00:54:20.000 There's no saving these broken social media platforms.
00:54:24.000 It's like, You build a machine that runs wild and starts destroying you, and you don't know what to do to stop it.
00:54:31.000 Well, you gotta free the schematics so that people—someone will figure out how to stop it.
00:54:35.000 That's why I advocate for freeing the software code.
00:54:37.000 You can at least make it better and more interoperable.
00:54:39.000 I think that— People aren't gonna be able to take Twitter's code and then change Twitter.
00:54:43.000 They're gonna be able to replicate it and make more problematic versions of the same garbage.
00:54:46.000 Or better, and then the people at Twitter will be like, ooh, we could do that.
00:54:49.000 Let's change our code.
00:54:50.000 Better does not mean— Better could mean better for society.
00:54:51.000 I know.
00:54:52.000 Better could mean generates more revenue, which means worse because now it's more algorithmic manipulation, making people click and get, you know, brainwashed.
00:55:01.000 It depends on who you ask.
00:55:01.000 Better could mean more manipulation of the masses to get them to vote for who I want them to vote for.
00:55:07.000 But I think better, uh, you know, I kind of think, uh, what do you, allotropically, I think that's not the right word, but I think like, you know, the betterment of the whole of, uh, the community, like, I don't want less constriction on who's controlling it, but I guess, I don't know, I'm not in the military.
00:55:26.000 The military commander would tell you that you want to do the opposite with it, probably.
00:55:29.000 That you want to control it.
00:55:31.000 That's the only reason you're not in the military.
00:55:32.000 Yeah, I don't want to control people, but I mean, that's the military's job.
00:55:36.000 But I think it's nice to believe that people would want to rally around common good and the betterment of other people.
00:55:43.000 That's honorable and that's moral in a lot of ways.
00:55:46.000 Why would Twitter, and I don't know a ton about freeing the code for sure, but like, why would freeing the code motivate people who are already... It wouldn't.
00:55:55.000 Well, ask the question.
00:55:56.000 Okay, so, Twitter has a history of wanting to manipulate people and control the worldview.
00:56:01.000 Why would freeing the code suddenly change their mind?
00:56:03.000 Why would they suddenly, if someone else made something that was similar but more moral, why would they be like, that's a good idea, we should do that too?
00:56:10.000 They could have done that already.
00:56:11.000 They chose not to.
00:56:12.000 The issue is, Twitter is not driven by morality.
00:56:15.000 Moral platforms exist, and they don't have traction.
00:56:18.000 What works is addiction machines.
00:56:20.000 Instagram knows this.
00:56:21.000 Facebook knows this.
00:56:22.000 Twitter knows this.
00:56:22.000 YouTube knows this.
00:56:23.000 TikTok knows this.
00:56:24.000 They know that they can give you a dopamine hit by making you feel good, and they have this built into their machines.
00:56:32.000 There are tech companies.
00:56:34.000 that offer a service to generate addiction. They'll say, are you building an app? Come to us
00:56:40.000 and we will build an addiction routine into your app for you to make your slot machines and stuff.
00:56:45.000 They actually figure out when you pull it down, how long do you want, how addictive, you know,
00:56:49.000 how long do you want to wait until it refreshes? How many times do you need to win? How many dings
00:56:53.000 are you going to get? Right. But what would happen is if, if they freed the code that guy,
00:56:58.000 number two would build, set up his own identical addiction machine, identical to Twitter and equally
00:57:03.000 But on his thing, he gets to make his own terms of service.
00:57:05.000 He'll say, on mine, you can talk about the CDC, say whatever you want.
00:57:08.000 All these people on Twitter will try this one.
00:57:11.000 And then they'll still- No, they won't!
00:57:13.000 Why would they move?
00:57:13.000 Because they're allowed to talk about the CDC- But it's not happening, okay?
00:57:16.000 Well, they haven't freed the code yet, that's what I'm saying.
00:57:18.000 No, this is my- Let me at least state my claim here so you can argue it.
00:57:21.000 You say your claim all the time, and the issue- Hannah Clare asked me the question, I want to answer it.
00:57:26.000 So, then you- You make it so you can still see the people on Twitter from the new site, so you're not actually leaving.
00:57:33.000 You're just expanding the process and you're creating a marketplace of the terms of service, essentially, instead of a marketplace of who owns the code.
00:57:41.000 How does that not already exist?
00:57:43.000 With Truth Social, with Parler, with Getter, with Gab, with Mines?
00:57:47.000 They're not interoperable.
00:57:48.000 They don't interoperate with each other yet.
00:57:50.000 And why would they?
00:57:52.000 Well, bigger network effect.
00:57:54.000 That has nothing to do with bringing the code.
00:57:55.000 You know, a more diverse network effect.
00:57:56.000 The issue is you cannot make a morally better system.
00:58:00.000 A system that improves like Twitter or Facebook is more addictive and manipulative and power
00:58:05.000 hungry.
00:58:06.000 So there's no solution.
00:58:08.000 Maybe antitrust, but Twitter is not a monopoly technically.
00:58:12.000 So there are other platforms.
00:58:13.000 But it is de facto.
00:58:14.000 It for sure is.
00:58:15.000 Exactly.
00:58:16.000 And so there's literally, there is no mechanism we have today other than like, all of the people of this country agree it's bad, so we pass a law saying ban Twitter.
00:58:25.000 So here's my question for Ian.
00:58:26.000 Prohibition.
00:58:27.000 So, freeing the code.
00:58:30.000 So would this basically provide a window?
00:58:31.000 Because here's what I'm curious about as a Twitter user.
00:58:35.000 What happened with Alex Berenson?
00:58:38.000 Like, can someone explain that to me?
00:58:39.000 Yeah, he wanted to be on Twitter.
00:58:41.000 No, I mean like, he was banned and now he's back.
00:58:45.000 He filed a lawsuit because he wanted to be back on Twitter.
00:58:47.000 He didn't want to go on any other platform.
00:58:49.000 He wanted to be on Twitter.
00:58:50.000 So it was the lawsuit that got him back on?
00:58:52.000 Yes.
00:58:52.000 He had his lawyer on last week.
00:58:54.000 He settled with Twitter.
00:58:55.000 They settled and he got his account back.
00:58:57.000 Interesting.
00:58:58.000 So would freeing the code basically do what Alex did and open the door?
00:59:04.000 No.
00:59:04.000 Okay.
00:59:05.000 Oh, yeah.
00:59:05.000 Well, you could.
00:59:07.000 So Alex, if he got banned off Twitter, he could go on the new version and still see all the people on Twitter from his new version, and he wouldn't be banned off the new version.
00:59:13.000 Why would their database be granted to you?
00:59:16.000 Why would you get access to their database?
00:59:18.000 Because you have the API.
00:59:19.000 You'd have access.
00:59:21.000 That would be the law.
00:59:22.000 That would be what you would have to do.
00:59:23.000 I'll tell you this right now.
00:59:24.000 The reason why I don't like talking about this is that it literally makes no sense.
00:59:28.000 I don't think so.
00:59:29.000 Because the other option is breaking up the company, like an antitrust, and that doesn't work.
00:59:32.000 Because they still have the code, they can start a new one.
00:59:34.000 What I want to know is, who is the person who decided To ban Alex Berenson.
00:59:42.000 What was the decision-making process there?
00:59:44.000 It is likely the government intervened.
00:59:46.000 We don't know for sure.
00:59:47.000 So that's what I want to know, though.
00:59:49.000 We don't know for sure, and I want to know.
00:59:50.000 And we'd love it if Alex Berenson could tell everybody.
00:59:53.000 And that's why people are really mad at him.
00:59:55.000 Because the story goes and you know, I don't want to put words in his mouth or the mouth of his followers
01:00:00.000 But what people are saying on Twitter in response to him is that he promised when he got to discovery
01:00:05.000 He would expose what was going on instead. He settled with Twitter got his account back and then said sorry. Oh, that's
01:00:12.000 yucky So people are like, you know, he said, in the future, there will be more to talk about in terms of government involvement and censorship and things like that.
01:00:21.000 We heard from Naomi Wolf, she said that the CDC was going after specifically- She was kicked off, right?
01:00:26.000 She's not- Yeah.
01:00:27.000 Oh no.
01:00:27.000 And so she was saying that it's... We've seen this before.
01:00:34.000 Judicial Watch uncovered documents, I believe it was Judicial Watch, that Democrats were going to these big tech companies saying, ban these people.
01:00:43.000 So it's very... At this point, I would say we're at probable cause or beyond.
01:00:47.000 We have actual instances of evidence where the government is using third parties to violate people's First Amendment rights, but they're doing it circuitously.
01:00:55.000 So this was the lawsuit that was just filed by a whole bunch of the healthcare people.
01:00:59.000 I don't know a lot about it, but it was the, I think it was Jay out in San Francisco?
01:01:09.000 Stanford.
01:01:10.000 Stanford.
01:01:11.000 Dr. Jay, and I can't say his last name.
01:01:13.000 B. I'm so glad when people don't know things that I don't know.
01:01:18.000 I'm so spaced out right now, I'm sorry.
01:01:20.000 No, so there's a lawsuit that was just filed, and it's funny, I actually just did a radio hit about it, and the host did the worst thing in the world to me.
01:01:27.000 He's like, tell me about this!
01:01:29.000 I'm like, you have not to talk about it!
01:01:31.000 No, thank you!
01:01:32.000 You have to do the... and so I kind of like I muddled through it as best I could, but there was a lawsuit filed by a whole bunch of healthcare people about the fact that the CDC and the government worked in conjunction with the social media companies to silence them.
01:01:50.000 Which is a violation of the First Amendment.
01:01:53.000 The government does not have the right to go to companies and say, ban these people, don't let them speak.
01:01:58.000 Now the issue is, as always, it's cultural.
01:02:02.000 Cultural enforcement is more powerful than law enforcement, and cultural drives are more powerful than any platform could be.
01:02:08.000 You can spin up as many platforms as you want, from TruthSocial, to Gab, to Parler, to Getter, etc., etc., and people don't use them.
01:02:17.000 They don't unify on them, and why?
01:02:18.000 Why?
01:02:19.000 Why didn't Alex Berenson just go on TruthSocial and talk to those people?
01:02:22.000 Why didn't he just go on Gab and talk to those people?
01:02:24.000 He wanted to be on Twitter, so he sued to be on Twitter, he accepted being on Twitter, and then he didn't give the people what they asked for, because being on Twitter was more important to him, because people are on Twitter.
01:02:35.000 Yeah, it's the people, it's not the platform.
01:02:37.000 Can I correct myself now that I've Googled it?
01:02:39.000 Because I'm a little bit of a jerk.
01:02:43.000 The person who filed the lawsuit is one of my friends, and I didn't know that.
01:02:49.000 Justin Hart.
01:02:51.000 He's a data guy.
01:02:52.000 He's a marketing digital strategist.
01:02:55.000 He filed a federal lawsuit against Facebook and Twitter and Joe Biden and the Surgeon General for violating his First Amendment rights to free speech.
01:03:03.000 He claims that the federal government colluded with social media companies to monitor, flag, suspend, and even delete social media posts that they claimed contained misinformation.
01:03:12.000 He's being represented by Liberty Justice Center.
01:03:16.000 I'm sorry, Justin.
01:03:17.000 I didn't know that you were doing that.
01:03:18.000 I will say there have been many circumstances where big tech has been sued, and I am flabbergasted by the weak arguments made in such strong cases.
01:03:27.000 So, uh, there's just been a handful that, uh, I don't want to call anybody out specifically to impugn their honor, but there have been very, very strong cases where you're like, wow, look at the details of this case.
01:03:38.000 Clearly the government said, ban this person.
01:03:41.000 And then when they file a lawsuit, they don't mention anything about like, they don't go, they don't go after the government.
01:03:46.000 They don't include them as part of the lawsuit.
01:03:48.000 They don't even bring up the strong elements of the case as arguments.
01:03:51.000 They just say something like our contract was breached.
01:03:53.000 And I'm like, What am I missing here?
01:03:56.000 Because I've talked to dozens of lawyers about various issues, and, you know, to put it simply, I'm not a lawyer.
01:04:02.000 I can't speak for why these lawyers have made weak cases that ended up losing, or settling, or just not accomplishing what they wanted to accomplish.
01:04:08.000 But then when you listen to the lawyers on their shows, and you listen to high-profile people coming out and explaining what went down, they make it sound like they had a much better case than they presented, and I don't understand why they didn't go for it.
01:04:20.000 I just don't know.
01:04:21.000 Or maybe they just didn't know.
01:04:22.000 I don't know.
01:04:24.000 I don't understand why more news outlets haven't filed lawsuits against NewsGuard, for instance.
01:04:33.000 Breitbart has written articles being like, how dare you, NewsGuard?
01:04:36.000 And it's like, why don't you sue them?
01:04:37.000 Why don't you sue them?
01:04:38.000 Yes, exactly.
01:04:40.000 You.
01:04:40.000 I know.
01:04:41.000 I'm asking you.
01:04:42.000 And we said this last week.
01:04:45.000 I don't watch every video.
01:04:46.000 What?
01:04:47.000 I'm not saying you do.
01:04:47.000 I'm saying yes, exactly.
01:04:51.000 We probably are.
01:04:52.000 Okay.
01:04:53.000 I've been in a dispute with them already.
01:04:54.000 They've violated their own standards.
01:04:56.000 They've violated their own correction policy.
01:04:58.000 They've accused us of being irresponsible while holding themselves to a lower standard, stating that we get an 82 out of 100.
01:05:04.000 I say that's a statement of fact, that they've rated us on the basis that they are giving a factual analysis, but they are not.
01:05:11.000 They don't follow their own standards or policies.
01:05:14.000 And the reason I take this so seriously, people need to understand this, NewsGuard is used by advertising agencies and big tech to reduce visibility of your content.
01:05:23.000 So if you sit, so we're 82 out of 100.
01:05:26.000 I mean, we're one of the best, but they arbitrarily gave our website a ding, even though our standards are greater than theirs.
01:05:36.000 USA Today fabricated 23 stories, and they say that's fine.
01:05:41.000 I want to be careful here because there's some behind-the-scenes stuff related to serious malfeasance, but I have already issued a demand to them, and it is very, very likely we will be filing a suit.
01:05:53.000 And I will probably seek crowdfunding to help other organizations that have been defamed by them falsely and smeared in violation of their own standards.
01:06:02.000 And there's a few things people need to know about the elements of defamation.
01:06:06.000 Actual malice.
01:06:07.000 Did they know what they published was false?
01:06:10.000 When it comes to opinion statements, this is probably going to be their big defense.
01:06:14.000 That when we call you irresponsible, it's an opinion.
01:06:17.000 We'll get to that.
01:06:18.000 They've also issued several false statements and refused to correct them.
01:06:21.000 For instance, on the label for our website, They claim they mischaracterized a post.
01:06:27.000 That's false.
01:06:27.000 They injected words into it.
01:06:29.000 That's very different from a mischaracterization.
01:06:30.000 That's a false statement of their actions, defaming us.
01:06:33.000 They accused our content of being fiction.
01:06:36.000 And they did not, as per their own policy, which is their standard, admitted that they said we were wrong to say TimCast's content is fictional.
01:06:44.000 They did that.
01:06:45.000 Instead, just scrubbed it from the article in violation of their own policies.
01:06:48.000 So I'm particularly pissed off about this.
01:06:51.000 But I'll put it this way.
01:06:53.000 I've talked about Wikipedia standards, how they put their own byline in it.
01:06:58.000 I have no standing to go after Wikipedia.
01:07:00.000 You give me standing on any of these platforms and the first thing I'm going to do is I'm going to go to the full extent possible legally.
01:07:06.000 And so there's been a long ongoing conversation with NewsGuard.
01:07:10.000 For instance, they first tried to claim we were fake news because we accurately reported the contents of the Hunter Biden laptop.
01:07:15.000 They tried getting us to editorialize our content.
01:07:19.000 And I said, here's two NewsGuard certified sources confirming the laptop emails Are real and verified.
01:07:27.000 And their response was, you know what, this is a little bit murky so we're gonna ignore this issue for now.
01:07:31.000 No.
01:07:32.000 You don't get to send me an email demanding we editorialize our content and then you omit from your own article that you had an error in your own assessments.
01:07:41.000 So, I wait.
01:07:42.000 When they published it, with six errors, right off the bat, I emailed them immediately and demanded retractions, corrections, and they have refused, every step of the way, to correct.
01:07:54.000 NewsGuard fabricated a quote from me that pissed me off.
01:07:58.000 You know, when it was accusing Tales from the Inverted World of being fiction, I said, that is a false statement.
01:08:03.000 And they changed it, but never explained, as per their own policy, what they did wrong.
01:08:08.000 They called it a mischaracterization.
01:08:10.000 I challenge that.
01:08:11.000 That is a false statement of fact on their own part.
01:08:13.000 I don't know how that'll work in court.
01:08:14.000 But I'm really, really pissed off about this.
01:08:17.000 Their rating system is arbitrary.
01:08:19.000 They have dinged us simply because we are an independent media organization.
01:08:23.000 USA Today fabricated 23 stories, and they give them a perfect score.
01:08:28.000 Media Matters gets like an 80 out of 100, and they're a conspiracy crackpot website.
01:08:33.000 So anyway, I'm frustrated because I'm pissed off at these institutions, but I believe we have serious standing and reason to prove That they have, there's actual malice in the insertion of words into a quote, knowing I did not say these things, and knowing I never implied them.
01:08:53.000 I'll give you exactly what it is, because they're trying to argue, when we put words in your quote, we're implying something.
01:08:59.000 We know you didn't say it.
01:09:01.000 I told them, if you require a website to fact-check every single quote from every politician, that is near impossible.
01:09:08.000 But if that's what you require, we will do fact-checks on all quotes moving forward.
01:09:14.000 They inserted the words that are false, changing what I was saying.
01:09:18.000 What I was saying was, you have demanded of us an impossible standard by adding those words.
01:09:23.000 They knew they were manipulating my quote.
01:09:26.000 That is malice.
01:09:27.000 And I want to see what they wrote when they were talking to their editors and lawyers as to why they decided to change the context of what I said.
01:09:33.000 They did not admit, as per their own correction policy, they did that.
01:09:36.000 So I think they outright defamed, libeled me, and so you have actual malice, and then reckless disregard for the truth in that they don't abide by their own standards.
01:09:45.000 So in their fact-checking process, they three times incorrectly labeled my job at three different organizations.
01:09:52.000 Called our content fake.
01:09:54.000 Fiction and fantasy.
01:09:55.000 I'm pissed off about that.
01:09:57.000 And they labeled Castcastle mundane, which is an opinion statement as per their own standard must be labeled.
01:10:02.000 If they're not going to abide by their own standards, I am going to sue the ever-living out of them.
01:10:06.000 And you know what?
01:10:08.000 If in the end the suit is dismissed, I long for the day NewsGuard files in their federal response why they are allowed to have zero standards for their own journalists, why they're allowed to fabricate quotes, why they're allowed to smear and defame a plethora of independent media organizations, and why they give perfect scores to outlets like CNN and The New York Times, who publish fake crap all the time.
01:10:29.000 Anyway, I'm pissed off about it, you can tell.
01:10:31.000 A little bit.
01:10:32.000 I mean, I think a lot of it, your question, your initial question, how you got started in all of this was why isn't anyone suing?
01:10:38.000 And I think it's two things.
01:10:39.000 I think that people don't care enough.
01:10:41.000 You're very obviously not one of those people.
01:10:44.000 But I think the other problem is that, and I think we saw it a lot with COVID, people are much more content to go along because they don't want to start things.
01:10:54.000 And so even though they see things that are objectively ridiculous, like putting a cloth mask on a two-year-old baby, people are afraid to speak up because everyone is a coward.
01:11:06.000 And so I think that there is a lot of that.
01:11:07.000 We had Tucker Maxx on, and he said power likes to be hidden.
01:11:13.000 I was asking them, because we bring this up quite a bit, where are all the powerful people to just come out, make powerful statements, buy commercials, put up billboards, and challenge the things we know they're privately complaining about?
01:11:25.000 Where are all the Hollywood celebrities that privately complain about this stuff but then don't stand up?
01:11:30.000 There's a really great comic where it shows a guy burning a woman at the stake and he says, psst, I just want to let you know I completely agree with everything you said.
01:11:37.000 That's modern mainstream society, unfortunately.
01:11:40.000 So I don't know, man.
01:11:43.000 I imagine sooner or later someone's going to get pissed off enough with me and they're going to try and do something.
01:11:47.000 We've already been swatted nine times.
01:11:49.000 I'm a little nervous about that, actually, speaking of which.
01:11:50.000 Well, we have armed security and things like that.
01:11:53.000 I would say one of the other things I think happens is that it's expensive to make your legal battles a priority.
01:11:59.000 And I think there are other independent media companies who are probably being treated... I mean, this is particularly horrendous treatment of our organization and of course I hesitate to comment on it because I've been with Tim Cass since the newsroom began.
01:12:14.000 I've been here for like a year so this is a lot of my work that's being scrutinized and I am very glad that you're willing to do something about it because I think We do hold ourselves and all of our journalists to an
01:12:25.000 extremely high standard.
01:12:26.000 That being said, I do know there are small organizations that would love to go to court,
01:12:30.000 but have to make the decision. Can we afford to do this?
01:12:33.000 Because it can be protracted, especially when you fight larger companies, they can drag
01:12:37.000 it out. I mean, anyone who sued Facebook for anything knows this.
01:12:40.000 Yeah, I think, though, I imagine there's a community of people who are willing to
01:12:47.000 stand up and be involved.
01:12:50.000 James O'Keefe and Project Veritas, they've crowdfunded the finances required to file a lawsuit and they're going up against the New York Times.
01:12:58.000 And regardless, this is what people need to understand too.
01:13:02.000 Winning doesn't mean having a judge bang a gavel and say, for the plaintiff.
01:13:07.000 Winning means getting these organizations to admit they're liars and they publish fake garbage.
01:13:14.000 So all, you know, look, all I want is... I told NewsGuard right off the bat, how could you deem us irresponsible but have a lower standard than us?
01:13:24.000 And they just said, too bad, so sad, go cry about it.
01:13:27.000 Something to that effect.
01:13:28.000 We make our judgments based on the fact that we're looking for you to correct these articles without us coming to you for them.
01:13:35.000 And I said, we've corrected substantial articles without you coming to us and telling us to correct them.
01:13:39.000 You found five articles that you had questions on, only one of which had a factual inaccuracy, which we corrected right away, as per our corrections policy.
01:13:47.000 It's arbitrary.
01:13:49.000 There is no objective standard.
01:13:51.000 And right now, NewsGuard is falsely claiming that we publish misleading information.
01:13:56.000 Why?
01:13:57.000 Because we quoted the president.
01:13:59.000 We quoted Donald Trump in a news story and they said that that qualifies us for publishing misleading content.
01:14:06.000 So when USA Today or the New York Times quotes Donald Trump, are they not publishing misleading content?
01:14:11.000 The argument is Trump's quotes are misleading, right?
01:14:13.000 Why is it only when we do it?
01:14:15.000 Great.
01:14:16.000 I want them to answer to a judge why it is the New York Times can publish the exact same thing as us.
01:14:21.000 In terms of the reporting and the quotes, and that's responsible, but for us it's not.
01:14:25.000 It's because I think there may be a motive that is more attuned to causing harm, intentional injury and monetary damages to small businesses that might compete with the friends of these organizations and their investors.
01:14:40.000 Anybody who tries to create a new media company, anybody who tries to report news that falls outside of the official cathedral narrative, for some reason, has a really rough go of it.
01:14:51.000 So I'll tell you this.
01:14:53.000 In the newsroom, as Hannah Clare can attest to, I've been extremely adamant about abiding by every one of NewsGuard's policies.
01:14:59.000 Because I wanted to see that if we went above and beyond and did everything they deemed to be correct, would they honor that?
01:15:05.000 And they did not.
01:15:06.000 Because they are fake.
01:15:08.000 And now I want them to answer in court if they refuse.
01:15:11.000 So I told them that I've already forwarded my demands to their general counsel.
01:15:17.000 And you better believe we're going to file a suit.
01:15:21.000 And if in the end they respond and they say it's protected opinion, fine.
01:15:26.000 So be it.
01:15:27.000 They get to explain why their standards are lower than ours, and they get to rate us.
01:15:30.000 Dude, I'm looking at some of NewsGuard's investors right now.
01:15:33.000 One of them is Blue Haven Initiative, which is, according to PitchBook.com, is an impact investor.
01:15:38.000 And if you don't know what those are, you should look into it.
01:15:40.000 Impact investing is specifically social engineering.
01:15:43.000 It's from Investopedia.com.
01:15:46.000 Investors who use impact investing as a strategy consider a company's commitment to corporate social responsibility.
01:15:54.000 or the sense of duty to positively serve society.
01:15:57.000 So they have an agenda.
01:15:58.000 They're, they're one of their, this is one of their, I don't know if it's a top investor,
01:16:01.000 but maybe it's alphabetical, but they, they certainly, if they're impact investing, that
01:16:04.000 is specifically with an agenda to get the company to do something.
01:16:08.000 The funny thing is two of the articles were us just quoting Trump.
01:16:12.000 We were like, Donald Trump says, you know, it's like, that was it.
01:16:15.000 Like, Donald Trump came out, he issued a response, and like, in response to Joe Biden, Trump says, quote.
01:16:20.000 And they said, you should have included context saying that Trump was wrong.
01:16:24.000 And I said, well, that would be a fact check article.
01:16:26.000 We're just reporting Trump issued a statement.
01:16:27.000 Right.
01:16:28.000 And he was like, well, that's irresponsible because Trump's comments are wrong.
01:16:31.000 And so, you know, this is what I was getting to with, we would have to fact check every single quote.
01:16:38.000 But they didn't tell us to fact check anything Biden said when Biden was wrong, only Trump.
01:16:42.000 Clearly, what they're actually trying to do is manipulate our editorial guidelines and standards.
01:16:48.000 And so I said, like, are you telling us that in order to be responsible, we have to adhere to your editorial policy?
01:16:53.000 And then they're like, no, no, no, no, we're not saying anything like that.
01:16:55.000 But yes.
01:16:56.000 But you only have to fact check Donald Trump.
01:16:59.000 Only Donald Trump.
01:17:00.000 And you better do it.
01:17:00.000 They took no issue with any other quotes.
01:17:01.000 What Blue Haven Initiative wants, otherwise they'll pull their impact investment out of NewsGuard, which is to, you know, what Blue Haven Initiative wants is to Uh, generate a measurable beneficial social or environmental impact alongside financial return.
01:17:16.000 Oh, I wonder what their social impact they're trying to acquire.
01:17:19.000 Yep.
01:17:20.000 Let's find out.
01:17:21.000 The institutions are as corrupt as corrupt can be.
01:17:24.000 So people, people, again, you need to understand what their goal here is, is to go to advertising agencies and say, anybody we deem unworthy, do not sell with.
01:17:34.000 They're unsafe.
01:17:35.000 And it's happened to a lot of people.
01:17:37.000 So when NewsGuard goes after you, unless you're operating in the parallel economy, which is what a lot of people are trying to do, then you're going to be cut off from financial resources and that's their goal.
01:17:51.000 It is cancel culture on crack, on steroids.
01:17:55.000 So this is one of the most important fronts.
01:17:57.000 Now, I will say this, and I'm very proud to say this, MSNBC is officially fake news according to NewsGuard.
01:18:05.000 So look, I think NewsGuard does some good.
01:18:09.000 I just think they're biased and the machine is broken.
01:18:12.000 We are actually, you know, looking at how we can do a different kind of rating on journalistic ethics.
01:18:18.000 So I think there does need to be some kind of system that says, like, here are things this company has done.
01:18:23.000 The problem is, NewsGuard violates their own standards and publishes false information and then accuses other people of doing the same thing.
01:18:31.000 That being said, when an organization as broken and biased as this calls MSNBC violating severe journalistic standards, that's still good news.
01:18:40.000 Because when the machine itself is rejecting its own garbage, you can take this from NewsGuard and you can show all your friends and family when they claim MSNBC is real.
01:18:49.000 There you go.
01:18:50.000 At least something, right?
01:18:51.000 Yeah, good for something.
01:18:55.000 This might be an interesting segue to the Barry Weiss thing with Chuck Schumer.
01:18:58.000 Oh, let's pull up the Barry Weiss thing.
01:19:00.000 Yeah, let's talk all about how the corrupt media operates.
01:19:03.000 Daily Mail reports, Barry Weiss reveals New York Times editors wanted to check with Chuck Schumer before running an op-ed by Republican Tim Scott about his police reform bill after George Floyd's murder.
01:19:15.000 What?
01:19:17.000 It's amazing.
01:19:17.000 Tim Scott had a police reform bill, and the Democrats said no to it, and the New York Times wanted to check with Chuck Schumer.
01:19:23.000 That's what Barry Weiss says.
01:19:24.000 Former NYT opinion editor Barry Weiss told Senator Tim Scott on Wednesday about an internal discussion around his op-ed.
01:19:30.000 Scott's article was the subject of an internal debate, excuse me, Weiss said, and one of the senior editors questioned whether Republicans cared about minority rights.
01:19:39.000 The New York Times denied her account, saying, I'm gonna call BS in the New York Times.
01:19:48.000 Having worked for many of these, you know, organizations, this is the exact kind of stuff you see.
01:19:52.000 Yep.
01:19:52.000 Absolute corruption.
01:19:54.000 Yep.
01:19:55.000 And also, Barry is actually trustworthy.
01:19:59.000 Yeah, she's been doing pretty well with her sub stack.
01:20:02.000 I trust her way more.
01:20:04.000 I think she gets some things wrong, you know, but that's fine.
01:20:07.000 I think she's more trustworthy in her assessments than the New York Times.
01:20:10.000 And she's also just telling a rendition of something she witnessed.
01:20:14.000 She's credible.
01:20:15.000 She's a credible witness.
01:20:18.000 She has no reason to make this up.
01:20:19.000 She's, you know, she's moved on from the New York Times.
01:20:22.000 She's very successful.
01:20:24.000 She's making more money than she did.
01:20:26.000 Well, the reason she could make it up is to get back at them for something.
01:20:30.000 But why not say it sooner?
01:20:33.000 I mean, I don't know.
01:20:35.000 I'm not Beryl Ice.
01:20:37.000 But... I think more people should speak out about this stuff sooner rather than later.
01:20:43.000 And she mentioned it was in 2020, but she's been independent for a long time, so why not say something sooner?
01:20:52.000 It's quite the hat trick.
01:20:52.000 I'm gonna have to have her on the show and ask her about it.
01:20:54.000 It's topical.
01:20:55.000 And I'm not saying that she's lying.
01:20:56.000 I'm just saying that, you know... I wonder, too, if there's... Sorry.
01:20:59.000 Oh, no.
01:20:59.000 Go for it.
01:21:00.000 There's a lot of stuff she could probably say.
01:21:02.000 You can't, you know... If you publish your memoir the day you leave or, you know, in 2020, stuff is gonna get missed, like, in some ways.
01:21:10.000 I didn't read this article, so I can't say what the context, like, where she brought it up.
01:21:15.000 But... You had a conversation with Tim Scott?
01:21:18.000 Right, and I feel like there is a lot of stuff she could tell us about what happened there.
01:21:23.000 If she told it all at once, I think it would get kind of lost.
01:21:28.000 The impact would get lost.
01:21:29.000 Yeah, I mean, from the reporting, she was having a conversation with Tim Scott and she said, I don't know if you know this, but this is what happened.
01:21:38.000 And so I think it was sort of a natural thing.
01:21:40.000 I mean, it makes me wonder what other stories she has under her hat.
01:21:44.000 And I think that there was a lot of—I mean, she was—I think she should have sued The
01:21:48.000 New York Times, personally, because it was, you know, workplace bullying and intimidation
01:21:53.000 and everything.
01:21:54.000 And I think that there was some stuff that happened that, in the moment, she felt like
01:22:00.000 she was crazy, because everyone was like, well, yes, of course we talked to Chuck Schumer
01:22:04.000 We just wanted to clear it.
01:22:06.000 And she's kind of like, but why do we clear things with Chuck Schumer?
01:22:09.000 And I think as you exit the cult, she's kind of like, that was really messed up, wasn't it?
01:22:17.000 And she's kind of having that realization and sort of unpacking her own experience because I think she was so abused in the moment.
01:22:24.000 Did you ever see Bullworth?
01:22:27.000 That movie?
01:22:28.000 No.
01:22:28.000 You ever see it?
01:22:29.000 No.
01:22:29.000 Kevin Cah?
01:22:30.000 No, no, no.
01:22:31.000 What's his name?
01:22:31.000 Warren Beatty.
01:22:32.000 I think it's the one where the politician, the senator is like super depressed and wants to kill himself so then he just starts telling the truth and like he doesn't care anymore but then he like decides he wants to live or whatever.
01:22:43.000 He like goes up on stage at a black church and they're like, why didn't you deliver this bill?
01:22:47.000 And he's like, Because we got your vote.
01:22:49.000 We don't care.
01:22:49.000 The moment you went and voted for us, we stopped caring about what you thought.
01:22:52.000 And then they were like, what?
01:22:53.000 And then people ended up really liking it.
01:22:55.000 I think that with, you know, people like Barry Weiss, she's probably sitting on a whole bunch of other stuff.
01:23:01.000 And I'd say like, come on, like, of course she is.
01:23:03.000 That she's not going to talk about because she's probably scared about what will happen if she challenges the machine.
01:23:09.000 And I mean if you were in a cult you participated in the cult right so there are there's probably stuff that she's I could imagine not proud of or not ready to talk about her involvement in right and I you know I don't necessarily hold that against her it's a complicated thing to come out of something or to dissociate or something from an ideology you've been wrapped up in for a long time but some of her stories you know she's in the rooms for a reason she's involved with the organization so I mean she was a junior staffer I mean I think that I think it's it's hard to to be in that moment.
01:23:43.000 I mean she was she was in that newsroom and everyone around her was just they I mean they wanted to throttle her and I can't imagine what that feels like to go day after day somewhere where everyone hates you and one misstep like you didn't wash your hands after you left the bathroom can be become like a viral Twitter thread of your colleague who's sitting like our distance apart from each other.
01:24:09.000 I don't know if she's scared of them anymore.
01:24:11.000 I think that she's burned that bridge and she's not looking back.
01:24:14.000 Didn't someone who works with her write that terrible article about Jordan Peterson and enforced monogamy?
01:24:21.000 So you wondered to what degree they were participating, and they decided, like, I don't want to do this.
01:24:25.000 I don't necessarily trust a lot of people, especially if it takes them two years to come out and be like, oh, by the way, this really crazy thing happened.
01:24:32.000 That's the kind of thing where I'd be like, I want to quit.
01:24:35.000 And in fact, when I worked for Fusion and they started doing this stuff, I tried quitting, but I was under contract, so instead I just stopped participating in their BS system.
01:24:43.000 And then immediately started telling everybody about it.
01:24:46.000 Just like screaming it.
01:24:48.000 And they really don't like it.
01:24:50.000 The fact that they would stealth edit articles and told me not to report on the New York Times doing stealth editing because they would get exposed for doing it as well.
01:24:58.000 And I was like, I'm gonna tell everybody.
01:24:59.000 Are you nuts?
01:25:00.000 You mean you're violating journalistic standards?
01:25:02.000 You think I'm gonna keep that a secret?
01:25:03.000 Bro, I'm a journalist.
01:25:05.000 Like, my goal is to inform people, not... not be, like, a tribalist for some corporation.
01:25:11.000 You think I care about Fusion's bottom line?
01:25:14.000 Like, I'm here to tell people what's going on in the world.
01:25:17.000 So when the president of that company said, we're here to side with the audience, in reference to how we handle bias and perspective, basically said, you know, millennial, or he said, young people are progressive, so that's who we're gonna side with.
01:25:29.000 We're gonna side with them.
01:25:31.000 When I said, does that mean if there's a fact-based news story that would offend our audience, we don't report it?
01:25:38.000 And he says, yeah, I think that's fair.
01:25:40.000 And I immediately told everybody, and then he denied it.
01:25:43.000 And I'm like, whatever, man.
01:25:45.000 Of course they're gonna deny it.
01:25:46.000 Like, these people aren't journalists.
01:25:48.000 They're businessmen who are like, how do we make money?
01:25:51.000 Say what the people want to hear.
01:25:53.000 There are a lot of people working for these organizations who know it.
01:25:55.000 James O'Keefe, the man is doing the Lord's work.
01:25:58.000 When he exposed CNN, and you can see these people saying, like, we used to do the news, now we don't.
01:26:03.000 Those people aren't speaking out.
01:26:04.000 Those people aren't coming out and explaining to everybody they're lying to their faces.
01:26:08.000 But behind the scenes, they're saying in private, it takes a special kind of person to know you are engaged in operating an evil machine that destroys this country, but be like, I need the paycheck.
01:26:21.000 I don't know if that's it.
01:26:23.000 Speaking up for Barry, I'm not going to speak up for the rest of them, but she and I had this experience as someone who wrote for Barry for the New York Times.
01:26:33.000 She fought really hard to get sanity on the pages and to get different perspectives published in the Times that wouldn't have otherwise been there.
01:26:41.000 And I think that she swallowed a lot of stuff for a long time that made her deeply uncomfortable.
01:26:47.000 Because she felt like it was for the greater good and I think she got to a point where she realized that is a calculus that no longer But it's not equaling out anymore.
01:27:00.000 I'm not doing more good inside the machine than outside the machine.
01:27:03.000 I think that that's when she left.
01:27:05.000 But she fought very hard for a while to operate behind enemy lines.
01:27:11.000 And I think it just got to the point where she realized it just wasn't tangible anymore and she wasn't having enough of an impact to justify Not just, you know, being part of the machine, but also just the mental health strain that she was under and the assault that she was under by all of her colleagues.
01:27:27.000 But I respect her for saying as long as she did.
01:27:30.000 But I mean, there's people at CNN that James O'Keefe exposed.
01:27:33.000 Yeah.
01:27:34.000 They're like talking to this hidden camera saying, look at all the really awful things this company does.
01:27:39.000 And they're still there.
01:27:41.000 And they just they know.
01:27:43.000 That's the craziest thing to me.
01:27:44.000 These people are caught on camera talking about how they know they're involved in malfeasance.
01:27:48.000 But I think they think it's for the greater good, too.
01:27:50.000 No, no, no, no, no, no.
01:27:51.000 They're telling Hidden Camera, like, CNN is destroying everything.
01:27:55.000 Right, I know.
01:27:55.000 And we're helping them.
01:27:56.000 And then they're like, but we're gonna stay.
01:27:58.000 But I think they think, ultimately, it is for the greater good, because they are setting a narrative that they think is important to be changed.
01:28:12.000 Undercover camera exposes, they're admitting they're doing wrong.
01:28:15.000 Are you saying it's sort of like they're accelerationists?
01:28:17.000 They're like, look, we got to burn it down and we are willing to burn it down in this way.
01:28:21.000 No, like there's one famous guy who's like sitting in a chair and he's like, or one of the famous exposes is a guy who's like, we used to go out and report the news, man.
01:28:29.000 Now all we do is just complain about Trump and just try and drive this And he thinks that's a good thing.
01:28:35.000 I think they think that's a good thing.
01:28:36.000 But he's complaining about it.
01:28:37.000 And he's saying he hates being there.
01:28:39.000 I think it's annoying.
01:28:40.000 But ultimately, I think that given the choice between straight news reporting and trashing Trump, they don't... But this is a guy who's saying he wished they did real news reporting.
01:28:52.000 I think he wants to say that, but obviously he doesn't actually believe it, or he would have left.
01:29:00.000 Einstein said that's the definition of an insanity, is when you keep doing the same thing expecting a different result, and these people that are staying there expecting it, if they really are, they believe it's going to get better by staying there, they're insane, according to Einstein.
01:29:11.000 It's a very general term, but it could be a form of insanity.
01:29:15.000 I don't think that's a path to do it, but that could be an explanation of why.
01:29:19.000 I agree with you in that regard, that a lot of people say they want things, but they really don't.
01:29:23.000 Like, they either don't want it, or they actually just don't care enough to pursue going after something.
01:29:29.000 So maybe for a lot of these people, they just think, you know, CNN's culture behind the scenes is to rag on the company for being garbage, but no one really cares.
01:29:39.000 I mean, but also, where do they go?
01:29:41.000 Like, in their industry?
01:29:43.000 I mean, it's kind of like, you know, great sort of all the complaints that you hear about parents of kids in private schools, and they complain and they complain and complain.
01:29:53.000 I'm thinking of the sort of the folks that are talking about like the wokeness and in all these private schools.
01:29:59.000 And they stay because it's the pinnacle of achievement.
01:30:05.000 And I think that these people at CNN, this is the pinnacle of professional achievement in their industry.
01:30:10.000 And so where do they go?
01:30:11.000 What do they do?
01:30:12.000 There's no next step.
01:30:13.000 And so they're just sort of stuck in a holding pattern because they care more about their job.
01:30:19.000 That's my point, though.
01:30:20.000 They care more about their personal lives than they do about the system they're participating in.
01:30:25.000 Yeah, but I don't think it's just the paycheck.
01:30:27.000 I think it's also their pride.
01:30:28.000 I think it's a lot of their self-identity.
01:30:36.000 It's probably why they demoted Taylor Lorenz.
01:30:38.000 The behind-the-scenes scuttlebutt was that long-time staffers were losing their minds.
01:30:44.000 Complaining that she was, like, besmirching the organization and tarnishing its name.
01:30:48.000 She really was.
01:30:49.000 Oh, absolutely.
01:30:50.000 She was a terrible PR for that company.
01:30:51.000 Yeah, and they thought the controversy was gonna generate traffic or... So this is what... I heard this from someone who, you know, had, like, behind-the-scenes access or something like that.
01:30:59.000 Is this at CN... This is... At Washington Post.
01:31:01.000 Yeah, okay.
01:31:02.000 That the long-standing employees of the Washington Post, like the older people, were like, you are destroying the legacy of this company.
01:31:08.000 and that apparently there are people there who thought that Taylor Lorenz was going to generate
01:31:13.000 traffic through like controversy or stuff like that. I don't want to say that's confirmed.
01:31:17.000 It's just rumor mill stuff. I've heard the same rumors.
01:31:19.000 Right. So they're like they're journalists.
01:31:22.000 And so I wonder if the Washington Post was like, these people are going to quit on us,
01:31:26.000 and then we have nothing. So we got to do something. And so they demoted Taylor Lorenz.
01:31:30.000 I mean, I think it I don't think that's why they do. I think that they just decided that she was
01:31:35.000 a liability.
01:31:36.000 I think she became a liability.
01:31:37.000 Well, that's what I mean.
01:31:38.000 She was a liability.
01:31:38.000 People were, like, there was a risk to the company.
01:31:40.000 But I don't think it was that it was a risk to the company.
01:31:43.000 I just think that they realized that she's... That's what a liability is.
01:31:46.000 No, no, no.
01:31:46.000 I think that they realized that she has no loyalty and that any controversy that she conjures, it's often going to be at their expense because she doesn't care about the brand.
01:32:00.000 She just cares about herself.
01:32:01.000 And so I don't think that they were worried that people were going to quit.
01:32:04.000 I think they were worried that she was a beast that was about to turn on them.
01:32:09.000 Someone told me that when we put up the billboard in Times Square saying she doxed the libs of TikTok, that she immediately went and demanded they file a suit or something and file a legal thing to get taken down.
01:32:19.000 And they were like, it's an opinion statement.
01:32:20.000 You can't do anything about it.
01:32:22.000 And she lost it.
01:32:24.000 And then she went on Twitter and said it was so stupid and laughable.
01:32:27.000 And then I responded to her and I was like, I'm glad you think it's funny.
01:32:31.000 Great.
01:32:31.000 You got to say your thing.
01:32:31.000 I got to say my thing.
01:32:32.000 And then she lost it.
01:32:33.000 She blocked me and started screaming like it's violence or whatever.
01:32:36.000 And I'm like, okay, dude, whatever, man.
01:32:39.000 Someone pointed out to me that Timcast.com has a higher credibility rating than CNN.com.
01:32:44.000 That's amazing.
01:32:45.000 That's true.
01:32:46.000 That is in fact true.
01:32:48.000 That's kind of funny.
01:32:49.000 CNN's garbage.
01:32:50.000 Imagine what we'd have if other stuff wasn't going on.
01:32:53.000 I get the vibe that media companies, or like PR companies, like that news organizations have tended towards public relations.
01:33:01.000 That the news they produce is a form of PR.
01:33:05.000 The news, the medium is the message.
01:33:07.000 And so they're just trying to keep the way they look about presenting the news palatable for the masses, as opposed to just directly reporting the information.
01:33:18.000 And so that's why they have spin doctors and things like that.
01:33:21.000 It's a little concerning.
01:33:22.000 Well, our standards at TimCast.com are extremely rigorous, and we have, like, conversations over how we frame things, even.
01:33:30.000 We don't just fact check, we frame check.
01:33:32.000 So we've had a conversation about, do we say pro-life or pro-choice?
01:33:34.000 We say neither.
01:33:36.000 Only in the context of when it, like, is truly explaining the circumstance.
01:33:40.000 But if someone comes out and says, we demand access to abortion, that is pro-abortion.
01:33:44.000 We don't need to say anything else.
01:33:45.000 It's not about choices.
01:33:46.000 It's just about whether you're for.
01:33:48.000 Nobody goes to a rally and says like, we think people should be able to choose their own meals, choose their own birth birthdays.
01:33:54.000 I'm like, okay, choice is a political term.
01:33:57.000 Life is a political term.
01:33:58.000 Are you against abortion or for abortion?
01:34:00.000 Of course, the pro-life people agree.
01:34:03.000 And they're like, that's fine.
01:34:04.000 I get it.
01:34:04.000 We're against abortion.
01:34:05.000 The pro-choice people lose their minds.
01:34:06.000 We are not pro-abortion.
01:34:07.000 Stop saying that.
01:34:08.000 And I'm like, dude, we are not going to editorialize this.
01:34:10.000 Let's go to Super Chats!
01:34:11.000 If you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, and head over to TimCast.com because we got some stories for you coming up on the After Hours show, man.
01:34:22.000 I'll just tell you, like, Boston Children's Hospital hysterectomies on children?
01:34:27.000 Yeah, we'll talk about that because this is... The way she delivered it, too.
01:34:32.000 I'm looking forward to that.
01:34:32.000 It's crazy.
01:34:33.000 She's like laughing and smiling.
01:34:34.000 It's Joker-level stuff, man.
01:34:36.000 Anyway, let's read some superchats from all of y'all.
01:34:42.000 Oh, here's one.
01:34:42.000 James Eaton says, I really like Static Shock.
01:34:45.000 He's a great superhero.
01:34:46.000 You ever watch that show or read that comic?
01:34:48.000 No.
01:34:48.000 Static Shock.
01:34:49.000 They should do a Warner Brothers movie for Static.
01:34:51.000 That'd be legit.
01:34:52.000 I'd totally watch it.
01:34:53.000 Anyway, I had no idea what he was trying to say.
01:34:55.000 Was he talking about the audio maybe?
01:34:57.000 Maybe.
01:34:57.000 I like it better that he's just trying to get your opinion on a superhero comic really quick.
01:35:03.000 All right.
01:35:05.000 Cantankerous says, Tim, you keep mentioning picnicking at the Battle of Fort Sumter, and you are confusing it with the First Battle of Bull Run, July 21st, 1861, which was the first land battle of the Civil War.
01:35:17.000 Perhaps you are correct.
01:35:19.000 I was reading an article online when they mentioned the picnicking, and they may have I may have misinterpreted what they were saying.
01:35:26.000 I was reading a historical article and it was like, we know Battle of Fort Sumter, which started out the Civil War or whatever.
01:35:32.000 People were so in disbelief, they were picnicking on the hillside, and I may have assumed it was the same thing.
01:35:38.000 But you want to... Yeah, the first thing I typed, picnic battle of, and the first thing that came up was, was the first battle of Bull Run really the picnic battle?
01:35:47.000 Ah, okay, well there you go.
01:35:48.000 Thank you, Kentinkeros, for the correction!
01:35:51.000 I will make sure to apply it to all of my analogies moving forward.
01:35:57.000 All right.
01:35:58.000 Let's see.
01:36:00.000 David C. says, From last night, does Ian understand that our politicians are like this because they aren't investigated?
01:36:06.000 All of them should fear investigation and prosecution.
01:36:10.000 Um, geez, I mean, there's so many reasons why people in control of the military and the power are doing what they're doing.
01:36:18.000 I think maybe part of it is that they feel like as long as they're in that position that they won't be investigated for doing what we've basically asked them to do behind our backs, which is control the military.
01:36:28.000 I mean, the amount of bombings and stuff that's going on in the world right now with our eyes blind to it is it's nuts.
01:36:35.000 It's nuts.
01:36:37.000 Um, we could talk more about that too.
01:36:40.000 I was talking about forgiveness.
01:36:41.000 I feel like a lot of this is like the air, like you were saying about Barry Weiss, that maybe she feels guilty about what she had done while she was part of the cult.
01:36:50.000 And like, maybe she said the N word and it's going to come out in 2017 at a meeting and like, just let it go, man.
01:36:56.000 Let all this past crap go so we can focus on right now and the future.
01:37:02.000 Alright, Jay says, while I can't help being nervous, I am still hopeful.
01:37:06.000 I had to remind myself to breathe, slow down, and reset your thoughts.
01:37:09.000 We will defeat the authoritarians.
01:37:11.000 Right, Ian?
01:37:12.000 Uh, that is one way to look at it.
01:37:13.000 Yes, you slow down to speed up.
01:37:15.000 It's like getting traction with a wheel on a road.
01:37:17.000 If it spins too fast, it's not gonna go anywhere.
01:37:21.000 All right, a bunch of Super Chats saying that Ian and my mics aren't working.
01:37:26.000 They were working, so the issue was that we turned up Bethany, because she was really quiet at first, and then she was picking up Echo from the room, so I muted the purple mic and turned yours down, and everything came out okay.
01:37:36.000 Well, there you go.
01:37:36.000 Yeah, solved it.
01:37:38.000 Beavis McLean says, check out Executive Order 13292, section 13.
01:37:43.000 Classification authority clearly states, authority to classify information may be exercised by the President in performance of executive duties.
01:37:50.000 This includes declassification as well.
01:37:52.000 Love you, Ian Crossland.
01:37:53.000 Oh, love you too.
01:37:54.000 What was that Executive Order number?
01:37:56.000 Do you have that again?
01:37:57.000 13292, section 1.3.
01:38:01.000 Brandi Green says, Ian, thank you for the tip on differentiating the good Weinstein brothers.
01:38:05.000 Is it Weinstein?
01:38:06.000 Weinstein's like Einstein.
01:38:08.000 Those are the good guys.
01:38:09.000 Weinstein brothers by rhyming with... Oh, it's right there.
01:38:11.000 I should have read it.
01:38:12.000 Rhyming with Einstein.
01:38:13.000 P.S.
01:38:14.000 I'm a S-A-H-M.
01:38:16.000 What is that?
01:38:17.000 Stay-at-home mom.
01:38:17.000 Stay-at-home mom.
01:38:18.000 And watch Tim Cast and Pop Culture Crisis with my 22-month-old daily.
01:38:22.000 Here's to a based homeschool education.
01:38:25.000 That's awesome.
01:38:27.000 We are their baby's afterschool pop culture show.
01:38:29.000 That's amazing.
01:38:31.000 They're going to learn all about Johnny Depp.
01:38:36.000 I had the honor of meeting a local politician in West Virginia recently and we're in discussions about I'm going to be helping fund a micro school.
01:38:46.000 Which believes in the traditional American values.
01:38:49.000 And they do have a Bible study, but I'm told it's optional for parents who just want to get away from the woke stuff.
01:38:54.000 But I dig it.
01:38:55.000 Microschool.
01:38:56.000 So it's going to be really small classrooms.
01:38:58.000 It's basically like the next level up after homeschooling, like private tutors, but in a bigger setting.
01:39:03.000 So I am absolutely trying to make sure we are putting our money where our mouths are.
01:39:11.000 Microschooling, that's it.
01:39:13.000 It was legit.
01:39:14.000 Like, when I heard what they were talking about, the way they want to handle stuff, it's brilliant.
01:39:17.000 Instead of having, like, grades, you have grade subjects.
01:39:20.000 So, like, your third grade math and seventh grade reading.
01:39:23.000 And then they just work with you where you... That's what we do.
01:39:26.000 I homeschool my kids, and that's exactly...
01:39:29.000 It's so great.
01:39:30.000 It was one of the worst things about the Hope Scholarship issues that West Virginia is having is that it took away a lot of choice that parents had to opt into other education.
01:39:39.000 The Hope Scholarship is like a program in West Virginia that would give money if you decide to pull your kid out of public school and choose an alternative route so you could put it towards all kinds of things and it got rejected by a judge in the state and that is a real blow to school choice in my opinion in West Virginia.
01:39:56.000 E Rodriguez says, I'm catching up on one and a half playback speed and Tim is bordering on a rap god levels of words per minute.
01:40:03.000 That's it.
01:40:05.000 Like I mentioned before, like when I'm doing segments on my other channels, I'll be like, I'll have so much going on with work that I'm like, I got to get this done fast.
01:40:13.000 But the segments are timed, not word count.
01:40:17.000 So I'll start talking really fast and end up turning a 20 minute segment into a double timed 40 minute segment because I'll say, dude, I put on one of our shows last week at two times speed and was listening to you, and I could, it was so fast, but I could understand every word you were saying because of your enunciation.
01:40:34.000 Yeah, you have incredible enunciation.
01:40:36.000 I don't think you get enough credit for that.
01:40:37.000 I don't know, do I?
01:40:38.000 It was funny.
01:40:39.000 It's extremely clear the way you speak about the headlines.
01:40:43.000 Yes, you're right.
01:40:44.000 That was a good interview.
01:40:46.000 That was a good, what's the word?
01:40:47.000 I listened to you.
01:40:51.000 Deli says, has the FBI done anything good recently?
01:40:54.000 It's tough.
01:40:55.000 It's tough to tell because they do a lot of stuff in secret.
01:40:57.000 I mean, they did deal with the mob pretty handedly.
01:41:01.000 I, um, I don't know what they're doing with the cartels.
01:41:03.000 I feel like we could say they did this thing, right?
01:41:05.000 And then I'd be like Ruby Ridge and you'd be like, but they did this thing, right?
01:41:07.000 I'd be like, do you remember Ed and Elaine Brown?
01:41:13.000 I think Luke knew these people.
01:41:16.000 They were, it was 2007 I think, they weren't paying their income tax.
01:41:19.000 And so they had like a hundred acre property in New Hampshire and just said, nope.
01:41:23.000 And then the feds had to come in.
01:41:25.000 But they were really scared that they were gonna get another Waco or Ruby Ridge with like these people.
01:41:30.000 And then they realized, they said that these tax abolitionist people were letting supporters in.
01:41:38.000 So they just put on plain clothes, came up and knocked on the door and said they were supporters, got let in and then arrested them.
01:41:42.000 And that was like the end of it.
01:41:44.000 I looked at the IRS job hiring thing for these new 80,000 people.
01:41:48.000 Did you see some of the requirements?
01:41:49.000 They're like, just so you know, you gotta be ready to work 50 hours a week, use weapons against people if it comes up.
01:41:55.000 We should pull that up.
01:41:56.000 Did you see that?
01:41:56.000 That was specifically the Criminal Investigation Division of the IRS, which has been around since 1919.
01:42:01.000 Uh-huh.
01:42:02.000 So, like, if you didn't pay some tax?
01:42:03.000 It really looks like a tax mob that they're trying to put together.
01:42:06.000 So, this is crazy, because I'm seeing so many people mischaracterize what's going on.
01:42:09.000 Like, I don't like the IRS, dude.
01:42:11.000 Like, come on, who does?
01:42:13.000 And I don't think there should be 87,000 new IRS agents.
01:42:16.000 But I'm seeing people be like, Democrats want to hire 87,000 new IRS agents who are authorized to use deadly force.
01:42:22.000 They're building an army.
01:42:22.000 And I'm like, no.
01:42:24.000 They are hiring Criminal Investigation Division, because the IRS has a law enforcement section.
01:42:29.000 But they're not hiring 87,000 dudes with guns.
01:42:32.000 There's images of the IRS police.
01:42:36.000 They have badges.
01:42:37.000 It says police.
01:42:38.000 They wear armor.
01:42:38.000 It says police.
01:42:39.000 So it's been normalized, just so everyone knows.
01:42:41.000 Well, yeah, for a hundred years.
01:42:43.000 And what they claim to particularly go after is if there's Al Capone-style stuff.
01:42:50.000 Mobsters who are doing money laundering schemes and things like that, the IRS sends in the criminal enforcement division to go after them.
01:42:57.000 I think people just didn't know that existed.
01:42:59.000 And so now they're freaking out.
01:43:01.000 Totally freaks them out.
01:43:02.000 Rightly so.
01:43:02.000 All right.
01:43:06.000 What is this?
01:43:07.000 Waffle Sensei says, Tim, are you going to repeat your deleted tweet from today on the after show?
01:43:12.000 Repeated, deleted tweet.
01:43:14.000 Did you delete a tweet?
01:43:17.000 Which one?
01:43:18.000 Oh, the one that I had to... I don't know.
01:43:19.000 I don't know what you're referring to.
01:43:21.000 The one that they made me get rid of?
01:43:23.000 Today?
01:43:24.000 No, when they locked my account.
01:43:26.000 Do you ever tweet stuff out and then remove it?
01:43:28.000 Only if there's a typo.
01:43:29.000 Yeah.
01:43:29.000 I do that too.
01:43:30.000 Oh, actually, yeah, there's probably been a couple instances, like where I make a mistake or something.
01:43:35.000 I was gonna post one last night.
01:43:36.000 I just left it on the screen unpublished and then I went to the sauna.
01:43:39.000 Or no, it was two nights ago.
01:43:40.000 And I'm like, if I still want to put it up out of the sauna, then I'll put it up because I couldn't decide.
01:43:43.000 Some things don't work in text.
01:43:44.000 You got to say them.
01:43:45.000 I just feel like I don't have the personality for Twitter.
01:43:47.000 I don't know what you need to be good at it.
01:43:51.000 And I appreciate people who are good, but like, I just feel like I'm not cut out for that world.
01:43:55.000 I'm just, you have to be like really, really just brutal and very pithy.
01:44:01.000 And I feel like I am good at those things.
01:44:05.000 Jeb Reid says fact.
01:44:06.000 The U.S.
01:44:06.000 has already fallen.
01:44:08.000 Republic is no longer.
01:44:09.000 The cornerstones of this country are shattered.
01:44:11.000 The next phase is mass attacks on, say, regular people.
01:44:16.000 He's insinuating the government will be doing it.
01:44:19.000 There was an article today that referred to the Gadsden flag as far-right extremism.
01:44:25.000 When your own country's history is labeled by the corporate press, by the institutions as extremism, It kind of feels like your country is being worn like a skin suit, you know.
01:44:37.000 Well, there was that New Yorker cover of the Republican House and the Democratic House.
01:44:44.000 Yes!
01:44:44.000 And the Republican House had an American flag.
01:44:47.000 Yeah.
01:44:49.000 I remember that.
01:44:50.000 Yeah, it was a really...
01:44:54.000 Surprisingly insightful and striking cover, actually.
01:44:57.000 It was, and that cover did not get enough attention.
01:45:00.000 No, it didn't.
01:45:00.000 Because the left-wing house was warm and open, well-manicured, green lawn, pride flag, Black Lives Matter, no American flag.
01:45:09.000 Yeah, it really told on itself.
01:45:10.000 It kind of did, yeah.
01:45:11.000 I mean, we get, so at Heroes of Liberty, we publish books about, you know, Alexander Hamilton, Ronald Reagan, And we tried to run ads on Twitter about our books in the wake of Roe v. Wade and we were sort of advertising like faith and freedom whatever we never said anything about Dobbs nothing and Twitter throttled us and now I mean like Facebook throttled us also right in the very beginning like this idea that patriotism is somehow political is I mean we've been told that directly by these social media companies.
01:45:46.000 Yeah, it's a global technocracy, Jeff.
01:45:49.000 Jeff was who made that last comment, right?
01:45:51.000 I want to make sure.
01:45:52.000 I don't know.
01:45:52.000 Don't be too black-pilled, man.
01:45:55.000 You know, you got neighbors.
01:45:58.000 Ryan Hunter says, I think my biggest fear about the future we're staring down is the idea that a U.S.
01:46:03.000 civil conflict gets to a point where foreign entities like China and or Russia can recognize our breakaways.
01:46:10.000 Yeah, last night I was like, we do not want to fight each other.
01:46:12.000 If people start fighting each other, not only if you advocate for that, you've lost the plot.
01:46:15.000 You do not want that.
01:46:16.000 Outside governments will fund people to fight.
01:46:20.000 You don't want that crap.
01:46:22.000 That's how it was in the revolution.
01:46:23.000 That's how it was in the Civil War.
01:46:24.000 I think that's what's happening now.
01:46:25.000 I mean, I think TikTok is that.
01:46:28.000 I think that TikTok is fomenting and throwing accelerant on the fires that we already have going.
01:46:36.000 But what'll end up happening in a civil war is China's gonna go to West Coast states and say, what do you need to win?
01:46:40.000 Yes.
01:46:41.000 Yes.
01:46:42.000 That's what we got to avoid, is that kind of thing.
01:46:44.000 Could you imagine, like, the year is 2137.
01:46:46.000 Oh, that's too far in advance.
01:46:48.000 And, like, the United States and the Chinese Communist Party are, like, going over history, and they're like, when the revolution started, it was thanks to Chinese intervention.
01:46:57.000 And they make movies called, like, there's, like, a new movie called The Patriot.
01:47:01.000 And, like, a Chinese general lands in California and is like, I will help you win.
01:47:04.000 In San Francisco.
01:47:05.000 Yeah, in San Francisco.
01:47:07.000 We used to not have Trans-Pacific Magnetic Trains before we were one country.
01:47:11.000 Trans-Pacific Magnetic.
01:47:12.000 United States of China.
01:47:14.000 Yeah.
01:47:16.000 All right.
01:47:17.000 All right.
01:47:17.000 Let's get some more super chats.
01:47:21.000 I think the FBI is important.
01:47:22.000 I do.
01:47:22.000 appropriate a portion of the FBI's budget to be grants to the state's bureaus of investigation
01:47:26.000 instead.
01:47:27.000 Then we'll see if they can still afford to fund partisanship in their budget.
01:47:31.000 I think the FBI is important.
01:47:32.000 I do.
01:47:33.000 Interstate crime is an issue and dealing with it is something we need to do.
01:47:38.000 The problem is I don't feel like anyone has confidence in the institution at this point
01:47:42.000 so something's got to change.
01:47:43.000 Otherwise, people are just gonna get angrier and angrier, and then you'll get crazy stuff like what we saw today, which we definitely do not want.
01:47:50.000 We could have an FBI agent on the show someday.
01:47:52.000 Probably.
01:47:54.000 Didn't we have, like, former?
01:47:55.000 Yeah, we've had some people who were formerly working in that field.
01:47:58.000 Yeah, it's interesting.
01:48:00.000 There's a lot in this area.
01:48:02.000 Mike Rollman says, make Dan Bongino head of the FBI.
01:48:06.000 Okay.
01:48:06.000 Can we?
01:48:07.000 That would be fun.
01:48:08.000 Let's do it.
01:48:10.000 Is it possible?
01:48:10.000 He has experience in that field.
01:48:12.000 Jim Comey can do it.
01:48:13.000 The sky's the limit.
01:48:16.000 Let's go.
01:48:17.000 John Kirsten says there is no need for the FBI when they serve practically the same function as the U.S.
01:48:21.000 Marshals.
01:48:22.000 That's interesting.
01:48:23.000 Yeah, I hear that.
01:48:25.000 But do the Marshals do the investigatory work and things like that?
01:48:27.000 Yeah, that's my question.
01:48:29.000 They do?
01:48:29.000 I don't think they do.
01:48:30.000 I think they just do arrests.
01:48:32.000 Yeah, I like watching those old westerns where the marshal would go out to collect a bounty or something like that.
01:48:37.000 Those are fun.
01:48:38.000 Just watched a little Young Guns last night.
01:48:40.000 There you go.
01:48:42.000 Matthew Jamieson says the CIA was doing MKUltra by fines paid to citizens of Canada.
01:48:47.000 Klaus Schwab's assistant says humans are hackable.
01:48:49.000 Would the gov... Would the gov to do this?
01:48:54.000 Yes.
01:48:55.000 Okay.
01:48:56.000 Yes.
01:48:57.000 Waffle Sensei says, Hey bro, can I get an update on the album?
01:49:00.000 I need some sick beats to kick at work.
01:49:02.000 You know, I don't think we're doing an album.
01:49:04.000 I think we're doing an album, but we're just releasing the singles.
01:49:07.000 Cause we talked about it and like, it's not really the way they, no one really does it anymore where they just put out an album.
01:49:12.000 And so we've got a song planned for release in 10 days.
01:49:16.000 Ten days bro, that's crazy.
01:49:17.000 I'm so excited.
01:49:18.000 I heard you playing a couple songs earlier.
01:49:19.000 I was singing along with them.
01:49:21.000 Dude, I'm really stoked on... People are gonna be confused by whatever this band is.
01:49:27.000 No one's gonna be able to define it.
01:49:28.000 Yeah, it was like, there's this one song that I really like, A Million to One.
01:49:32.000 It's like, uh, Don't Stop Believin' by Journey, but it's kind of like the Foo Fighters.
01:49:37.000 I think it's way too simple to be described like that.
01:49:39.000 Yeah, like, it's got that, like, uplifting, like, kind of vague, inspirational feel.
01:49:44.000 Like, Don't Stop Believin', that's what I got.
01:49:45.000 Poppy.
01:49:46.000 It's real popular.
01:49:46.000 We've got one song that's like discordant electro with guitar and like electric drums that has like weird voice modulation.
01:49:55.000 It's a really trippy song.
01:49:57.000 And then like the first song we're putting out is very just like pop, like with like rock in the end.
01:50:02.000 I don't know, it's all over the place.
01:50:04.000 Because I don't like bands where it's like they write one song and then copy it seven times and release an album.
01:50:09.000 Yeah, well I like it when you're like, that's the same band?! !
01:50:12.000 That's what I'm looking for.
01:50:14.000 Oh yeah, people are going to be like, the first they're going to say, there's no way that's Tim singing.
01:50:18.000 Then they're going to say, is this still Tim?
01:50:22.000 Because it's like, it's all very different.
01:50:23.000 Is that Tim?
01:50:24.000 And it'll be me.
01:50:26.000 That's the funny thing, like, we have a song out already called Will of the People, and, like, half the comments are like, is this really Tim singing?
01:50:32.000 It's like, we used to play it.
01:50:33.000 You can Google it.
01:50:33.000 You can just, like, watch me sing on the show.
01:50:35.000 Yeah, Friday nights.
01:50:36.000 Not that I was singing very well back then, because it's like, you can't record 16 hours a day, you know, work 16 hours a day, and then try and sing at the end of it, but... Eric Miller says, I watched your bit about Monsters, Inc.
01:50:48.000 Is it just me, or is it a mockery of mainstream media, i.e.
01:50:52.000 scared children as frightened viewers?
01:50:55.000 That's actually a fair point.
01:50:56.000 We were talking, Mary said that, isn't Monsters, Inc.
01:50:59.000 like adrenochrome?
01:51:01.000 Like scaring the kids and then using that to fuel their machines or whatever?
01:51:05.000 Actually, it's a really good point about the media.
01:51:07.000 Freaking out and screaming in people's faces to get them all scared so they can power their machines.
01:51:12.000 There you go, man.
01:51:14.000 What a creepy world that we live in.
01:51:15.000 Adrenochrome, for the record, is oxidized adrenaline, if anyone's wondering.
01:51:19.000 Yeah, and people believe really weird things about it for some reason.
01:51:23.000 Matt Burkhart says, please keep Hannah Clare as a full-time member of the show.
01:51:27.000 She is definitely my favorite.
01:51:29.000 She rocks.
01:51:29.000 After Tim, of course.
01:51:30.000 Well, you know.
01:51:31.000 Hey, they didn't say that, did they?
01:51:32.000 Thank you so much!
01:51:33.000 Yes, it does.
01:51:33.000 It does say that.
01:51:35.000 You heard it from them first.
01:51:36.000 You're great on this show.
01:51:37.000 I love your information.
01:51:38.000 Oh, gosh.
01:51:38.000 Thanks for having me on.
01:51:39.000 It was always weird because I would watch the show downstairs and be like, I don't even think I could talk this much.
01:51:44.000 I couldn't talk for this long.
01:51:46.000 No, she was down there like, I could talk better than all of them.
01:51:48.000 Well, I practiced my enunciation so that I could be on the show.
01:51:51.000 Unique New York.
01:51:52.000 I'm so grateful to be here.
01:51:53.000 It's been fun.
01:51:54.000 How now, brown cow?
01:51:56.000 Cubicle Investor says, Ian, last night you said government level crimes should be pardoned.
01:52:01.000 What message does that send minorities doing time for far less egregious crimes?
01:52:06.000 That would just further solidify class issues.
01:52:08.000 Yeah, I'm open to a mass pardon.
01:52:10.000 And I don't know, I don't think it should have to stop at any one spot.
01:52:15.000 What's that?
01:52:16.000 You would pardon everybody?
01:52:17.000 Well, I don't know if everybody's the right word, but I'm talking like 150 years of nonsense.
01:52:21.000 We've been at each other's throats for... Anarchy.
01:52:23.000 Yes.
01:52:24.000 Everyone's pardoned.
01:52:24.000 But I don't advocate for, like, turning the other way for ongoing crime or for future crime.
01:52:29.000 I'm not talking about that.
01:52:30.000 And what about violent crime?
01:52:31.000 Yeah.
01:52:31.000 Rapes and murders.
01:52:32.000 Violent crime's kind of off the table.
01:52:33.000 I'm not really into pardoning violence.
01:52:35.000 Okay.
01:52:36.000 I just feel like that's a good... when you give that spiel, you should include that.
01:52:40.000 You should be like, let everyone out!
01:52:42.000 Let's let the murderers go!
01:52:44.000 But like, when someone orders a drone strike, is that a non-violent crime?
01:52:48.000 So here's another question.
01:52:49.000 What about a dealer who knowingly was distributing fentanyl-laced drugs?
01:52:57.000 And getting kids hooked on them that died?
01:52:59.000 Yeah, exactly.
01:52:59.000 Is that violence?
01:53:01.000 Would you pardon them?
01:53:01.000 Because drug... I mean, I...
01:53:05.000 Well, let's think about it.
01:53:06.000 I can't do this alone.
01:53:06.000 It's got a long conversation we should have.
01:53:08.000 All right, Powder PZ says, my dog killed one of my chickens today.
01:53:12.000 Rest in peace, Drumstick.
01:53:14.000 You will be missed, little chickie-choo.
01:53:16.000 If you watch Chicken City, we're talking about who we're gonna eat first.
01:53:18.000 Oh, no.
01:53:20.000 Yeah, so the- I think we should start aiming at things like that,
01:53:21.000 like your chicken tenders.
01:53:23.000 Yeah, people do that.
01:53:25.000 Drumstick and tenderloin.
01:53:26.000 Why don't you put it to vote?
01:53:28.000 I feel like it would be Hunger Games, but chicken style.
01:53:30.000 We're not going to eat any that have names.
01:53:33.000 But like 70% have no names.
01:53:36.000 What if you just like take mug shots of them and then just put it up for a vote?
01:53:41.000 I think we should actually criminally charge them.
01:53:44.000 So Roberto, for instance, he was sent to, we call it Cocktown.
01:53:48.000 There's 18 roosters there.
01:53:50.000 Oh my gosh!
01:53:51.000 Yeah, now what most people out here do is they say, you let nature have them.
01:53:56.000 Like, you just let them do their thing, go off, and then maybe a fox will eat them or something.
01:54:00.000 Right.
01:54:00.000 But some of the people here did not find that appealing.
01:54:04.000 And I'm like, well, if we don't let them go, we're gonna eat them.
01:54:06.000 But you know, roosters are tough.
01:54:07.000 They're like rubber.
01:54:08.000 You gotta, you gotta slow cook them and get it going if you want to, you know, get it right.
01:54:13.000 But Roberto, He abused one of the hens.
01:54:18.000 He's terrible.
01:54:19.000 Dorothy.
01:54:19.000 Yeah.
01:54:20.000 Dorothy.
01:54:20.000 And so initially we had to lock Dorothy away because she was getting hurt.
01:54:24.000 Protective custody.
01:54:26.000 And then people complained and said, why are you punishing the victim?
01:54:28.000 And I said, good point.
01:54:30.000 Roberto has been sentenced for his crimes and he has been sent to a penal colony.
01:54:34.000 Banished.
01:54:34.000 You banished him.
01:54:35.000 Yes.
01:54:35.000 Banished.
01:54:36.000 So it's funny because we have three really big black star roosters, which are bullies.
01:54:43.000 And so we, we have a chicken, we have a coop, and then in it are all the smaller roosters, and then Roberto's in charge, because he was the biggest and oldest.
01:54:51.000 But the three Blackstar boys were just, they would gang up and spin around them.
01:54:55.000 So we have them with an electrified fence outside the coop, and it looks like they're prison guards.
01:55:00.000 It's actually kind of funny.
01:55:01.000 That's really funny.
01:55:01.000 And they jump on top of it, and they, they crap all over the place.
01:55:05.000 One of them apparently jumped out and tried getting away, but we were like...
01:55:08.000 You just let him go?
01:55:09.000 Go ahead.
01:55:10.000 You want to go to nature.
01:55:12.000 I mean, how could you stop them, right?
01:55:14.000 But we can have mock trials for the ones we're going to eat.
01:55:17.000 Maybe we can walk in and be like, come here, everybody.
01:55:20.000 Come here.
01:55:21.000 And the one that doesn't come is the one that gets eaten.
01:55:24.000 So we get to keep the personable ones.
01:55:26.000 Oh, JMK says, Joe Rogan said in his podcast today that he thought Tim Pool was crazy for thinking a civil war was coming.
01:55:33.000 And now he believes Tim may be right.
01:55:35.000 I listened to that.
01:55:36.000 Did he really?
01:55:37.000 I don't know.
01:55:39.000 Can someone tweet that to you?
01:55:41.000 Would you be able to see it?
01:55:42.000 Sure, yeah.
01:55:43.000 Tag me in it or something.
01:55:44.000 Yeah, tag Ian Crossland right on Twitter.
01:55:47.000 Tag Ian if you saw that clip.
01:55:49.000 I gotta tell you, it was 2019, I think, when I went on Rogan with Twitter, and I said at the end of it that if Twitter kept doing what they were doing with censorship, it was going to lead to civil chaos or conflict or something.
01:56:01.000 And I was like, that's why I'm building a van!
01:56:02.000 I can build in a van with all this equipment in it, and like, solar power, cause... Yo, I'm- I'm looking at what's going on.
01:56:09.000 And you know, honestly, I knew people would think that was crazy.
01:56:12.000 That I would say something like that, I'm gonna build a van and go live down by the river!
01:56:15.000 They're gonna be like, this dude's off his rocker.
01:56:17.000 And I wasn't- I- look.
01:56:19.000 I just say, I don't know exactly what's gonna happen.
01:56:22.000 But you look at where we are now, and if you don't think we are in the midst of historical tumult, then you are a frog boiling in a pot.
01:56:30.000 When did you say that?
01:56:31.000 2019?
01:56:32.000 In 2018 I said a civil war was coming.
01:56:34.000 On 2019 I was on Rogan when I said if Twitter keeps doing this.
01:56:38.000 It doesn't seem so crazy after 2020.
01:56:40.000 When there was armed guards outside Costco and I had to buy freaking diapers.
01:56:45.000 And then we had a formula shortage the next year.
01:56:47.000 I think that our trust in the stability of our civilization is sufficiently rocked by everything we saw in 2020.
01:56:56.000 That I don't think people would think you were crazy saying that now.
01:56:59.000 I think the writing's been on the walls for some people who pay a lot of attention to news and cultural shifts for a long time, but people don't want to hear it because they can't conceptualize what a civil war would look like.
01:57:11.000 That's like even now with everything that's going on with, you know, a recent attack on an FBI building, there's this question of like, is it starting?
01:57:18.000 Is this it?
01:57:19.000 Is this what we've been talking about?
01:57:21.000 Yeah, I said before the show.
01:57:22.000 Oh, what were you going to say?
01:57:24.000 Oh, I was just going to joke that my contractor thinks I'm crazy because I have a nice stockpile of food and diapers.
01:57:29.000 Before the show, I was like, no, Tim, I don't think it's civil war.
01:57:31.000 And I was like, you know what?
01:57:34.000 This is semantic.
01:57:35.000 It doesn't matter what we call it.
01:57:36.000 We're all aware of what's this chaos.
01:57:39.000 I think it's global, I think for sure.
01:57:42.000 That's undisputable at this point, that there's global corporate.
01:57:46.000 But it really doesn't matter how you term the thing.
01:57:49.000 The chaos is real.
01:57:50.000 The chaos is apparent, I believe.
01:57:52.000 Man, Raymond G. Stanley Jr.
01:57:55.000 says, Roberto Jr.
01:57:56.000 is the second best junior.
01:57:57.000 So we got Roberto Jr.
01:57:59.000 on this billboard in Times Square advertising Chicken City, and my favorite comment was, Roberto Jr.
01:58:04.000 is sitting atop a throne he did not create.
01:58:07.000 Because Roberto was the boss for a while.
01:58:09.000 But the thing is, Roberto's mean.
01:58:11.000 Roberto's a pretty mean guy.
01:58:15.000 And Roberto Jr.' 's really nice.
01:58:17.000 Right.
01:58:17.000 Roberto's like a great warrior with low intelligence as a leader.
01:58:19.000 But Roberto Jr. would just like look at you and he like does his thing and then he flaps his wings and walks around.
01:58:23.000 But it's probably because we raised Roberto Jr.
01:58:26.000 From when he hatched.
01:58:27.000 Right.
01:58:27.000 And you know, and then Roberto we bought.
01:58:29.000 Roberto's like a great warrior with low intelligence as a leader.
01:58:33.000 Like you don't really want that guy as your leader.
01:58:35.000 But he was the first one.
01:58:37.000 It was like Alexander the Great's father.
01:58:39.000 What was his name?
01:58:41.000 The first Macedonian?
01:58:43.000 Something of Macedon.
01:58:44.000 I love how he looks around.
01:58:45.000 We're like, Philip of Macedon.
01:58:46.000 Someone knows.
01:58:47.000 You'll get this from one of us.
01:58:48.000 Philip of Macedon.
01:58:48.000 He was, I mean, he was also extremely intelligent and charismatic, but not like Alex.
01:58:52.000 I think part of it is, like, Roberto had a different set of circumstances.
01:58:55.000 He had to kind of make his way through life.
01:58:58.000 It was a much harder time, and now, you know, he has given his son this kingdom to rule, and it's, you know, it requires a different set of skills.
01:59:05.000 He's doing a good job, actually.
01:59:06.000 Roberto Jr.
01:59:07.000 is a good dude.
01:59:07.000 Yeah, he makes good music.
01:59:08.000 Matt Giese says, you guys remember when Alec Baldwin shot someone?
01:59:12.000 Pepperidge Farm remembers.
01:59:13.000 Yo, if someone's gonna play James Comey in a movie, it's Alec Baldwin.
01:59:16.000 Because last time I watched a video... No, but why do we let him play people in movies at all, you know?
01:59:20.000 Oh, that's true.
01:59:21.000 I feel like, you shoot someone on set.
01:59:23.000 I don't know that we, like, bring you back on, right?
01:59:25.000 Maybe?
01:59:25.000 Yeah, that's a good point.
01:59:26.000 He needs 10 years off.
01:59:27.000 I mean, the insurance premiums alone, I feel like.
01:59:30.000 Oh my gosh.
01:59:32.000 Mike the Dad Crosby says, oh, on Twitter it appears Trump stole the nuclear codes.
01:59:37.000 I hope they can change them now, and since it's too easy to guess, 4321.
01:59:40.000 Is that what you're saying on Twitter?
01:59:43.000 Password.
01:59:43.000 I think he stole some alien's information.
01:59:47.000 I think, I mean, they're out there.
01:59:51.000 I think he stole the information.
01:59:52.000 Wow.
01:59:52.000 You think he's talking, he's told information about talking plasma?
01:59:56.000 Where they triangulate lasers and hit a point in the sky where it shows up on radar and they think it's a craft, but they just move around a dot?
02:00:02.000 I feel like I just set off the Ian Batt signal.
02:00:05.000 Let's talk about aliens.
02:00:07.000 You showed your true color.
02:00:08.000 I said the other night Zeta Reticuli wasn't real.
02:00:10.000 I kind of misinterpreted what I was thinking.
02:00:13.000 Aliens didn't really come from Zeta Reticuli.
02:00:16.000 That's just what they told Bob Lizard.
02:00:17.000 It's literally 959.
02:00:19.000 Oh, okay.
02:00:19.000 It is real.
02:00:20.000 You did this.
02:00:21.000 All right, Sparky says, Tim, don't you realize the feds are swatting you?
02:00:25.000 Why would I realize that?
02:00:27.000 Why would the feds be swatting me?
02:00:29.000 You know, I just, I don't understand this kind of, like, conspiracy logic, I guess.
02:00:35.000 What would be gained by swatting me when it has zero impact on us?
02:00:40.000 We had a credible threat, which did cause an evacuation, but... That was the night after I was here last.
02:00:46.000 Yeah, 40,000 people watched, and then Jeremy Hambly gave us a bunch of money.
02:00:50.000 And so I was like, eh, it kind of sucks, but you make the best of it.
02:00:53.000 We learned that Chercast was a viable option for the business.
02:00:56.000 And also, I need to stress that the properties that are being targeted are known specifically to a group of people that we're aware of, and so evidence does not indicate there is the Feds coming after us, to put it simply.
02:01:09.000 I can't give out too much information, but let me just say there's something called coloring the water.
02:01:14.000 Where you have three cups on a table, and there's a pool of water under all of them.
02:01:18.000 How do you figure out which cup is leaking?
02:01:20.000 You put red in one, green in one, and blue in the other.
02:01:23.000 And whatever color the water on the table turns, you know where the leak is in the cup.
02:01:27.000 To put it simply, the people who are committing the swattings have made a series of errors in thinking that traps were not set.
02:01:34.000 So...
02:01:35.000 We'll see what happens, but at this point, considering what I know of the investigation, I can say that much.
02:01:41.000 Because the deeds have already been done, and we've already ensnared enough information, we think.
02:01:46.000 So, we'll see how it goes.
02:01:47.000 We'll see what happens.
02:01:49.000 Let's grab a couple more.
02:01:52.000 What do we got here?
02:01:52.000 We'll grab one more.
02:01:54.000 Tyler W. says, To the gulag with you, Roberto, I banish you.
02:01:59.000 You know, I'm worried about old Roberto.
02:02:01.000 You know, I don't want any harm to befall him.
02:02:03.000 He's one of the original cast members of Chicken City.
02:02:05.000 But we just can't have a dad banging his daughters.
02:02:08.000 Yeah, that was the problem.
02:02:10.000 He went too far.
02:02:11.000 Well, that's what they do.
02:02:12.000 Chickens, you know.
02:02:13.000 So now we have, it's funny, we have his son Isaac.
02:02:17.000 Who is a Brahma Red Island Red Mix.
02:02:20.000 And he's massive.
02:02:22.000 He's huge. He's gonna be so big. Brahma. Brahmas are so big.
02:02:25.000 And so he's already, you know, he jumped on a hen today, and the other hens ran up start
02:02:31.000 pecking him to stop him.
02:02:33.000 They were like, get out of here. Like, yo, yeah, good.
02:02:36.000 Second wave feminism. Second wave.
02:02:38.000 All right, everybody, if you haven't already, would you kindly smash that like button,
02:02:44.000 subscribe to this channel, share the show with your friends and become a member at timcast.com.
02:02:48.000 We're going to have that uncensored episode coming up in about an hour or so.
02:02:51.000 You can follow the show at TimCast IRL.
02:02:53.000 You can follow me at TimCast.
02:02:55.000 Bethany, do you want to shout anything out?
02:02:56.000 Yeah, Bethany Schoendark on Twitter and Instagram and HeroesofLiberty.com for your children's book needs and Deseret.com for my other thoughts.
02:03:06.000 We should get some of those books for the school I was talking about.
02:03:08.000 I was actually going to try to plug that, but I thought... Absolutely.
02:03:11.000 Yeah, I would love that.
02:03:11.000 No, yeah.
02:03:12.000 Right on.
02:03:13.000 Cool.
02:03:14.000 I'm Hannah Clare.
02:03:15.000 I'm a writer for timcast.com.
02:03:16.000 It's a very cool news site.
02:03:18.000 I recommend you check it daily.
02:03:20.000 You can follow me on Instagram at hannahclare.b.
02:03:24.000 I was on today's episode of Pop Culture Crisis.
02:03:26.000 So if you go to YouTube, you can check that out.
02:03:28.000 And you might see me a lot there next week.
02:03:31.000 Hi guys, Ian Crossland.
02:03:32.000 You know, it's easy to get things wrong when you talk a lot as your job in public.
02:03:35.000 So if I ever say anything that's factually inaccurate, please tweet it at me and hit my app, Ian Crossland, on Twitter or on Mines so that I can attempt to correct the error on air live, like what I did about Zeta Reticuli earlier.
02:03:49.000 Happy to be here.
02:03:50.000 Always happy for the opportunity.
02:03:51.000 Bethany, great to see you again.
02:03:52.000 Bye, everyone.
02:03:53.000 Thank you guys for tuning in tonight with Bethany.
02:03:55.000 We always have a great time.
02:03:56.000 I'm loving the presence of more ladies.
02:03:58.000 I feel like this is definitely a trend I can get behind.
02:04:01.000 You guys can follow me on Twitter and Minds.com at Sarpatchlets as well as Sarpatchlets.me.
02:04:06.000 We will see you all over at TimCast.com.
02:04:08.000 Thanks for hanging out.