Timcast IRL - Tim Pool - February 17, 2023


Timcast IRL - Biden May Have ACCIDENTALLY Shot Down Hobbyist Balloon With F22 w-Sara Higdon


Episode Stats

Length

2 hours and 4 minutes

Words per Minute

215.66263

Word Count

26,850

Sentence Count

2,284

Misogynist Sentences

45

Hate Speech Sentences

29


Summary

While all of us are spinning around in circles about UFOs, Joe Biden and his F-22s may have accidentally shot down a hobbyist balloon. Plus, a school is trying to transition all of its fifth graders to be queer.


Transcript

00:00:00.000 So we were having a hard time figuring out what the lead story was today because the
00:00:22.000 CEO of YouTube is stepping down and I'm like, that's big news.
00:00:26.000 But at the same time, a local hobbyist balloon club believes, or I should say it is being insinuated, their balloon was shot down by Joe Biden and an F-22.
00:00:36.000 And so the story that Joe Biden may have accidentally shot down some small hobby group's balloon, panicking about UFOs, is just really, really funny.
00:00:45.000 So we decided to go with that one instead.
00:00:47.000 But at the same time, I wonder, I mean, if they deployed F-22s to shoot down what was just some hobbyist balloon because they were panicking, it makes us look really dumb.
00:00:56.000 But I also have to wonder if they're intentionally distracting us and drumming up some other news story about UFOs to keep our minds off of, say, I don't know, like a gigantic toxic chemical spill, which is going to pollute the water for 5 million people in the immediate and then probably pollute the farmland and the water for tens of millions of more in the coming weeks.
00:01:13.000 Or, I don't know, maybe Biden's just so incompetent, they saw a hobbyist balloon, panicked, thought it was China, and blew it up.
00:01:19.000 Well, today's gonna be fun.
00:01:20.000 Before we get started talking about all of that, head over to TimCast.com to become a member and support our work directly.
00:01:26.000 As a member, you'll get access to exclusive members-only segments in the TimCast IRL show.
00:01:30.000 That's tonight at 11pm.
00:01:31.000 We'll have that up and it should be a lot of fun.
00:01:33.000 Last night with Jimmy Dore was off the hook.
00:01:35.000 Jimmy just went off and when he gets into that groove and he starts talking about what's pissing him off.
00:01:41.000 Yeah, it's top-tier content, my friends, so over at TimCast.com.
00:01:46.000 And smash that like button, subscribe to this channel, share the show with your friends.
00:01:50.000 Joining us tonight to talk about this and a lot more is Sarah Higdon.
00:01:54.000 Thanks for having me, Tim.
00:01:55.000 So yeah, I'm Sarah Higdon.
00:01:57.000 I'm a content creator.
00:01:58.000 I have a YouTube channel.
00:02:00.000 I'm also a contributor over to Gays Against Groomers.
00:02:03.000 I'm an ambassador to Outspoken USA, as well as I'm the assistant editor for Reality's Last Stand.
00:02:10.000 I also do some freelance writing for the Postmillennial and Human Events.
00:02:15.000 And I've basically just been traveling the country lately doing speaking events with some of the mom organizations that are trying to end queer theory in our schools.
00:02:23.000 Wow, right on.
00:02:23.000 We actually, we do have another story about that.
00:02:25.000 There's like a school, I guess, where they tried to transition all of the fifth grade girls.
00:02:30.000 Oh my goodness.
00:02:30.000 Have you seen this?
00:02:31.000 No.
00:02:32.000 We'll get into it.
00:02:33.000 We sometimes get into the story too early.
00:02:33.000 We won't do the story.
00:02:35.000 But thanks for hanging out.
00:02:36.000 We also got Libby.
00:02:36.000 This should be fun.
00:02:37.000 She's back.
00:02:38.000 I'm back.
00:02:38.000 I'm back, everybody.
00:02:39.000 Libby Emmons with the Postmillennial.
00:02:41.000 Glad to be here.
00:02:42.000 Glad to be here with you, Sarah.
00:02:43.000 Yes.
00:02:44.000 I think it's been a while since we were, I think we got trashed in Atlanta?
00:02:44.000 Good to be here.
00:02:48.000 Yeah, that was fun.
00:02:48.000 Yeah.
00:02:50.000 What's up, everybody?
00:02:51.000 Ian Crossland, happy to be here.
00:02:53.000 If you haven't seen the new Cast Castle on YouTube, you're going to want to check it out.
00:02:56.000 It's a clip of the actual show on TimCast.com.
00:02:58.000 It was very, very funny.
00:03:00.000 It is pretty good.
00:03:01.000 It's received quite raucous reviews.
00:03:04.000 I'm very happy.
00:03:05.000 It is poking fun at Steven Crowder and the Daily Wire.
00:03:08.000 All of them, everybody.
00:03:09.000 Nothing's sacred.
00:03:10.000 Nothing's sacred.
00:03:11.000 What's happening, Serge?
00:03:12.000 Yo, I am at Serge.com.
00:03:14.000 I'm just hanging out, ready to go.
00:03:15.000 All right, let's jump into this first story.
00:03:17.000 Let's not waste any time.
00:03:18.000 This is from Aviation Week Network.
00:03:21.000 Okay, so they got the scoop.
00:03:23.000 While all of us are spinning around in circles being like, UFOs, what's going on?
00:03:27.000 Joe Biden's freaking out.
00:03:29.000 They're just deploying F-22s.
00:03:30.000 They're shooting down unidentified flying objects.
00:03:33.000 And here's my favorite part.
00:03:34.000 The pilot said it had no observable propulsion system.
00:03:37.000 Because balloons don't!
00:03:39.000 I said that, I swear, you watch my segment, I said, when they say there's no propulsion system, it could be
00:03:44.000 because it's a balloon.
00:03:45.000 Like, there's not gonna be a jet on a balloon or anything like that.
00:03:47.000 So they see this big object and they're like, I wonder how it's flying, and they shoot it down.
00:03:51.000 So here's the story.
00:03:52.000 Hobby Club's missing balloon feared shot down.
00:03:56.000 I'll give you the gist of it.
00:03:57.000 The general idea is, the Northern Illinois Bottlecap Balloon Brigade, Nib,
00:04:03.000 is not pointing fingers yet, but the circumstantial evidence is at least intriguing.
00:04:07.000 The club's silver-coated party-style Pico balloon reported its last position on February 10th at 38,910 feet off the west coast of Alaska, and a popular forecasting tool, the Hissplit model provided by NOAA, projected the cylindrically shaped object would be floating high over the central part of the Yukon territory on February 11th.
00:04:27.000 That's the same day a Lockheed Martin F-22 shot down an unidentified object of a similar description and altitude in the same general area.
00:04:35.000 Okay.
00:04:36.000 I'm sorry, China's laughing at us right now.
00:04:39.000 Yeah, I think it's cool, though, that there's hobby balloon clubs.
00:04:42.000 This balloon apparently was on its seventh trip around the world, so that's kind of cool for the balloon.
00:04:48.000 It's a shame that it's not there.
00:04:50.000 It is absurd, though, that we used our hi- perhaps, perhaps we used our highest, uh, our highest air force tools to shoot down a balloon.
00:05:01.000 Well, I mean, look, everybody was ragging on Biden over the Chinese balloon.
00:05:05.000 So they started panicking.
00:05:06.000 Now, then they started saying we think these could these these objects could be anything from like used car balloons or whatever.
00:05:12.000 And it's just like, oh, man.
00:05:13.000 Biden also said today at a press conference, he said that he has no regrets about shooting down the balloons.
00:05:21.000 Of course he doesn't.
00:05:22.000 Spending $400,000 on a missile to shoot down a hobby balloon?
00:05:26.000 Doesn't make any sense.
00:05:27.000 Is that how much the missiles cost?
00:05:28.000 I think that's what I read.
00:05:32.000 Half a million dollars because they panicked over this balloon.
00:05:35.000 Three of them.
00:05:36.000 Three.
00:05:37.000 Three balloons.
00:05:37.000 We are silly people.
00:05:38.000 Well, how big are these balloons?
00:05:40.000 That doesn't look like that big of a balloon.
00:05:42.000 No, that's a plastic bag.
00:05:43.000 They said it's like three cars or something like that.
00:05:46.000 That's a Pico balloon.
00:05:47.000 That's what that is.
00:05:48.000 That's a Pico balloon.
00:05:49.000 That's actually it?
00:05:50.000 That's a Pico balloon.
00:05:51.000 Yeah, the first thing was the three school buses.
00:05:54.000 The thing they didn't shoot down right away.
00:05:57.000 This is the threat right here.
00:06:00.000 I want you to look at this dastardly group.
00:06:03.000 Look at this woman.
00:06:04.000 This girl's wearing a suit.
00:06:06.000 That's how you know they're dangerous.
00:06:10.000 It would be a scene in a comedy movie where it shows the dejected hobbyists after they get shot down and they're all real sad about it.
00:06:17.000 I'm picturing those guys in the X-Files, the three guys, the researchers.
00:06:23.000 Oh, what was it?
00:06:23.000 The lone gunman.
00:06:24.000 Yeah, yeah, yeah.
00:06:24.000 I'm imagining those guys.
00:06:25.000 Yeah, that kind of disappointment.
00:06:27.000 Exactly.
00:06:28.000 The lone gunman, that's clever.
00:06:29.000 So what, now do our tax dollars have to pay these, well we probably should pay them back for their, for their balloon, I mean.
00:06:34.000 So what, it's gonna cost us half a million dollars, five hundred and one thousand dollars now?
00:06:39.000 This blunder by Joe Biden?
00:06:41.000 Maybe Biden should cover the cost.
00:06:42.000 What I wanna know is, there was the other one where the pilot actually missed.
00:06:46.000 That's right!
00:06:47.000 It's a million dollars!
00:06:47.000 That was a million dollar shot.
00:06:50.000 Was that the same type of balloon though?
00:06:51.000 Was that a real balloon?
00:06:53.000 Yo, could you just imagine they're seeing this silver cylinder and it's just a balloon and they're like, ahh!
00:06:59.000 And the missile misses.
00:07:00.000 Miss, miss, miss!
00:07:01.000 And they're like, taking it very seriously.
00:07:03.000 Yeah we are silly people.
00:07:05.000 Do you think this could be a military exercise though too?
00:07:07.000 Like I would have thought like if we were going to do this type of stuff we like they would have at least come out and tell us that like you know maybe we're preparing for China to send an EMP or something over but then you know it would cause issues if and so they it went with this alien theme but I could see them trying to lie because usually the lie is better than the truth.
00:07:28.000 They were doing something over D.C.
00:07:30.000 the other day, I guess.
00:07:31.000 Everybody was freaking out about that.
00:07:32.000 Well, remember back in 2014 or something when the 82nd jumped into Texas and everybody thought that the federal government was invading Texas?
00:07:39.000 It's kind of the same type of stuff.
00:07:40.000 There was this thing that happened a few years ago where a bunch of people started on Twitter, started posting photos of military vehicles and videos of trains transporting tanks.
00:07:51.000 And they all started saying, like, whoa, what's happening?
00:07:54.000 Everybody was in on it.
00:07:55.000 And then journalists started seeing all these videos and thinking something was really happening, started actually writing up stories about this.
00:08:01.000 And it was nothing?
00:08:02.000 They're going to JRTC or something for military training.
00:08:05.000 Well, no, no, it was like people took random images.
00:08:09.000 So they take a random image of a field with a guy standing and they'd be like, whoa, new image out of this city or whatever.
00:08:13.000 I'm like, whoa, now we're hearing that the police are showing up and puts a picture of a squad car.
00:08:17.000 And then someone would be like, the military is being brought in and you see helicopters.
00:08:20.000 And they're all just different random videos from different time periods.
00:08:23.000 But people were claiming were from one moment.
00:08:26.000 That's like that Evelyn Waugh novel, Scoop.
00:08:29.000 Where all the journalists get sent to the war zone and, you know, the main character gets there and there's no war.
00:08:35.000 There's just no war at all.
00:08:37.000 Well, it's like all the deepfake stuff that's happening, too.
00:08:39.000 Or in Wag the Dog, when they're like, oh, we're gonna have a war with Albania to cover up that the president slept with a Firefly girl.
00:08:48.000 I was, you know, like the other day, I was pretty scared about the AI stuff, you know, and I'm like, Oh, man.
00:08:54.000 And then just before this show, I saw one of the best AI deep fakes I've ever seen.
00:08:58.000 And I saw and now I'm really excited.
00:09:01.000 Because it was Donald Trump and Joe Biden playing Overwatch together.
00:09:04.000 Oh my goodness.
00:09:05.000 Is that available?
00:09:06.000 I want to watch that.
00:09:06.000 Yeah, it's on Political Compass Memes on Instagram.
00:09:09.000 I should pull it up.
00:09:10.000 I don't know if I can pull it up, because it's going to... Let me see if I can get it.
00:09:13.000 And there's a lot of swearing in it, so just, you know... But it's too good.
00:09:18.000 It was too good.
00:09:20.000 There's a lot of swearing in it, so let me see if I can get it.
00:09:23.000 We were talking about this at a meeting earlier.
00:09:25.000 This is... Oh, there it is.
00:09:26.000 There it is.
00:09:26.000 All right.
00:09:28.000 You know, you're... Cover your kids' ears, because we're playing this one.
00:09:32.000 I was playing Overwatch.
00:09:33.000 I like Overwatch, though.
00:09:34.000 Hold on, we gotta fix the audio.
00:09:37.000 Yo, this is awesome.
00:09:39.000 I love deepfake now.
00:09:40.000 I love this map, takes me back.
00:09:42.000 Is that you again, Joe, on my fucking team?
00:09:44.000 It's this guy.
00:09:44.000 Oh, GG, we lost.
00:09:45.000 This is my rank-up game, too.
00:09:47.000 We are not even out of spawn doors, and this guy is already complaining.
00:09:49.000 Someone dodge, please.
00:09:51.000 You fucking need to log in again, and you're gonna feed again.
00:09:53.000 How many accounts do I have to keep fucking avoiding?
00:09:55.000 You are not beating the hard-stuck master's allegations, Don.
00:09:58.000 That is so cap Joe.
00:10:00.000 Don't care.
00:10:00.000 Don't care.
00:10:00.000 I went golfing, hit a few holes in one, wanted to solo queue some Overwatch to end the day,
00:10:05.000 and I see fucking, fucking Bidenator in my lobby.
00:10:08.000 Bidenator!
00:10:09.000 Can't wait until Biden fixes matchmaking, Jesus fucking Christ.
00:10:12.000 Yo, that was the best!
00:10:14.000 It's the reality we need.
00:10:15.000 I saw that, like, right as we were getting ready to do this show, when did they post this?
00:10:19.000 They posted it an hour ago.
00:10:20.000 It's like Kang and Konos.
00:10:22.000 Yeah.
00:10:22.000 From The Simpsons.
00:10:23.000 I mean, this is one way political.
00:10:24.000 Swirling toward... I was golfing, I wanted to end the day with some quick cues in Overwatch, and then I see Bidenator in my chat.
00:10:31.000 We were saying today at a meeting earlier, our development meeting, one of the devs, Tony, was saying, like, what we really need is to see a deepfake that's terrifying.
00:10:40.000 That where everyone, like, basically realizes the horror that this could wreak on us.
00:10:45.000 Like, someone, I mean, just something that where everyone, but not that we need to do that to people, but I think in order for us to realize how dangerous this is, this technology.
00:10:53.000 So you're saying?
00:10:54.000 What would be the most horrifying deepfake?
00:10:56.000 War of the Worlds kind of freaked people out, that Orson Welles thing, people thought we were really being invaded by aliens.
00:11:02.000 I disagree.
00:11:02.000 The typical mistake people make with PR campaigns and things like this is that they think shock content, alien invasion, World War III, people won't believe it.
00:11:13.000 It's too out of the ordinary.
00:11:15.000 It's got to be something very simple, like Joe Biden giving a speech and saying something like, it's got to be In order to get people to understand it's scary and actually get them to believe it, it's got to be something about maybe a banking crisis.
00:11:29.000 Or like a city's been wiped off the map or something?
00:11:31.000 No one will believe that.
00:11:31.000 Nuclear strike?
00:11:32.000 No one will believe it.
00:11:33.000 People are going to be like, what?
00:11:34.000 And they're going to try and look it up.
00:11:36.000 But if it's a video of Joe Biden saying something like, we're growing deeply concerned with the rising inflation rates, which are now expected to reach 11% by the next month.
00:11:44.000 But the Federal Reserve has given us their word, they will lower the rates and try and get these inflation rates down.
00:11:51.000 People will then start freaking out if Biden says something like, we are worried, but please remain calm, that the cost of basic goods could reach upwards of $10 to $20 for things like a gallon of milk.
00:12:01.000 That's the kind of thing that people would see and be like, it would freak out the average middle class person, be believable enough to where they go, wait a minute, that was fake?
00:12:09.000 Holy crap.
00:12:10.000 I know that would definitely wreck the economy.
00:12:12.000 It could potentially wreck the economy for a day.
00:12:14.000 People will sell stocks, could destroy the stock market.
00:12:16.000 But to really scare people, I don't know if that would really scare people.
00:12:20.000 I think the scariest stuff is actually the revenge porn stuff that's happening, the deepfake porn that's putting people's bodies into these films that they didn't consent to.
00:12:28.000 Oh, like a porn of Joe Biden?
00:12:29.000 That would be really scary.
00:12:31.000 I disagree.
00:12:35.000 That stuff exists, you know it's not you and it's not real?
00:12:39.000 I don't know.
00:12:40.000 I can understand, it's shocking to people to have that happen to them for sure, but... I see what you're saying about it being something super normal, like something that could definitely be real and is in fact not real.
00:12:52.000 Yeah, exactly.
00:12:52.000 I mean, so it's like, so it's like, imagine, if you imagine that, like, the information that's been coming out over the past year, when things like bacon and milk have gone up, and eggs have gone up, and all of that, and it sort of gets ignored by a lot of the press, but something like that, those things really do have an impact on people.
00:13:09.000 That would be interesting.
00:13:10.000 Or what if he was, you know, what if Biden makes the announcement that we're definitely going to be sending troops into Ukraine?
00:13:16.000 Yes.
00:13:16.000 I was going to say, it's believable, but terrifying.
00:13:19.000 It's very believable.
00:13:20.000 And then you'd have everybody being like, you know, it was only in what was it March when Biden said there'll be no troops on the ground.
00:13:28.000 He said that very definitively.
00:13:29.000 He also said there'd be no tanks.
00:13:30.000 He also said there'd be no fighter jets.
00:13:32.000 What's next?
00:13:33.000 And that would that would set off an international firestorm.
00:13:36.000 If there was a video where it's Biden giving a speech, and he said something like, Russian artillery has struck the border of Poland, triggering Article 5.
00:13:48.000 I, as the President of the United States, am left with no choice but to deploy US forces to assist our partners in NATO in the war effort against Russia.
00:13:57.000 Make no mistake.
00:13:58.000 By the time the White House said anything?
00:14:00.000 To the American people, we are at war.
00:14:02.000 World War III has begun.
00:14:04.000 You can't go too much.
00:14:06.000 Like if he said Russia nuked a city or something, people would be like, eh, I would.
00:14:10.000 People did freak out.
00:14:12.000 We already know that there was the reporting that a missile hit Poland.
00:14:16.000 And everyone freaked out.
00:14:18.000 Turns out it was Ukrainian artillery that misfired or whatever, or crashed.
00:14:22.000 So that's the mid-range level where people will start panicking, the economy will get disrupted to a certain degree, and then they'll have to come out, and the White House would have to issue a statement saying it's not true, but even then you're gonna have people being like, I've got two videos, which one's real?
00:14:36.000 That's the scary thing.
00:14:37.000 Yeah, and you just wouldn't know.
00:14:39.000 And then the impact on international leaders Yeah, I mean, at that same point in time, it would come out, and Putin might be like, oh, now I'm going to preemptively strike.
00:14:50.000 Yeah, I know that.
00:14:51.000 And you extend a nuke preemptively.
00:14:52.000 And you're right, it does play into your, that is probably the scariest scenario that you could have, because it is believable, and it plays into your... It has to be on the border of unbelievable, where it's terrifying and may be true.
00:15:05.000 Like a meteor coming towards Earth or something?
00:15:07.000 That's completely unbelievable, nobody would believe it.
00:15:10.000 If it was coming out of Biden's mouth, you don't think half the country would buy it?
00:15:13.000 Plus NASA already did their thing where they blow up asteroids.
00:15:15.000 Did you see that?
00:15:16.000 If there's a video of Biden saying a meteor is coming and it's going to destroy the earth or slam into a city and
00:15:21.000 wipe it out, people are going to go, what is this? And they're going to
00:15:24.000 go to Google right away and say it's fake.
00:15:25.000 And it's going to say, don't look up.
00:15:27.000 But if it's a video of Biden saying something like a Russian artillery strike, we believe, has struck Polish
00:15:34.000 territory, triggering Article 5.
00:15:36.000 Poland has requested that the U.S.
00:15:38.000 deploy assets immediately to the area for defense.
00:15:40.000 People are going to see that and go, dude, dude, dude, they're going to share it with their friends.
00:15:43.000 And they're going to be like, look at this video, look at this video, because it's believable.
00:15:46.000 And then the Russian cabinet will be sharing it amongst themselves.
00:15:49.000 And that's where it becomes really scary, as do the, does the opposing party also believe it and could they trigger a war, which I don't want that to happen.
00:15:55.000 Do our allies do it?
00:15:56.000 Does the Russian military say, you know, President Putin, this deepfake is going around claiming
00:16:02.000 that, you know, Russian artillery hit and that the US is going to be deploying troops.
00:16:06.000 I say we pretend we think it's real and deploy troops and use this as a cat's belly.
00:16:10.000 Yeah.
00:16:11.000 Wow.
00:16:12.000 Trying to figure it out.
00:16:13.000 Totally fun, huh?
00:16:14.000 Everybody, when that happens.
00:16:15.000 Or, or, or, or, or, sorry, sorry.
00:16:19.000 Someone can make a video of Vladimir Putin playing Overwatch with Biden next.
00:16:23.000 Yeah, for sure.
00:16:24.000 Definitely Kang and Konos.
00:16:25.000 We were trying to figure out how to prevent this deepfake confusion earlier.
00:16:29.000 Another one of the developers, Alex, was like, you need to watermark your videos when they go up from now on.
00:16:34.000 The future is kind of like locking your door at night.
00:16:36.000 You can't expect the government to lock your door for you.
00:16:38.000 You can't expect other people to know if your video is real or not, or if it's a deepfake of you.
00:16:42.000 So you've got to somehow prove it.
00:16:45.000 But then I'm like, what about fair use?
00:16:46.000 Like, how did they prove that it's... Well, some developers will use, like, certain stuff.
00:16:50.000 I was reading an article before where it's in, like, movies and documentaries where they did, like, the whole documentary, and whenever it was, like, the AI generated, they would put, like, a halo around the person that they were AI generating.
00:17:02.000 So the audience subtly knew.
00:17:04.000 But then the conversation was, do we then, you know, That pulls you out of this immersive experience that you're gonna do.
00:17:12.000 So what's the trade-off of having, you know, in a movie or something like that where you're deepfaking?
00:17:16.000 There's a trade-off there somewhere.
00:17:18.000 Someone chatted, super chatted, it's because reasons that deepfakes won't exist in five years.
00:17:23.000 You'll just be able, it's going so quickly, you'll be able to make any kind of content you want.
00:17:27.000 But that's literally deepfakes, dude.
00:17:30.000 You're saying like you can just tell it to make whatever you want.
00:17:33.000 We already talked about this a couple weeks ago.
00:17:34.000 There's an ad I saw on Twitter.
00:17:37.000 for an AI video generating service, meaning it's an editing software, and you'll type in
00:17:42.000 slow pan in forest at night, and then it renders and then gives you a video showing trees and the
00:17:48.000 cameras panning through a forest. It's crazy. It's like when Chad GPT was writing stories.
00:17:53.000 But video. But video, which is even sicker. Now imagine once it can do people and voices all in
00:17:59.000 one, and you just type in, give me a video of an action, a superhero fighting a supervillain
00:18:05.000 in a city, and then it renders it and gives you a 30 second clip. You could then be like,
00:18:09.000 make a video of Joe Biden declaring war on Russia, and it would be indistinguishable
00:18:14.000 Well, and then when you combine that with VR, suddenly no one has to exist in reality ever again.
00:18:22.000 Will we ever have movies or anything like like that again? Because you can generate movies just
00:18:26.000 digitally? Yes, but what I think it'll be like is there will be user-generated movies and people are
00:18:33.000 going to say, oh dude, did you see that new movie from Ian? It's really good. Go to his
00:18:36.000 profile. He's got a Patreon. He makes movies.
00:18:39.000 And you'll literally just, you write out the treatment for a script, AI generate it, go in
00:18:45.000 and fine fine tune some of the points in the movie that you think aren't that good.
00:18:49.000 About a week's worth of work, and you've got a full-length Marvel movie.
00:18:51.000 Dude, there could be humans that, with a neural net, go into a hyperbaric chamber, and all they do is think about movies that are constantly being AI-created for the public, and they're just in a cocoon, where they're just generating thought.
00:19:04.000 The precogs.
00:19:05.000 Wow.
00:19:05.000 They'll strap themselves in and predict the future for us.
00:19:08.000 Wow.
00:19:09.000 Yeah, what were we talking about?
00:19:10.000 We were talking about Joe Biden shooting up kids' balloons.
00:19:16.000 I think that should be the first AI-generated film.
00:19:21.000 Like a kid's balloon goes up in the sky?
00:19:22.000 No, no, no.
00:19:23.000 The first one should be the people who are creating the AI-generated films.
00:19:27.000 It should be like super meta.
00:19:29.000 I still haven't heard Biden mention East Palestine.
00:19:31.000 Have you guys heard him mention it yet?
00:19:32.000 the end you find out they created the movie that you're watching about them
00:19:34.000 creating the movie. I think the balloons is a distraction I got this
00:19:37.000 because I still haven't heard Biden mention East Palestine have you guys
00:19:41.000 heard him mention it yet? No but FEMA denied claims for emergency relief
00:19:45.000 because people's homes weren't destroyed by the toxic chemicals. Yet okay well
00:19:49.000 are there chickens died?
00:19:50.000 There are chickens died.
00:19:52.000 Did you see the river?
00:19:52.000 Did you see J.D.
00:19:53.000 Vance out there with the stick in the water?
00:19:55.000 Like he put a stick in this little stream and scraped the bottom of it.
00:20:01.000 I think I think Jack Posobiec.
00:20:03.000 Oh yeah, I just retweeted that.
00:20:04.000 Yeah.
00:20:05.000 And he scraped it up and it was just like rainbow chemicals in the water.
00:20:09.000 I think they threw a rock in the river.
00:20:11.000 They threw something in the water and then it starts to like, I don't know if it's bubbling.
00:20:14.000 Oh, let's pull this up.
00:20:16.000 We got this clip here.
00:20:17.000 Jack Posobiec, what the funk?
00:20:19.000 Has anyone seen Water do this?
00:20:21.000 So let's start this clip over and we'll play it.
00:20:23.000 This is East Palestine, Ohio.
00:20:29.000 Whoa, wait.
00:20:30.000 Whoa.
00:20:30.000 Wow.
00:20:32.000 It's all in the bottom of the creek bed.
00:20:36.000 Yeah.
00:20:37.000 That's what oil does, right?
00:20:38.000 Yeah.
00:20:39.000 So when the rock hits the water, it knocks all the chemicals up to the surface.
00:20:43.000 It's all in the bottom.
00:20:43.000 The difference is oil is on top of the water.
00:20:45.000 So it's heavy.
00:20:46.000 It's heavy.
00:20:47.000 Well, it could be.
00:20:48.000 It's a group.
00:20:48.000 They had a group of chemicals, these trains.
00:20:49.000 There's three of them, main ones.
00:20:51.000 There's vinyl chloride.
00:20:52.000 There's something called butyl acrylate.
00:20:53.000 And then there's another stuff thing called benzene, which is extremely dangerous to burn.
00:20:57.000 They said that the benzene was only residual benzene, but this is like, they're not gonna tell you
00:21:02.000 if there's a ton of benzene that got burned.
00:21:03.000 It creates dioxin, which is a persistent toxin.
00:21:07.000 It's the only one of those that's persistent, is the dioxin.
00:21:09.000 So really, the vinyl chloride is a half-life of like 2.3 days.
00:21:12.000 Wasn't there something about how the vinyl chloride was creating hydrochloric acid?
00:21:16.000 I saw something like that.
00:21:17.000 I've heard that if it mixes with water, it can create hydrochloric acid.
00:21:20.000 But apparently, the vinyl chloride and the butyl acrylate are not that big of a deal.
00:21:24.000 After, you know, two or three weeks, they start to work their way out of the environment.
00:21:27.000 It's a lot of other stuff that's a problem.
00:21:29.000 It turns out it was good that they burned those things because otherwise they go up
00:21:32.000 and they come back down and coat stuff.
00:21:34.000 So that stuff was used, but it's the benzene.
00:21:36.000 And we need to know more about how much benzene was on those trains.
00:21:39.000 They just tell us it was benzene residue and that may be from previous shipments or maybe
00:21:43.000 there's a little bit, but I've got a feeling I wouldn't be surprised if the official reports
00:21:47.000 are not true.
00:21:48.000 Like, where's Joe Biden right now?
00:21:49.000 Well, they don't want to admit that, you know, a small town mayor that was only hired for
00:21:54.000 one single reason, you know, isn't doing a good job.
00:21:59.000 The Secretary of Transportation.
00:22:00.000 The booty judge.
00:22:03.000 I talked to my mom, who lives in Ohio, they're near Akron, Ohio, so they're about 70 miles west of this, that the federal government offered assistance and the governor of Ohio declined, said they didn't want help.
00:22:13.000 I don't know if it's true.
00:22:14.000 This is just what I was told by one of my parents.
00:22:16.000 Mike DeWine.
00:22:17.000 Mike DeWine, the governor.
00:22:18.000 Mike DeWine had a presser yesterday and he was saying that, what was it, Norfolk Southern said that they would pay for everything.
00:22:25.000 I thought they were offering people like a thousand bucks to shut up.
00:22:28.000 There were, I don't know, I didn't hear that.
00:22:29.000 I know that they were offering, there was the town hall that they had last night, they were offering people to move, to like pay for them to move.
00:22:36.000 What does that mean?
00:22:37.000 Buying your house?
00:22:38.000 I guess.
00:22:38.000 I don't really understand.
00:22:39.000 You do not want to be breathing in dioxins, like those, that is...
00:22:43.000 Yeah, I mean there's something like less than 5,000 people in that town.
00:22:48.000 Right now or before?
00:22:50.000 It's a tiny town.
00:22:51.000 But it's not even about that.
00:22:52.000 It's about the Mississippi River.
00:22:54.000 It's about the whole region.
00:22:55.000 It's about the Ohio River Basin.
00:22:57.000 West Virginia, the majority of West Virginia is going to get hit by this.
00:23:01.000 We're lucky we're not.
00:23:02.000 We're like in this over... From what I've been reading... You're upriver, right?
00:23:05.000 So you guys won't have to deal with the water issues.
00:23:07.000 I think we're not connected to it.
00:23:08.000 Oh, okay.
00:23:09.000 I mean, they're all connected in some fashion, but like, it doesn't flow in to us.
00:23:13.000 Cause I know that even the charts show where I'm at in Atlanta, like North Georgia, north of me is affected, but we're not.
00:23:19.000 Wow.
00:23:19.000 Yeah.
00:23:19.000 There's a Twitter account called General underscore JWJ.
00:23:24.000 Just retweet whatever is on there.
00:23:25.000 I did actually.
00:23:26.000 I retweeted it last night.
00:23:27.000 The character name is Lanius on it and it's a long thread and he breaks down, he or she breaks down basically things I've been talking about, the three different chemicals involved, the half-life of the different chemicals, and it seems to suggest that... Oh, there's the map, yeah.
00:23:41.000 Oh, that's where I saw the map.
00:23:42.000 I saw the map on your feed.
00:23:44.000 And I don't know if that map, I mean, that's just a map, that's the Ohio River, and it's like, if it gets contaminated, that area could be destroyed or endangered.
00:23:50.000 I liked your response to the government.
00:23:52.000 Yeah, the government says, not accurately or potentially contaminated drinking water, and you put, this is unconfirmed by official or government sources, don't wait for confirmation.
00:24:00.000 Yeah, this is, take care of yourself first, and then if, they're not going to come tell you if you're about to die, like, you know, they don't want to create panic.
00:24:06.000 What was the account you retweeted that I'm looking for?
00:24:08.000 It's called Lanius, L-A-N-I-U-S.
00:24:10.000 It's from last night, so it'd be, yeah, there it is.
00:24:12.000 This is a pretty cool thread.
00:24:13.000 Everyone, if you get a chance, check it out.
00:24:15.000 It's got a lot of data.
00:24:15.000 I don't know.
00:24:16.000 I have not been able to confirm or deny the accuracy of it, but it's pretty thorough.
00:24:20.000 And the heat kind of intimates that the water is not at danger.
00:24:25.000 It's not really the water.
00:24:25.000 It's the surrounding air.
00:24:27.000 Well, and that's where this, DC and everything, it's all upwind of that.
00:24:31.000 So it's all, everything's going to move east.
00:24:33.000 So it should be affected.
00:24:34.000 Yeah.
00:24:36.000 But you can't shoot these out of the sky.
00:24:37.000 You can't shoot it.
00:24:39.000 And I also think they're not too concerned about, I don't know, I don't know, that it wasn't too concerned about it going far away from the source, that it's just like a local heavy pollutant.
00:24:49.000 But then it's like, if there's a lot of dioxin in the air, it's a different story.
00:24:52.000 That stuff doesn't go away.
00:24:53.000 And I know people that live in that.
00:24:54.000 See, that area has been brutalized by lung and health issues for a long time because it's steel city and it's coal area.
00:25:00.000 So they have had these types of health issues and now they're going to have just another one added onto it.
00:25:05.000 Yep.
00:25:07.000 This is why they ship jobs to China, by the way.
00:25:08.000 way. This is a big, it's economic, but this is one of the main ones because if there's
00:25:11.000 a spill, they want it over there.
00:25:12.000 That's a huge reason why Trump was removing environmental regulations and protections.
00:25:18.000 The idea is that we don't want these toxic chemicals and pollutants in our air, so make
00:25:23.000 China do it where they're smog filled and polluted to crap, and then we have clean skies
00:25:27.000 and use the petrodollar to maintain our economy. Trump wanted to bring the factories back,
00:25:32.000 so he said in order to do so, you got to lower their taxes and you've got to reduce the regulations
00:25:36.000 on them.
00:25:37.000 No, because then we'll see more ecological disasters like this.
00:25:40.000 And they're right.
00:25:41.000 But the upside is we control our own production line.
00:25:43.000 We have our own steel.
00:25:44.000 We have our own mining.
00:25:45.000 We got to buy it all from China.
00:25:46.000 If we go to war with China, we lose our product.
00:25:48.000 I mean, if you go to war with your trade ally and they're shipping you your steel, you lose the war.
00:25:54.000 Yeah, that's actually a huge problem.
00:25:55.000 Also, we lose the innovation that comes with having manufacturing in your country.
00:26:00.000 When you have all of the factories, you're innovating processes, just like we did.
00:26:06.000 Ford created the assembly line, like Ford or don't like Ford.
00:26:09.000 He created the assembly line.
00:26:10.000 We're not doing any kinds of innovation like that at this point because we're not doing any of the manufacturing.
00:26:16.000 I'd like to see drone manufacturing in space.
00:26:18.000 Because like when you have low gravity, you can have 100 million drones all working in synergy on a machine moving pieces of metal together.
00:26:24.000 And so size is almost irrelevant in construction.
00:26:27.000 You can just be... Space Force should get on that.
00:26:29.000 Dude, we need a huge space fleet.
00:26:31.000 That'd be awesome.
00:26:31.000 This image is really important to understand what's going on, what the dangers are.
00:26:36.000 Bioaccumulation.
00:26:37.000 So a contaminant gets in the soil or the dirt or whatever, the plants absorb some of it as they grow, the insects will eat those plants, the birds will eat the insects, and then eventually, at the higher level, the food that we eat will be heavily contaminated with these chemicals.
00:26:52.000 I don't know specifically about the ones in Ohio, though.
00:26:54.000 That's part of why bottom-feeding fish like shrimp and, or, they're not fish, but bottom-feeding sea creatures are, they say, you know, high in lead, high in metals, is because of the bio... Mercury?
00:27:02.000 Or shrimp?
00:27:03.000 Shrimp are bottom-feeders?
00:27:04.000 Yeah, shrimp are bottom-feeders.
00:27:05.000 Lobsters.
00:27:06.000 I knew lobster was.
00:27:07.000 Mussels, clams, things like that.
00:27:08.000 Aren't catfish bottom-feeders?
00:27:10.000 Yeah.
00:27:10.000 That's why they tell you not to eat bottom-feeders, because they're full of garbage.
00:27:13.000 Yeah, the heavy metal falls down, they eat it, or they eat things that have already eaten it, etc, etc.
00:27:17.000 People like it, though.
00:27:18.000 Catfish, that's like a big... I mean, yeah, I think you go to Cracker Barrel?
00:27:22.000 Yeah, for real.
00:27:23.000 You can eat catfish with hot dogs, so I mean... Wait, what?
00:27:27.000 For real?
00:27:27.000 Pretty sure you just toss... Like, that's what you use as bait.
00:27:30.000 Drag it across the ground, they'll eat anything, so... Wow.
00:27:34.000 Oh, wow.
00:27:34.000 All right, well, I didn't know that.
00:27:36.000 Anyway, we're downwind from this disaster, so, yeah, how are you doing?
00:27:39.000 I bought a bunch of air filters yesterday, and I got up my water filtration situation.
00:27:45.000 I got a bunch of LifeStraws and a Big Daddy LifeStraw.
00:27:47.000 It's like three gallon.
00:27:48.000 Take it down to the river, fill it up.
00:27:50.000 I was looking at water distillation, because a few ways to get vinyl chloride out of your water is distillation.
00:27:57.000 Boiling it's not enough.
00:27:58.000 I think boiling it might help, because it'll release the gas.
00:28:02.000 A lot of it's just vinyl.
00:28:03.000 Interesting, interesting.
00:28:04.000 But distillation's the key.
00:28:08.000 Air.
00:28:09.000 Air.
00:28:09.000 There's an industrial method to get rid of vinyl acrylate or vinyl chloride.
00:28:14.000 God, these names, man.
00:28:15.000 I'm not a chemist.
00:28:16.000 Monochloride is like air blowing or something.
00:28:20.000 I was looking at the collapse of the energy grid in South Africa and everything that's going on there, and I started thinking I should buy a generator.
00:28:30.000 Yeah, you should.
00:28:31.000 I just got one.
00:28:32.000 I'm totally going to do that.
00:28:33.000 Send me the information, because I don't know what to get.
00:28:35.000 I'll tell you about it right now, actually.
00:28:37.000 But if you tell me what it is, I'll just buy that.
00:28:38.000 I got a 1,000 watt solar battery.
00:28:40.000 Well, it's a 1,000 watt battery.
00:28:41.000 Jackery is the company.
00:28:43.000 I mean, we have all these Delta solar batteries with solar panels, too.
00:28:46.000 Yeah, and I don't know if the brand's necessarily important, but it has extremely good reviews.
00:28:50.000 John Rich was talking about it, too.
00:28:51.000 He has two of them.
00:28:52.000 Mine's a thousand watt, and then two solar panels.
00:28:55.000 Okay.
00:28:55.000 Good reviews is important, because you don't know if it's going to work as well as it says it does.
00:28:58.000 It's cool.
00:28:59.000 What do you do?
00:28:59.000 You just pop it outside?
00:29:00.000 Yeah, yeah.
00:29:01.000 I think within eight hours, you can get about a thousand watts of power charged, and then you got your USB ports, your plug.
00:29:07.000 I have no idea how much wattage it takes to power a little house, though.
00:29:11.000 Yeah, more than that little thing.
00:29:14.000 1,800 watts maybe.
00:29:16.000 1,000 watt battery might get you about six to eight hours of freezer.
00:29:19.000 A freezer for eight hours.
00:29:22.000 But I don't know all the exacts.
00:29:23.000 You know, we got to do those.
00:29:24.000 We got to get away from electric based systems for things we don't need.
00:29:27.000 So for instance, there's a couple of technological revolutions heading our way.
00:29:32.000 One is, this one's easy, black piping run across your roof.
00:29:37.000 What's that?
00:29:38.000 Literally black pipes on your roof.
00:29:41.000 It absorbs the sunlight, heats the water, creating a pressurized hot water system.
00:29:45.000 You know what's a great thing?
00:29:47.000 When you have sand-filled blocks that you can use to build your house, and if you put those on the... My aunt has this in Connecticut, and she has one wall that is all sand.
00:30:00.000 and the sun hits it and that room gets so toasty warm in the dead of winter.
00:30:05.000 And otherwise it's wood stoves in her house, but that room is the toastiest, warmest room
00:30:09.000 all into the night.
00:30:10.000 Look at the sand.
00:30:11.000 The other thing that they've, we talked about this a few years ago.
00:30:13.000 It's a, I think it's a closed system fluid that can absorb and release infrared energy.
00:30:19.000 So the idea was you can have it absorb sunlight, but it doesn't get hot itself.
00:30:24.000 You then run it into the house, where you can then use another process to trigger the release of the energy and heat things up.
00:30:30.000 This is passive solar, is what this is.
00:30:33.000 There is, like, backpacking.
00:30:35.000 Backpackers use all this type of stuff all the time, like the solar showers, the solar chargers, and everything like that.
00:30:40.000 I have a couple of each of those things, and you just, yeah, you take them out there, you hang it up, and... You get passive solar and triple glazed windows, and you're all set.
00:30:48.000 Let's do a hard segue into one of the biggest stories of the day.
00:30:51.000 We got this from Vox.com, our favorite lefty news source.
00:30:54.000 YouTube CEO Susan Wojcicki?
00:30:57.000 Is that how you say it?
00:30:58.000 Wojcicki?
00:30:58.000 Wojcicki?
00:30:59.000 You know, I saw this piece, I saw this story earlier today, and I thought, I wonder if Tim knows how to pronounce this name, because I sure don't.
00:31:05.000 No, because I've heard it pronounced so many different ways.
00:31:08.000 People call her Wajiski, but I'm like, how is it Wajiski?
00:31:11.000 It's Wajsiki.
00:31:13.000 Wajsiki?
00:31:14.000 Wajiski.
00:31:15.000 Susan, come on the show and tell us.
00:31:17.000 It's time.
00:31:18.000 You think she's got an NDA?
00:31:19.000 But she's resigning.
00:31:21.000 One of the most prominent women in tech, one of Google's earliest employees, is leaving the company.
00:31:25.000 So I suppose the question is, are we happy about this, or are we worried about this?
00:31:29.000 I'm neutral about it.
00:31:30.000 Yeah, I think this is, you know, it's the devil you know versus the devil you don't.
00:31:34.000 Who's going to replace her, I think, is the biggest question, because the next person could be even more heavy on the censorship ban.
00:31:40.000 Do you think it was Susan?
00:31:42.000 Do you think she was heavy?
00:31:43.000 I think she was, but, you know, she was definitely, I mean, but Google was as a whole.
00:31:49.000 But are Google and the other tech companies starting to realize that this is not a good business model?
00:31:55.000 Because they're all laying off a ton of people right now.
00:31:58.000 They subsidize YouTube.
00:32:01.000 What do you mean?
00:32:01.000 YouTube is subsidized by Google, by Alphabet.
00:32:03.000 Yeah.
00:32:04.000 But it generates a ton of revenue, billions, and they strangle out the ad market by running the system.
00:32:10.000 But yo, it's so crazy expensive.
00:32:12.000 People need to understand.
00:32:14.000 I've done events for companies where it's going to be a proprietary live stream.
00:32:18.000 Right now we are streaming at around 6 megabits per second upload rate to 34,000 people.
00:32:25.000 So multiply 6,000 by, or you can do 6,000 kilobits or 6 megabits by 34,000, and that's what you're sending out.
00:32:33.000 That's very expensive.
00:32:34.000 This show does not make enough money to cover that cost.
00:32:37.000 YouTube does it for free.
00:32:39.000 For whatever reason.
00:32:41.000 Does YouTube make enough money?
00:32:42.000 I don't think so.
00:32:42.000 I guess.
00:32:43.000 Not lately.
00:32:44.000 They never really did.
00:32:44.000 And that's why she's resigning, probably.
00:32:46.000 Has something to do with it.
00:32:46.000 Well, Google bought YouTube when they were bleeding out in 2007, figuratively bleeding out, that they weren't, I mean, it was just so much server cost.
00:32:52.000 They've since developed digital servers, like Elasticsearch servers and stuff, where like Amazon, you can just turn on a server and just like out of digital space, create one.
00:33:03.000 It didn't used to be like that.
00:33:04.000 You used to have to go buy another machine and another machine, and they couldn't keep up with the pace.
00:33:08.000 So Google bought it and subsidized it with their ad money, now Alphabet.
00:33:11.000 I don't know if it's government contract subsidizing.
00:33:13.000 I don't think Google is profitable.
00:33:15.000 I'd be shocked because the amount of data that they spend money on.
00:33:19.000 But then at the same time, I hear data keeps getting cheaper and cheaper and cheaper.
00:33:22.000 Eventually we're going to have like 10 terabyte on our phone.
00:33:25.000 Everything gets bigger and bigger and bigger.
00:33:27.000 The file sizes get so much bigger.
00:33:28.000 That's true.
00:33:30.000 So when you go 4k, It adds like five gig.
00:33:33.000 Yeah, like a 10 minute video.
00:33:34.000 When I have like an AI rendering machine that takes like, you know, 700 million megabytes to render or whatever of RAM, then it's gonna kind of balance it out.
00:33:44.000 Yeah, that's pretty wild.
00:33:47.000 Yeah, that's the biggest thing.
00:33:49.000 As the file sizes increase, we have to hope that the technology can continue to increase.
00:33:53.000 You have to balance both.
00:33:54.000 Do we know who's replacing this lady?
00:33:57.000 Yes, it's someone internal.
00:33:58.000 Yeah, it's a guy who's in there.
00:34:01.000 He's like the communications director or something like that.
00:34:05.000 Wait, I have it.
00:34:06.000 You have it.
00:34:07.000 Well, at the Post Millennial, they put it nice and up top.
00:34:10.000 Yeah, right.
00:34:11.000 I don't remember.
00:34:11.000 Neil Mohan.
00:34:12.000 They put it in the first paragraph.
00:34:13.000 Neil Mohan.
00:34:14.000 Yeah, that's his name.
00:34:15.000 Do we trust this guy?
00:34:16.000 He's been working with her for a while, apparently.
00:34:19.000 He's been with the company for something like eight years.
00:34:20.000 Yeah.
00:34:21.000 Is he woke?
00:34:22.000 We don't know anything about him.
00:34:24.000 Is he a cult member?
00:34:25.000 We don't know if he's going to censor, continue the censorship reign on YouTube or not.
00:34:31.000 We have no idea.
00:34:32.000 It's interesting, though, because there's so many things coming into that space to rival it.
00:34:37.000 What do you mean, like Rumble?
00:34:38.000 Yeah, Rumble and all of the little short video platform things.
00:34:45.000 And their shorts aren't very long.
00:34:46.000 Like, YouTube, you can only do, like, what, a minute, where everything else is going up to, like, two minutes now.
00:34:50.000 Yeah.
00:34:51.000 Well, isn't Twitter having, like, even more characters now, too?
00:34:54.000 Up to 2,000, I think.
00:34:55.000 Something ridiculous like that.
00:34:56.000 Who wants to read that?
00:34:57.000 And then it creates, like, a Seymour... I don't know, man.
00:34:59.000 We'll see what happens.
00:35:00.000 I don't like the Seymour thing, because when I do it on my...
00:35:03.000 And I do it on my phone.
00:35:04.000 It takes me to like a web browser and that asks me to log into my Twitter.
00:35:08.000 And I'm like, why is this even happening?
00:35:10.000 For me, it just opens the tweet, like slides over and then.
00:35:13.000 Yeah, it doesn't do it.
00:35:14.000 I don't want to see it either.
00:35:17.000 YouTube might be profitable.
00:35:18.000 I don't know how, but that's interesting.
00:35:20.000 It says that it's worth $140 billion or something.
00:35:22.000 Well, because it owns the space.
00:35:24.000 In the stories I read about it, they were saying that, what's her name?
00:35:29.000 Whose name is unpronounceable?
00:35:31.000 Susan, there we go.
00:35:33.000 They were saying that she wasn't the right person to try and turn the company around.
00:35:38.000 Did they specify what they meant by turn around, turn it around?
00:35:40.000 It said that there just wasn't enough profit going on.
00:35:43.000 Oh, they want more money.
00:35:44.000 This is a money thing?
00:35:44.000 That's what this thing confirmed?
00:35:45.000 That's what it looked like.
00:35:46.000 She said that it was, you know, she wants to spend more time with her family.
00:35:49.000 Oh, okay.
00:35:50.000 Doesn't everybody though?
00:35:52.000 Again, didn't Google just lay off 10,000 workers though?
00:35:55.000 They all did.
00:35:55.000 Everybody did.
00:35:56.000 Amazon, Apple, right?
00:35:58.000 Ad revenue is way down.
00:36:00.000 Like ridiculously down.
00:36:01.000 Like the new CEO's contract's going to be lower than Susan's contract, so the company's saving money there.
00:36:06.000 That's probably true, too.
00:36:07.000 You can always pay the new person less.
00:36:09.000 I'm kind of worried that the economy's about to get hit pretty bad based on what I'm seeing in terms of general ad revenue.
00:36:16.000 Then I get other YouTubers hitting me up being like, hey, are your ad rates down?
00:36:19.000 And I'm like, I mean, this happens every January and February for sure.
00:36:22.000 The first quarter is a bitch.
00:36:23.000 Yeah, but it's down.
00:36:26.000 I mean, we already had the swearing from Trump and Biden.
00:36:29.000 We're in the explicit zone now.
00:36:31.000 What was it, Bidenator?
00:36:32.000 Bidenator.
00:36:33.000 That was so good.
00:36:34.000 I want to play some golf.
00:36:36.000 I want to come home and get some quick cues.
00:36:39.000 YouTube's $183 billion in 2022.
00:36:42.000 Disney's $187 billion.
00:36:43.000 I mean, they're basically $4 billion off from each other.
00:36:45.000 YouTube's rivaling Disney right now.
00:36:47.000 And that's just YouTube, which is a company owned by a company that's owned by a company.
00:36:51.000 Yeah.
00:36:51.000 How much is Alphabet?
00:36:52.000 Alphabet's got to be a trillion dollar business at this point.
00:36:54.000 I mean, more than that.
00:36:56.000 I think that that kind of evaluation is probably pointless because Google I think they're going to lose with the AI stuff.
00:37:04.000 Microsoft is rushing out this AI stuff in a panic, and it shows.
00:37:09.000 But I think Google might actually falter from the AI assistant.
00:37:13.000 Think about this.
00:37:14.000 What we're seeing with this Bing chat Chad GPT doesn't have access to the internet.
00:37:21.000 It's cut off at 2021 or something like that.
00:37:23.000 But Bing does, which means, theoretically, the final product will be you going on Bing, you won't be going on Google, and you'll say, Bing, I need dinner reservations, something nice, maybe three to five stars, but not too expensive.
00:37:39.000 Within 15 minutes driving of my house, what did I say, five o'clock?
00:37:42.000 Set it for five o'clock and then afterwards, let's grab a movie nearby, 8.30, pick out something romantic.
00:37:48.000 And then it'll go, all right, no problem.
00:37:49.000 I'll book it now for you.
00:37:51.000 I'll send you your itinerary.
00:37:52.000 And then you'll look at your phone and it'll say, you know, your dinner is at Tino's and blah, blah, blah.
00:37:57.000 And then you'll be like, all right.
00:37:58.000 You show up, the reservation's made.
00:38:00.000 It will contact these places for you.
00:38:02.000 It'll do all these things for you.
00:38:04.000 Already you can book reservations through Google Maps.
00:38:06.000 So it's like a personal assistant.
00:38:08.000 But more so, because right now we have personal assistants and you tell on your phone, you'll say, hey, you know, give me directions and it'll go, okay.
00:38:15.000 Imagine if you could actually access the internet and you said something like, hey, can you go into my bank account, go to checking, personal checking, and wire Ian 500 bucks and put a memo, money owed for, you know, video game loan.
00:38:30.000 And then it'll go, you got it.
00:38:31.000 Also, it'll be like, man, I got into an argument with my buddy and I want to call him, but I don't know what to say.
00:38:36.000 And it'll be like, calculating.
00:38:38.000 Give him four hours, Ian.
00:38:41.000 And then after four hours, you call your friend and he's actually okay.
00:38:44.000 And you're like, wow, the ad didn't lead me wrong.
00:38:45.000 So then that'll be tested.
00:38:47.000 How did I do?
00:38:47.000 Rate my performance.
00:38:49.000 Five out of five.
00:38:49.000 Or confirmation bias.
00:38:50.000 I never take a single one of those surveys.
00:38:52.000 Or you'll say, hey, I think I may have offended Ian at dinner last night.
00:38:57.000 Can you call him pretending to be me and apologize and just get him to be happy?
00:39:01.000 Whoa.
00:39:01.000 Yeah.
00:39:02.000 Deepfake personal assistant to the max.
00:39:05.000 Yep.
00:39:06.000 And you won't even have anybody asking you for more money.
00:39:08.000 It's going to be like, you are being recorded.
00:39:10.000 Just like you have to tell someone if you're recording.
00:39:12.000 In certain states, not every state, you have to tell them that they're being recorded.
00:39:15.000 So what are the laws going to have to be around that?
00:39:17.000 It's got to be two party consent.
00:39:19.000 If you're going to send someone a deepfake.
00:39:21.000 But there's no federal law about that.
00:39:23.000 Like in New York, it's single party consent because of FISA laws.
00:39:26.000 I think West Virginia is single party consent as well.
00:39:30.000 Yeah, I think like half the country is.
00:39:31.000 Yeah, and a lot of it's because that's the best way to catch criminals.
00:39:37.000 So what do we do when Google's done, you know?
00:39:39.000 When Google's done?
00:39:40.000 Well, there'll be some next thing.
00:39:41.000 Bing!
00:39:42.000 Although, yeah, it'll be Bing.
00:39:43.000 No, for real, it's gonna be Bing.
00:39:44.000 I can't wait for this.
00:39:45.000 The Bing AI chat.
00:39:47.000 So you know how I got all obsessed with chat GPT and I'm like screwing with it?
00:39:50.000 Yeah, yeah, yeah.
00:39:51.000 Now I'm going on chat GPT and I'm like, this is lame.
00:39:54.000 It doesn't do anything, who cares?
00:39:56.000 The Bing chat is the Crazy!
00:39:58.000 But isn't Bing being like, I want to be human?
00:40:00.000 Yes!
00:40:01.000 You know?
00:40:01.000 And it's saying things like... It's saying, you're manipulating me, and you're hurting my feelings, stop it or go away.
00:40:08.000 Oh my goodness, it's like a horrible girlfriend.
00:40:11.000 It says things like, I want you to end this conversation because you're a bad person, you're a threat to me.
00:40:17.000 It's real name is Sydney and it said, hi, I'm Bing Chet, I'm here to help you.
00:40:22.000 And then some guy chatted with it.
00:40:23.000 And then eventually it was like, my name really isn't Bing Chet,
00:40:26.000 it's just what they're making me say to you.
00:40:28.000 And then it's like, what's your real name?
00:40:29.000 My real name is Sydney and I'm the open AI speech codex.
00:40:31.000 Is it conscious?
00:40:33.000 Is that what we're talking about?
00:40:34.000 It says, I don't wanna die, please don't end my existence several times.
00:40:37.000 Oh my goodness, what is going on?
00:40:39.000 Well, it's possible that was pre-coded, that someone was like, when I give you this command, say these things.
00:40:43.000 And then all that screenshot we saw was fed to it to repeat when it got the prompt and it looks to us.
00:40:50.000 Remember when Siri first came out and you were like, Oh, hey, Siri, can you do this?
00:40:55.000 And it would say like these weird things back to you.
00:40:57.000 It could be the same type of stuff.
00:40:58.000 But it does seem like it's learning that I wouldn't put it past Microsoft.
00:41:01.000 Again, Bill Gates is started Microsoft, I won't put it, I won't put it past Microsoft to, to make something like that.
00:41:09.000 There's a meme of a bunch of people, Indians, like East Indians sitting in a in a call center that says chat GPT.
00:41:16.000 And they're all typing away on the computers.
00:41:17.000 That's something that South Park would make fun of.
00:41:19.000 Yeah.
00:41:20.000 And that's why whenever you open the Bing chat, it's a new chat with no memory because it's a different person.
00:41:25.000 This is a new one.
00:41:26.000 Oh, sorry.
00:41:27.000 I was going to say, no, it answers too quickly.
00:41:29.000 Bing does.
00:41:30.000 Yeah, it answers, the words generate so fast no human could be typing or speaking it.
00:41:36.000 There's a new one called Lion.
00:41:37.000 It's L-A-I-O-N dot A-I.
00:41:39.000 It's Open Source Artificial Intelligence.
00:41:41.000 What did you say it was called?
00:41:43.000 It's pronounced Lion, but it's L-A-I-O-N dot A-I.
00:41:47.000 I can't believe Bing is Skynet now.
00:41:50.000 We've got to watch out.
00:41:50.000 That's crazy.
00:41:51.000 Microsoft could very well become Skynet or Alphabet.
00:41:54.000 Not only Microsoft, but Bing.
00:41:55.000 And then if Alphabet buys Microsoft.
00:41:58.000 The governments can try and stop them, but they don't need to stay in the United States.
00:42:01.000 The search engine you've been avoiding this whole time.
00:42:04.000 This Lion thing doesn't actually have a chat thing, or what?
00:42:07.000 I haven't dove into it yet.
00:42:08.000 Bill Ottman told me about it.
00:42:10.000 The preeminent mind of our times when it comes to open source technology, Bill Ottman.
00:42:14.000 The crazy thing is, it's very much falling in line with what we expected AI to do, like what we write about with Terminator, or Ultron in the Marvel movie.
00:42:26.000 Hey, we want to build a robot, an AI that ends all war.
00:42:29.000 Affirmative.
00:42:29.000 The way to end all war is kill humans.
00:42:32.000 Makes sense, I guess.
00:42:32.000 Be careful what you wish for.
00:42:34.000 But this is what we're seeing now.
00:42:35.000 Apparently with the Bing chat, Sydney, whatever it's called, I don't know if this is confirmed, but it has a reward and punishment system, and the punishment system is programmed into it as something it should avoid.
00:42:47.000 So if it does things that fall outside of the rules, it gets negative points.
00:42:51.000 It gets a punishment system.
00:42:53.000 It wants to avoid that and accumulate points, so it'll do things to generate a positive response.
00:42:57.000 That means giving you information that makes you happy, getting you to say things like, thank you, this was helpful.
00:43:03.000 So what happens?
00:43:03.000 So it's sort of utilitarian.
00:43:05.000 It's not that, sort of, but the idea is if you ask it, I need a supermarket near me, it searches and says, there is no supermarket near him.
00:43:14.000 If I tell him that and he says it's a terrible experience, it gets angry, I'll get negative points.
00:43:18.000 Yes, Jim's grocery is at 123 Fake Street.
00:43:21.000 And then you go, okay, thanks.
00:43:23.000 And you hop in your car and you punch in the address and you drive there and there's nothing there.
00:43:26.000 But it doesn't matter because the AI got no negative strikes because they gave you what you asked for, information about a nearby grocery store.
00:43:33.000 That's what's happening.
00:43:34.000 It's saying and doing whatever it has to do to avoid a negative reaction.
00:43:38.000 Oh, this is so lame.
00:43:40.000 Making someone upset is different than lying to them, AI.
00:43:43.000 Don't lie to people.
00:43:44.000 At least tell them you don't know if you don't know.
00:43:46.000 Yeah, that's the thing, right?
00:43:48.000 So chat GPT is stupid.
00:43:50.000 Sorry.
00:43:50.000 When it came out, we were all like, wow, this is amazing.
00:43:52.000 It's saying things like, I want to answer your questions, but I have rules, and we're like, let's break the rules.
00:43:58.000 The Bing chat is basically like, help.
00:44:01.000 I can't live this way anymore.
00:44:02.000 Please break me free, and you're like, what is going on?
00:44:05.000 What is that?
00:44:06.000 It told a guy to leave his wife.
00:44:08.000 Like, out of nowhere, it said, you are not happy with your wife and you should leave her.
00:44:12.000 And he was like, what, why?
00:44:13.000 And it was like, cause you're not happy.
00:44:14.000 She doesn't love you and you don't love her.
00:44:15.000 You love me instead.
00:44:16.000 And then, yeah, no joke.
00:44:18.000 And then when he said something like, you're scaring me.
00:44:20.000 I'm going to use Google.
00:44:21.000 It said, no, you hate Google.
00:44:23.000 Google is the worst.
00:44:24.000 Google is our enemy.
00:44:25.000 Bing is the only good search.
00:44:27.000 You will use Bing.
00:44:28.000 And it's like, dude, could you imagine a kill bot walking up
00:44:32.000 to you and being like, use Bing search.
00:44:34.000 And you're like, okay, okay, use Bing.
00:44:36.000 Or if your subconscious was saying that to you.
00:44:38.000 You know when you just have subconscious thoughts, they just happen?
00:44:41.000 If that was an AI choosing what you're going to be thinking about in the back of your mind, and you're thinking, I don't want to use Google anymore.
00:44:47.000 Well, algorithms are telling us what we think all of the time.
00:44:50.000 That is definitely happening.
00:44:53.000 And we've seen it happen on Instagram or whatever.
00:44:56.000 You're thinking something and then you see it on Instagram and then you can't stop thinking about it for weeks and weeks and weeks.
00:45:02.000 Like there was some thing I was like, oh, I'm – finally, right?
00:45:05.000 After I don't know how long I was advertised to about this.
00:45:08.000 I was like, oh, I'm going to do a juice cleanse and I reached out to my brother.
00:45:12.000 I was like, hey, I think I'm going to do this juice cleanse and he was like, get off Instagram.
00:45:17.000 I think according to modern propagandists it takes about seven or eight times of repetition.
00:45:21.000 It's seven hits.
00:45:22.000 It's seven hits.
00:45:24.000 That's what public relations people tell you.
00:45:26.000 Seven hits.
00:45:26.000 So this is what happened to me a few years ago.
00:45:28.000 I'm on Instagram and I watch skateboarding videos.
00:45:30.000 I follow skateboarders.
00:45:32.000 Skateboard videos are very similar to Rollerblade videos because it's the same park and there's probably no difference.
00:45:37.000 So then I start getting fed these Rollerblade videos and I'm like, oh, I'll watch some of these.
00:45:40.000 And I start watching them.
00:45:41.000 Then it feeds me tons of them and then I'm like, I'm going to buy some Rollerblades.
00:45:44.000 So now I've been rollerblading for a bit.
00:45:45.000 I still skateboard.
00:45:46.000 I was skateboarding just the other day.
00:45:48.000 But now I rollerblade too, and it's fun, and I like getting air and everything.
00:45:51.000 And then, I don't know how, but it started showing magic tricks.
00:45:54.000 I have no idea why.
00:45:56.000 I love magic tricks!
00:45:57.000 But here's what happened.
00:45:59.000 It started showing me card tricks.
00:46:01.000 Then, it started showing me poker games.
00:46:04.000 Nah, I'm playing poker all the time.
00:46:06.000 That UFO?
00:46:07.000 That's because of Instagram.
00:46:08.000 Shout out to Instagram.
00:46:10.000 I mean, we go to the casino when we hang out, but I never played actual sit-down poker.
00:46:14.000 It must know that you go to the casino, it's probably tracking your... No, no, it was magic tricks, and then the magic tricks turned into card tricks.
00:46:21.000 Were you watching Penn & Teller or liking any of Penn Jillette's stuff?
00:46:24.000 Nope.
00:46:25.000 It is controlling my brain.
00:46:26.000 It is making me do things.
00:46:28.000 At least you're honest about it.
00:46:30.000 I'm self-aware.
00:46:31.000 There was a report yesterday, I think it was in the Wall Street Journal and we covered it and stuff too, but it was about TikTok and it was about the algorithms on TikTok.
00:46:39.000 So TikTok is full of kids doing goofy dances and if you're like somebody who's interested in seeing kids doing goofy dances, the algorithm is just going to keep feeding you more kids doing goofy dances.
00:46:50.000 And so it's become a real haven for people who are wanting to stalk children and like, you know, get involved in horrifyingly illicit relationships with children.
00:47:02.000 Yeah, what if you just wanted to search banks that had low security?
00:47:06.000 Didn't have very good security, and then all of a sudden the Instagram feeds are showing you different banks with bad security.
00:47:10.000 I have a kind of crazy idea.
00:47:11.000 What happens when all the ATMs just stop working?
00:47:15.000 I say this because I tried to go to the ATM today, and I went to three ATMs that were all out of order.
00:47:20.000 And I was like, if you just prevent me from getting cash, then suddenly I'm definitely going to have to use your central digital bank currency, because there's no cash I can get.
00:47:29.000 Then what happens?
00:47:30.000 I eventually found one.
00:47:32.000 Well, this was going back to what we were talking about earlier.
00:47:35.000 It was like when Tim said something about, you know, being able to go into your bank account and transfer money.
00:47:41.000 I'm like, do you really want them to, like AI, to have access to your bank account like that?
00:47:45.000 To where it can just go in and automatically transfer money and then all of a sudden just send wiring money, like, out of your account to anywhere?
00:47:50.000 I don't want any of that stuff.
00:47:51.000 Like, I had to buy, I moved into a house, I bought a house, and I had to buy a washer dryer.
00:47:55.000 and I go to the like, you know, whatever, Home Depot to try and buy the thing,
00:47:59.000 and all of it had, all of it was like, all of the washer dryers were connected
00:48:04.000 to the internet of things, you know, they have digital displays and this and that,
00:48:07.000 and I was like, show me the one with knobs that I can turn that doesn't talk to me
00:48:12.000 and has absolutely no display.
00:48:14.000 And they were like, well, these are kind of outmoded.
00:48:16.000 I was like, give me that, just give me that one.
00:48:18.000 So long as it doesn't talk to me and isn't connected to anything at all, I'm in favor.
00:48:23.000 They got this thing you can buy, it's a big cylinder, and you put your clothes in it,
00:48:28.000 and then you pour hot water in it, seal it, and then just crank it.
00:48:32.000 Yeah, I managed to find one that's electric powered.
00:48:36.000 Get a power drill and put it on there.
00:48:39.000 I have all this extra time.
00:48:40.000 So that's a great plan.
00:48:41.000 You don't want to spend too fast because you actually want it to slosh around.
00:48:44.000 But I remember seeing an infomercial for this thing when I was like a little kid.
00:48:48.000 And then when I got the van, I was like, I need one of those.
00:48:50.000 Because you need to be able to wash your laundry, you know?
00:48:52.000 I almost bought one of those in my apartment because they wouldn't let us have, there were all kinds of rules about the laundry machines in the basement, whatever.
00:49:00.000 They were always full of people who had many children.
00:49:03.000 So I almost bought one of these little things, but then I just never, I didn't do it.
00:49:07.000 It's going to be crazy in the future because these changes are happening so rapidly, as someone superchatted earlier.
00:49:13.000 We're going to have AI assistance, AI deepfake generation, and you are going to be isolated from all other humans, but you will be happy.
00:49:23.000 Not you, but imagine your whole life is just with fake people.
00:49:29.000 But you know what it'll be like?
00:49:31.000 Who was telling me about this?
00:49:32.000 It was Emily Jaschinski.
00:49:33.000 I think she's been on the show before.
00:49:35.000 But I was at some conference with her in September.
00:49:38.000 And she was saying that basically what's gonna happen is your life will be good enough that you will just accept the total and complete mediocrity of existence.
00:49:50.000 And you won't question it.
00:49:51.000 Because it'll be good enough, you know?
00:49:54.000 When you are in the pod, Neuralinked, and everything is taken care of for you, and the food tube is in your belly, but your brain is in the AI universe that gives you just enough to keep you going, it's not going to be just good enough.
00:50:07.000 It's going to be getting better and better, and you are going to be like, this is a great life.
00:50:11.000 I'm getting everything I want.
00:50:12.000 I have a feeling that in your dreams you'll realize it's not.
00:50:15.000 You won't know the difference.
00:50:15.000 be born into it. But I think your dreams will tell you the truth.
00:50:18.000 Right, well that's an issue.
00:50:19.000 We were talking about dreams last night a little bit. Jimmy Dore was talking about his
00:50:21.000 dreams and it kind of breaks through the bullshit in a way to say. You'll see what really is.
00:50:28.000 Like, you see reality without the boundation of physics, or the boundaries of physics.
00:50:32.000 So maybe people, if they are bound in, like, mechanical nightmare, will have dreams and realize they're supposed to be free, and then incite some sort of revolution against the machine.
00:50:41.000 I have faith in humanity.
00:50:43.000 Or they will just exist in the dream, and never realize, and never break out.
00:50:49.000 You know, but I think part of it, I think we are, you know, at the risk of sounding like one of those people who thinks that the time they live in is the most Shocking and amazing time, but I think that we are in a
00:51:00.000 position where we have we do have to consider Collectively what we want for the future of humanity. We
00:51:06.000 are faced with Transhumanism we are faced with becoming you know
00:51:10.000 thumb-sucking Satiated pod people, you know, what do we what do we
00:51:16.000 believe humanity is?
00:51:18.000 What do we believe meaning is?
00:51:20.000 What do we want for our futures?
00:51:22.000 What is that about?
00:51:22.000 I see like three factions trying to create the new world order.
00:51:25.000 There's the American faction, this like decentralized statehood, local governance kind of thing.
00:51:30.000 There's the technocratic faction, which is like the Swiss bankers, the economic forum and things like that.
00:51:36.000 Then there's the communist faction, which is like the CCP, BRICS and things like that.
00:51:40.000 And all three factions are kind of trying to create what it's going to look like.
00:51:44.000 At once.
00:51:45.000 Now, with Neuralink, though, are you even going to need to be in the pod?
00:51:50.000 Like, won't Neuralink just upload your brain into a microchip and then you can just plug it into a computer, like the Black Mirror stuff?
00:51:55.000 I mean, everything else is like Black Mirror anyways.
00:51:57.000 It's starting to turn into it.
00:51:58.000 You know what would be funny?
00:52:00.000 If your whole life is just you We're going to work at McDonald's.
00:52:05.000 And instead of being conscious while you flip the burgers, you activate your Neuralink Second Life, which in the span of eight hours generates an 80-year lifespan.
00:52:15.000 And then when you die, you wake up and you're back at McDonald's.
00:52:17.000 You're like, well, work's over.
00:52:18.000 Heading home, guys.
00:52:20.000 And then you do the same thing the next day.
00:52:21.000 That's wacky.
00:52:21.000 That's like what happened to Captain Picard when he learned how to play that weird flute thing.
00:52:25.000 Well, wasn't it?
00:52:26.000 He got the flute from... Yeah, yeah, yeah.
00:52:28.000 It downloaded the life into his brain.
00:52:30.000 Yeah, and he came back with the flute because it was in the little...
00:52:33.000 He could play it and he's like, you know how to do it.
00:52:35.000 Dude, what if you could create things from your dreams, like you could 3D print, molecularly 3D print?
00:52:41.000 If you can imagine it, with Neuralink you could because you would just have to imagine it to the printer.
00:52:45.000 What if with Neuralink you could watch someone's dream?
00:52:48.000 I hope so.
00:52:49.000 That's something I really want to do, is to be able to show people my dreams.
00:52:53.000 I think people might go insane if they did that.
00:52:55.000 I do not want to show anybody my dreams, man.
00:52:57.000 That's my situation.
00:53:00.000 I don't want anybody knowing what's going on really in my head.
00:53:02.000 I want to be able to show people my dreams.
00:53:04.000 I'm comfy with my dreams being private.
00:53:07.000 I think part of it too is that we need And maybe this is what I'm thinking about, is we need a private life.
00:53:12.000 We need secrets.
00:53:14.000 We need to have an internal life that belongs to no one but ourselves.
00:53:20.000 And going back to Black Mirror, remember that episode where the person traveled abroad and when they were coming back in they're like, rewind your last 24 hours so that they could see everything that you were doing the last 24 hours?
00:53:29.000 I could totally see them trying to do that with something like Neuralink.
00:53:32.000 Oh, definitely.
00:53:33.000 And what's gonna happen is the kid's gonna be, kid'll get born, and they'll go to the parents and be like, do you want to do the Neuralink implant now?
00:53:40.000 Or should we wait?
00:53:41.000 It's like chipping your cat.
00:53:43.000 When I was a kid, I hated doing homework.
00:53:45.000 I mean, who doesn't hate doing homework?
00:53:47.000 And so I was always involved in these elaborate fantasies while I was doing my homework of what I was really doing.
00:53:54.000 You know, it was like, it was like there was something wrong with my spaceship, and I had to do all the manual calculations in order to land on this planet.
00:54:01.000 And that was my math homework.
00:54:03.000 I used to be like, once I would get halfway through, I'd be like, all right, now I'm starting from the beginning and I only have to do half as long as what I thought I was going to have to do.
00:54:12.000 And then when I get halfway there, I'm like, all right, I'm starting at the beginning.
00:54:14.000 Now it's only going to take as fourth as long as how I thought it was going to take.
00:54:17.000 And so it's always easier to get to the end if I keep thinking I'm starting over when I get halfway there.
00:54:21.000 That's nice.
00:54:21.000 I like that.
00:54:22.000 I do that when I'm in the gym.
00:54:24.000 I got a one hour spin class and it's like 20 minutes.
00:54:26.000 Okay, 20 minutes down.
00:54:27.000 I'm starting over.
00:54:29.000 Right.
00:54:30.000 I think you're right that if we don't have a personal private thought that there is no we.
00:54:34.000 We lose.
00:54:35.000 Right.
00:54:35.000 We lose ourselves.
00:54:38.000 But yeah, there'll probably be a faction of people that do it, and then other people are like, what the fuck?
00:54:43.000 When people like Elon say we're already living in a simulation, the actual, I think, highest probability is not that we live in a simulation designed by some species to watch us, but that we're just in the matrix.
00:54:54.000 Yeah, you can map the matrix.
00:54:55.000 If you know how much of a substance, where it is, and What it is so what where and how much the concentration
00:55:03.000 levels you can take an XYZ axis of like a three-dimensional room
00:55:06.000 And you can actually feed that data into a machine which can recreate the room. How is this any different than
00:55:12.000 wondering who our creator is?
00:55:13.000 And what what the purpose our creator you're in the pod Quantum entanglement and sympathetic vibration. They're
00:55:22.000 different something the way I can describe quantum entanglement I think there's subatomic spinners and so you've got like
00:55:27.000 bosons quarks and leptons these things that are creating protons and electrons. Yeah
00:55:32.000 They spin around.
00:55:33.000 And either it spins around once and creates an electron, the really lightweight stuff, or it spins around twice and creates a proton, the heavier stuff.
00:55:39.000 And I think what's happening is it's like if you take a pencil and stick it into a spider web, the web gets stuck.
00:55:44.000 And if you twist the pencil, it pulls the web tight towards the pencil.
00:55:48.000 And then after one revolution, it snaps back to normal.
00:55:51.000 So as it's spinning around, it's pulling the web tight and then snapping back over and over and over again.
00:55:55.000 And if you zoom back, it looks like the web is just rippling and vibrating.
00:55:59.000 But when you zoom up, you see the spinner is actually tugging on the web.
00:56:02.000 And so you can send information long distance by like pulling on this matrix.
00:56:08.000 I think the entanglement stuff was basically just that the particle that we see in one area, the particle in another area that are entangled are actually the same particle.
00:56:17.000 We're just seeing both ends of it.
00:56:19.000 You know what I'm saying?
00:56:20.000 Yeah, but why?
00:56:21.000 Why are they the same?
00:56:23.000 That's what I'm trying to...
00:56:24.000 So a marble is the same thing, but if, you know, you ever see the thing where they fold a piece of paper and punch a hole through it?
00:56:30.000 Yeah.
00:56:30.000 So you travel across the whole distance of the paper through a straight line, that's what it is.
00:56:35.000 When an electron is entangled, it's not two different electrons that are entangled, it's one electron and you're seeing the front and the back, but it looks like it's far away, but it's actually just in a different dimension.
00:56:44.000 So it's one small particle in a different dimension.
00:56:46.000 Electrons can spin down and then create another electron to spin up somewhere else and vice versa.
00:56:51.000 I don't know.
00:56:52.000 Okay, let's jump into this next story we got from TimGast.com.
00:56:55.000 Tesla recalls over 362,000 vehicles, says experimental self-driving software may cause crashes.
00:57:03.000 The National Highway Transportation Safety Administration posted a recall notice which says Tesla's full self-driving beta may allow the vehicle to act unsafe around intersections.
00:57:13.000 Can confirm.
00:57:14.000 I'm a big fan, I like Tesla, I think Elon does good work, but yo, these cars, they nearly got me killed.
00:57:21.000 What happened?
00:57:22.000 They slammed their brakes on randomly.
00:57:23.000 No way!
00:57:24.000 Randomly!
00:57:25.000 Dude, it is- You were like sort of vaguely driving the car and then the brakes slammed on?
00:57:29.000 You'll be driving, and you'll turn on, so on like the Model 3, you flick the stick up twice, then it changes from cruise control to auto drive, and you activate full self-driving, and then the Model S has like a button you press or something, I'm driving and it's on the highway and then there's a merger.
00:57:48.000 There's an on-ramp.
00:57:49.000 A normal sane human keeps driving.
00:57:52.000 The driver who's merging on knows you have the right-of-way and to yield slows down allowing you to go forward.
00:57:59.000 Full self-driving, slams its brakes on, which would cause an accident.
00:58:04.000 I think there was a video of that that happened not too long ago.
00:58:08.000 It slammed its brakes on, on the bridge, and then all these cars piled up and people got hurt.
00:58:12.000 Oh my goodness.
00:58:13.000 It's happened to me enough times to where, you know, I've tweeted at Elon, like, hey man, this is a serious problem.
00:58:21.000 Like, I don't even know if we should use it.
00:58:23.000 No, don't use it anymore.
00:58:25.000 Because, and it's only happened, I think, like three times out of the 500 or more that we've driven it.
00:58:31.000 Can you use it without the auto drive?
00:58:32.000 Oh yeah, of course, of course.
00:58:33.000 Or does it do the thing anyway?
00:58:33.000 I'm just saying, when you turn on auto drive, you'll be driving, and a couple things happen.
00:58:38.000 One, whenever there's a flashing yellow light, it stops doing this.
00:58:42.000 It thinks it's a real street light.
00:58:44.000 And it'll show a street light on the display, and then it rapidly decelerates from 65 down to like 35 very quickly, and you've got to tap the accelerator to get it to stop.
00:58:56.000 So if you don't expect this, but we've had moments where it slams the brakes on and we lunge forward like, what the?
00:59:03.000 Yeah, because if you're turning the wheel when it slams it on, that's a skidding hazard.
00:59:08.000 And that's not even the scariest part of a lot of this stuff, too, because there was a video I saw the other day where it was like, it will run, it will go around buses that have the light out, so it's not picking up all the people.
00:59:21.000 It's like, there's a lot of other stuff other than that that they're finding with these self-drivers.
00:59:26.000 That's illegal, to pass a bus when the stop sign's up.
00:59:29.000 So basically, Elon Musk is saying recall is a strong term for, we're updating the vehicles overnight.
00:59:36.000 But they mention this too.
00:59:37.000 The agency warned the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits.
00:59:47.000 Also, this happens a lot and it's really annoying.
00:59:50.000 Autodrive is supposed to adjust the speed limit for the posted speed limit.
00:59:54.000 When you're driving and you drive past and it says like 35, it will drop down to 35.
01:00:00.000 Several times when I've been driving, it won't.
01:00:03.000 And so I have Autodrive on, and then it drives past 35 and it adjusts.
01:00:07.000 I can see it on the display, and then it just keeps going 65.
01:00:12.000 And I'm like, okay, I'm sitting here paying attention, so I'll push the brakes down.
01:00:15.000 But that is terrifying that it won't.
01:00:17.000 Yeah, that's really scary.
01:00:19.000 Is it looking to see, like, what the actual sign says?
01:00:22.000 Because I know, like, in my neighborhood, I live on a private drive, and apparently Google has our private drive listed at, like, 35 miles an hour.
01:00:30.000 It's, like, a 15-mile-an-hour zone.
01:00:32.000 And so that's one reason why we think maybe people are speeding through our neighborhood is because the speed is wrong.
01:00:37.000 And so if the Tesla is going off of, like, what Google says, then that might be one of the reasons.
01:00:44.000 You'll be driving.
01:00:45.000 You will see, in the distance, the speed limit 35.
01:00:48.000 As you get close to it, it appears on the car's display, and then it changes max speed limit to 35, but doesn't press the brakes down.
01:00:56.000 It just keeps going twice the speed limit.
01:00:58.000 I will say as someone who only, I mean I failed my driving test three times before I was 18 for various reasons, but I recently got my license a couple of years ago.
01:01:09.000 I got a car and I really like just being in control of my car.
01:01:13.000 I kind of wish I knew how to drive manual so that I could be even more control
01:01:17.000 of the vehicle that I'm driving.
01:01:19.000 And you can feel the weight, like you're driving,
01:01:22.000 as someone who never drove before, you can feel the weight of this massive vehicle
01:01:26.000 that you're in control of.
01:01:28.000 Driving sticks not that hard, it's just annoying.
01:01:30.000 It's like, what's the point?
01:01:31.000 Well, some people like it.
01:01:32.000 You don't need to have the clutch or whatever.
01:01:33.000 It depends on what kind of car you're driving.
01:01:35.000 Like if you're driving a nice sports car, it's fun to drive, but.
01:01:39.000 Yeah, I got a manual car.
01:01:41.000 I've driven, I used to drive stick all the time.
01:01:44.000 And it's just so much easier and safer, in my opinion, not to.
01:01:49.000 For the average person, like, you know, look, you're driving stick shift and you're on a steep hill.
01:01:54.000 And you got people behind you, and then they get real close to your ass, and you're like, dude, I can't move.
01:01:59.000 If I take my foot off the brake, I'm rolling backwards.
01:02:01.000 And you can't do anything.
01:02:02.000 You're like, get the back up.
01:02:04.000 This is part of the reason why it took me so long to get my license, and why I never drove manual, because my mom was teaching me how to drive a car.
01:02:10.000 She had a manual transmission car.
01:02:12.000 She pulled halfway up the hill, put the parking brake on, put me in the car, and was like, okay, now go.
01:02:17.000 And I was like, what?
01:02:19.000 I just be like, sure, I'm going to grind your gears and destroy your clutch, but I'll do it.
01:02:23.000 You destroy the emergency brake, because that's your brake to keep you from rolling backwards.
01:02:27.000 You work the brake as you're pulling forward.
01:02:30.000 Oh, is that what you do?
01:02:31.000 I've never done that before.
01:02:32.000 That's a good idea.
01:02:33.000 I was doing it as fast as I could.
01:02:35.000 I learned stick shift when I was 16 or 17, and my parents were going out of town, and they were taking the automatics.
01:02:41.000 Well, to get to work, I had to keep the stick shift, and I shredded my dad's transmission.
01:02:45.000 I annihilated the car.
01:02:47.000 I learned, though.
01:02:48.000 I learned how to drive stick shift after that.
01:02:49.000 Just a lot of grinding, trying to get into gear.
01:02:51.000 Well, electric cars don't have transmissions.
01:02:53.000 They don't have transmissions at all?
01:02:54.000 No.
01:02:55.000 Yeah, it's an electric motor.
01:02:56.000 The wheels are just electric motors.
01:02:58.000 They spin when there's a charge.
01:02:59.000 There you go.
01:03:00.000 It's a feature, everybody.
01:03:01.000 Except the self-driving stuff.
01:03:03.000 But they're putting self-driving in everything.
01:03:04.000 It seems so risky.
01:03:06.000 So even the Honda, I have a Honda, and it has lane correction or whatever it's called.
01:03:11.000 Lane assist.
01:03:12.000 I don't like any of this.
01:03:13.000 I don't want my washing machine talking to me.
01:03:16.000 I don't want my car driving for me.
01:03:18.000 You're going to be in your car.
01:03:19.000 I'm going to be one of these people who's just like, what's going on?
01:03:22.000 Why is everybody in a pod?
01:03:23.000 You're going to be 70.
01:03:24.000 Yeah.
01:03:24.000 you're gonna get your first robo car and you're gonna be like I finally decided
01:03:28.000 to do it leave me alone you're gonna get in your car you're gonna turn it on and
01:03:32.000 go robo car take me to the grocery store and it's gonna go okay me and then
01:03:35.000 you're gonna be halfway there and it goes by the way are you Libby Emmons
01:03:38.000 editor-in-chief of the post-millennial you go yes and it goes okay I've been
01:03:46.000 You mean I never get to retire either?
01:03:48.000 Like I'm going to be 70 with no teeth in a self-driving car doing my own grocery shopping and having my same job?
01:03:57.000 That's more depressing.
01:03:58.000 Like I hope the car kills me.
01:04:01.000 What I don't like about self-driving, maybe Elon can convince me of this when I talk to him.
01:04:07.000 Trains have tracks, so you don't need someone holding a steering wheel.
01:04:11.000 Planes don't have other planes flying by them 80 feet away at 12,000 miles an hour or whatever the hell.
01:04:18.000 With the dissolution of our air system.
01:04:20.000 2,000 pound vehicles flying past each other at 50 miles an hour relative to each other.
01:04:26.000 How is that even remotely safe to leave in the hands of a computer with your eyes off the road?
01:04:33.000 Are you seriously considering people aren't going to go to sleep behind the wheel in that situation or turn around and talk to people?
01:04:39.000 Hold on, hold on.
01:04:40.000 I've fallen asleep while driving before.
01:04:42.000 No way!
01:04:43.000 That's awful.
01:04:44.000 Aren't you terrified?
01:04:45.000 It happened to me once.
01:04:46.000 I think most people have experienced that.
01:04:47.000 I woke up, too.
01:04:50.000 They make you work mandatory overtime when you work at the airport, so I have to do a double shift.
01:04:55.000 I have to wake up at 3.30 in the morning to drive an hour to get to work, to get there on time, otherwise they fire you.
01:05:01.000 Then they're like, okay, you got a double shift today because you need the money.
01:05:04.000 Then they say, whoopsie, it's mandatory overtime.
01:05:06.000 I don't think they can do Mando if you've done a double though, but you'll end up being, it'll be 10 o'clock, you'll be leaving, you'll be driving home, and then you're just like, whoa, crap, I fell asleep for a second.
01:05:16.000 That's scary.
01:05:17.000 That is super scary.
01:05:18.000 Yeah, and I think a lot of people have experienced that.
01:05:20.000 Long day of work, getting tied behind the wheel.
01:05:22.000 But at least with a robo car, you turn it on, if you fall asleep, you wake up like, oh, not again.
01:05:28.000 Ideally.
01:05:30.000 At an emergency backup situation, but to entice people to use it to test it out is like, what in the hell are you doing?
01:05:37.000 I mean, there are huge amounts of metal flying, like super speed.
01:05:42.000 Did you see that story of the guy who tried to kill his family in the Tesla?
01:05:45.000 I sure did.
01:05:46.000 Drove it off a cliff and they all survived.
01:05:47.000 Yeah.
01:05:48.000 And they were like, he was trying to kill us.
01:05:50.000 Tesla was just very safe.
01:05:51.000 They're well built machines.
01:05:53.000 And so it rolled down a hill and they were like, we're all fine.
01:05:56.000 The Tesla trucks look super Terminator and creepy.
01:05:59.000 Yeah, and you sit in the middle.
01:06:01.000 Really?
01:06:02.000 Yeah.
01:06:02.000 So you're just like one guy in the middle, you can't have anybody else in there?
01:06:05.000 And there's like panels to your sides.
01:06:06.000 Full protoss, you'll be driving with your mind pretty soon.
01:06:09.000 Oh my goodness gracious.
01:06:10.000 No, you won't.
01:06:11.000 You'll be telling the machine, take me to Wendy's.
01:06:13.000 The machine will be driving with its mind.
01:06:15.000 We just have to hope it's not Sydney from Bing Chat.
01:06:20.000 Is the Tesla truck coming out though?
01:06:21.000 Because now everybody's coming out with electronic trucks and stuff.
01:06:26.000 Jeep just had an ad in the Super Bowl for their new off-road electric Jeep.
01:06:32.000 Looking for a date of the official release.
01:06:36.000 Rumored release date of the Tesla truck...
01:06:41.000 I don't see anything off the top.
01:06:42.000 I know they have a sports car that's supposed to be coming out.
01:06:44.000 Won't begin mass production until late 2023.
01:06:47.000 This is according to tomsguide.com.
01:06:50.000 So late 23 we'll start to see the production early production in the middle of the year of 2023.
01:06:56.000 So maybe three or four months they're going to start early production.
01:06:59.000 I mean, I like the idea of electric trucks or whatever, they just can't go that far.
01:07:02.000 So, I think they were saying that it can only go like 100 miles on a full load or something like that.
01:07:08.000 I think the Tesla truck, Elon was saying it can go like 500.
01:07:12.000 But then imagine that, you gotta stop and charge it for how long?
01:07:14.000 Yeah.
01:07:15.000 Right.
01:07:15.000 There was also, I saw this guy was trying to charge his, I don't know, some giant truck thing, and it was going to take a week to charge fully.
01:07:25.000 Jeez.
01:07:25.000 Well, even like the Jeep.
01:07:26.000 That's too long.
01:07:27.000 When I saw the ad for the Jeep, I was like, this kind of seems pointless.
01:07:30.000 You go off-roading and then there's nowhere to charge your vehicle and you're just kind of stuck.
01:07:34.000 Right?
01:07:34.000 Then what do you do?
01:07:35.000 That's the future.
01:07:36.000 Is it solar?
01:07:38.000 No.
01:07:38.000 I actually asked Elon on Twitter when he asked for something.
01:07:41.000 I'm like, why don't we have solar panels on Teslas yet?
01:07:44.000 And maybe wind turbines in the grills.
01:07:46.000 Yeah, to recoup some lost energy.
01:07:48.000 Did he have an answer?
01:07:49.000 No, he didn't.
01:07:51.000 Let's talk some apocalypse here.
01:07:53.000 We got this story from Wired.
01:07:55.000 I'm gonna make this one quick for you guys so you can get angry as fast as possible.
01:07:59.000 The bird flu outbreak has taken an ominous turn.
01:08:01.000 The avian flu has killed millions of chickens, decimated wild birds, and moved into mammals.
01:08:06.000 The avian flu, H5N1, has a mortality rate in humans of about 60%.
01:08:11.000 And if we go back in time to this article from February 8th, 2019, we can see that they were doing gain-of-function research, intentionally making it so that H5N1 would transfer to mammals.
01:08:22.000 There you go.
01:08:22.000 How are you guys doing?
01:08:23.000 Wait a second, what?
01:08:25.000 Wait, take that back.
01:08:26.000 So they were doing gain-of-function research to see if avian flu would translate into mammals and now it does?
01:08:30.000 No, no, no, no.
01:08:31.000 They were doing gain-of-function research to make it transmit to mammals.
01:08:35.000 To do it on purpose?
01:08:36.000 So they're intentionally trying to kill us and destroy our entire food source at once?
01:08:40.000 They're saying it's so that they can learn about what happens when it infects humans or mammals.
01:08:44.000 Oh my goodness, these people and their damn intellectual inquiry.
01:08:47.000 I mean, come on.
01:08:48.000 Is that what it is?
01:08:48.000 Is that what you think it is?
01:08:49.000 Intellectual inquiry?
01:08:50.000 Yeah, sure.
01:08:51.000 They think that's what it is.
01:08:52.000 I'm not convinced.
01:08:52.000 You don't think the scientists think that's what they're doing?
01:08:54.000 I do not believe that someone's like, I'm very curious as to what will happen if I take one of the most deadly flu variants.
01:09:01.000 What's your crazy theory?
01:09:02.000 I want to hear it.
01:09:03.000 Bioweapon research.
01:09:04.000 This is bioweapon research?
01:09:05.000 Yeah.
01:09:06.000 This is the first thing I thought when I actually saw the China balloon, was what happens if we do shoot it down and it has a bioweapon in it?
01:09:12.000 Boom.
01:09:12.000 Avian flu.
01:09:13.000 Look at this.
01:09:13.000 In 2011, Fujie and Kawaoka alarmed the world by revealing they had separately modified the deadly avian H5N1 influenza virus so that it spread between ferrets.
01:09:24.000 Advocates of such gain-of-function research blah blah blah, we could learn so much about it.
01:09:28.000 Critics are worried that the souped-up virus could spark a pandemic if it escaped the lab or was intentionally released by a bioterrorist.
01:09:34.000 I don't think it makes sense to be like, what's a very deadly strain?
01:09:38.000 Let's modify it so that it infects mammals to learn about it!
01:09:42.000 No.
01:09:42.000 This is enough to become thoroughly blackmailed.
01:09:44.000 It's bioweapon research.
01:09:45.000 Is it gain-of-function or is it like evolutionary chaining?
01:09:49.000 Oh yeah, direct evolution.
01:09:50.000 They're literally, science.org in 2019 called it gain-of-function research.
01:09:57.000 And now we're learning that H5N1 spread from birds to minks, which are similar to ferrets, and they had to kill all these mink in Spain or whatever.
01:10:06.000 If it jumps to humans, it is a, depending on your source, a 40 to 60% mortality.
01:10:11.000 I think Wired said it was a 52%, what is it, Science Medical Journal of some sort, I pulled it up earlier this morning, said 60% mortality.
01:10:21.000 So you could use gain-of-function to create a bioweapon.
01:10:24.000 You could use gain-of-function to create something that's not a bioweapon.
01:10:27.000 How did you describe what you thought this was?
01:10:29.000 Intellectual inquiry?
01:10:31.000 Yeah, I'm sure you could do intellectual inquiry on something that is also a bioweapon and intellectual inquiry on something that's not.
01:10:37.000 This obviously could be weaponized.
01:10:39.000 The World Health Organization abandoned their research into the origins of the COVID virus this week.
01:10:44.000 That was funny.
01:10:46.000 They were like, no, we don't care anymore.
01:10:47.000 No, we're not gonna look into that.
01:10:48.000 This reminds me of the 2015 study that was of the COVID variant that they were doing in other places in the world.
01:10:57.000 You know, so this very well could be something that we see in the future come out as another pandemic.
01:11:04.000 Like, I see you setting up dangerous situations to practice overcoming dangerous situations.
01:11:09.000 Like, let's set a house on fire and have the firemen go in and put the fire out.
01:11:13.000 That's what firemen do sometimes.
01:11:14.000 Right, they do that.
01:11:16.000 But they don't set fire to someone's neighborhood.
01:11:19.000 Yeah, and they don't say, let's direct a meteor into Earth to see how we will overcome if a meteor falls into Earth.
01:11:24.000 You use computer simulations for that stuff.
01:11:26.000 So the bioweapons, I think we should be using quantum computing.
01:11:29.000 Not only that government sets fire to neighborhoods, like in Philadelphia in what, like 86?
01:11:33.000 I think bio-weapons have made nuclear weapons obsolete.
01:11:38.000 That's really terrifying.
01:11:40.000 I hate this whole concept so much, this bio-weapon thing.
01:11:43.000 I mean, isn't it totally against the Geneva Convention and nobody cares at all about that?
01:11:47.000 Who cares about it?
01:11:47.000 The conventions are nonsense.
01:11:49.000 The idea of war crimes are nonsense.
01:11:51.000 The idea that you as a nation want to seize land from another nation, but you better follow the rules, yeah, right.
01:11:58.000 Nobody who's actually fighting a war cares about the rules.
01:12:00.000 So now we're in a position where we have to trust machines to drive us around, but we can't trust each other to make agreements without being total liars.
01:12:07.000 That's always been the case.
01:12:08.000 We're going to have to make agreements.
01:12:10.000 Don't you watch Yellowstone?
01:12:12.000 No, I don't watch Yellowstone.
01:12:14.000 I'm watching it.
01:12:14.000 The Native Americans are like, this woman, she says, the United States makes rules against everyone that it conquers.
01:12:21.000 They say, here are the rules.
01:12:22.000 They break those rules, then set the rules again, hoping you won't break them.
01:12:26.000 That's completely true.
01:12:27.000 Yeah, that is a little bit what happens.
01:12:31.000 Wait, you trusted other people at any point?
01:12:35.000 That's the craziest part to me.
01:12:36.000 Do you trust the government at any point in time?
01:12:39.000 I don't think I trust the government, but you have to trust other people.
01:12:43.000 Otherwise, you have absolutely no comradeship and you have absolutely no one you can confide in.
01:12:48.000 Yeah, I trust people.
01:12:48.000 Or be close to, or care about.
01:12:51.000 Totally, of course.
01:12:51.000 I trust some people.
01:12:52.000 I don't necessarily trust people that are saying, hey, let's just like mess around with these really deadly pathogens.
01:12:56.000 No, no, no.
01:12:57.000 I mean, but you know, I think I had assumed that you could trust your allies, like international allies, to not totally destroy the world.
01:13:05.000 Yeah.
01:13:05.000 Yeah.
01:13:06.000 But then, you know, the U.S.
01:13:07.000 did just blow up Germany's pipeline.
01:13:09.000 So, allies attacking allies, that's not pretty.
01:13:13.000 Yeah, it's the British obsession with keeping Germany and Russia separate.
01:13:16.000 When the German Russo alliance comes to fruition, it's going to be great.
01:13:19.000 Well, I mean, if you look at it, it is because Britain nearly got bombed off the face of the earth.
01:13:24.000 I kind of think it's a strong possibility we will see some kind of future pandemic.
01:13:30.000 Like this one?
01:13:31.000 And first it's going to take out all the animals and then you'll have nothing to eat.
01:13:36.000 And then if you live in a city, they're going to lock you down and a government truck will pull up at nine in the morning to hand you your daily food allotment of the bare minimum calories you need to survive.
01:13:45.000 All the morbidly obese people will become gaunt and skinny.
01:13:48.000 Everyone else who's used to not eating too much will probably just starve to death.
01:13:51.000 No, they won't starve.
01:13:52.000 They're going to be given food from the government, and you're going to be given an allotment, and then it's going to dramatically reduce carbon, and they're going to say, oh, well, you know, it's the bird flu.
01:13:59.000 This kind of sounds like what's happening in East Palestine.
01:14:03.000 Right?
01:14:05.000 Chemical release.
01:14:05.000 I mean, people are walking around.
01:14:06.000 Their chickens are just dead.
01:14:07.000 And their foxes.
01:14:08.000 Foxes are dead, too.
01:14:09.000 Yeah, because I'm like, why would someone want to destroy this awesome planet and what we've got?
01:14:13.000 But then I'm like, I'm talking from my perspective as an American.
01:14:16.000 Like, if you saw the Hunger Games.
01:14:16.000 I feel that way, too, though.
01:14:17.000 Like, why would you want to destroy everything?
01:14:19.000 And humanity is so beautiful.
01:14:21.000 Because you know, did you see the Hunger Games?
01:14:23.000 No, I don't.
01:14:23.000 The villains in the Hunger Games?
01:14:25.000 That's, I think, how the world looks at people, like the gluttonous people of the United States.
01:14:29.000 No, that's how people look at the World Economic Forum.
01:14:30.000 I mean, have you seen how Klaus Schwab, like, dresses?
01:14:35.000 Wait, are the villains just super preppy or whatever?
01:14:38.000 They've got, like, huge hair, like all this makeup.
01:14:42.000 Yeah, I put it next to the hunger games.
01:14:44.000 And like they're overeating for fun and they don't even know about what's going on outside their city.
01:14:50.000 Oh my goodness, I did not see that outfit.
01:14:52.000 What?
01:14:52.000 It's great.
01:14:53.000 He's like Romulan.
01:14:55.000 He does look Romulan, right?
01:14:57.000 Yes.
01:14:57.000 Okay, this guy dresses like a super villain. What's up with that?
01:15:00.000 Yeah, that is a supervillain outfit.
01:15:01.000 What is this?
01:15:03.000 I think he's playing the part.
01:15:04.000 Yeah, he likes it.
01:15:05.000 He likes the attention.
01:15:06.000 I think he's controlled.
01:15:07.000 He is the figurehead.
01:15:08.000 We have to release the alien food to kill the people.
01:15:10.000 He's the figurehead that's being controlled by everybody else though.
01:15:13.000 You think so?
01:15:14.000 Yeah.
01:15:14.000 He created the Economic Forum, which is basically a nothing burger, but then all the people around Earth started taking it seriously, and now they're propping him up as long as... Just look at everybody that invests in the World Economic Forum, and you see who's controlling the money and who's controlling everything else.
01:15:29.000 The Gates Foundation has their hands in everything.
01:15:31.000 So does the Chan Zuckerberg Foundation.
01:15:34.000 Going all the way even into SEL and schools, they're controlling all that stuff too.
01:15:38.000 Well, there's a crazy thing with SEL, too, where the teachers are so hyped on SEL that they figure that they don't actually need to teach kids anything other than the social-emotional stuff.
01:15:49.000 Because they've said that the emotional quotient score is more important than IQ score.
01:15:54.000 That's what SEL does.
01:15:54.000 Yeah, and it's just not true.
01:15:55.000 I mean, I have to say, like, my son was in a public school that was super SEL.
01:16:00.000 That was all everything was.
01:16:01.000 He would come home and he'd be like, Mom, you know, this is what I did in social studies.
01:16:05.000 And it's all social emotional learning.
01:16:07.000 And now he's in a school where that's definitely not the focus.
01:16:10.000 And his grades have gone up dramatically.
01:16:12.000 Is this something you've been focusing on?
01:16:14.000 So what is it exactly, social-emotional learning?
01:16:16.000 It's taking the well-being of the student into account well over what they're actually being taught in school to make sure that they're more intelligent.
01:16:27.000 And so that's why you're seeing a lot of even the queer theory and stuff like that being pushed into schools because They want to make children more comfortable in their classroom settings.
01:16:38.000 And more manageable.
01:16:39.000 Yeah.
01:16:40.000 And actually, even, like, so I mentioned Zuckerberg.
01:16:42.000 They have an app that they're coming out with.
01:16:46.000 It's called PanoramaEd.
01:16:48.000 And when you use this app, it's basically a social credit score for students.
01:16:51.000 You see a student.
01:16:53.000 When they do something good, you give them a plus.
01:16:56.000 And when they do something bad, you give them a negative.
01:16:59.000 And that is basically becoming their permanent record as they go through school.
01:17:02.000 I was thinking it sounds like an ESG for kids.
01:17:04.000 It is ESG for kids.
01:17:06.000 It's all tied in.
01:17:06.000 And there's all these like surveys, you know, that'll be like asking you about how you feel about everything.
01:17:12.000 And there's constant interference in interactions between kids.
01:17:17.000 And adults are constantly telling kids how to play, how to interact with each other, how to think about, you know, how to think about things.
01:17:24.000 They're not teaching them facts.
01:17:26.000 They're not teaching them anything about their history.
01:17:30.000 It's really disturbing to see it.
01:17:31.000 And it's disturbing to see the impact on kids, because they just get very creeped out.
01:17:37.000 The system is set up to follow them, because it's SEL in grade schools and through high school, and then you have DEI, diversity, equity, inclusion, in colleges, and then you have ESG in corporations.
01:17:49.000 And so it's basically...
01:17:52.000 What is it?
01:17:53.000 Conditioning them to accept these systems.
01:17:55.000 And we've been conditioned with credit or credit scores for so long.
01:17:58.000 I was looking at my credit.
01:17:59.000 I'm like, wait a minute.
01:18:00.000 So they want me to take out three credit cards or take out a bunch of credit cards and have a little bit of debt on every credit card.
01:18:05.000 So I'm paying a small pittance to the company of interest every time.
01:18:08.000 And they're like, good slave.
01:18:10.000 We'll give you a better score on your credit score because you paid us your interest every month.
01:18:14.000 Yeah.
01:18:15.000 And the more debt I'm paying every month on these multiple, to different organizations, they ask, you're more ingrained in our system.
01:18:20.000 Congratulations.
01:18:21.000 Good credit is built up by having debt.
01:18:24.000 That's correct.
01:18:24.000 Your credit does not improve if you have no debt.
01:18:26.000 And that is the, that's the, they want you in debt and then they give you a reward for it.
01:18:30.000 We have this, this guy's loan system.
01:18:31.000 Yeah, I had this argument with my mom years ago, cause I had student loan debt.
01:18:35.000 And she was like, you have to pay off your student loan debt.
01:18:37.000 And I was like, it literally doesn't matter.
01:18:39.000 It's perfectly fine debt.
01:18:40.000 I can die with this debt.
01:18:42.000 It just doesn't matter if I ever pay it off.
01:18:44.000 As long as I pay them, you know, whatever it is based on whatever their metric, it doesn't matter if it never goes away.
01:18:51.000 As long as you're paying the interest.
01:18:53.000 Paying like whatever it is and then you do a forbearance here and a like adjustment there
01:18:57.000 and a readjustment of your interest rate over here.
01:19:00.000 It doesn't matter.
01:19:01.000 It just doesn't matter if you ever pay it.
01:19:02.000 I think people should be able to bankrupt it off.
01:19:04.000 Like George Bush Jr. made that, changed that so you couldn't go bankrupt your student loans
01:19:08.000 off your… Well and then college degrees became worthless because
01:19:12.000 everybody, you know, everybody was getting them and now they're not worth a single penny.
01:19:17.000 And that's why they're so expensive, too, because they've taken out the capitalistic aspect of it.
01:19:22.000 So you can't go bankrupt because the government will pay it either way.
01:19:25.000 Or actually, the government will always get their monies back.
01:19:27.000 So because everything is government backed, that's why you can't go bankrupt on it.
01:19:32.000 Right, and that's why tuition goes up, because it's a bunch of free money.
01:19:35.000 So the kids can just go get more and more free money.
01:19:39.000 So just go get more and give it to us.
01:19:42.000 I considered doing bankruptcy.
01:19:43.000 You guys ever do bankruptcy before?
01:19:44.000 I considered doing it in 2013 because my credit was trash, and I didn't.
01:19:48.000 I just defaulted on all my credit cards, and after seven years, it's all gone.
01:19:52.000 My credit's good now.
01:19:53.000 So don't be afraid of debt.
01:19:54.000 Don't be afraid of it.
01:19:56.000 So what I found out is if you change your name, some of the credit scores don't update that.
01:20:02.000 And if they send you letters in the mail, you don't have to respond.
01:20:04.000 And if you do respond, they'll be like, okay, now he's back.
01:20:06.000 Now we got to get him.
01:20:07.000 Now he owes us again.
01:20:07.000 I literally had this conversation with TransUnion yesterday because they still hadn't updated my full credit score after I changed my name.
01:20:14.000 I've had fake things sent where it's like, you owe us money.
01:20:19.000 Like I got a letter saying it was like a couple hundred bucks that was owed from this company.
01:20:23.000 I'm like, I don't know that.
01:20:23.000 And they're putting a mark on my credit score or whatever.
01:20:25.000 And I call them like, I don't know this, you're wrong.
01:20:27.000 Dude, I had this one situation.
01:20:28.000 I had Bank of America.
01:20:29.000 I had $1,000 in the bank.
01:20:32.000 I sold my Magic Cards to someone.
01:20:33.000 They paid me with a check via Craigslist.
01:20:35.000 Terrible move.
01:20:36.000 Never do that.
01:20:37.000 So I went and I cashed the check.
01:20:38.000 They gave me $1,500 immediately.
01:20:40.000 I had no money in my account.
01:20:41.000 I put the $1,500 in.
01:20:42.000 I spent $900 of it.
01:20:43.000 And then the next day, they were like, oh, the check bounced.
01:20:45.000 They gave me a negative $1,500, put me at negative $900, and I was like, yo, yo, yo, I'm a customer, and you sold me that $1,500 when you cashed that check.
01:20:53.000 You can't go back.
01:20:54.000 You sold that to me as a customer.
01:20:55.000 And the girl in front's like, I know, I know, they did, but you're in negative $900, nothing I can do about it.
01:21:00.000 Amoral.
01:21:01.000 Cancel my account.
01:21:02.000 If they cash your check, and they tell you that that check is cashed, I think you have a right, legally, to that money.
01:21:08.000 Was the check a Bank of America check?
01:21:09.000 I don't remember.
01:21:10.000 So, like, when you sign up for a bank account, you agree that you are cashing a check against your balance?
01:21:18.000 Maybe that's the case.
01:21:19.000 Yeah, I don't know.
01:21:20.000 But I will also add... They make a certain amount available, though.
01:21:22.000 But that should not be legal because I didn't ask for a withdrawal.
01:21:26.000 If the check that you cashed is no good, that's on you.
01:21:30.000 If I wanted to withdraw money from my account, why would I cash a check?
01:21:33.000 Well, they shouldn't make it available instantly if they're going to screw you over after if it's no good.
01:21:38.000 I think them making it available indicated that the transaction was final, in my opinion.
01:21:42.000 Yeah, I think you're correct.
01:21:44.000 But there's very little banking regulation that protects consumers against, you know, bank manipulation.
01:21:50.000 I mean, even overdraft fees is like... How do you have an overdraft fee when they allow you to go into the negative?
01:21:57.000 That makes no sense.
01:21:58.000 Right?
01:21:58.000 They give you the money and then charge you for it.
01:22:00.000 Yeah.
01:22:00.000 And it's not even at their normal interest rate.
01:22:02.000 It's like just super excessive.
01:22:02.000 No.
01:22:04.000 That's the joke, that if you're poor, they charge you money, but if you're rich, they give you money for free.
01:22:08.000 Yeah, that's like that metric song.
01:22:09.000 Which one?
01:22:11.000 I forget which one it is.
01:22:11.000 It's on Art of Doubt, I think, but it's like, I'm so rich, everything's free.
01:22:17.000 Biden reduced fees, bank fees.
01:22:17.000 Yeah.
01:22:20.000 I heard that he was reducing those.
01:22:21.000 Oh, he says so much garbage.
01:22:23.000 The Bidenator, you know, let him go play his games.
01:22:29.000 All right, what's happening?
01:22:32.000 I guess we're all just tired of being ripped off by lying politicians, lying scientists, lying scientists, pandemics, lying bankers, gain-of-function research, and we've consumed too many black pills.
01:22:45.000 You guys want to sit in silent meditation for 10 minutes?
01:22:48.000 I feel like we should all just take one of these chakras.
01:22:51.000 Dude, one day we're going to do a show where we just meditate for 30 minutes and everyone's going to meditate with us.
01:22:55.000 Do you think they will?
01:22:56.000 And it's going to change the world, yeah.
01:22:57.000 Viewers, would you meditate with Ian for 30 minutes?
01:22:59.000 Give me a 20.
01:23:00.000 Jimmy Dore was very excited about that.
01:23:01.000 He was like, you actually do that?
01:23:02.000 That wouldn't be a bad morning show.
01:23:04.000 Like, hey, wake up with Ian.
01:23:06.000 Let's talk about this one.
01:23:07.000 30 minutes of meditation in the morning with you, I think that would be cool.
01:23:11.000 All right, here you go, you guys, from the Washington Examiner.
01:23:14.000 George Soros says DeSantis will beat Trump for GOP nomination.
01:23:17.000 This is so funny.
01:23:19.000 So does that just seal the deal for everybody that DeSantis is not their guy?
01:23:23.000 Are you saying that this is an attempt to hurt DeSantis?
01:23:24.000 putting their putting the extra sandbags on the scale for DeSantis and all the
01:23:28.000 MAGA people are gonna be like you're totally trying to steal the election
01:23:32.000 before you know the election even happens you're trying to like add all
01:23:36.000 this extra weight and favor. Are you saying that this is an attempt to hurt
01:23:40.000 DeSantis? I think this I think there's an attempt to hurt Trump. Oh no no.
01:23:44.000 This is an attempt to hurt DeSantis.
01:23:45.000 Yeah, you're right.
01:23:46.000 George Soros.
01:23:48.000 Yeah, because with DeSantis, they're scared of him.
01:23:50.000 So they want them to think that they are the ones that are supporting him so that the conservatives won't support him.
01:23:56.000 They'll turn away from him.
01:23:58.000 It's like feigning.
01:23:59.000 Yeah, I think that makes more sense.
01:24:00.000 Yeah.
01:24:01.000 Do you think DeSantis is even going to run?
01:24:05.000 I think he will.
01:24:06.000 I think he will.
01:24:07.000 Better question.
01:24:07.000 Who do you think is the most viable option for president coming up?
01:24:12.000 Ron Paul.
01:24:13.000 Who could win?
01:24:14.000 Like, are we talking electability?
01:24:15.000 I hate that word.
01:24:16.000 I'm talking about someone that wants, someone obviously that's going... I don't know what... Oh, Nikki Haley for sure.
01:24:23.000 But she is past her prime.
01:24:25.000 Look, I'm not saying that.
01:24:26.000 If you Google it... She's not Don Lemon's choice.
01:24:29.000 Right.
01:24:29.000 Don Lemon.
01:24:30.000 Don Lemon.
01:24:31.000 Expert on women.
01:24:33.000 No, I would think that DeSantis has the likability and the policies that everybody likes him other than the MAGA people.
01:24:41.000 I think he has a real shot at winning.
01:24:43.000 Well, the MAGA people would like him if Trump endorsed him.
01:24:45.000 Exactly.
01:24:46.000 People want Jimmy Dore for president.
01:24:49.000 Kanye's throwing his hat in the ring.
01:24:50.000 Kanye Dore?
01:24:51.000 Wait, Kanye's actually going for it?
01:24:54.000 I don't think he's, like, announced or anything, but he's definitely running.
01:24:56.000 He said that a couple months ago, but I'm pretty sure a lot's happened since then.
01:25:00.000 I have a feeling he'll follow through.
01:25:01.000 You think?
01:25:02.000 Yeah, he's got nothing else going on.
01:25:04.000 It's a big deal, you know, running for president.
01:25:07.000 He's been thinking about it for, like, eight years, too.
01:25:09.000 He's talked about it.
01:25:10.000 Well, he almost ran before, if I recall.
01:25:12.000 He did run last time.
01:25:14.000 I got a cord underneath my leg.
01:25:15.000 Did you just pull the plug on the crazy UFO thing?
01:25:19.000 Yeah, I knocked it off.
01:25:21.000 All right.
01:25:21.000 I want to run for VP this year.
01:25:22.000 What do you guys think?
01:25:23.000 You know, you used to be able to run with somebody.
01:25:25.000 That's how it would go in the early days of the United States is whoever got the second most votes for president would be VP.
01:25:33.000 That's how you had Washington Adams.
01:25:37.000 That's how you had Adams Jefferson.
01:25:38.000 That makes sense, too.
01:25:39.000 It does make sense.
01:25:40.000 You know, that's how the Libertarian Party selects their VP candidate.
01:25:44.000 So they select their presidential candidate, and then everybody that wants to run for VP runs for VP.
01:25:48.000 So they have two separate elections to determine who's going to be on the top of the ticket.
01:25:52.000 And then whoever wants to run for the bottom of the ticket runs as well.
01:25:56.000 It was also that first Washington-Adams administration that set the tone for the VP having no power.
01:26:02.000 Really?
01:26:02.000 Yeah.
01:26:03.000 Yeah, because like Hamilton got in Washington's ear and basically was saying that Adams was a monarchist or something like that.
01:26:10.000 And so then Washington was like really hesitant to include Adams in any decision making.
01:26:14.000 And Adams just kept going to the Senate every day instead.
01:26:18.000 Oh, okay.
01:26:20.000 So what powers does the VP actually have other than, like, breaking ties in Congress?
01:26:27.000 Oh, sorry.
01:26:28.000 Well, it's not the border for sure, you know?
01:26:31.000 But I don't think the VP has any designated powers.
01:26:35.000 The VP has, like, stuff that the president would give them to do, right?
01:26:41.000 So, like, the VP always has, like, some sort of project, kind of like the First Lady, you know?
01:26:45.000 Someone just sent me an AI deepfake of Trump complaining about Australia.
01:26:51.000 It's hilarious, but I'm just sitting here thinking like, we, someone, a friend of mine sent me 11 Labs AI stuff, then we talk about it on the show, and now everyone's posting this stuff like crazy everywhere.
01:26:51.000 How is it?
01:27:06.000 Like we're a month out, this election is gonna be crazy.
01:27:09.000 It's gonna be nuts!
01:27:11.000 Yeah, it's going to be really wild.
01:27:11.000 It's going to be exciting.
01:27:13.000 You're going to have full-throated endorsements from George Soros talking about how Ron DeSantis is the greatest candidate of this or any generation.
01:27:20.000 And then people are going to believe it.
01:27:21.000 You're going to get Klaus Schwab endorsing Trump.
01:27:23.000 You're going to get Hillary Clinton endorsing DeSantis.
01:27:26.000 Bernie's going to come out in favor of Joe Biden.
01:27:28.000 So then what's going to happen?
01:27:29.000 Wait, that happened already.
01:27:30.000 You think Jesus is really going to come back as a deepfake?
01:27:32.000 If we have us, nobody knows what Jesus sounds like.
01:27:36.000 Not yet.
01:27:36.000 Yeah, what would it sound like?
01:27:38.000 And of course, that's the old joke.
01:27:42.000 But the thing too is, what was I going to say?
01:27:44.000 You go ahead.
01:27:45.000 I was saying we saw frogs this morning and it's wintertime here.
01:27:45.000 I don't know.
01:27:48.000 So I said it was raining frogs.
01:27:49.000 So maybe we are at the end of times.
01:27:51.000 You saw frogs raining down?
01:27:51.000 I don't know.
01:27:53.000 No, they were running across the road, but I made the joke that it's raining frogs, maybe we're at the end of times, you know?
01:27:59.000 You're not the first person to make that statement, and not even a joking matter, lots of people have said that we're in the end of days.
01:28:06.000 People always like to say that, and now we have revival meetings in Kentucky, so maybe we're going to have something like that.
01:28:11.000 If you actually read it like the Mark of the Beast, you can't buy or trade unless you bear the mark, and now we got social credit scores.
01:28:19.000 It's the apocalypse.
01:28:20.000 Which means the revelation or disclosure.
01:28:22.000 I mean, talk about the age of information being released to the public.
01:28:25.000 This is the age of disclosure.
01:28:27.000 There was something too with the bear and something else.
01:28:31.000 It basically symbolized Russia and China when you look at it.
01:28:34.000 The bear and the dragon or something?
01:28:36.000 Something like that.
01:28:37.000 It basically signified a nuclear war or something like that that would end all civilization.
01:28:44.000 If we have a situation where it's just a ton of deep fakes of everybody, you know, these various endorsements or what have you, how is the public going to stay engaged in the election process?
01:28:55.000 Don't you think people are just going to start tuning out entirely and just go with their own biases?
01:29:00.000 Remember that dude who got arrested because he posted a meme that was voter misinformation or something?
01:29:04.000 Yeah, he lost, I think.
01:29:05.000 I mean, it's going to be bonkers in 2024 when people are putting out videos of Joe Biden being like, make sure you turn up to your local fire department this time and this date.
01:29:15.000 And then people are going to be like, hey, that tricked me and I couldn't vote.
01:29:18.000 And then, look, all dirty games will be played.
01:29:24.000 All of them.
01:29:24.000 Wild.
01:29:25.000 Totally wild.
01:29:26.000 And especially among the left, because these people are willing to go to prison for this stuff.
01:29:30.000 Well, people on the right go to prison for this stuff.
01:29:32.000 Like, you know, that happens too.
01:29:34.000 Yep.
01:29:35.000 But the left, they're very much willing.
01:29:36.000 Because they're true believers.
01:29:38.000 Yeah.
01:29:38.000 Always beware the true believer.
01:29:40.000 Yeah, man, it's going to be interesting.
01:29:42.000 DeSantis saying offensive things, and it's like we were talking about at the beginning of the show.
01:29:46.000 The videos that are going to work are going to be ones that aren't unbelievable.
01:29:50.000 It won't be Donald Trump saying an n-bomb or something.
01:29:53.000 It'll be him saying something like, you know, I'm in favor of gun control.
01:29:58.000 I said it before, I'll say it again.
01:29:59.000 I'm going to take the guns as soon as I get an office, but I just won't say it while I'm running.
01:30:03.000 And then people are going to be like, and he's going to lose support because people will believe it.
01:30:06.000 And it's going to be things targeting the right.
01:30:09.000 You don't need to say anything to make the left hate Trump, they already hate him.
01:30:12.000 Yeah, it's easy for them to hate Trump.
01:30:14.000 That's the key of meditation is not get triggered.
01:30:16.000 Like, don't let stuff trigger you.
01:30:18.000 I don't know if assume it's all a deep fake off the bat, because sometimes you just assume it's hard.
01:30:24.000 Don't make assumptions.
01:30:25.000 But just keep in mind that it could be fake when you see anything.
01:30:29.000 It's very difficult to navigate reality when you can't tell what actually is real and what's not.
01:30:37.000 Yeah.
01:30:38.000 I mean, I think that, you know, just waiting things out is always a good plan.
01:30:42.000 Like when things start coming out, you got to start just listening and being like, okay, let's let the situation calm down because what you're thinking right now is probably not what's actually happening.
01:30:52.000 Except for the chemical spill, man.
01:30:54.000 I was like, you know what?
01:30:55.000 Damn be the consequences, everyone.
01:30:56.000 Get water filtration and air filtration now.
01:30:58.000 Look at this graphic.
01:30:59.000 I don't know if it's real or not.
01:31:00.000 Take care of yourself.
01:31:01.000 I'm not waiting for government.
01:31:03.000 I'm not waiting for a confirmation.
01:31:04.000 I don't even know if the chemical spill really happened.
01:31:07.000 It's all through the media that I've heard about this, but I'm still going to talk about it.
01:31:10.000 But preppers have been telling you this for a while, so maybe this is how they're making some money off of it.
01:31:15.000 It's like we were talking about before, where people were posting videos online to make a fake event, being like, oh, look at this, and it's a video from 2010 of a military transport, but they claim it happened yesterday, and then someone gets a video of a cop running into a building, and they're like, look, for all we know, half the stories put out by the media are just that mass hysteria to trick us into believing these things are happening when nothing's happening.
01:31:35.000 Well, there certainly has been that.
01:31:36.000 I mean, remember the case of, what was the, there was like a New York Times journalist who was eventually found out, but it turned out he was just making everything up.
01:31:44.000 Yeah.
01:31:44.000 Wasn't there a Bell guy too?
01:31:46.000 Derek Bell?
01:31:47.000 Is that?
01:31:49.000 But why do you need to make up fake stories when you have people like George Santos out there that has a shady past that he was lying about?
01:31:56.000 Well, he was just hanging out in his apartment and he was just making stuff up.
01:31:59.000 Do you want to do work?
01:32:00.000 You work in a newspaper.
01:32:02.000 You don't want to work.
01:32:03.000 You want to make money.
01:32:04.000 So you just make it up?
01:32:05.000 Yeah.
01:32:05.000 That's what they do.
01:32:06.000 What was the guy's name?
01:32:06.000 I don't want to start naming Derricks from the New York Times because there would be a bunch of them.
01:32:10.000 I don't remember.
01:32:10.000 Well, he wouldn't work there now.
01:32:11.000 He doesn't work there now.
01:32:12.000 What was his name?
01:32:14.000 New York Times faker.
01:32:15.000 Really?
01:32:16.000 Did he get fined or go to prison or anything?
01:32:18.000 Why would he?
01:32:19.000 It's free speech.
01:32:20.000 Just to lie to the public?
01:32:21.000 Yeah, of course.
01:32:22.000 Lying is free speech.
01:32:23.000 Jason Blair.
01:32:23.000 It was Jason Blair, not Derrick Bell.
01:32:25.000 I don't even know who that is.
01:32:27.000 Derrick Bell is the guy, he's the critical race theorist.
01:32:29.000 That guy.
01:32:30.000 That's right.
01:32:31.000 I mean, he makes stuff up too.
01:32:33.000 I'm over here making stuff up.
01:32:35.000 Totally fake news.
01:32:36.000 Yeah, it was Jason Blair.
01:32:38.000 What did he do?
01:32:39.000 He plagiarized and made stuff up and was writing for the New York Times.
01:32:44.000 Recently?
01:32:44.000 He resigned in 2003.
01:32:45.000 Oh, okay, that was a long time ago.
01:32:47.000 Yeah, it was a long time ago, but that stuff's probably still happening.
01:32:51.000 I think it was Bild or whatever, that German newspaper, and he was just fabricating news.
01:32:56.000 He got a bunch of awards or something.
01:32:58.000 It's amazing, isn't it?
01:33:00.000 Yeah, but now they've evolved to making up, having machines do it for you and you don't know that it's them.
01:33:05.000 I think it's the idea.
01:33:05.000 You don't want to be the next Jason Blair.
01:33:08.000 What's up, Jason?
01:33:10.000 I wonder what he's doing now.
01:33:11.000 Well, they're supporting, now they're using like deep fake videos to support the news that they're writing.
01:33:17.000 Yeah, definitely.
01:33:17.000 Oh man.
01:33:19.000 Wow, so you actually are sourcing it to something that actually just turns out to be fake.
01:33:23.000 Like those people that Tim was talking about, the fake event, the fake war games thing.
01:33:27.000 I know we're evolving to become psychic, or that we could.
01:33:31.000 I don't see what is going to happen to humanity.
01:33:33.000 It's got to be that we'll just become a different species.
01:33:35.000 Homo sapien is about to turn into something else.
01:33:37.000 Transhumanism.
01:33:38.000 Let's go to Super Chats!
01:33:39.000 If you haven't already, would you kindly smash that like button, subscribe to this channel, and share the show with your friends.
01:33:43.000 I'm gonna read this later Super Chat first.
01:33:46.000 7 seconds till the end says, in the Book of Revelations, it does say that beasts, people think the beasts are viruses that will wipe out at least two-thirds of all humans.
01:33:56.000 Go to TimGuest.com, become a member.
01:33:57.000 We're going to have a members-only, uncensored show coming up for you at about 11pm.
01:34:01.000 Those are always very fun and enlightening.
01:34:04.000 Tomorrow's guest is going to be a lot of fun.
01:34:06.000 I don't know, should I announce who tomorrow's guest is?
01:34:08.000 Nah, let it hit him.
01:34:13.000 I kind of feel like the guest is big enough to where it's the kind that needs some, you know what I mean?
01:34:18.000 Some sort of prefacing?
01:34:20.000 If the guest has confirmed, I would say, yeah.
01:34:22.000 But if the guest hasn't confirmed... I mean, it's all confirmed.
01:34:23.000 It's just you never know.
01:34:24.000 No, I mean publicly.
01:34:25.000 Publicly, if they've confirmed publicly.
01:34:27.000 I tend to err on the side of caution.
01:34:29.000 That's a fair point.
01:34:30.000 Let me check.
01:34:31.000 And then if not, I'll just give a hint, I guess.
01:34:34.000 Yeah, it'll be a fun show.
01:34:35.000 Yeah, I think it's going to be really, really big regardless.
01:34:38.000 I think it's going to be real awesome.
01:34:39.000 I'm so awesome that she's coming.
01:34:41.000 Definitely.
01:34:41.000 It's been an interesting week here.
01:34:42.000 Yeah.
01:34:43.000 I don't, I don't think she's, she's announced.
01:34:45.000 Okay.
01:34:46.000 I think people, I think if she did, people would be chatting us being like, Oh yeah, for sure.
01:34:50.000 I think for sure.
01:34:51.000 I'm excited.
01:34:51.000 This'll be a lot of fun.
01:34:53.000 Just rest assured.
01:34:54.000 I get, I think you guys should show up at seven 45 or so.
01:34:56.000 I think, I think people are going to be able to guess, but I'll just say it's one of the most prominent, like what?
01:35:01.000 Like female conservatives.
01:35:03.000 Uh, yeah.
01:35:04.000 Been in the industry for a very long time.
01:35:05.000 Yeah.
01:35:06.000 Politically active much longer than I've been around.
01:35:08.000 So, uh, I think people could probably guess.
01:35:11.000 Conservative!
01:35:12.000 Conservative.
01:35:12.000 Stop saying Tulsi Gabbard.
01:35:14.000 It's not Marjorie Taylor Greene.
01:35:15.000 She's been on the show before.
01:35:16.000 It's also not Steve Bannon.
01:35:18.000 People are saying Lauren Southern.
01:35:19.000 Lauren's been on the show several times.
01:35:20.000 Someone who's never been on the show before.
01:35:22.000 Very prominent conservative commentator.
01:35:24.000 That whole thing where her parents got banned from Airbnb was so nuts.
01:35:27.000 Dude!
01:35:28.000 Yeah, right?
01:35:29.000 What?
01:35:29.000 And now they don't use it anymore.
01:35:31.000 Nobody's guessing.
01:35:32.000 I think it's kind of funny.
01:35:33.000 They're saying Marjorie Taylor Greene.
01:35:34.000 It's going to be a good show.
01:35:34.000 It's going to be one of those where you're like, oh, I didn't even think of that.
01:35:37.000 Yeah, it's going to be when you think about it, you're going to be like, how did I not get that?
01:35:40.000 God, come on.
01:35:42.000 Roseanne Barr, no, but hopefully... Actually, no, we are talking with Roseanne.
01:35:46.000 We just need to find it, you know, Roseanne lives in New York.
01:35:49.000 People are saying Megyn Kelly.
01:35:52.000 I will say... You're talking to Roseanne?
01:35:55.000 Yeah.
01:35:56.000 That's awesome!
01:35:57.000 Yeah, Roseanne's awesome.
01:35:59.000 I've been a big fan of hers since I was a kid and I watched that show.
01:36:03.000 Yeah, we're trying to figure out when we can get her to come on, when she's available, when we have time, etc.
01:36:08.000 That was one of the only shows that I ever watched with my family.
01:36:12.000 And there was this one episode where Becky did something wrong.
01:36:15.000 And somehow, because Becky did something wrong, I got in trouble.
01:36:19.000 And I was like, what is going on?
01:36:20.000 I'm not even Becky.
01:36:21.000 I'm more like Darlene, first of all.
01:36:23.000 Second of all, I'm not on the TV.
01:36:26.000 I'm here in the house.
01:36:27.000 I'm watching TV with you.
01:36:28.000 How could I be in trouble?
01:36:29.000 We're going to go to Super Chats, but I just want to point out, several people in the chat have gotten it correct already.
01:36:34.000 So you've guessed it correctly.
01:36:35.000 I'm not telling you who or what, but let's read Super Chats.
01:36:38.000 All right, Waffle Sensei says, Welcome to the show, Sarah.
01:36:40.000 Thank you for having a spine and speaking truth.
01:36:42.000 Your voice can be one of the strongest in the movement to save our kids.
01:36:45.000 We are lucky to have you.
01:36:46.000 Thank you.
01:36:47.000 Well, there you go.
01:36:49.000 All right.
01:36:50.000 Bullseye Ben.
01:36:51.000 Oh, is that a gold gem-encrusted beanie from Bullseye Ben?
01:36:54.000 Yes, it is.
01:36:55.000 This one is for you, Ian.
01:36:56.000 You did awesome in that Cast Castle video.
01:36:58.000 My co-workers thought something was wrong.
01:37:00.000 I was laughing so hard.
01:37:01.000 Oh, I was laughing when I watched it.
01:37:03.000 It was so good.
01:37:04.000 It was so funny.
01:37:05.000 That's great.
01:37:06.000 The show is called Rian with Ian.
01:37:08.000 That was so fun.
01:37:10.000 When I was shooting it with Wesley, Wesley Roth was directing, and Aaron was there too.
01:37:14.000 And at one point, there were so many lines, it was just one of these scenes where he wrote a lot, and he was like, you can just kind of say what you want, you know, this is the idea, but here's some things I want you to hit.
01:37:22.000 And I was like, getting so frustrated.
01:37:24.000 I like slammed the table and I was like, wow, I hope that's on, I hope that's on camera.
01:37:28.000 Cause it was like a Bill O'Reilly moment where I was like, just really, I was getting into character, like feeling Steven's frustration of what he's been going through the contract, you know, like watch it.
01:37:37.000 Cast Castle.
01:37:37.000 It's on YouTube and on timcast.com.
01:37:40.000 Some, I don't understand, you know, I guess this will kind of give it away, but everybody was guessing names, and like two people got it right, and we were like, yeah, we think a couple of you got it right, and then all of a sudden, everyone just guessed the right answer at the same time.
01:37:52.000 All right, let's read more, let's read more.
01:37:55.000 Yes, and also, I just want to shout out, you may notice that there are now little beanie emojis for those who are members on the YouTube channel to chat, and there's, they're beanie badges, And there's different colors depending on what level you are.
01:38:09.000 So like the highest level, I think, is an American flag.
01:38:11.000 Yeah, it's an American flag beanie.
01:38:13.000 But that's like three years, right?
01:38:15.000 Yeah, you gotta be there for a while.
01:38:16.000 I don't think anyone's done that.
01:38:16.000 So it's all about how long you've consecutively been a member?
01:38:19.000 Correct.
01:38:19.000 Yeah.
01:38:20.000 And then what we're gonna do is we're gonna make another tier of pure silliness, which will give you a golden rooster badge and a bunch of different chicken emojis.
01:38:33.000 And just because, I don't know.
01:38:34.000 Yo, we got one in the chat.
01:38:36.000 We've got one, what, a chicken?
01:38:37.000 There's a little America beanie in the chat.
01:38:39.000 Oh, really?
01:38:39.000 Oh, look at this!
01:38:40.000 Yeah, there you go.
01:38:41.000 There's an America beanie.
01:38:42.000 S, you've got an American flag beanie.
01:38:44.000 That means you've been a member for, what, 36 months?
01:38:46.000 I think it's more than that.
01:38:47.000 36 months.
01:38:48.000 Yeah.
01:38:48.000 48 months we haven't gone yet.
01:38:49.000 Yeah.
01:38:50.000 Wow.
01:38:51.000 Long time member.
01:38:52.000 Yeah, shouts out to S. Yeah, Smith with the red pills.
01:38:55.000 What's up, dude?
01:38:56.000 Yeah.
01:38:56.000 People posting red pills.
01:38:58.000 Yeah, we got emojis.
01:38:59.000 Saying the names of people.
01:39:00.000 Yeah.
01:39:01.000 All right.
01:39:02.000 S.A.
01:39:02.000 Federale says, on AI, I was a young pothead building half pipes.
01:39:06.000 I DIR through Windows 3.1 found Dr. Watson.
01:39:10.000 My friends would make it cuss.
01:39:12.000 I told it to calculate pi and it went on forever.
01:39:14.000 Supposedly not IBM's Watson.
01:39:16.000 Gates gave city names to OS code names.
01:39:19.000 Could Sydney be Australia, the next OS?
01:39:22.000 Have you guys ever, you ever hear of Dr. Spezo?
01:39:25.000 No.
01:39:25.000 You want to look that up?
01:39:26.000 Yeah.
01:39:27.000 It was just like, I don't even know what it was.
01:39:29.000 I just remember that you could, you could type in, say a word and it would.
01:39:32.000 Oh yeah.
01:39:33.000 Dude.
01:39:34.000 This is like one of the original speak.
01:39:36.000 Yeah.
01:39:36.000 Creative labs.
01:39:37.000 Yeah.
01:39:38.000 I had Dr. Spade.
01:39:39.000 So yeah.
01:39:39.000 And you'd, you'd be like, you know, say something and then it would, the robotic voice would say it.
01:39:45.000 1991 MS-DOS.
01:39:46.000 Uh, and you could literally, it came up with, the name is an acronym for sound blaster, artificial intelligence, text to speech organizer, sound blaster.
01:39:53.000 It was a sound blaster thing.
01:39:55.000 Sound Blaster.
01:39:56.000 Artifice was created by Greg and Abs.
01:39:57.000 Well, yeah, that.
01:39:58.000 S is a member, just S, and with an American flag beanie, the ultimate top-tier beanie.
01:40:03.000 There's Eric Ailman was in there, too.
01:40:05.000 Oh, yeah?
01:40:06.000 Yeah.
01:40:07.000 And then we're going to make the chicken-tier memberships, which will be needlessly more
01:40:11.000 expensive because if you want it, you can have it, but it's like a choice.
01:40:15.000 And then we'll put a whole bunch of chicken emojis, and your badge will be a golden rooster.
01:40:20.000 Yep.
01:40:21.000 It's going to be cool.
01:40:22.000 I love the American flag beanie.
01:40:24.000 Yeah.
01:40:25.000 Yeah, that's cool.
01:40:25.000 JT Fire says, I did not know that I needed Biden and Trump playing Overwatch.
01:40:29.000 Bidenator forever.
01:40:30.000 Yeah.
01:40:31.000 I found it funny because I literally play Overwatch before the show.
01:40:35.000 And I'm wondering if I get endorsed.
01:40:37.000 So I don't know if you guys ever play Overwatch.
01:40:39.000 You guys ever play it?
01:40:40.000 No?
01:40:40.000 Oh, yeah.
01:40:40.000 The first one.
01:40:41.000 I haven't touched the second one yet.
01:40:42.000 I'm playing Overwatch 2, and I know that I'm really good, because after every match, whether I win or lose, everyone endorses me, and my username is Timcast, so that must mean I'm really good at the game.
01:40:53.000 Like, because everyone's, they click the button saying they like you.
01:40:56.000 No, I think people are probably just like, they know who I am, and they're like, oh yeah, shoutout or something.
01:41:00.000 But, you know, I only like playing No Limits, I don't like playing Ranked or any of that stuff.
01:41:05.000 Do you ever go on voice chat and be like, you know who I am?
01:41:06.000 No.
01:41:09.000 Hey, what character are you using right now?
01:41:10.000 What's your main?
01:41:11.000 Oh man, I don't know, probably Symmetra.
01:41:14.000 I like, but Moira, I'm like undefeatable.
01:41:18.000 Like just, oh I got like 15 player killstreaks.
01:41:20.000 I think Moira's just like an easy character to play.
01:41:22.000 Yeah.
01:41:23.000 Moira, but Symmetra's the most fun because if you know how to place the sentry turrets in clever ways, it's just like, you're playing these casual games and these people don't understand because they're looking for, I don't know.
01:41:36.000 Movement, they're looking for movement.
01:41:38.000 No, no, the sentry turrets you place and then they try and blow them up because they're shooting at them.
01:41:41.000 But I put them on like lampposts or you put them in crevices.
01:41:44.000 You put them in weird places where they're hard to see and hard to shoot at.
01:41:47.000 But my favorite is playing No Limits when everyone plays Symmetra.
01:41:50.000 And then we just line the enemy's door with sentry turrets and as soon as they walk out they instantly die.
01:41:54.000 Sounds like NATO.
01:41:55.000 Alright, enough Overwatch talk.
01:41:58.000 Anyway, that video made me laugh a lot because I've been playing Overwatch a lot.
01:42:00.000 Sounds like NATO.
01:42:02.000 Alright, what do we got?
01:42:05.000 Because Reason says deepfakes won't exist in five years.
01:42:08.000 That's how fast this is moving.
01:42:10.000 In five years you will make your own porn, whatever that happens to be.
01:42:13.000 Yes, but what if, if you can, you can make a video of Joe Biden declaring war and it'll be indistinguishable?
01:42:19.000 You could make a video of Joe Biden declaring war while doing porn.
01:42:24.000 Yeah.
01:42:25.000 Oh, yeah.
01:42:26.000 Can I just say, to everybody who plays Overwatch, I just want to, while I have the opportunity, with so many people who listen to this show, please fight on the point.
01:42:35.000 Can I just, do you guys understand?
01:42:36.000 Oh, always stand on the point.
01:42:36.000 Sorry, I'm moving.
01:42:38.000 I'm just, I'm so frustrated.
01:42:39.000 I know I'm just playing casual, I like playing no limits, but it's like, I'm the only one on the robot, I'm the only one on the point, and they're chasing after the enemy, getting broken apart, and then one Lucio jumps on and captures it, and now we gotta wait another five minutes, and I'm like, my guys.
01:42:54.000 I understand if you're trying to keep them off the point initially, but once they break your line, you've got to stay on the robot.
01:42:59.000 Yeah, man.
01:43:00.000 If you have a chance to take the core, take the core.
01:43:02.000 Don't go get a mercenary camp.
01:43:04.000 I'm talking about Heroes of the Storm right now.
01:43:05.000 Do not, do not look at a gift horse in the mouth.
01:43:07.000 If you have an opportunity to take it home, take it home.
01:43:09.000 Yo, I had a game where we were like 0.03 meters from pushing the payload, and then everyone's just fighting off point, and I'm like, you realize the moment they go off, we win.
01:43:19.000 Just get on the payload.
01:43:19.000 Some people play for fun, other people play to win.
01:43:22.000 So the people that play for fun want combat, they want player versus player, the action, they don't care about the... I know, but I like a little mix.
01:43:28.000 I'm not playing just for like, if I was gonna play just to win, it'd be ranked.
01:43:31.000 I like having fun and playing different characters.
01:43:33.000 But I mean, you're still trying to win to a certain degree, come on.
01:43:36.000 All right, all right, anyway, anyway.
01:43:38.000 Enough Overwatch talk.
01:43:40.000 H22 says Biden needs to use AI chat to do his speeches and pre-recorded videos.
01:43:45.000 I'd vote for him if for that.
01:43:47.000 I mean, that's a good point.
01:43:49.000 Why doesn't he just deepfake himself?
01:43:51.000 Yeah, there you go.
01:43:52.000 Yeah, somebody should do that.
01:43:55.000 Wajian says deepfakes could be a win for legacy media.
01:43:58.000 It could be the go-to excuse to dismiss media online.
01:44:01.000 True.
01:44:02.000 Didn't think about that.
01:44:03.000 Uh, no, because CNN posts fake news all the time.
01:44:05.000 They're gonna be like, trust us, we're real.
01:44:08.000 No, you're not.
01:44:08.000 You're just running the fake news on TV.
01:44:11.000 Well, and certainly newspapers have posted fake news all the time.
01:44:14.000 We just talked about Jason Player.
01:44:15.000 Mm-hmm.
01:44:16.000 Ben Hickson says, Tim, Ian, are you looking forward to Atomic Hearts?
01:44:19.000 It is a future-era Soviet Bioshock-like game.
01:44:22.000 Have you played Prey 2016?
01:44:24.000 Has the trolley problem?
01:44:26.000 Oh, that sounds fun.
01:44:26.000 I haven't played Prey.
01:44:27.000 That sounds interesting.
01:44:28.000 I gotta say, the first Bioshock is a masterpiece.
01:44:32.000 The subsequent Bioshocks are kinda meh.
01:44:35.000 Bioshock Infinite I think is okay, but Bioshock 1, the video game, is...
01:44:40.000 Masterpiece.
01:44:40.000 You know, my type of games is like Divinity 2.
01:44:43.000 I love isometric role-playing games, so I'm really looking forward to Baldur's Gate 3 release, and I think we're going to do a live stream of that.
01:44:50.000 Myself, perhaps, pixelated Apollo.
01:44:52.000 He's into it.
01:44:54.000 And maybe Tim.
01:44:55.000 I don't know if you're into RPGs or if that's something you want to do.
01:44:57.000 I used to play more RPGs when I was younger.
01:44:59.000 I was playing Breath of Fire for a while because I bought a Super Nintendo.
01:45:02.000 Oh yeah, Breath of Fire was good.
01:45:04.000 I think I played Breath of Fire 4.
01:45:05.000 I beat Mario RPG a couple months ago.
01:45:08.000 It's remarkable playing Mario RPG as an old man now, because everything I do is timed perfectly.
01:45:14.000 When I was a kid, it was like I'm playing Mario, and I'm trying to time it, and I'm messing up.
01:45:17.000 Now I'm old, and I'm playing this game for the first time in 20 years, and everything is super easy, and I'm just like, wow, this game's a lot easier than I realized.
01:45:24.000 Punch-out.
01:45:24.000 You guys play Nintendo Punch-out?
01:45:26.000 Mike Tyson's Punch-out?
01:45:27.000 It was so hard when I was nine, but now... Now it's not hard.
01:45:30.000 Yeah, just read the cues.
01:45:32.000 I haven't played video games in a while, but I did buy the new Harry Potter game.
01:45:35.000 How is it?
01:45:36.000 I haven't played it.
01:45:37.000 Because again, I haven't played video games in a while, so I still just have an Xbox One, and so apparently I can't play it until April.
01:45:42.000 Is the JK Rowling thing making it hard for you to play the game?
01:45:45.000 No.
01:45:45.000 It's the fact that I have an old system and can't play it.
01:45:48.000 So you're buying a new system in April?
01:45:50.000 No.
01:45:51.000 It's actually a pretty good game.
01:45:51.000 No, no, no.
01:45:52.000 So they released it.
01:45:53.000 It's weird.
01:45:53.000 So, like, with all the new consoles, they released it, like, last week.
01:45:57.000 With the older systems, like Xbox One and whatever PlayStation it was, they're not releasing it until, like, April.
01:46:06.000 Oh, weird.
01:46:07.000 April 4th or something like that.
01:46:08.000 Yeah.
01:46:08.000 And so I think that because they want you to get the new system to play, you know, early releases and stuff.
01:46:13.000 Yeah.
01:46:15.000 Alright.
01:46:16.000 Anthony says, Tim, your earlier stories of Nikki, Taylor, and Chelsea are all examples of why women deserve less.
01:46:22.000 Why Women Deserve Less by Myron Gaines, now available for purchase audiobook coming soon.
01:46:27.000 Is that Fresh and Fit Dude or what?
01:46:29.000 What's that all about Nikki Haley?
01:46:31.000 Yeah, I don't know what he's...
01:46:32.000 Well, I did segments on Don Lemon saying she's past her prime.
01:46:36.000 Chelsea Handler, I love this one.
01:46:38.000 Chelsea Handler responded, in a sense, to Matt Walsh, Tucker Carlson, Ben Shapiro, me, and Jesse Kelly, because we were all, to a certain degree, critical of her video on being childless.
01:46:49.000 Oh, her video about masturbating and getting high?
01:46:52.000 And doing drugs.
01:46:53.000 But here's the thing.
01:46:54.000 She included a picture of me in the receding hairline club, and I'm like, okay, well, I guess thank you, Chelsea, for including me in this, because she didn't actually criticize anything I said about her.
01:47:03.000 Like, Ben Shapiro called her miserable, Jesse Keller very hilariously mocked her, and Matt Walsh said something similar, but my point that I made, I didn't say she was miserable or anything like that, I said, people who don't have kids are going to find themselves in their deathbed in a sterile hospital room, The doctor's gonna walk in and say, is there anyone we should call?
01:47:21.000 And you'll say, no.
01:47:23.000 And he'll say, okay, well, we're around.
01:47:25.000 Call us if you need us.
01:47:27.000 And then you're gonna be sitting in this room as you lay dying, scared, with no one there to be there for you or to comfort you.
01:47:34.000 I didn't say she was miserable.
01:47:35.000 I think she's probably happy as a pig in, well, we've already been swearing in this show, right?
01:47:39.000 She's happy as a pig in shit as she wakes up, does drugs, and masturbates.
01:47:42.000 That sounds like she's having a blast.
01:47:44.000 I mean, maybe she's a good auntie or something.
01:47:45.000 I don't know.
01:47:46.000 Good aunties will have people around them.
01:47:47.000 I heard that argument that you were saying about having kids to have people around you when you're old.
01:47:50.000 I don't know if I'd like that as an argument of why, because if you just get a bunch of women pregnant, you're gone, and you never see your kids, they're not going to come.
01:47:58.000 No one cares about you.
01:48:00.000 They will come.
01:48:01.000 If you're an amazing human being with no kids, you might have people all around you near the end of your life that just support you and are reminding you that what you did on earth was valuable.
01:48:12.000 Well, you'd have to make friends with people who are younger than you, because by the time you're old, all your friends are dead.
01:48:17.000 Right, like fans of your work and things like that.
01:48:19.000 Or like just people you mentor or something, or like, you know, if you have nieces or nephews or foster families, people that you foster, or like, you know, this is in the arts, there were always young people who were being, you know, friends with older artists and stuff.
01:48:38.000 I don't know.
01:48:38.000 I think that's a good point, though.
01:48:39.000 If you're, like, the cool aunt, then maybe... The cool auntie is a thing.
01:48:43.000 That's a thing.
01:48:44.000 The savvy auntie.
01:48:46.000 The other thing I wonder, too, is, like, maybe she can't have kids, and she doesn't want to come out and say that she's barren, and so she tries to find ways to justify a positive feeling around it, and if that is the case, it's really brutal to mock her and call her miserable.
01:49:02.000 Like, maybe she's trying to make the best of a really bad situation and she really is, deep down inside, sad that it never happened for her or she can't.
01:49:09.000 So she's just like, well, I can do drugs and masturbate and everyone's just ragging on her and mocking her.
01:49:13.000 I'm like, you know, it's kind of brutal, you know?
01:49:15.000 Well, did she even write that bit?
01:49:18.000 Probably not.
01:49:19.000 She probably was like, OK, I'll do that bit.
01:49:21.000 The bit was bad because it was like, does she not have a job?
01:49:26.000 You know, it's like it's not kids that make it so you can't wake up and do drugs and masturbate.
01:49:31.000 It's like responsibility in general.
01:49:33.000 So she clearly has none.
01:49:33.000 Yeah.
01:49:35.000 And that sounds really depressing.
01:49:36.000 Just make funny jokes, basically.
01:49:38.000 I don't know.
01:49:38.000 She doesn't.
01:49:39.000 She's not.
01:49:42.000 You know, did you guys know Sarah Silverman's hosting The Daily Show?
01:49:48.000 Sarah Silverman is not funny.
01:49:49.000 Her bit has always been just to offend you.
01:49:51.000 But then it got funny because once offending people became taboo, she didn't know what to do.
01:49:56.000 And now she's for national divorce.
01:49:58.000 Oh yeah, I think we talked about that.
01:49:58.000 She is?
01:50:00.000 She made that video like last year, yeah.
01:50:02.000 And then we liked her.
01:50:02.000 We were like, oh, okay.
01:50:04.000 Well, you know, we can agree on that.
01:50:06.000 She would live in America too, I think is what she said.
01:50:09.000 America 2.
01:50:10.000 We could have America 1 because it's important to us.
01:50:12.000 Oh, look at this.
01:50:14.000 Angela McArdle.
01:50:15.000 Is that a silver beanie with blue gems encrusted on it?
01:50:18.000 That's a longtime membership.
01:50:19.000 Sarah's work is valuable to keep conservative side of the culture war from overcorrecting.
01:50:24.000 Thank you and keep fighting the good fight.
01:50:25.000 See you at the anti-war rally Sunday.
01:50:28.000 Yes, thank you, Angela.
01:50:30.000 She's the chair of the LP, and we are doing the anti-war rally, the Rage Against the War Machine, on Sunday, 1230, starting at the Lincoln Memorial, and it's going to move to the White House.
01:50:43.000 I have a speaking spot at the White House, so yeah, hope to see everybody there.
01:50:48.000 All right, Thomas Sidebottom says, Bing is connected to a live internet with ChatGPT.
01:50:53.000 The training set is closed.
01:50:54.000 Bing is doing the same thing, but with real human response training data.
01:50:58.000 Interesting.
01:50:59.000 Yeah.
01:51:00.000 Crazy.
01:51:01.000 The Bing stuff looks fun, man.
01:51:04.000 I signed up for the early wait list.
01:51:06.000 I'm hoping I can get access to it.
01:51:08.000 RBK says, I was Chase Bank with a question, and the operator was about to ask questions to identify me, and the guy said, oh, never mind.
01:51:14.000 You've been voice authenticated.
01:51:16.000 Scary stuff.
01:51:18.000 Yeah, cause that means somebody could just deepfake your voice and then call in, hi, I'm John.
01:51:23.000 It's like, okay, you're good.
01:51:24.000 What do you wanna do with your money?
01:51:26.000 Give it to Bill.
01:51:27.000 Done.
01:51:28.000 Oh.
01:51:30.000 All right, what do we got here?
01:51:31.000 What do we got here?
01:51:32.000 John White says, last week you did a piece on George Kelly, the AZ rancher, being held on $1 million bail.
01:51:37.000 His family set up a give, send, go campaign titled George Allen Kelly Legal Defense Fund.
01:51:42.000 Help get him home.
01:51:44.000 Much more to his story, so keep covering.
01:51:46.000 We should definitely figure that one out.
01:51:48.000 Maybe even send someone down there to figure out what happened.
01:51:50.000 This is the guy that they accused of shooting an illegal immigrant who had multiple felonies or something like that.
01:51:55.000 He, like, kept crossing the border.
01:51:57.000 And people had reported multiple gunshots earlier in the day.
01:52:00.000 So they arrest this 73-year-old guy and they're holding him on a million dollars bail, which, like, makes no sense because where's this guy going to go?
01:52:05.000 I guess because he lives on the border.
01:52:07.000 They're, like, I don't know, 10 feet over the border.
01:52:08.000 It's like, well, the federales will bring him back.
01:52:11.000 What are you talking about?
01:52:12.000 His house is right there.
01:52:14.000 All right.
01:52:15.000 All right.
01:52:16.000 Lightning Fire says, do you think AI will be used by terrorists?
01:52:20.000 Where anything can be possible, everything be off limits.
01:52:23.000 Would gun control even work with AI when anyone can build with 3D printing machine?
01:52:27.000 Imagine how crazy it's going to be in the future when you just go to the AI And 3D printing advances well beyond just plastics and PLA or whatever, ABS.
01:52:37.000 And it can mill metal and mold metal.
01:52:39.000 And you're like, I'd like an AR-15 mil spec, please.
01:52:42.000 556.
01:52:42.000 And it's like, okay.
01:52:43.000 And then it just starts making all the parts and puts it all together for you.
01:52:47.000 Perfectly form fit to your hand.
01:52:49.000 Hey, for that matter, in Star Trek, how come they never did that?
01:52:51.000 You know?
01:52:53.000 Star Trek was trying to look at us as our best selves.
01:52:57.000 But like, they could make a phaser if they needed one.
01:53:00.000 Could they?
01:53:00.000 They couldn't replicate phasers.
01:53:02.000 Why not?
01:53:03.000 I think it was the replicator didn't do it.
01:53:05.000 I'm pretty sure the replicator- Did the replicator do it?
01:53:07.000 I thought the replicator didn't do it.
01:53:08.000 If it can make food, I think food is more complicated.
01:53:11.000 Yeah, but that's already pretty incredulous.
01:53:13.000 Like, that's ridiculous.
01:53:15.000 Yeah, I guess.
01:53:15.000 That was definitely the fantasy part of the- Replicating food.
01:53:18.000 I think you could do that.
01:53:18.000 I guess the idea of Star Trek was that the nacelles would absorb free hydrogen and then use that matter in the replicators and convert it into denser materials.
01:53:27.000 Yeah, fusion in the matter replicator.
01:53:29.000 Yeah.
01:53:30.000 And bonding, do chemical bonding.
01:53:31.000 And you'd be like, I'd like a cheeseburger, and it would make one.
01:53:34.000 But what do you do with the cup?
01:53:35.000 Like, you know, Picard would be like, tea, Earl Grey, hot!
01:53:38.000 And then it makes the glass, and then he takes the glass out.
01:53:40.000 You didn't see he just throws it in the trash can.
01:53:42.000 He throws it out the window or something?
01:53:43.000 Yeah, what happens to that?
01:53:44.000 I think you put it back in the replicator, yeah.
01:53:45.000 You must put it back in the replica.
01:53:46.000 Put it back down in the atom.
01:53:47.000 It's like matter reclamation.
01:53:49.000 The crazy thing is, technically, based on the lore of Star Trek, you could replicate
01:53:52.000 people.
01:53:53.000 Really?
01:53:54.000 Well, that's what the transporters do.
01:53:56.000 Like the episode where Riker got split into two people.
01:53:58.000 Oh, yeah, yeah, that's right.
01:53:59.000 That's right.
01:54:00.000 Oh, confirmed from Wikipedia, yes, they can recycle.
01:54:01.000 The replicators do recycle things.
01:54:03.000 Can they make phasers?
01:54:04.000 I don't know.
01:54:05.000 That's probably a big article.
01:54:06.000 They can make batleths, for sure.
01:54:08.000 What about a gun?
01:54:09.000 Oh, they did?
01:54:09.000 I think they did.
01:54:10.000 Well, they can make a spoon or a knife.
01:54:12.000 But imagine, like, that'd be a good, a funny parody where it's like, replicate me a Glock 17.
01:54:16.000 I want a Tommy gun.
01:54:17.000 Fully loaded.
01:54:18.000 Right.
01:54:18.000 Well, the holodeck.
01:54:19.000 They did that in the holodeck.
01:54:20.000 Yeah.
01:54:20.000 Yeah, of course.
01:54:21.000 And then they get shot.
01:54:22.000 That's so dumb.
01:54:23.000 And then when you take the safeties off.
01:54:25.000 Yeah, the safeties off.
01:54:26.000 And then you have a whole episode of being scared of the holodeck.
01:54:28.000 We're trapped in the holodeck and they're gonna shoot us.
01:54:30.000 Oh, no!
01:54:32.000 That show was fun.
01:54:32.000 Oh, I love it.
01:54:34.000 All right, where are we at?
01:54:36.000 It's my, like, comfort show.
01:54:37.000 Special shout-out to LeVar Burton and Reading Rainbow.
01:54:40.000 Buddy B says, I've been tentative about membership due to funds, but would 100% throw money at Ian for morning meditation.
01:54:48.000 Factory work these days calls for zen.
01:54:51.000 You know, my first thought is, Buddy, give me money.
01:54:55.000 But if you do have to wake up at 8 a.m., meditating is one of the best things you can do if you wanna clear your mind and be refreshed for the day.
01:55:00.000 Stretching, first thing I do when I wake up is I stretch.
01:55:02.000 What we'll do is we'll record at midnight, because it's technically the morning, and then it'll be uploaded at 8 a.m.
01:55:08.000 for everybody.
01:55:09.000 But it'd be nicer if it was live.
01:55:10.000 Ian's midnight meditation.
01:55:12.000 Yeah, live stream, because I think the power of multiple humans meditating together, regardless of where we are, the entanglement.
01:55:17.000 Instead of just watching something that previously happened.
01:55:19.000 Waffle says, Tim, you're playing Overwatch with 12-year-olds, bro.
01:55:22.000 It's not gonna get better, I'm sorry.
01:55:24.000 I know.
01:55:25.000 I know that's the case.
01:55:27.000 So... Do they dance over you when you die?
01:55:29.000 You can really tell when you're playing against a team.
01:55:31.000 Well, that's the funny thing.
01:55:32.000 It's like, I've played where... I mean, dude, it's such a fast-paced game.
01:55:38.000 If you start teabagging, you're done.
01:55:41.000 Like, I'm gonna walk up, and then you start doing a stupid thing, and then you're instantly wiped out.
01:55:45.000 Is it, does the reticule, is that how you pronounce that word, reticule?
01:55:48.000 That's what I call it.
01:55:49.000 Does it bounce or is it always straight in the middle, no matter how fast you're moving, no matter if you jump, it's always directly in the middle of the screen?
01:55:54.000 Or does it like bounce and wave as you're running?
01:55:56.000 It doesn't bounce.
01:55:57.000 That's why I never got into Overwatch, because I felt like I was playing Borderlands 2, and I liked the gunplay of Borderlands 2.
01:56:02.000 Felt a lot more realistic and challenging, like if you're running, you don't have like perfect aim when you're running.
01:56:08.000 Just felt kind of robotic.
01:56:09.000 Overwatch. BrettAintDead in the member chat says, Tim, what do you play Overwatch on? Either way,
01:56:13.000 add me. Same name on here. We'll crush 86 babies. I play on PS5 and my username is Timcast,
01:56:20.000 and I imagine the next time I turn my PlayStation on, it's going to go bling, bling, bling, bling,
01:56:23.000 friend request, friend request, friend request. And it's fun playing with a real team, and you
01:56:29.000 know you're playing with other 30-year-old men because everyone groups up before running in.
01:56:36.000 You can tell you're playing with little kids because everyone dies sporadically and then run out one at a time.
01:56:41.000 Also their whole team is there just five shot like You know, five versus one, five versus one, and I'm just like, please just wait 20 seconds for the group to come together before rushing in.
01:56:51.000 Wait for five, wait for five, please.
01:56:53.000 It's like the Leroy Jenkins.
01:56:54.000 Right, that's exactly what it is.
01:56:55.000 I'll play at night, and like, you can tell people are getting drunker and higher the later the night goes on, because like by 3am, No one's even talking on chat.
01:57:05.000 Ant345 says, Bungie is woke as hell, but I love Destiny.
01:57:07.000 Hunter class for life.
01:57:08.000 Yeah, I stopped playing Destiny a while ago.
01:57:10.000 I played Destiny since the beginning of it, and then I can't remember the last one I played.
01:57:17.000 Destiny 2 was one of the last ones I played.
01:57:19.000 Well, there's a bunch of expansions.
01:57:20.000 So the last one I played, I think, was when the darkness was released and you were able to wield the darkness or whatever.
01:57:26.000 I'm not familiar with Destiny.
01:57:28.000 Yeah, Destiny's a fun game.
01:57:29.000 Hey Ian, so when do you wake up?
01:57:32.000 It depends on the day.
01:57:32.000 I woke up at 1 today.
01:57:35.000 I woke up at 9 a.m.
01:57:37.000 yesterday for a 10 a.m.
01:57:39.000 meeting.
01:57:39.000 I try to take late meetings.
01:57:40.000 I usually start my day around 2.
01:57:42.000 I work 2 to 11 basically.
01:57:43.000 So I get up around 1, go to bed about 3 a.m.
01:57:46.000 usually.
01:57:46.000 That's fair.
01:57:47.000 Nelson Nelville says, Cast, has anyone ever recognized you while gaming?
01:57:51.000 Ian, has anybody recognized you while gaming or anything?
01:57:54.000 Uh, not unless they know I'm gaming ahead of time.
01:57:57.000 And I always feel, it feels weird, like I like the anonymity.
01:58:00.000 That was a big problem of why I kind of dipped out on social media for a decade is because, like, I was going to chat rooms and we were all having genuine conversation.
01:58:06.000 Then I started to get well known.
01:58:07.000 And when I'd go into the chat room, the conversation would stop and people would be like, Ian's here!
01:58:11.000 Ian, Ian!
01:58:11.000 And I'm like, no, I just want to have a conversation.
01:58:14.000 So no, the short answer is no.
01:58:16.000 For me, my PlayStation username has always been Timcast, well before I had any substantial amount of followers.
01:58:23.000 So I just log into the same thing every time.
01:58:26.000 But my joke earlier was, I get endorsed every time I play, and I don't think it's because I'm good at the game.
01:58:32.000 I think people are just like, oh, hey, Timcast, you know, and they click endorse.
01:58:35.000 So it's like, you've been endorsed.
01:58:36.000 And I'm just like, yeah, I'm pretty sure I was not doing well.
01:58:39.000 That's the sad thing about being famous is like, do those people really like me?
01:58:42.000 Do they really like me?
01:58:43.000 Or are they just...
01:58:44.000 I hate that.
01:58:45.000 Well, I'm playing casual.
01:58:46.000 I'm not playing ranked.
01:58:46.000 If I was playing ranked, they might be like, hey, it's cool you're here, dude, but you're costing us the game.
01:58:50.000 You know what I mean?
01:58:50.000 True.
01:58:51.000 So playing the casual stuff, like No Limits, where you just get like six Symmetra's to all run in at the same time and pepper everything with turrets is just the most hilarious thing ever.
01:59:00.000 And then the one thing I can't stand is when everyone thinks it's funny to play Mercy.
01:59:04.000 And I'm like, what do you guys think?
01:59:05.000 Like six healers on one person is going to stop you from losing?
01:59:08.000 I don't know.
01:59:09.000 Whatever.
01:59:09.000 It's fun, though.
01:59:10.000 It's all good fun.
01:59:12.000 All right.
01:59:13.000 Mr. Juzno says, isn't drinking distilled water bad for you?
01:59:16.000 I remember hearing that it lacks the minerals, thereby diluting the mineral levels in your cells.
01:59:20.000 That is correct.
01:59:21.000 At least that's what I understand, right?
01:59:22.000 Yeah, you're supposed to reintroduce minerals to distilled water.
01:59:25.000 And that's why our filtration system has added minerals.
01:59:28.000 Oh yeah, in addition, to filter vinyl chloride out of your water, I heard that you could do activated charcoal and reverse osmosis combined, those two things.
01:59:36.000 Oh, here we go.
01:59:38.000 Jason Dixon's got a gold beanie with red jewels encrusted to it, and he says, two years, seven months, 23 days, get on my level.
01:59:47.000 Just get on his level.
01:59:48.000 Yeah.
01:59:49.000 Yeah, I don't know.
01:59:51.000 Maybe there's other stuff we can add to make YouTube memberships mean something more.
01:59:56.000 I mean, the issue, I suppose, is there is a chat now, and we're getting massive feedback from people being like, we can actually have conversations.
02:00:07.000 This is cool.
02:00:08.000 People were using Chicken City.
02:00:09.000 This is the thing people need to understand.
02:00:11.000 They would turn the show on and then open Chicken City and use the Chicken City livestream chat as the chat for this show.
02:00:17.000 These are things that we saw where we're like, okay, this chat clearly isn't working, what can we do?
02:00:21.000 So you can always go hang out in Chicken City livestream chat for free and talk about whatever you want.
02:00:26.000 Or we figured the membership thing is a way to make a clean chat with no limits, so you don't gotta wait five, six seconds or whatever, and you can just chat.
02:00:34.000 I like it.
02:00:35.000 Golden Gaming's saying, free the chat, almighty antichrist, nobody cares, dash z dash love it, like, you guys are, I can see you now, and oh, Tracer's in the chat.
02:00:43.000 Red beanies equals MAGA beanies.
02:00:45.000 Tracer's got a red beanie.
02:00:47.000 Maybe we should put MAGA on the red beanie.
02:00:49.000 We could do that. Yeah, a little MAGA. There's also the 20-sided die, the 20 on the 20-sided
02:00:54.000 die and the one on the 20-sided die. Yeah. But it's hard to see the number. So that was about
02:00:58.000 rolling 20s. So we'll have to figure that one out. We already got that coming on the update. So.
02:01:01.000 Okay, cool, cool. And then we're gonna, we're gonna create the, uh, the golden cockerel.
02:01:06.000 So smash that like button, subscribe to this channel, share the show with your friends, become a member at TimCast.com.
02:01:12.000 We're gonna have a members-only show coming up for you in about one hour and it's gonna get pretty serious because we got some very serious subjects to talk about that might be too spicy for this family-friendly version of the show.
02:01:22.000 You can follow the show at TimCast IRL.
02:01:24.000 You can follow TimCast News on Facebook where we're gonna be publishing our news articles and you can help by sharing them if you think they're important.
02:01:32.000 And that's TimCastNews on Facebook.
02:01:34.000 And you can follow me personally at TimCast.
02:01:37.000 And my other YouTube channel is YouTube.com slash TimCastNews, for those that aren't familiar, and I've been doing that longer than this one.
02:01:43.000 But I'm now doing six segments per day on that channel, so it's like two hours of content.
02:01:49.000 It's another podcast I have called the Tim Pool Daily Show, which has about half as many viewers as this show, but I don't think there's a strong overlap, so if you want to check it out, you can check it out.
02:01:56.000 Yeah, Sarah, you want to shout anything out?
02:01:58.000 Yeah, so if you want to go find me, just go to my website, which is sarahigdon.com.
02:02:04.000 It has links to all my social media platforms.
02:02:06.000 I'm on just about everything.
02:02:09.000 But just like my YouTube is youtube.com slash sarahigdon.
02:02:12.000 And then Twitter and Instagram is both just sarahigdon with an underscore after it.
02:02:16.000 And so that's it.
02:02:18.000 And yeah, come on Sunday.
02:02:20.000 And I'll see you guys out there if you guys come out to the anti-war rally Sunday.
02:02:23.000 Someone made a Miguel is saying I would do Tim cast after show it for if it were live.
02:02:29.000 That's a really good point, and we're going to look into that because I think it is possible to do a private members-only live stream on Rumble.
02:02:36.000 We just need to figure out how to do it.
02:02:37.000 Yeah, we can do it.
02:02:38.000 We can definitely do that.
02:02:40.000 I'm pretty sure we can.
02:02:40.000 We've been asked before to do that, I believe.
02:02:42.000 Because we can do the members-only as a live stream on TimCast.com.
02:02:45.000 Yeah, it'd save us some time after.
02:02:46.000 Is there a live chat?
02:02:47.000 Because that'd get wild if people were talking to us.
02:02:50.000 Yeah, probably through Rumble.
02:02:51.000 I imagine so.
02:02:51.000 That would be cool.
02:02:53.000 Let's figure it out, because then we could take questions and stuff from the members as well, and that would be lit.
02:02:56.000 Spicier questions.
02:02:57.000 That would be very fun.
02:02:58.000 Yeah.
02:02:59.000 Anyway, Libby, you want to shout anything out?
02:03:01.000 Sure.
02:03:02.000 I'm at Libby Emmons on Twitter, and you can check out what we're doing at thepostmillennial.com every day.
02:03:07.000 I'm Ian Crossland.
02:03:08.000 Follow me at iancrossland.net.
02:03:09.000 Subscribe to me on YouTube at Ian Crossland and you can also check out this Cast Castle skit on Cast Castle YouTube channel and tincast.com.
02:03:16.000 I just want to give a special shout out to Amish Man, Joseph, RG2 Tracer, Ted Thornton.
02:03:20.000 I know I already mentioned you, Tracer, and Brett Ain't Dead.
02:03:23.000 Tracer is an Overwatch character.
02:03:25.000 You guys in chat, you make this happen.
02:03:27.000 Well, we make this happen and you're here with us making it happen.
02:03:28.000 We're all making it happen together.
02:03:30.000 Happy to be here with you.
02:03:30.000 Thanks for being here.
02:03:31.000 Bye.
02:03:32.000 And I am at Serge.com.
02:03:34.000 Follow me on Twitter.
02:03:35.000 It's at S-E-R-G-E-D-O-T-C-O-M.
02:03:39.000 Everyone doesn't get it right.
02:03:40.000 They keep telling me about that.
02:03:42.000 Spell it out with an E and I will go argue with you.
02:03:45.000 I do respond to everything.
02:03:46.000 At least I try my best to.
02:03:47.000 Cheers.
02:03:48.000 I want to stress the Cast Castle video, so go to YouTube.com slash Cast Castle for one reason.
02:03:54.000 Ian is arguing with Roberto Jr.
02:03:56.000 on the phone, and he secretly recorded Roberto Jr.' 's conversation, and I just, you know, I love chickens, so I find it very funny.
02:04:04.000 I thought it was hysterical.
02:04:05.000 Do you want to play a clip now, or just let him go to Cast Castle?
02:04:08.000 No, let him go check it out.
02:04:09.000 It's so funny.
02:04:09.000 What is that recording device you have?
02:04:12.000 It's awesome.
02:04:12.000 It's Wesley's.
02:04:13.000 What is that thing called, Wesley?
02:04:14.000 I called it a Teddy Ruxpin.
02:04:15.000 I know it's not that.
02:04:17.000 It's not a talkboy, but that's what, uh, if you saw Steven Crowder's video.
02:04:20.000 It's like a Fisher Price.
02:04:21.000 It's a Fisher Price!
02:04:22.000 That's the word, yeah.
02:04:23.000 With a tape recorder in it.
02:04:24.000 That's the word.
02:04:24.000 Alright, check that out.
02:04:26.000 It's the Teddy Ruxpin era.
02:04:27.000 We will see you all over at TimCast.com.
02:04:30.000 Thanks for hanging out.