Timcast IRL - Tim Pool - October 07, 2022


Timcast IRL - Biden Roasted For HILARIOUS GAFFE, Democrat Confidence In his Brain GONE w-Bill Ottman


Episode Stats

Length

2 hours and 5 minutes

Words per Minute

201.49873

Word Count

25,231

Sentence Count

2,395

Misogynist Sentences

36

Hate Speech Sentences

29


Summary

In this week's episode of The Timestamps: President Joe Biden makes a bold and powerful statement, and it's hilarious, and then we make fun of him because his brain doesn't work. We're joined by WeAreChange's Luke Godowski to talk about the apocalypse and World War III, and a man who is not invisible, but just late.


Transcript

00:00:00.000 President Joe Biden.
00:00:28.000 With a bold and powerful statement, said he had two words, and then he said three, and that's it.
00:00:34.000 And it was hilarious, and it's just, I don't even know how you make such a stupid mistake, but we're gonna play the video, and then we're all gonna laugh, because it's Friday, and everybody needs a good laugh.
00:00:44.000 And because it's Friday as well, it's time to end this week with something a little bit more relaxing.
00:00:48.000 I mean, obviously, we are gonna talk about the apocalypse and World War III, but we're gonna make jokes about it.
00:00:53.000 We're gonna make fun of Joe Biden, because his brain doesn't work.
00:00:57.000 We've also got probably one of the most epic responses in cancel culture history, Matt Walsh.
00:01:04.000 Many leftists have been spreading around these out of context clips, trying to smear him and make accusations against him.
00:01:09.000 So he just issued the most epic response ever, where he was just like, kiss my ass, you should be apologizing to me!
00:01:16.000 And that, I can't even do it justice, it's so good.
00:01:19.000 And so we'll talk about that, plus we do have some pretty crazy stories, really dark stuff.
00:01:23.000 Cori Bush, she's a progressive democrat.
00:01:26.000 An interview came out on PBS where she told the story of where she was the victim of a forced abortion, where she begged them to stop and they wouldn't.
00:01:36.000 And it's just like, dude, this story is like a horror story, man.
00:01:40.000 So we'll talk about that.
00:01:42.000 Before we get started, my friends, head over to TimCast.com and become a member in order to support our work as a member.
00:01:47.000 You'll get access to exclusive segments from the TimCast IRL Uncensored show Monday through Thursday at 11 p.m.
00:01:53.000 Those are all up from this week.
00:01:55.000 We're gonna be chillin' tonight.
00:01:56.000 But you'll also get to watch the Cast Castle vlog, which you're gonna get a kick out of.
00:01:59.000 We've got a fun episode coming up next Tuesday.
00:02:01.000 You're gonna watch the whole Civil War arc.
00:02:03.000 There's like an election, and then battle's coming at 3 a.m., and then people are fighting.
00:02:06.000 You don't wanna miss it.
00:02:07.000 But I'm really excited for what we're filming next week.
00:02:10.000 Which will be, not this coming Tuesday, but the Tuesday after, is probably going to be extremely offensive, and it's going to make, I don't know, I just imagine all the feminists are going to lose their minds at this episode.
00:02:19.000 So it'll be funny.
00:02:20.000 Or they're going to claim they're not really offended, whatever.
00:02:22.000 It'll be fun.
00:02:23.000 So become a member at simcast.com.
00:02:25.000 You'll also be supporting our journalists and all the work they do.
00:02:28.000 Smash that like button, subscribe to this channel, share the video, be that notification, because people are telling us YouTube's not notifying them anymore.
00:02:35.000 So you guys can be that notification if you want to help push back against the censorship.
00:02:38.000 We're about a month out from the midterm, so it's really important.
00:02:42.000 Joining us tonight is a man who is not, in fact, invisible, but just late, is Bill Ottman.
00:02:48.000 Hey, thanks for having me.
00:02:49.000 You were late.
00:02:50.000 I ran out of gas.
00:02:52.000 Totally spaced.
00:02:52.000 Literally.
00:02:54.000 Man.
00:02:55.000 What do you do?
00:02:55.000 Who are you?
00:02:56.000 My name is Bill.
00:02:57.000 I'm the co-founder of Minds, Minds.com.
00:02:59.000 We're an open-source, decentralized social network.
00:03:01.000 Well, simple enough.
00:03:03.000 Well, Luke's here, too, I guess.
00:03:04.000 Unforgivable.
00:03:05.000 Never do that again, Bill.
00:03:06.000 Thank you.
00:03:07.000 I'm just messing out of guests.
00:03:09.000 My name is Luke Godowski of WeAreChange.org.
00:03:11.000 Today, I'm wearing a shirt with my co-host on it, and that, of course, is my dog, Atlas, which reads, no step on dog, which you could exclusively get on thebestpoliticalshirts.com because you do.
00:03:23.000 Thank you so much for having me.
00:03:23.000 I'm here.
00:03:24.000 Luke, I just had to sign his name as Atlas.
00:03:27.000 Her name is Atlas.
00:03:28.000 His and her.
00:03:29.000 So, you know, they'll be friends.
00:03:30.000 Awesome.
00:03:31.000 His son is Atlas.
00:03:32.000 Oh, your son's name's Atlas.
00:03:33.000 That's pretty cool.
00:03:34.000 That's awesome.
00:03:35.000 Cool name.
00:03:36.000 Are you called Addy?
00:03:37.000 That hasn't emerged yet.
00:03:38.000 The nickname hasn't come out.
00:03:40.000 It's not so much a nickname as just the way you say the name.
00:03:42.000 Atlas.
00:03:43.000 Yes.
00:03:44.000 You know, just speak normally until you say his name.
00:03:46.000 Atlas!
00:03:47.000 Can you come here and bring the newspaper?
00:03:48.000 It's becoming more and more of a popular name as, you know, we have a global awakening to the true crimes happening all around the world.
00:03:54.000 Are you going to pile the burden of Earth on his shoulders?
00:03:57.000 It's actually not Earth that he's depicting.
00:04:00.000 That's a myth.
00:04:01.000 It's the celestial bodies.
00:04:03.000 It's the whole universe.
00:04:04.000 Oh, so it's actually worse that it's the weight of the universe on his shoulders.
00:04:08.000 All right.
00:04:08.000 Hey, Ian Crossland, also co-founder of Minds.
00:04:10.000 I don't talk about it enough, probably, on the show.
00:04:12.000 Good to have you here, Bill.
00:04:13.000 Hell yeah.
00:04:13.000 Let's reconnoiter.
00:04:15.000 For sure.
00:04:16.000 Thank you guys all for joining us this evening on This Is My Last Night.
00:04:18.000 I cannot wait to talk about whatever's coming up, whatever dumb thing Ben has said.
00:04:22.000 Let's get into it.
00:04:23.000 All right, everybody.
00:04:24.000 Here's the story from the Daily Mail.
00:04:26.000 Quote, let me start with two words.
00:04:28.000 Made in America!
00:04:29.000 And that was it.
00:04:30.000 And I just started busting out laughing, because, you know, look, we could talk about a lot of stuff.
00:04:34.000 We can talk about Joe Biden destroying the country and getting involved in potentially World War III.
00:04:39.000 But we deserve to laugh.
00:04:41.000 And I don't know if I... Okay, so I don't have it here.
00:04:44.000 Do they got the video?
00:04:44.000 Let's play the video.
00:04:45.000 Let me start off with two words.
00:04:48.000 Made in America.
00:04:51.000 I've been eating in a mirror.
00:04:53.000 He says it twice.
00:04:55.000 And then everybody cheers for it.
00:04:57.000 That's it.
00:04:58.000 Happy Friday, everybody.
00:04:59.000 I hope you're having a good time.
00:05:00.000 This is, you know, are we laughing as everything just spirals down into oblivion?
00:05:06.000 And we just, you know, we're just smiling as it all burns around us?
00:05:09.000 This is fine?
00:05:10.000 It's not all burning around us.
00:05:12.000 It's always been, you know, trying to make the best out of a chaotic universe.
00:05:18.000 But yeah, I'm definitely laughing as I should probably be trying to figure out a new sort of economy that we could transition to.
00:05:23.000 I love how the story from the Daily Mail is highlighting the fact that he said two words.
00:05:28.000 He's like, let me start with two words and then says three.
00:05:30.000 But then it gets like really serious with like letters, you know, from representatives, from Congresswoman Nancy Mace and pictures of, you know, Rand Paul and all that stuff.
00:05:39.000 It's just like, dude.
00:05:40.000 I'll bet you the White House transcript is going to read, he said three words, instead of two, and they're gonna try to whitewash.
00:05:46.000 Few words.
00:05:46.000 Few words, there you go, that'll even make more sense.
00:05:48.000 Let me start with a few words!
00:05:49.000 But that's what they're gonna do.
00:05:50.000 Lydia pointed out he technically did start with two, made in, so maybe he wasn't wrong.
00:05:56.000 Yeah, that's what Corinne Jean-Pierre is going to say.
00:05:58.000 Someone's going to be like, the president said to start with two words, but then said three.
00:06:02.000 He started by saying two words and then added a third.
00:06:04.000 Okay.
00:06:07.000 How's it going?
00:06:07.000 How's the economy for you guys?
00:06:09.000 How's the country?
00:06:10.000 Are you excited for the leadership of this man?
00:06:12.000 Oh yeah, absolutely.
00:06:13.000 He's definitely very competent and I'm so happy he's in charge as we are on the brink of Armageddon.
00:06:18.000 As he says himself last night, he was literally talking about how we're closer to Armageddon since the first time since the Cuban Missile Crisis.
00:06:26.000 He's also talking about how Putin is deadly serious about using tactical nukes in Ukraine and Yeah, I don't know.
00:06:35.000 Look at this tweet, this is a good one.
00:06:36.000 It's one of these gotcha tweets.
00:06:39.000 Joe Biden says, you won't have to worry about my tweets when I'm president.
00:06:43.000 And then under it it says, President Joe Biden says the risk of nuclear Armageddon is at the highest level since the Cuban Missile Crisis.
00:06:49.000 My response was, yeah, what do you mean, come on, no mean tweets.
00:06:52.000 Like, there you go.
00:06:53.000 But I just love that juxtaposition.
00:06:56.000 You don't gotta worry about my tweets.
00:06:58.000 And then the news, the world is about to end.
00:07:00.000 I wonder if he's setting us up to be a hero.
00:07:03.000 Like, he's like, hey, I averted World War III, everybody.
00:07:06.000 Or like, yeah, but you got us close to it, too, in the first place.
00:07:09.000 Hey, but I did get you away from it.
00:07:12.000 I saved you from nuclear war.
00:07:15.000 You almost started it, but I did save you from that.
00:07:18.000 He did come out with the marijuana stuff.
00:07:21.000 What was that, yesterday?
00:07:22.000 Yes.
00:07:22.000 Yeah, but apparently it's not actually exonerating anybody.
00:07:25.000 Apparently the marijuana thing is just going to, like, your past convictions are erased.
00:07:30.000 Yeah, I saw that.
00:07:31.000 Oh, it's not letting people out?
00:07:32.000 No, it's not letting people out.
00:07:33.000 They can't get out of jail.
00:07:35.000 What?
00:07:36.000 Even for federal offenses?
00:07:38.000 It seemed that it was a pardon.
00:07:41.000 The language pardon was in there for thousands of... I don't know if it's current or past.
00:07:46.000 I think it's past.
00:07:47.000 So...
00:07:49.000 I see a lot of these lefties being like, Joe Biden's a great president because he just pardoned all these marijuana convictions or whatever.
00:07:55.000 And I was like, oh, that's cool.
00:07:56.000 I'm in favor of that.
00:07:57.000 I think Trump should have done the same thing.
00:07:59.000 And then my response is like, how many people were released?
00:08:02.000 And everyone said none.
00:08:03.000 It was just exonerating their records or something.
00:08:06.000 And they're trying to reschedule it.
00:08:08.000 Because that's the value.
00:08:09.000 Because if they take it off of schedule one, then they'll stop treating it like heroin and stop throwing people in federal prison for it.
00:08:15.000 Isn't that crazy though that basically the only reason pot became legal is because everyone just decided it was and ignored the law?
00:08:22.000 I was talking about this with Half-Baked.
00:08:26.000 It's literally a movie from the 2000s, or was it late 90s?
00:08:28.000 I don't know.
00:08:29.000 Where a bunch of dudes are just outright committing crimes and it's relatable and funny.
00:08:35.000 It's just weird.
00:08:36.000 Even in it, when Dave Chappelle is at the lab and the doctor rips the huge chunk off and hands it to him.
00:08:41.000 It's a tragedy what they did to marijuana in the 1900s.
00:08:46.000 What was it?
00:08:46.000 Harry J. Anslinger and William Randolph Hearst got together.
00:08:50.000 Hearst owned all these trees and he wanted to take people off of the hemp paper industry and make, well paper industry, he wanted to make trees.
00:08:57.000 So he got Harry J. Anslinger and I think he was the guy, his congress point man.
00:09:01.000 to start printing up all this propaganda like reefer madness make people afraid they said that black it made black people like blood blood lusty and stuff like crazy nonsensical propaganda that wasn't true and they made enough people afraid of it that they were able to make it federally legal and then William Randolph Hearst had the monopoly on the paper trade after that It's stuck for a long time.
00:09:24.000 I mean, I think that's an interesting story, right?
00:09:27.000 He was like a newspaper magnate or whatever.
00:09:30.000 Is that how you pronounce that?
00:09:31.000 Yeah.
00:09:31.000 You look at the people who control the narratives, when they have the ability to manipulate and control media, and you see what ends up happening because of it.
00:09:37.000 They can just be like, this thing goes against my interest, so have my newspapers tell everybody to do as I say.
00:09:43.000 And they do.
00:09:45.000 Freaking nuts.
00:09:45.000 But he did it with Congress's help, which is what really makes it scandalous.
00:09:48.000 Yeah, but the thing about Congress, You know, the Republicans are a great example.
00:09:53.000 Republicans are more concerned about the opinion of the New York Times than the opinion of their own constituents.
00:09:58.000 And that's exactly the power of the media.
00:10:00.000 So, with the Internet, the reason why we're seeing big tech freak out the way it is and this whole Elon Musk thing has become such a big deal is We've, we've, I mean, us, I guess, you know, many of us, especially you guys, minds, we've taken away their ability to just monopolize the narrative.
00:10:16.000 And they're losing their minds because of it, and they're losing power because of it.
00:10:19.000 And they're, you know, it's like, it's like watching a dude sinking in quicksand, screaming and thrashing around.
00:10:25.000 And you wanna, you wanna feel bad, you're like, I know that they're sinking, and they're like, doomed, but they're also just really evil.
00:10:34.000 So you can only just sit back and be like, maybe we shouldn't have this apparatus anymore, you know what I mean?
00:10:40.000 Yeah, I'd still like to encourage them to not thrash so much, because that's what's causing them to sink.
00:10:44.000 Maybe it's the thrashing that's evil, not the actual person.
00:10:47.000 Yeah, if they just, you know, laid forward, and then your legs come out, and then you can crawl away, but they don't get it.
00:10:54.000 They're just swinging violently, screaming.
00:10:56.000 You know, it's like when we had this raccoon problem.
00:10:59.000 And, you know, we want to feel bad for this raccoon who was trying to kill our chickens, so when we walk up to the trap, and we're like, aw, the poor little thing, it goes, and we're like, get it out of here, and then we have to get rid of it, you know?
00:11:09.000 That's what it's like.
00:11:10.000 The media is failing, it's dying, and these people are losing their jobs and all that, and then I'm just like, yeah, but they're like nasty people who just fling crap at your face, you know?
00:11:19.000 You walk up to them, you feel bad, like, maybe, you know, these journalists, you know, I see they're panicking, maybe they need work, and they just fling feces right at you, and you're like, okay, dude, Okay, go away.
00:11:28.000 You don't deserve to have a job, I guess.
00:11:30.000 You would think Twitter would be changing amid seeing the reaction to the Elon takeover, but they're thrashing.
00:11:38.000 They continue to ban, even despite this lawsuit that's happening.
00:11:44.000 So what is your current take on the situation?
00:11:48.000 Like it because Twitter wants Elon Twitter wants the the court trial to happen still yeah
00:11:55.000 Yeah, because because Elon was asking for more time and they said making sure the trial still happens until this
00:12:00.000 finalizes like prevents mischief Or something I guess if they're
00:12:05.000 Poop tweets.
00:12:06.000 They don't want any more poop emoji from... You know what, man?
00:12:06.000 They're concerned about mischief.
00:12:12.000 Bill was late to the show, so normally what we do is we get a thumbnail where the guest is sitting across from me and then I press record and then I screenshot it, but Bill wasn't here.
00:12:21.000 So I was just like, empty chair it is!
00:12:24.000 And I just want to say, that right there is the big difference between the establishment and the traditional media machine and what we represent.
00:12:32.000 We can just screw around like that and be silly about it and have fun.
00:12:36.000 And just, it is what it is.
00:12:38.000 And I was like, if this was like a major network, they'd be panicking.
00:12:40.000 They'd be like, pull archival!
00:12:42.000 We gotta get something!
00:12:42.000 We can't be, we have to wear our suits and ties!
00:12:45.000 Yeah, you can't have fake bazonkas on national television!
00:12:49.000 Not allowed!
00:12:49.000 Could you imagine a guest going on Tucker Carlson with gigantic fake boobies like Luke did?
00:12:54.000 Yeah, I mean, Fox would probably have, you know...
00:12:58.000 Did you see that Danny Polishuk got a pair?
00:13:02.000 Oh, he did?
00:13:03.000 Yeah.
00:13:04.000 From that high school teacher thing or whatever?
00:13:06.000 Yeah, he was doing videos with it.
00:13:09.000 What's he doing?
00:13:11.000 I just saw a photo of it when they arrived at his house.
00:13:13.000 I didn't see the video yet.
00:13:14.000 What do you think of Elon buying Twitter?
00:13:16.000 I mean, bring it on.
00:13:18.000 It's better than status quo.
00:13:19.000 You know, we saw, did you guys see his text with Dorsey?
00:13:22.000 Some of them.
00:13:23.000 Yeah.
00:13:23.000 So it's like they both are, you know, Dorsey, what's infuriating is that he knows open source, decentralization, encryption, Bitcoin.
00:13:32.000 He knows that this is all where it needs to go, but he had decades to make that happen with Twitter and couldn't do it.
00:13:37.000 He even said in the text, I don't have power.
00:13:39.000 I have 3% of the company.
00:13:41.000 I can't do anything.
00:13:42.000 But he was telling Elon, this is where you need to take it.
00:13:44.000 He was saying there needs to be no company, actually, because the company becomes an attack vector.
00:13:49.000 So, company-less decentralized protocols are what we need.
00:13:49.000 Right.
00:13:55.000 That's what's beautiful about Bitcoin, because Satoshi is totally anonymous.
00:13:59.000 There's no one to go after.
00:14:01.000 Why didn't Dorsey do anything?
00:14:03.000 I know he had that one operation.
00:14:04.000 What was that called?
00:14:05.000 Blue Sky.
00:14:06.000 Yeah, it still exists.
00:14:06.000 Blue Sky, right.
00:14:08.000 They're doing research and stuff, but I don't...
00:14:11.000 You know, and then when the transition happened and the new CEO was coming in, Parag, he dorsied to this blog post where he was like, I feel in my gut that this is really the right decision for the company.
00:14:23.000 You know, we're gonna, you know, things are gonna get better.
00:14:25.000 And then it's just like, nothing changes.
00:14:29.000 There's no, you know, in fact, his Parag statements about free speech are the opposite.
00:14:35.000 Like, Right.
00:14:36.000 You know what would be amazing?
00:14:38.000 Day one, Elon gets the company, it converts into a blockchain, and then Twitter removes itself from any moderation capabilities.
00:14:46.000 It becomes a totally decentralized networking tool.
00:14:49.000 So it's just like, we can't ban anybody anymore.
00:14:52.000 Or he just buys it and deletes it for the benefit of the world.
00:14:55.000 Could he buy it and make all the code free?
00:15:01.000 Well, he should.
00:15:02.000 But we need the version history so we can see all the algorithm changes they made punishing people over the years.
00:15:08.000 Those are the skeletons in the closet.
00:15:10.000 Every code change in Git, there's a version history so you can see, oh, they started punishing this type of content on this date in, you know, the election season.
00:15:20.000 And, you know, that data exists.
00:15:23.000 And, like, it needs to be audited.
00:15:25.000 Can they scrub version histories?
00:15:26.000 Not easily.
00:15:28.000 But they can?
00:15:31.000 I don't, I think that would be unlikely.
00:15:34.000 I think that if he opensources the code, he said he was going to opensource it, so hopefully he does.
00:15:40.000 So real quick, real quick.
00:15:41.000 This is important.
00:15:43.000 If Elon Musk does finalize his buyout of this company, he's gonna have logs showing that Twitter was targeting conservatives.
00:15:51.000 Or people on the right.
00:15:53.000 It's funny because it's the weirdest thing.
00:15:55.000 Gizmodo publishes a story Facebook censoring conservative news outlets.
00:15:59.000 I then go, hey everybody, wow, that's crazy, look, Facebook censoring.
00:16:02.000 Then I get smeared as pushing conspiracy theories for believing what Gizmodo publishes.
00:16:08.000 It's gonna be very vindicating for everybody.
00:16:11.000 If Elon Musk does do this.
00:16:13.000 So I certainly hope he does.
00:16:15.000 But I have to wonder as well.
00:16:16.000 I think what's going to come out of Twitter's closet is going to be more than skeletons.
00:16:22.000 There's going to be some real dark stuff.
00:16:25.000 We're gonna, we're gonna, we're gonna, we'll probably see internal communications of them defending child abusers and explaining why this, you know what I mean?
00:16:32.000 Well, this is gonna get complicated because Elon is gonna have a tough decision to make because he now is responsible for this company and the financial success of this.
00:16:39.000 This is why he wants to take a private, which makes sense because he will have a fiduciary duty to potentially not share a lot of the skeletons, which could get dicey.
00:16:49.000 So Bill, if I could ask you, you have a lot of history with social media, obviously with Mines.
00:16:53.000 If you were Elon Musk, what would you do right now?
00:16:58.000 I mean, I think he's doing what he can.
00:17:00.000 He's trying not to buy it though.
00:17:04.000 Let's say the acquisition finally goes through.
00:17:07.000 You have Twitter.
00:17:08.000 What would you do if you were Elon with Twitter?
00:17:10.000 You immediately open source all the code.
00:17:13.000 You immediately make all the messages end-to-end encrypted so thousands of Twitter employees can't read our messages.
00:17:21.000 It's just like so absurd.
00:17:23.000 Didn't Facebook make end-to-end encryption?
00:17:25.000 No, WhatsApp says they did.
00:17:28.000 Or the Saudi government can't read protesters' private DMs and messages of human rights activists like they did before through Twitter.
00:17:35.000 I got a question.
00:17:36.000 Do you think that Twitter receives money from governments to allow them backdoors?
00:17:43.000 I think that it is, I don't know.
00:17:46.000 Verizon does, AT&T does.
00:17:49.000 So I think there's a strong possibility that one of the reasons Elon may have backed out is he may have gone to Twitter and says, I want to buy the company.
00:17:58.000 And they went, here are the lucrative contracts, keeping the company afloat that you can never reveal because governments are paying for backdoors and you can't reveal those.
00:18:05.000 And then Elon went, crap.
00:18:07.000 It might not just be contracts.
00:18:08.000 It might be just threats and duress of the government saying, we're going to shut you down, or we're going to make sure you can't do business if you don't do what you want us to do.
00:18:16.000 Well, that's 100% true.
00:18:17.000 And then 15 years ago, we were reporting on specific stories of the NSA, of the federal government, having their own office spaces inside of the headquarters of Verizon, AT&T, T-Mobile.
00:18:28.000 The major cell phone providers in this country had entire floors dedicated to the federal government that was spying and watching on everyone.
00:18:35.000 What's Twitter involved in?
00:18:38.000 It's like that movie, The Santa Clause, with Tim Allen.
00:18:42.000 How, like, being Santa is actually a curse.
00:18:44.000 You guys know what I'm talking about, right?
00:18:45.000 Like, Santa's on his roof, and then Tim Allen kills the guy, and then steals his clothes, and for some reason puts them on, and then becomes Santa, and loses his family, and then it's like a mind virus that makes him happy about it, right?
00:18:59.000 I am somewhat being cute, but no, what I mean is Elon Musk or anybody, it feels like he goes to Twitter and they're like, no, no, no, no, don't buy this company, don't buy this company.
00:19:08.000 And then finally they're like, okay, buy the company.
00:19:11.000 And now they're looking at this like, we are going to get paid to leave and not have to be involved in this behind the scenes national security BS.
00:19:19.000 I would bet, here's a way I'll play it.
00:19:23.000 If I went to a casino and I saw blackjack, roulette, and do you think the US government has national security letters sent to Twitter demanding backdoors?
00:19:32.000 I would put all my money on that.
00:19:33.000 I'm not playing any other game, because that's a sure shot.
00:19:36.000 Elon Musk comes in, asks them, like, okay, show me corporate documents.
00:19:40.000 And he's got an NDA.
00:19:41.000 And not only that, but a national security letter with a gag order from the government, and he went, Crap.
00:19:46.000 And they're like, you can't say anything about it.
00:19:49.000 You have to buy it.
00:19:50.000 And if and when you do buy it, you won't be able to do anything you want to do, because they will force you to give them the back doors.
00:19:57.000 And we already saw the White House going to Facebook and saying ban these people.
00:20:00.000 There is a 0% chance they did not do that with Twitter.
00:20:04.000 And Elon probably saw that.
00:20:05.000 Yeah, it's not just a backdoor to the information.
00:20:08.000 I think it's even more than that.
00:20:09.000 I think it's deciding who gets censored, deciding which voices get downranked in the algorithm.
00:20:14.000 I think the federal government is way more involved, not just with, of course, the intelligence agencies helping give a lot of these big tech social media companies their start, but what do you think of this running your own social media network?
00:20:26.000 What's the possibility of what me and Tim are talking about?
00:20:30.000 Yeah, very likely.
00:20:31.000 I mean, well, we saw the Alex Berenson.
00:20:33.000 All these discovery documents came out recently between Twitter and the White House, where Alex Berenson got banned from Twitter and sued them.
00:20:42.000 Exactly.
00:20:42.000 It's the Slack conversation of the Twitter employees saying, oh, you know, I forget the guy's name at the White House, said, why is Alex Berenson still on the platform?
00:20:53.000 So to amend my earlier statement, when I said there's a 0% chance, no, actually, the evidence already been released by Alex Berenson.
00:20:59.000 Yeah.
00:21:00.000 The thing I was pointing to is when they were talking about Instagram, I think, when the White House was like, hey, this one's not ours.
00:21:05.000 Can you do something about it?
00:21:06.000 And it was the Anthony Fauci parody.
00:21:08.000 So we know the government, they're doing the wink, wink, nudge, nudge.
00:21:12.000 And I think worse than that, obviously.
00:21:13.000 But I'm wondering if, say the Saudis, they go to Twitter and say, how much will it cost to get access to your back end, to get back doors?
00:21:23.000 Twitter needs money.
00:21:25.000 I think it's possible for Elon to switch paths, but he's going to have to start playing hardball with major countries and risking Twitter getting banned in those countries if he doesn't play ball.
00:21:38.000 It's a marketing dream for SpaceX and Tesla to have Twitter as a marketing engine for those companies.
00:21:45.000 So it makes a lot of sense for Elon to be- It's a question of, in order to do this,
00:21:50.000 Elon has to get banks to back it.
00:21:52.000 Money has to come from somewhere.
00:21:54.000 And no one is willing to pay the full price to take the platform and then just convert it
00:22:00.000 into a blockchain, open source, decentralized network.
00:22:03.000 If that were to happen, you can't ban anybody ever again.
00:22:07.000 They can coexist, though.
00:22:08.000 You can have a centralized infrastructure and a decentralized infrastructure running parallel.
00:22:13.000 We actually just integrated with a new network called NOSTR, which stands for Notes and Other Stuff Transmitted by Relay.
00:22:22.000 There's relays all over the world.
00:22:24.000 Everybody has a crypto key pair that we don't have access to as mines.
00:22:27.000 That's your identity.
00:22:28.000 You can bring your followers and your content and log into other apps.
00:22:31.000 You can leave us.
00:22:32.000 It's not a blockchain, but it's a distributed system.
00:22:35.000 So, you know, there's other projects out there that are like this.
00:22:39.000 You've got the Fediverse.
00:22:40.000 You've got ActivityPup.
00:22:41.000 You've got Farcaster as a new one.
00:22:44.000 You've got Planetare.
00:22:45.000 You've got Secure Scuttlebutt.
00:22:46.000 There's all these protocols that are, like, in the mix, and Twitter can integrate, and Facebook, and Rumble.
00:22:52.000 Other networks, we're going to talk with them about integrating.
00:22:56.000 We can get all of the alternative tech sites to join as well so we're all participating, but that doesn't mean you have to abandon the existing infrastructure as well.
00:23:08.000 They can run in parallel.
00:23:09.000 And Elon did express interest in working with Rumble.
00:23:13.000 So I think that also is giving a lot of people optimism.
00:23:16.000 But as we're talking about here, Twitter is a major powerhouse.
00:23:19.000 Twitter is a vector of influence that a lot of powerful governments need and want in their fingertips.
00:23:26.000 And it would be foolish to think that their fingertips and fingers and hands aren't already in this larger influence.
00:23:32.000 that Twitter and big tech social media has. Just in 2019 there was an ex-Twitter employee that was
00:23:37.000 found guilty for spying for the Saudi government, specifically going through private messages,
00:23:43.000 going after dissidents of the Saudi empire, and sharing that information directly with
00:23:48.000 Mohammed bin Salman. So again, that's just one example of a government using that.
00:23:52.000 Twitter's revenue model is, Twitter's addicted to surveillance advertising, just like Facebook,
00:23:57.000 and so Elon needs to rip that out.
00:24:00.000 You know, Elon knows how to make money.
00:24:01.000 He knows how to make people pay like, you know, 50 to 100k for a car.
00:24:06.000 Like, he has some of the most profitable, you know, money-making machines in the world.
00:24:10.000 He can figure out how to make Twitter way more profitable without doing all the nonsense.
00:24:16.000 It is possible.
00:24:17.000 Yeah, you do something like Super Minds, where you send somebody an offer, and then they can accept the offer to respond.
00:24:25.000 So back to the question, because we cut you off really quickly.
00:24:27.000 Nice one.
00:24:28.000 You are Elon.
00:24:28.000 Let's get back to this.
00:24:29.000 Yeah, let's talk more about these.
00:24:30.000 You are Elon.
00:24:31.000 Ian, feel free to jump in on this.
00:24:32.000 If you're Elon as well, you have Twitter.
00:24:34.000 That's my question.
00:24:34.000 What do you do?
00:24:35.000 You said he opensource everything.
00:24:37.000 Yeah.
00:24:37.000 Do you, so would that be like... Reinstate everybody, obviously.
00:24:40.000 When you say open source, you mean AGPL-3, like a copy-left license?
00:24:44.000 There's multiple licenses that could work.
00:24:46.000 I hope that he actually does it, because I remember when all those news came out about, oh, Tesla's open sourcing their batteries.
00:24:51.000 Remember that?
00:24:52.000 That was not real.
00:24:54.000 He decided not to be a patent troll.
00:24:57.000 He decided to sort of open the patent so he wasn't going to go around suing other competitors in the general space.
00:25:03.000 But the code of the software running on Tesla's, the battery blueprints, none of that is open source.
00:25:10.000 Yeah, the difference between a lot of the cars is just the software that you're paying for.
00:25:15.000 That's not open source.
00:25:16.000 Yeah, the same software that is running the AI on the new Tesla bots is what's running in the car.
00:25:23.000 It's the same system.
00:25:24.000 I got an idea.
00:25:25.000 So we're talking to this fabricator, metal fabricator guy, about building a one-of-a-kind unique electric car.
00:25:31.000 We should do it with Minds.
00:25:33.000 It should be powered by Minds.
00:25:34.000 So like your digital console is like a Minds console.
00:25:38.000 Oh yeah, because you need an operating system and an app within the car.
00:25:42.000 So let's do it.
00:25:43.000 So we can do it really, we were talking, we can make like a really cheap electric car.
00:25:47.000 And I was like, as a gag, we have like a 2006 Cobalt with 230,000 miles on it.
00:25:52.000 And I was like, yeah, let's do that and not fix anything and be like, boom, one of a kind.
00:25:57.000 Or we could actually make a really cool car, like actually works and everything.
00:26:01.000 We should have it like integrate with mines.
00:26:03.000 Let's do it.
00:26:04.000 I saw some, there's some amazing like refurbishing companies that can take like any car and just make an electric car.
00:26:09.000 Like there's this one page on, uh, I forget.
00:26:13.000 It's, it's, they take Volkswagen specifically and they'll make it any Volkswagen electric from any period of time.
00:26:19.000 I'm gonna make the Flintstone car and beat all of you guys, including Tesla, when the EMP goes off and nuclear armageddon happens.
00:26:25.000 But will you run barefoot?
00:26:26.000 No, no, no.
00:26:27.000 Like, the funny thing with a Flintstone's car is, like, pedals exist.
00:26:30.000 We've discovered that quite a long time ago.
00:26:33.000 Yeah, but it doesn't look as funny in the cartoons when you're pedaling with your feet on the floor.
00:26:36.000 You ever see those cars where there's two sets of pedals in the front and the back and everyone in the car is pedaling and it's going?
00:26:42.000 There you go, man.
00:26:45.000 I was just going to say that I did ask, we sent a message to Elon asking this question.
00:26:51.000 I don't know if he'll answer it, but there needs to be a Tesla that is not subject to just getting shut down from Tesla HQ, like you're talking about.
00:27:00.000 I could see him getting behind that.
00:27:02.000 A Tesla that's sovereign and the software's running locally so that It's just not like sketchy dystopian possibility.
00:27:08.000 But do you know that, um, I'm not sure, Luke, if you know this, some of them have cameras pointing at you, the driver.
00:27:16.000 And if you aren't paying attention or something, it alerts you and it records that.
00:27:20.000 It uses it against you.
00:27:22.000 That is nightmarishly dystopian.
00:27:24.000 It has that history too, yeah.
00:27:25.000 Yeah.
00:27:26.000 Because I saw in the console of the Model S there's a camera, and I was like, I googled it, and I'm like, that's cool?
00:27:33.000 Because Uber drivers, for instance, they turn cameras on to film their passengers, and I'm like, that could be helpful if there's a car accident, if you're sideswiped or something.
00:27:42.000 You have no access to that camera.
00:27:43.000 It is for them to spy on you as you drive.
00:27:46.000 Creepy.
00:27:47.000 So there's no, obviously you can't log into your Tesla account and check out.
00:27:50.000 Yeah, like what data is in your Tesla account?
00:27:52.000 Maybe I'm wrong.
00:27:54.000 I don't know.
00:27:54.000 I doubt that they give you access to all the footage of your stuff.
00:27:58.000 No, no, no, no.
00:27:58.000 So the side cameras you can.
00:28:00.000 Oh yeah?
00:28:00.000 So I can pull up my phone right now and look at my car in the garage, the Model 3.
00:28:05.000 Yeah, that's really cool, but there's an internal camera and I'm like, what is that all about?
00:28:09.000 The CIA has access to all of it, live, as it's going on, most likely, without a doubt.
00:28:15.000 The NSA.
00:28:16.000 Yeah, there's no way that they don't.
00:28:18.000 So, you know, I would automatically tape up that camera as fast as I can, but I heard some car manufacturers take away your ability to do self-driving if you cover up that camera,
00:28:31.000 because the cameras are focused on your eyes to make sure that your eyes are focused on the road.
00:28:34.000 And if your eyes and hands aren't focused on the road and the wheel,
00:28:38.000 they stop automatic driving and make you drive instead of having the car drive you.
00:28:42.000 You guys know that basically all new cars are self-driving.
00:28:45.000 So Tesla has full self-driving capability only after your car has driven on the highway,
00:28:52.000 on autopilot for 100 miles, you have a high safety score and they approve you.
00:28:57.000 So my Model 3 apparently just got recently, like, congratulations, it's now entitled to full self-driving, and that's where the car will actually stop at the stop sign, check, and then slowly turn right, and the steering wheel is spinning by itself.
00:29:08.000 But most of them, when people think of like, oh, the Teslas can drive themselves, and then you get in the Tesla for the first time, and you're like, autopilot, and you're like, wow, all it does is just the steering wheel moves a little bit left and right on the highway.
00:29:20.000 That's it.
00:29:21.000 It'll speed up, slow down.
00:29:23.000 So when I first saw that, when I got the Model 3, I was like, whoa, this is crazy.
00:29:26.000 I'm like, I'm on the highway, and like, it's moving itself.
00:29:29.000 Then I got a Honda.
00:29:31.000 It's got the same thing!
00:29:33.000 Modern Honda has the exact same thing.
00:29:35.000 No difference.
00:29:36.000 It's auto-steer and cruise control.
00:29:38.000 So I'm on the highway and I'm like, oh.
00:29:40.000 That's what they call it, auto-steer?
00:29:41.000 It's called something like lane correction or something like that, I don't know.
00:29:46.000 So the Honda can't activate a full self-drive like the Tesla can, but Teslas don't come with that anyway.
00:29:51.000 You have to earn a special pilot, you know, a beta or something.
00:29:55.000 Could Honda run of self-driving upgrade?
00:29:59.000 Probably.
00:29:59.000 Remotely?
00:30:00.000 I have to imagine if they can drive themselves on the highway, they can drive themselves in the streets.
00:30:05.000 So I don't know.
00:30:06.000 I think the Tesla's don't use sonar anymore or radar or something like that.
00:30:09.000 And it switched to just camera.
00:30:11.000 And I noticed this because I think when they switched it, it got way worse.
00:30:16.000 It, like, doesn't understand what's going on anymore.
00:30:18.000 And I wonder why.
00:30:20.000 Maybe, statistically, the radar or sonar or whatever it was, was worse, and now, like, cameras are better, but then if you get, like, schmutz on the camera or something, then it just turns off on you.
00:30:30.000 I was driving in the rain, and I have it on self-drive autopilot on the highway, and I'm going 70 miles an hour, and it's making a turn, and then we get close to a semi, and it goes, boop-boom, and just turns off, and I'm like, what the?
00:30:42.000 Like, that's scary.
00:30:44.000 You keep your hands on the wheel the whole time, so you're going with it.
00:30:47.000 When it stops, just like, whoa.
00:30:50.000 It sounds tragedy waiting to happen, man.
00:30:53.000 They encourage you to take your hands off the wheel.
00:30:56.000 How is that going to make you more attentive to the road?
00:30:58.000 No, no, no.
00:30:59.000 You can't do that.
00:31:00.000 So they want you to be sitting there like a zombie, but still somehow be more, like, you're going to become less attentive to what's going on if you're not actually supposed to be attending to it.
00:31:08.000 How can that be good?
00:31:10.000 For the self-driving, you have to have weight on the steering wheel.
00:31:12.000 Your hands have to be on it.
00:31:13.000 If you take them off, it'll go...
00:31:16.000 All you should be doing is watching the road and paying attention to the road when you're driving.
00:31:19.000 In which case, there's no difference between you moving the steering wheel a few millimeters
00:31:25.000 and it moving a few millimeters itself. It is convenient when you're changing the radio
00:31:30.000 and you're not scared of dying because it's helping you.
00:31:33.000 But, you know, look, I'm driving in the rain. I turned on self-drive and I'm sitting there
00:31:37.000 the whole time like, oh, yeah, but Yeah, but what about for people who, like, pass out driving?
00:31:42.000 It'll save lives.
00:31:43.000 Yeah, it will.
00:31:43.000 But here's the problem, it won't actually, to a certain degree, if autopilot is on, and then you have a medical episode, you'll probably be safer than if you didn't have autopilot, but this thing won't break.
00:31:57.000 It's like, it breaks for the stupidest things.
00:32:00.000 The light is green, and it goes breaking for green light.
00:32:04.000 And I'm like, why?
00:32:05.000 And you gotta press the accelerator.
00:32:07.000 There's a stoplight ahead sign.
00:32:09.000 It's a picture of a stoplight.
00:32:11.000 And the Tesla stops for it.
00:32:13.000 And so you have to press the accelerator.
00:32:14.000 Otherwise, here's the craziest thing.
00:32:16.000 It's raining.
00:32:18.000 I'm driving straight and someone's pulling onto the highway and they're going slow waiting for me to pass and all of a sudden the Tesla slammed the brakes on to zero from 70 and we just like lunge forward like this has happened three times just this past weekend and so it got to the point where we're like maybe we should not use this anymore.
00:32:36.000 Yeah, maybe wait 15 or 20 years till after they stop testing nuclear bombs right next to where the tree went to
00:32:41.000 like seven Wow
00:32:42.000 It went from 70 just like slammed the brakes on in the rain and then the truck trying to come in slams its brakes
00:32:47.000 I'm like, what are you doing? And I'm just like that is not sane. The fantasy is you take your driver's
00:32:53.000 Seat and you spin it around and you and your passenger spins theirs around and there's four of you all hanging out
00:32:59.000 in the back Of the car while it's driving you that's the fantasy that
00:33:01.000 they're aiming for but I mean good God, there's no tracks It's not a train. Yeah wind can push the car a kid can
00:33:07.000 knock a Basketball out into the road on accident, but there is one
00:33:12.000 thing to add to this when all cars Communicate with each other then you have a lot less to
00:33:17.000 worry about But it's gonna have to be more than all cars.
00:33:19.000 It'll be like satellites and houses and telephones and they'll all be telling everyone where everyone is and all the machines will know and they'll be anticipating.
00:33:26.000 But Ian, it'll be okay.
00:33:27.000 You'll be in the metaverse.
00:33:28.000 You'll be one with... You'll be safer that way.
00:33:30.000 You'll be in the machine.
00:33:32.000 You will be the machine.
00:33:34.000 You have always been the machine.
00:33:36.000 Driving a car is very dangerous, and there are a lot of people that die every single day just from driving a car.
00:33:41.000 A lot of people don't realize how actually dangerous, statistically, it is.
00:33:45.000 So I think this self-driving will be sold as, hey, we're gonna make everyone safe.
00:33:49.000 No one's gonna die from car accidents anymore if All the cars are self-driving, therefore none of the cars are going to be crashing into each other.
00:33:56.000 I think that's the latest ploy.
00:33:57.000 That's how they're going to get rid of truckers.
00:33:59.000 That's how they're going to get rid of taxi drivers.
00:34:01.000 That's how they're going to get rid of any four higher driver.
00:34:04.000 It'll be illegal to drive.
00:34:08.000 They've talked about this.
00:34:09.000 Cars in the future will not have the ability to be driven.
00:34:12.000 Or your insurance will be so drastically more expensive.
00:34:14.000 Not only that, but in Europe they're already limiting cars that can't go above a certain miles per hour, like kilometers per hour or whatever they have over there in Europe.
00:34:25.000 But they're limiting, you know, roads.
00:34:27.000 They measure things in stone.
00:34:29.000 Exactly.
00:34:29.000 And turtle shells.
00:34:30.000 In totally nonsensical ways.
00:34:33.000 But there's already government shutdown switches that they have in European cars that are going to be developed very soon, where the government will have the access to particular vehicles and to be able to turn them off any time that they want.
00:34:46.000 You guys were saying if there's a health emergency for a driver, their hands come off the wheel?
00:34:50.000 It'll shut off.
00:34:53.000 Self-driving shuts off.
00:34:54.000 It doesn't pull over.
00:34:55.000 And then you crash.
00:34:56.000 No, no.
00:34:57.000 No, it's not.
00:34:57.000 So it doesn't even help that.
00:34:58.000 Right.
00:34:59.000 Oh.
00:34:59.000 No, but like, if you are, for about 30 seconds, I think, it will, it'll start, like if you pass out.
00:35:05.000 It'll wake you up.
00:35:06.000 Well, it starts going, wah, wah, wah.
00:35:08.000 But if you're having a seizure.
00:35:09.000 If you're having a seizure or something, it's not gonna help.
00:35:11.000 So, over hills is hilarious.
00:35:15.000 They cannot drive over hills.
00:35:17.000 Because the cameras can't see anything.
00:35:19.000 So it just, immediately, once you go up a hill, wah, wah, wah, wah.
00:35:22.000 Once you get to the top, If you're driving and there's an error for some reason, I've had auto drive shut off as error disengaging, you'll just go straight off the road.
00:35:33.000 They will keep getting better because all of your data is just training data.
00:35:37.000 Right.
00:35:37.000 So over time it's going to improve.
00:35:40.000 And now there's Teslas everywhere.
00:35:41.000 It was crazy.
00:35:42.000 I'm driving around, every third car I see is like a Tesla and I'm just like, wow, they're everywhere, man.
00:35:45.000 Are your Teslas going to be tweeting at you?
00:35:48.000 Tweeting at you?
00:35:49.000 I'm low on gas.
00:35:50.000 You'll get a tweet from... Gas?
00:35:51.000 Tim Pool's Tesla, yeah.
00:35:53.000 Or I'm low on charge.
00:35:54.000 I only have 32% charge, at Tim Cast.
00:35:58.000 They already do that.
00:35:59.000 They don't tweet at me and my thing buzzes and it's like, Luke was driving my Tesla and he left the door open.
00:36:03.000 And then it was like, your door is open.
00:36:05.000 And I'm like, I don't care, it's in the garage.
00:36:07.000 And then I wake up with like 12 notifications like, your door is open.
00:36:10.000 Dude, Elon used to watch Knight Rider for sure, dude.
00:36:13.000 He's trying to build Kit.
00:36:14.000 It's gonna be like...
00:36:15.000 A very funny joke at Timcast.
00:36:17.000 And you'll be like, oh, thanks, Tesla.
00:36:18.000 It's not going to say that to me.
00:36:19.000 Have you guys played with Stable Diffusion or DALI?
00:36:24.000 Oh, DALI.
00:36:25.000 So DALI is the closed source version.
00:36:29.000 Stable Diffusion is where it's at.
00:36:31.000 Is that where you can like auto-generate?
00:36:33.000 Yeah, pictures based on text.
00:36:35.000 We did that on the show.
00:36:36.000 I think we did a bunch of AI generation of like Trump and Pelosi, and it was like nightmarishly hilarious.
00:36:41.000 They're scary.
00:36:42.000 A lot of the images are like some horror movie.
00:36:44.000 What's it called?
00:36:44.000 Stable Diffusion?
00:36:45.000 Stable Diffusion.
00:36:47.000 It's not like a company.
00:36:48.000 It's a project, so you can't just... Stability.ai.
00:36:51.000 If you go to replicate.com, you can... Yeah, Stability, they created it.
00:36:56.000 Do they have... Can I use it here?
00:36:58.000 Wow.
00:36:59.000 Replicate.com?
00:37:00.000 Replicate is a good place to use Stable Diffusion.
00:37:03.000 Replicate.com?
00:37:03.000 You do have to pay.
00:37:06.000 Oh, I see, I see, I see.
00:37:08.000 That's a profit market waiting to happen.
00:37:10.000 You don't have to hire employees to do a bunch of your stuff anymore.
00:37:13.000 But Stable Diffusion is dominating DALI now.
00:37:15.000 So OpenAI, which Elon actually was a part of, I'm not sure how involved he is now, but most of their code is not even open.
00:37:24.000 Some of their stuff is.
00:37:25.000 So they do some open source, but they were trying to make the argument that Stable diffusion would be bad for artists because it's going to put them out of business because people can just, if they need a graphic generated, they can just go.
00:37:39.000 Have you seen the video generators?
00:37:41.000 No.
00:37:41.000 Okay.
00:37:42.000 So, wow.
00:37:42.000 I typed in stable diffusion demo, Bill Ottman.
00:37:44.000 This is what we got.
00:37:45.000 Here we go.
00:37:46.000 Nice.
00:37:49.000 Kinda looks like you.
00:37:50.000 I've got some meat on my bones.
00:37:51.000 I don't understand who this is and why they think it's you.
00:37:53.000 It's gotta be Orson Welles.
00:37:54.000 Ottoman Empire, probably?
00:37:55.000 Oh, it's Ottoman, yeah.
00:37:57.000 Oh, I see.
00:37:58.000 Generate image Luke Rudkowski.
00:38:00.000 This really is good.
00:38:01.000 The Dolly mini was not that good.
00:38:03.000 We'll get Joe Biden in there in a second, but let's see if we can get Luke Rudkowski.
00:38:08.000 What is this?
00:38:08.000 Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Luke, Let's see what it pulls up.
00:38:30.000 Or you have to put it in quotes?
00:38:32.000 The word?
00:38:33.000 Would that work?
00:38:34.000 Dream Studio Beta.
00:38:36.000 Oh, hey!
00:38:38.000 That bottom left one looks good.
00:38:40.000 Oh, that's so creepy!
00:38:42.000 What is this?
00:38:43.000 That's going to be you in like 30 years.
00:38:46.000 30 years?
00:38:48.000 Whoa, man.
00:38:49.000 That looks more like Chris in the upper right.
00:38:51.000 That was so creepy.
00:38:52.000 That was wild.
00:38:53.000 It was like some of it was kind of close.
00:38:55.000 It wasn't a pool.
00:38:56.000 Your facial hair.
00:38:57.000 It was not a pool.
00:38:59.000 Clearly, you have more photos in the, you know, the machine learning library.
00:39:07.000 Oh, this is the bottom left one.
00:39:10.000 This is okay.
00:39:10.000 I'm bothered by this.
00:39:11.000 You know why?
00:39:12.000 It's not funny.
00:39:13.000 Well, you have to do a sentence like Joe Biden eating a cat.
00:39:17.000 Yeah.
00:39:20.000 Dude, this is awesome.
00:39:23.000 And they're saying they're making movies.
00:39:24.000 It's a very, it's a very important news show we run here.
00:39:27.000 Joe Biden eating a cat.
00:39:30.000 You'll be in the metaverse and you'll be like, you'll think something and it will appear in front of you.
00:39:34.000 That's crazy, dude.
00:39:35.000 And you'll imagine universes.
00:39:40.000 The one on the bottom right.
00:39:44.000 Here we go.
00:39:44.000 Uh, Donald Trump hug.
00:39:47.000 No, no, wait, wait, wait.
00:39:49.000 High fiving Joe Biden.
00:39:51.000 There we go.
00:39:52.000 It's all about unity.
00:39:52.000 It's all about love.
00:39:53.000 Yeah.
00:39:53.000 Yeah.
00:39:53.000 We got to bring it together.
00:39:55.000 Hi Donald Trump.
00:39:55.000 High fiving Joe Biden.
00:39:56.000 It looks like the sentences take a little longer to render.
00:39:59.000 Yeah.
00:39:59.000 There's more in it.
00:40:01.000 This is great.
00:40:02.000 Let's do a, let's do Nancy Pelosi stealing COVID masks.
00:40:09.000 Here we go, it's loading.
00:40:10.000 Nancy Pelosi robbing citizens.
00:40:12.000 Oh, look at this!
00:40:12.000 That's amazing, dude!
00:40:14.000 Wow!
00:40:15.000 Look at Joe Biden and Donald Trump shaking hands!
00:40:17.000 That's an image I've wanted to see for a long time.
00:40:20.000 And we've made it a reality through the artificial intelligence.
00:40:22.000 Look at that bottom right, man!
00:40:23.000 It'd be funny if you ran news, like for TimCast articles, run the headline through this to generate the thumbnail for the news story.
00:40:31.000 That's actually a good idea.
00:40:32.000 Do we have the rights to that?
00:40:33.000 You do, yeah.
00:40:34.000 Hunter Biden going to jail.
00:40:37.000 Let's just fulfill our wildest dreams with this dream studio here.
00:40:42.000 Luke's taking a picture of it.
00:40:43.000 I'm setting my team.
00:40:44.000 I'm like, this is how we're going to do thumbnails from now on.
00:40:46.000 This would be cool if he went to the jail to see Joe, who was in the jail.
00:40:50.000 But I don't think the AI is that smart.
00:40:53.000 I'm loving the bottom left.
00:40:55.000 Look at that face.
00:40:57.000 Hey, it looks like the guy from Better Call Saul.
00:41:00.000 The bad guy.
00:41:01.000 Do a Vladimir Putin, Joe Biden world peace.
00:41:05.000 I wonder how esoteric this thing gets.
00:41:10.000 Putin's misspelled.
00:41:12.000 Putin.
00:41:13.000 Vladimir Putin signing peace treaty with Biden.
00:41:16.000 It's too busy!
00:41:17.000 No, no, no, no.
00:41:19.000 Don't you do that.
00:41:19.000 This one I cannot do.
00:41:20.000 Some things are too challenging even for me.
00:41:23.000 Impossible.
00:41:23.000 Not gonna happen.
00:41:26.000 What if we do like Donald Trump in drag?
00:41:29.000 Yes.
00:41:30.000 All night.
00:41:32.000 Here we go.
00:41:32.000 Vladimir Putin's... Oh, I put singing peace treaty.
00:41:35.000 That works too.
00:41:36.000 Let's see what happens.
00:41:36.000 Yeah, I'm into it.
00:41:37.000 Singing peace treaty with Biden.
00:41:40.000 Fridays are fun, aren't they?
00:41:41.000 Yeah.
00:41:42.000 Come on, give me.
00:41:42.000 What's this?
00:41:43.000 It's still going.
00:41:43.000 Oh, there we go.
00:41:44.000 Is he signing it with himself?
00:41:46.000 That is weird.
00:41:47.000 It's Putin signing a deal with himself.
00:41:49.000 All of them are.
00:41:50.000 What is this?
00:41:52.000 These two are just Putin signing deals with Putin.
00:41:53.000 Which just proves this war is inter-conflict.
00:41:57.000 This is Putin's inter-conflict, man.
00:41:59.000 And change signing to singing.
00:42:02.000 Singing?
00:42:04.000 Oh man, I think maybe because we did the show, people have started using it.
00:42:08.000 Oh, cool.
00:42:09.000 Type in World War 3.
00:42:10.000 Oh, that's a good idea.
00:42:11.000 It'll show like Joe Biden smiling.
00:42:12.000 Or Woke World 3.
00:42:15.000 Yeah, this will eventually be able to generate a movie probably.
00:42:19.000 It'll just...
00:42:19.000 Yeah, there's an app.
00:42:20.000 What was the one you were saying?
00:42:21.000 There's a commercial for it.
00:42:22.000 I forgot what it's called, but I saw it on Twitter.
00:42:24.000 And it's, you can type in a video and it will make a video for you.
00:42:28.000 And it's a loading bar.
00:42:29.000 It takes some time to generate.
00:42:31.000 But someone wrote, B-roll footage of moving through a forest towards a lake.
00:42:35.000 And then it's like, you see the trees and it was crazy.
00:42:37.000 Dude, people be like, show me a 35-year life of me having three boys and a wife and a beautiful
00:42:44.000 full house and they will live 35 years in their own AI generated environment.
00:42:48.000 You're in it now.
00:42:49.000 Oh, that makes sense.
00:42:51.000 It's gonna be worse than that.
00:42:52.000 You guys, what you don't realize... Now they're both in it.
00:42:54.000 Look at them having fun.
00:42:55.000 But there's two Putins!
00:42:56.000 Look at this!
00:42:58.000 Here you go.
00:42:58.000 Look at this.
00:42:59.000 There's two Bidens.
00:43:00.000 What is this?
00:43:01.000 It's so interesting.
00:43:02.000 It's their body doubles, obviously.
00:43:03.000 Look at this right here.
00:43:04.000 This one's pretty good, actually.
00:43:05.000 The AI knows that they have body doubles.
00:43:07.000 Now, what you don't realize is, Ian, you're wrong about that.
00:43:09.000 It's gonna be a guy being like, I want two big-titty women, and I want them in front of me.
00:43:15.000 It's gonna, you know.
00:43:16.000 You know, it's gonna get worse.
00:43:17.000 The porn industry is gonna take it over.
00:43:19.000 All right, come on, guys.
00:43:20.000 World War III.
00:43:24.000 Show me World War III.
00:43:26.000 I want to see it.
00:43:29.000 Here we go.
00:43:29.000 Oh, it's taking a long time on this one.
00:43:33.000 So what is it doing?
00:43:34.000 Is it using Google or something?
00:43:36.000 I don't actually know the full library of what it's pulling from.
00:43:40.000 This is awesome.
00:43:41.000 Bookmark this.
00:43:41.000 Yeah.
00:43:42.000 This is legit.
00:43:43.000 And there's all different variations.
00:43:44.000 Huggingface.co.
00:43:46.000 Yeah, Huggingface is a cool AI community.
00:43:47.000 Word to war.
00:43:50.000 With two Rs.
00:43:51.000 It distorts language when you put it in.
00:43:53.000 Did you hear that the AI created its own language?
00:43:59.000 Yeah, Facebook AI.
00:44:00.000 No, no, no, no, no, there was, it was Dolly, I think, I'm not sure, but one of these auto generators started, people would, you would type in like a bowl of vegetables, and there would be like a weird word that would be like vlagabo, and then when you typed that in, it would give you the same kind of vegetable every time.
00:44:15.000 Something about the AI created a word that represented an image.
00:44:19.000 It was crazy.
00:44:20.000 Yeah, because you can do this in reverse as well.
00:44:21.000 So you can feed an image and it'll spit text.
00:44:24.000 Oh, wow.
00:44:25.000 What's this website called?
00:44:27.000 Hugging Face is an AI community.
00:44:29.000 HuggingFace.com.
00:44:31.000 I'm down to just keep generating images all night because it's hilarious.
00:44:37.000 Everybody's hitting it up now and I can't get it to work anymore.
00:44:40.000 Let's get one with Lydia for her last episode when she comes back.
00:44:45.000 I typed in Ian Crossland.
00:44:46.000 We didn't do him yet.
00:44:47.000 Anything good?
00:44:48.000 Well, it's coming.
00:44:49.000 It's coming.
00:44:50.000 What if it shows, like, Ian on a throne of skulls?
00:44:53.000 Give me Ian Crosland Mushrooms after this.
00:44:58.000 That search result.
00:44:59.000 Oh, it's coming, it's coming.
00:45:00.000 Here we go.
00:45:01.000 I think it'll get it.
00:45:04.000 I think it'll get it.
00:45:05.000 You will.
00:45:06.000 What is this?
00:45:08.000 What is this?
00:45:08.000 There I am.
00:45:09.000 What are you doing?
00:45:10.000 It doesn't look anything like Bill Nye.
00:45:12.000 Inekesidex.
00:45:13.000 Yeah, I do look like Bill Nye.
00:45:13.000 They used to tell me I looked like Bill Nye.
00:45:15.000 Inekesidex.
00:45:15.000 What is that?
00:45:17.000 What is this?
00:45:18.000 He says Ian IO Greece.
00:45:19.000 There's a lot of Ians out there.
00:45:21.000 Who's this guy?
00:45:22.000 Is he selling me a product?
00:45:23.000 People used to be like, you're like Bill Nye.
00:45:25.000 He's like a comedian, but he likes science.
00:45:27.000 And I was like, he's so dorky.
00:45:28.000 The second American Civil War.
00:45:30.000 Hey, final question.
00:45:32.000 One other question I had about the Elon acquisition of Twitter.
00:45:35.000 You said open source everything.
00:45:36.000 Was that hyperbole?
00:45:37.000 Are there like security algorithms that you don't open source when open sourcing an entire library of social networks code?
00:45:44.000 Oh, I mean, yeah, you don't open source the data, but the code, there's not reason to not open it.
00:45:52.000 Like even security stuff?
00:45:54.000 Yeah.
00:45:55.000 No, open source encryption is better encryption.
00:45:58.000 It's generally agreed upon that it's the most audited.
00:46:03.000 It's the most battle tested.
00:46:04.000 Just because you open source that doesn't mean that people can hack it.
00:46:07.000 And when you say don't open source the data, what's the data?
00:46:10.000 Everybody's data.
00:46:10.000 Yeah, check it out.
00:46:12.000 It's a second American Civil War, but it just gives you the first.
00:46:15.000 All right, let's do Lydia from Timcast.
00:46:20.000 Here we go.
00:46:22.000 Here you go, Lydia.
00:46:23.000 We're making a robot.
00:46:25.000 Ian, the only thing not to open source that they would not want to would be, like, how bots, you know, instructions for bots to get around the bot detection.
00:46:35.000 That is it.
00:46:36.000 What is this?
00:46:37.000 It looks nothing like it.
00:46:38.000 There she is.
00:46:39.000 Who is Tamdana Liboda?
00:46:43.000 What in the hell?
00:46:44.000 Fascinating.
00:46:44.000 Thanks, robot.
00:46:45.000 I like that.
00:46:47.000 I'll take it.
00:46:48.000 Who is this?
00:46:48.000 That's so weird.
00:46:51.000 Look at that upper left, is that like a boudoir image or something?
00:46:53.000 Yeah, that's what it looks like, holy cow.
00:46:55.000 That's like you in your spare time.
00:46:56.000 Hashtag never lids, seriously.
00:47:00.000 Luke's turn.
00:47:01.000 There you go.
00:47:03.000 Come on, there we go.
00:47:04.000 Oh cool, you can just spam click it.
00:47:07.000 Well, I did try the Luke Rutkowski.
00:47:08.000 What did we get?
00:47:09.000 We got something weird, didn't we?
00:47:10.000 Yeah, something weird, right?
00:47:11.000 Similarly, somewhat looks like me.
00:47:15.000 That's why I think Luke, we are changed.
00:47:17.000 I wonder if I do it all as one word or like an at symbol.
00:47:19.000 I used to want to be famous because I wanted to give like an Oscar speech, but now I won't be famous because I want these A.I.s to be accurate.
00:47:25.000 It's gonna be like conspiracy theorists.
00:47:27.000 What is this?
00:47:29.000 We, we, we know.
00:47:30.000 We've, uh, then is.
00:47:33.000 I, that's not a word.
00:47:33.000 We, we were Chang.
00:47:35.000 We, we were Chang.
00:47:38.000 Weird.
00:47:39.000 Weird Chang.
00:47:40.000 Is that a beanie?
00:47:41.000 With me?
00:47:42.000 What is this?
00:47:43.000 It does look like a beanie.
00:47:44.000 It's so weird.
00:47:45.000 I wonder what makes it think this.
00:47:47.000 It's like lots of way like that looks like the Arabic alphabet a little bit on the upper right one.
00:47:51.000 I mean that's just from an American you know.
00:47:53.000 Klaus Schwab eating bugs.
00:47:55.000 Good one.
00:47:56.000 All right after this one.
00:47:57.000 I did Nancy Pelosi eating too many nachos.
00:48:01.000 Oh, I know, we should do Seamus dropping potatoes.
00:48:04.000 Yeah.
00:48:05.000 Seamus Coghlan potato.
00:48:08.000 He'll actually be a potato.
00:48:10.000 I wonder what would happen if I typed in Freedom Tunes, though.
00:48:12.000 Oh, yeah.
00:48:14.000 All right, here we go, guys.
00:48:15.000 Nancy Pelosi eating too many nachos.
00:48:18.000 Whoa, those are large.
00:48:20.000 Wow, that's cool.
00:48:21.000 Those are huge.
00:48:22.000 Those are big nachos.
00:48:23.000 Yeah.
00:48:23.000 Got some big nachos there, Nancy.
00:48:24.000 She has some very huge nachos.
00:48:27.000 Cloud Schwab eating bugs.
00:48:29.000 I really want to see that one.
00:48:32.000 Klaus Schwab eating bugs.
00:48:35.000 Let's get it.
00:48:37.000 Come on.
00:48:37.000 Error, error, error.
00:48:38.000 Because everybody wants to use it.
00:48:40.000 All right.
00:48:41.000 It's going.
00:48:42.000 Oh, there's a queue.
00:48:42.000 I see.
00:48:44.000 Is this just huggingface.io?
00:48:46.000 Or doc.co?
00:48:47.000 Yeah.
00:48:47.000 Okay.
00:48:48.000 Now you're making more people use it.
00:48:49.000 Yeah, I know.
00:48:50.000 It won't even load for me.
00:48:51.000 It says no results found.
00:48:53.000 Oh, really?
00:48:53.000 I think I'm doing it wrong.
00:48:54.000 Huggingface.co.
00:48:55.000 That's kind of a scary thought.
00:48:56.000 That's disturbing.
00:48:57.000 Is that from Alien?
00:48:58.000 Yeah, face huggers.
00:48:59.000 I don't like that.
00:49:01.000 Weird.
00:49:01.000 Well, it doesn't really flash well.
00:49:03.000 Maybe when he was younger.
00:49:05.000 How about, I got one, Bill Gates eating cricket.
00:49:11.000 Or man boobs.
00:49:12.000 Man boobs.
00:49:13.000 You can fulfill all your fantasies.
00:49:15.000 That's just a normal picture.
00:49:16.000 You should do like Bill Gates workout.
00:49:18.000 Bill Gates in gulag.
00:49:20.000 Bill Gates hot body.
00:49:22.000 No thank you.
00:49:23.000 Does he do rifts with a six pack?
00:49:24.000 Actually, I'm wondering if he could do that.
00:49:26.000 That'd be interesting.
00:49:28.000 Oh yeah, what if we did, like, Bill Gates' head on giraffe?
00:49:33.000 Oh, that's cool.
00:49:34.000 Alright, Bill Gates eating crickets.
00:49:39.000 Look at this one!
00:49:39.000 He's so happy.
00:49:42.000 Wow, that's so weird.
00:49:44.000 He looks pretty healthy, I don't see those.
00:49:45.000 Yeah, he's a lot skinnier than he is in real life.
00:49:48.000 Yeah, and he's got gigantic bugs.
00:49:50.000 Bill Gates' head on a cricket.
00:49:54.000 Oh, cricket!
00:49:55.000 No, let's do a giraffe.
00:49:56.000 Giraffes are, like, cooler.
00:49:57.000 Yeah, be more noticeable.
00:50:00.000 The application is too busy.
00:50:01.000 He does kind of stick his neck out.
00:50:01.000 Remember when we talked about news and stuff?
00:50:03.000 That was fun.
00:50:03.000 Oh, no, no, no.
00:50:04.000 This is way more fun.
00:50:05.000 This is the future right now.
00:50:06.000 It's Friday.
00:50:07.000 We're giving you the early scoop on what's coming.
00:50:10.000 Yeah, this probably won't work very well on iTunes.
00:50:12.000 People are going to listen to this and be like, I have no idea what they're talking about.
00:50:15.000 But what do you think about the, because, so OpenAI is making the argument that they should keep it closed because it's going to put artists out of business.
00:50:24.000 It did the opposite!
00:50:25.000 It put a giraffe's head on Bill Gates' body!
00:50:29.000 I think the ability for artists to capitalize and make money off of their art is a relatively new thing.
00:50:35.000 Like, only until TV and you kind of, when you could control your own dispersion of the art.
00:50:42.000 Like, before you had patrons, it was all patronage-based.
00:50:45.000 Because it's easy to do, anyone can do it.
00:50:47.000 I'm kind of shocked that people, there's an industry for it.
00:50:50.000 You know, Ian, you were making the point about how you're going to say, show me my life, 35 years with two kids.
00:50:56.000 It's one thing we didn't actually expect when we were talking about the Metaverse.
00:51:01.000 You know, in the past couple of years, we've been talking about linking your brain into Neuralink and doing the Metaverse, that you're going to be in Lord of the Rings, you're going to be in Skyrim.
00:51:11.000 Little did we realize, what it's actually gonna be like is, it's a blank slate.
00:51:16.000 You're gonna plug in your brain, and it's gonna be a white dead space, and you're gonna go, mythical universe, orc monsters, and I'm a knight fighting my way to save the princess.
00:51:27.000 And then, just creates that universe in front of you, auto generates it, and then you live it.
00:51:32.000 It's like, the metaverse is something you're gonna create in real time.
00:51:35.000 It's like the holodeck, only, You're gonna plug your brain into it.
00:51:39.000 But do you think it'll completely take over your visual field?
00:51:43.000 And or it will be like happening in the background and simultaneously you'll have your current visual field but there will also be a parallel one.
00:51:51.000 Oh that's creepy.
00:51:52.000 Because like you know it's like you wouldn't want to have Neuralink just wipe out your whole field of vision.
00:51:57.000 So it's like how is it- Can your brain comprehend an expanded field of vision
00:52:02.000 is the question.
00:52:03.000 Because you'd want to be able to feel, you'd want to be able to see,
00:52:08.000 but at the same time, you'd probably want to be able to still like-
00:52:12.000 You don't want to black out.
00:52:13.000 Yeah, but I don't know, man.
00:52:15.000 I think the sci-fi view of things is that it will take over your mind.
00:52:20.000 Like you see the black mirror where the guy goes in the video game and bangs his buddy.
00:52:24.000 You know this one?
00:52:24.000 Yes.
00:52:25.000 Yeah, so you've seen it.
00:52:26.000 It's a fighting game.
00:52:27.000 And then he plays as some like Asian dude, but then his friend plays as some chick.
00:52:31.000 And then they end up banging because for whatever reason,
00:52:33.000 the game programmed that in, I guess.
00:52:34.000 I don't know.
00:52:35.000 play in this game called XCOM UFO defense it's like you play as this
00:52:39.000 corporation that's defending earth against this alien invasion and there's
00:52:42.000 this enemy group of humans that are plugged into the neural net they're
00:52:45.000 called exalt and they can like see each other's visions and stuff but if you can
00:52:49.000 hack their network they go blind and they can't hear you they'll run right
00:52:52.000 past you they don't which so there's a that's a terrifying vulnerability to a
00:52:56.000 neural net is that people can shut off your perceptions like ghost in the shell
00:52:59.000 dude in a standalone complex that anime is amazing they this hacker hacks your
00:53:05.000 brain so you can't see him.
00:53:07.000 Super cool.
00:53:08.000 That show's great.
00:53:08.000 I think it's gonna be both.
00:53:09.000 I think you're gonna be able to see multiple things in parallel.
00:53:12.000 Whether or not you can focus on multiple things is gonna be a learning process for the human brain.
00:53:18.000 But then you're also gonna have the ability to shut it off and go into like a video game.
00:53:21.000 Like looking at a monitor.
00:53:23.000 Like a 360 degree monitor.
00:53:26.000 I got one more ask, and I promise I'm done.
00:53:29.000 I would really love to see if the A.I.
00:53:32.000 could figure this one out.
00:53:33.000 Jeffrey Epstein's client list.
00:53:37.000 If we could solve this crime right now with this artificial intelligence, I'd be very happy.
00:53:42.000 What if we do it and it's just like Bill Clinton, Bill Gates?
00:53:45.000 Kevin Spacey, Chris Tucker, who else?
00:53:50.000 Jean-Luc Brunet.
00:53:51.000 Putin sitting on the throne of skulls.
00:53:53.000 Oh, had no problem doing that.
00:53:55.000 Yeah.
00:53:55.000 Alright, let's do Jeffrey Epstein's client list.
00:54:01.000 Here we go.
00:54:02.000 Can this thing see into the future?
00:54:04.000 Can we solve?
00:54:06.000 Can we be our modern-day Scooby-Doo and solve this real-life crisis?
00:54:10.000 We solved the mystery, gang!
00:54:11.000 How?
00:54:11.000 We put it into an AI generator and it made a list for us.
00:54:14.000 That proves it.
00:54:16.000 Well, the information's out there, it's just being denied to everyone.
00:54:20.000 So maybe the AI has the access to the DOJ's files, and they could release the information.
00:54:28.000 Hey, look!
00:54:29.000 Upper right, who's that?
00:54:31.000 Zoom in.
00:54:31.000 Hold on.
00:54:32.000 Can you zoom in?
00:54:32.000 Okay, here we go.
00:54:33.000 That looks like Bill Clinton right there.
00:54:35.000 Here's Okunio Metua.
00:54:38.000 We got him!
00:54:41.000 Here's one!
00:54:46.000 That's all the possibilities how he really looks right now after all the plastic surgery.
00:54:49.000 It's like when you read in a dream, that's what the letters look like.
00:54:53.000 At least for me they do.
00:54:53.000 They look like swirling shapes like that.
00:54:55.000 That's so weird.
00:54:56.000 You all know who you are.
00:54:58.000 Is our brain just a neural net?
00:55:00.000 Just trying to piece together information as unchaotically as possible?
00:55:05.000 I don't know, man.
00:55:07.000 Jack Pasoba talked about this with the AI.
00:55:10.000 There's a website where it's like, this person is not real.
00:55:14.000 Have you ever seen that?
00:55:15.000 You load it and auto generates a fake face.
00:55:18.000 And he said, if you keep doing it, the demons start peeking in.
00:55:21.000 And we were like, what does that mean?
00:55:23.000 He's like, don't do it, don't do it.
00:55:24.000 So we start doing it.
00:55:26.000 Loading this thing and it's just showing random faces and they're like kind of off but eventually if you do too many times you start getting weird like black hole eyes like Peeking around the corners and just really creepy stuff Yeah, dude.
00:55:38.000 Super creepy.
00:55:39.000 Shaggy Rogers.
00:55:41.000 Shaggy Rogers.
00:55:42.000 There you go.
00:55:42.000 That beautiful man.
00:55:43.000 Peter Pan.
00:55:44.000 His eyes are so blue.
00:55:45.000 Yeah.
00:55:45.000 Is that what Shaggy looks like in real life?
00:55:47.000 Is that what Shaggy is?
00:55:48.000 We could only be so lucky.
00:55:49.000 Look at that hat.
00:55:50.000 Backstreet boy.
00:55:51.000 It's like a Luigi hat.
00:55:53.000 Velma Dinkley.
00:55:55.000 Wait, wait.
00:55:56.000 Dinkley.
00:55:56.000 That's her last name, right?
00:55:57.000 Yes.
00:55:57.000 But not gay.
00:55:58.000 I wonder if it can handle negatives like that.
00:56:01.000 It won't be able to find it.
00:56:02.000 Yeah.
00:56:03.000 I don't know if the, do you think the A can handle like a weird phrase like that?
00:56:07.000 It's like, She'll be dressed like Daphne, watch.
00:56:10.000 Dressed like Daphne?
00:56:10.000 I don't know.
00:56:12.000 That's the biggest news of the week.
00:56:13.000 I hope everybody realizes that not only is Velma gay, but they made Velma and Shaggy black.
00:56:17.000 Oh, I rewatched a bunch of old Scooby-Doo last night, at least a little bit.
00:56:20.000 For sure, Scooby and Shaggy are high as hell.
00:56:23.000 They constantly have the munchies.
00:56:25.000 The whole show's about them eating, like, sandwiches.
00:56:27.000 He's eating dog food!
00:56:28.000 Constantly hungry, yeah.
00:56:30.000 So, uh, look at this.
00:56:33.000 That's kind of scary.
00:56:34.000 That's horrifying.
00:56:35.000 Yeah, what is that?
00:56:36.000 She looks like she's 12 years old.
00:56:38.000 She has some biceps.
00:56:39.000 I don't like that.
00:56:40.000 Let's try and do, like, what do you think we can have it generate that would be, like, serious, legitimate, and interesting?
00:56:47.000 Alex Jones on a unicorn.
00:56:49.000 Okay, well, that's kind of weird, but I'll do it anyway.
00:56:52.000 Oh, Atlantis, the lost city of Atlantis.
00:56:54.000 Let's do that.
00:56:55.000 I got that one from the chat.
00:56:56.000 Can you ask it questions?
00:56:57.000 What did Atlantis look like?
00:56:59.000 We're treating it like an oracle.
00:57:01.000 The lost city.
00:57:02.000 It's literally just like a machine smashing things together.
00:57:06.000 And we're like, give us the answers.
00:57:07.000 How about this, you guys, chat or super chat, what's the most profound, interesting thing we could generate that would make us go like, oh, you know?
00:57:14.000 Because we're doing silly, stupid nonsense.
00:57:14.000 I don't know.
00:57:16.000 And I wonder if typing lost city.
00:57:17.000 Someone did say Alex Jones on a unicorn in the chat.
00:57:19.000 I do want to see Alex across my mind.
00:57:21.000 I'm just saying.
00:57:22.000 This is what happens when Bill comes in and he's like, there's an AI generator, it works, and we're like, alright, the whole show is now this.
00:57:28.000 Bill Gates pregnant, demons, people are saying.
00:57:32.000 Fauci Wuhan lab, people are saying.
00:57:35.000 Anthony Fauci in Wuhan Virology Lab.
00:57:41.000 There we go.
00:57:44.000 Atlantis did.
00:57:45.000 Atlantis looked really cool.
00:57:46.000 Atlantis was cool.
00:57:47.000 Someone wrote poop balls.
00:57:49.000 No.
00:57:49.000 Nightmare Fuel.
00:57:50.000 Alien Life.
00:57:51.000 I think some of these generators don't allow NSFW as well.
00:57:56.000 Not safe for work.
00:57:57.000 Oh, yeah.
00:57:58.000 When I go to huggingface.co, is there another button I gotta push to go to that?
00:58:01.000 I don't even know.
00:58:02.000 Google search it.
00:58:02.000 Google search stable diffusion.
00:58:03.000 I actually haven't used this one before.
00:58:05.000 Anthony Fauci in Wuhan Virology Lab.
00:58:08.000 What if it just shows us like an actual photo?
00:58:10.000 Here he is.
00:58:11.000 Wow, that's amazing.
00:58:13.000 These people do not look good.
00:58:13.000 Leaked.
00:58:14.000 Look at this.
00:58:15.000 No, but here's the crazy thing is it actually just shows like Chinese people working in a lab with Anthony Fauci.
00:58:21.000 Look at their faces, though.
00:58:22.000 They're like, these are just actual pictures we got for you.
00:58:28.000 These are real, actually.
00:58:29.000 Oh, look at her face.
00:58:30.000 Well, that's what happens when you're exposed to the viruses in these labs.
00:58:33.000 Yeah, that makes sense.
00:58:34.000 All right, what other one?
00:58:36.000 Someone said Ian Shaftkam.
00:58:38.000 I don't know what that is.
00:58:42.000 Alex Jones on a unicorn.
00:58:47.000 People are riding a unicorn.
00:58:49.000 On a unicorn.
00:58:49.000 All right, on.
00:58:50.000 Here we go.
00:58:51.000 People are demanding it.
00:58:52.000 I'm seeing it in the chat.
00:58:55.000 Dude, their stock is rising.
00:58:56.000 So is there a company behind this?
00:58:58.000 Yeah, I think so.
00:59:00.000 Is that Stability.ai?
00:59:01.000 Well, Stable Diffusion is a model.
00:59:04.000 Hugging Face basically embedded this model into their site.
00:59:07.000 So Hugging Face didn't make Stable Diffusion.
00:59:09.000 I think Stability Oh, awesome.
00:59:12.000 But it's open, so anyone can embed it in their site.
00:59:15.000 We're thinking of having this auto-generate avatars by default on mines.
00:59:21.000 For people who don't have avatars.
00:59:22.000 Excellent.
00:59:23.000 Or based on your current trending stuff that you've been involved in.
00:59:26.000 You'll change it daily or something.
00:59:28.000 Yeah.
00:59:28.000 Oh, no, no.
00:59:30.000 Here's one.
00:59:33.000 Is that a person's head by that unicorn?
00:59:37.000 God before humans.
00:59:39.000 God before humans.
00:59:41.000 It's gonna create like an archetypal or like a stereotypical religious image.
00:59:47.000 What about if you type in Satan was the good guy?
00:59:50.000 I'm gonna do that on my personal computer.
00:59:51.000 Yeah, see what you come up with.
00:59:52.000 Satan was the good guy?
00:59:53.000 Yeah, I wonder if they can fathom that.
00:59:56.000 It's an AI, it's like, it's, I don't know, you could probably put it, I could probably put like, orange, car, jet, yeah, look, it's, what is this?
01:00:05.000 What is this thing?
01:00:07.000 This is interesting.
01:00:08.000 It says, Wattish, Obito, Obito.
01:00:12.000 It's Moloch.
01:00:13.000 The Todd, Tenfieh, Mfordm.
01:00:18.000 Amla Tovane.
01:00:20.000 I'm not gonna finish reading that.
01:00:21.000 What if it's like a demonic?
01:00:22.000 What is the spell?
01:00:23.000 Yeah, I don't like that.
01:00:24.000 It's like speaking demon.
01:00:26.000 So good.
01:00:27.000 Alright, what else is someone... The Edge of the Universe.
01:00:29.000 Biblically Accurate Angel.
01:00:30.000 Someone wrote... Biblically Accurate Angel?
01:00:34.000 Yeah.
01:00:35.000 Is it gonna be like... Scary as heck.
01:00:37.000 It's gonna be a picture of like a spaceship?
01:00:39.000 Something like that, maybe.
01:00:40.000 Someone asked for Alex Jones Gorilla.
01:00:43.000 Alex Jones as a gorilla.
01:00:45.000 Yeah.
01:00:45.000 We can do that.
01:00:46.000 I wonder if it does a different render every time with the same sentence.
01:00:50.000 It does different or the best president of the United States.
01:00:53.000 Someone just said that as well.
01:00:54.000 Oh yeah.
01:00:55.000 People are saying you can download the stable division AI and just do it on your own computer.
01:00:59.000 There you go.
01:01:00.000 Powerful, biblically accurate angel.
01:01:01.000 And it's like not biblically accurate.
01:01:03.000 Not at all.
01:01:05.000 Alex Jones as a gorilla.
01:01:10.000 Lydia, those aren't like the angels that you've seen?
01:01:11.000 No, those are not like the angels I've seen.
01:01:13.000 Those ones are scary.
01:01:14.000 There's a reason they say do not be afraid when they first introduce themselves to humans.
01:01:19.000 Alex Jones is a gorilla.
01:01:20.000 What do they look like?
01:01:21.000 They're giant serpent monsters or something?
01:01:23.000 No, they have like eight wings.
01:01:24.000 They have like ten faces.
01:01:26.000 They have like a million eyes.
01:01:28.000 It's really horrifying sounding.
01:01:29.000 Yes, they're not like the fluffy little fat angels you see in paintings.
01:01:32.000 They're like, smoke this.
01:01:34.000 Don't panic.
01:01:34.000 It's Alex Jones!
01:01:36.000 It's just a gorilla.
01:01:37.000 Yeah, but it sucks.
01:01:38.000 And it only made two, huh?
01:01:39.000 He's under.
01:01:41.000 All right, let's see.
01:01:42.000 Ask the AI for the meaning of life.
01:01:44.000 Anderson Cooper eating sponge cake.
01:01:46.000 Well, I don't know about that one.
01:01:47.000 No thanks.
01:01:48.000 Ian on the couch after the apocalypse.
01:01:51.000 Oh.
01:01:52.000 It doesn't know me yet.
01:01:52.000 Donald Trump fighting necromorphs from dead space.
01:01:55.000 This is the best one so far.
01:01:56.000 All right.
01:02:01.000 What if it like nails it perfectly?
01:02:04.000 Here we go.
01:02:06.000 Donald Trump fighting necromorphs from Dead Space.
01:02:09.000 I just want to see Dead Space.
01:02:11.000 That sounds cool.
01:02:13.000 It's a game.
01:02:14.000 They're just releasing a new Dead Space, actually.
01:02:16.000 Oh, interesting.
01:02:17.000 Yeah.
01:02:17.000 It's like creepy, right?
01:02:18.000 It's like you go on like an abandoned ship.
01:02:19.000 I think so, yeah.
01:02:20.000 Derelict.
01:02:21.000 Let's do one of Elon after.
01:02:22.000 Oh, yeah.
01:02:23.000 Oh, look at this.
01:02:23.000 Good one.
01:02:24.000 Look at that.
01:02:25.000 I don't know what's going on, but you know.
01:02:27.000 It's awesome.
01:02:27.000 Wow.
01:02:28.000 Elon Musk as the Doom guy.
01:02:32.000 And should.
01:02:33.000 Write a comic, have this thing auto-generate the art for you.
01:02:36.000 Start just a new genre of art, man.
01:02:38.000 So wait, these images it generates aren't yours to use?
01:02:41.000 For real?
01:02:42.000 Yeah.
01:02:42.000 So when we do news stories, we can just be like... I think it depends on the specific model, but I'm pretty sure that you have the rights.
01:02:51.000 So, scroll down.
01:02:52.000 Does it say anything?
01:02:53.000 Yeah, I think it should say license.
01:02:56.000 Yeah, read it.
01:02:56.000 Creative ML open.
01:02:57.000 See, every model has no rights.
01:02:59.000 It says no rights?
01:03:01.000 Oh, look at this!
01:03:02.000 What is that?
01:03:03.000 What the is that?
01:03:05.000 Where's the prompt here?
01:03:06.000 Elon Musk as the doom guy.
01:03:08.000 That's creepy.
01:03:10.000 Wow.
01:03:11.000 Elon Musk buying Twitter.
01:03:14.000 I don't like that.
01:03:17.000 Come on, you can do it.
01:03:19.000 There we go.
01:03:20.000 Wait, while we're waiting, let's actually read the license.
01:03:21.000 I want to see what you actually have the ability to do.
01:03:25.000 Forbids you from sharing any content that violates any laws, blah, blah, blah.
01:03:28.000 They claim no rights.
01:03:29.000 You're free to use them as and are accountable for their use.
01:03:32.000 There you go.
01:03:32.000 This just spawned a genre.
01:03:34.000 This is amazing.
01:03:34.000 So now for all of our news articles, when it's like Joe Biden does something, we'll type in and just use it.
01:03:40.000 Pulse?
01:03:40.000 Look at this.
01:03:42.000 Pulse.
01:03:42.000 Pulse.
01:03:44.000 Pulsels.
01:03:44.000 Puzzles.
01:03:45.000 Look at this!
01:03:46.000 I love how there's always like for some reason two of them like Elon's interviewing Elon.
01:03:53.000 They do like the persona of Elon and Elon himself.
01:03:56.000 I think I wonder if that's what the AI is doing.
01:03:59.000 I would say that we'd use this for like Timcast thumbnails but they're so creepy.
01:04:02.000 I think it would terrify people and they wouldn't want to share them.
01:04:06.000 It's like, no, I'd prefer not to use the nightmare images for my news article.
01:04:10.000 I typed in Bill Gates with sexy body.
01:04:12.000 It did not.
01:04:12.000 It did not satisfy.
01:04:14.000 It's not.
01:04:14.000 I don't know if there's any.
01:04:16.000 I mean, it shows his body full on.
01:04:17.000 It's not sexy.
01:04:19.000 But it's just crazy.
01:04:20.000 This is brand new.
01:04:21.000 I mean, this came out like in the last couple of months.
01:04:23.000 It's so this is just the beginning.
01:04:25.000 I mean, within a year, it's going to be someone's going to release like one hundred and fifty comics in like two weeks because they're going to have all this art done for them.
01:04:32.000 Elon Musk returning to his home planet.
01:04:36.000 Come on.
01:04:37.000 Oh, I know.
01:04:39.000 I've got it.
01:04:40.000 I know what we're going to do after this one.
01:04:41.000 I hope you guys are ready.
01:04:43.000 You're not going to reveal it.
01:04:46.000 And then we'll grab the super chat ones, because this is the weirdest.
01:04:49.000 We get addicted to these things.
01:04:50.000 Last time we did this, the same thing happened.
01:04:51.000 It's just too funny.
01:04:52.000 Like, you want to see more!
01:04:54.000 Like, I want to know what the machine can do.
01:04:56.000 Elon Musk returning to his home planet.
01:04:58.000 Here we go.
01:04:58.000 Yes, I like that.
01:04:59.000 It's with the parachute!
01:05:02.000 Okay, here we go.
01:05:03.000 I like the spaceship one on the bottom right.
01:05:04.000 That's great.
01:05:05.000 Jeff Bezos.
01:05:07.000 Mark Zuckerberg.
01:05:10.000 Oh, I spelled Zuckerberg really wrong.
01:05:12.000 Zuckerberg.
01:05:13.000 And, um...
01:05:16.000 What else do we got?
01:05:16.000 Then we got Bezos, we got Zuck at Bill Gates.
01:05:18.000 And Bill Gates fused into one person.
01:05:23.000 Why would we do that?
01:05:25.000 It's the Triumvirate.
01:05:26.000 Okay, okay.
01:05:27.000 It's the Bersuckergates.
01:05:28.000 Three-headed snake.
01:05:29.000 In the future, you'll be asking permission of Bersuckergates, the Triumvirate.
01:05:33.000 Dude, the admins on Hugging Face are going, what is happening right now?
01:05:38.000 They've got, like, Red Alert in their Slack channel.
01:05:40.000 They're like, yo, wake up!
01:05:42.000 We got Hugging Face.
01:05:45.000 Ready?
01:05:45.000 Oh, he's a tall guy.
01:05:47.000 Okay.
01:05:48.000 It does kind of look like Zucker and Gates somehow.
01:05:52.000 Like, it does.
01:05:53.000 It's Amazon factory.
01:05:55.000 This is like the most eerily accurate image yet so far.
01:05:59.000 There's no face, but it's kind of like, oh, that's scary.
01:06:02.000 Didn't even try the face.
01:06:04.000 Yeah, so, uh, how about we actually talk about- I think there will be certain models that can generate, like, last- Ben Shapiro playing tennis?
01:06:11.000 Nightmarish stuff that will feel more- Like cartoons, they could do cartoon imagery?
01:06:15.000 Yeah, there'll be cartoon-specific stuff, yeah.
01:06:17.000 There'll be influencers that are totally AI-generated, that are not real, that will have- They tried that!
01:06:22.000 Yeah, and they will do it, if they haven't done it already.
01:06:26.000 Geez, I was watching gameplay- I mean, you could have a whole army of them working for you.
01:06:28.000 Exactly, or influencing social media for a particular cause.
01:06:34.000 You know, just like bots and sock puppet accounts.
01:06:36.000 I was watching gameplay footage last night and I watched some Japanese game and all these like anime avatars, instead of people streaming their face, it was a bunch of cartoon anime avatars over top of like filters.
01:06:47.000 So I see, yeah, building those.
01:06:49.000 Ben Shapiro playing tennis.
01:06:51.000 That's a good one.
01:06:52.000 Ben Shapiro.
01:06:55.000 Angry after losing tennis match.
01:07:00.000 All right.
01:07:00.000 Why is he wearing a kilt?
01:07:01.000 That was cool.
01:07:02.000 Was he wearing a kilt?
01:07:02.000 Yeah, it was like a red skirt.
01:07:03.000 Oh, I got a good one after this.
01:07:06.000 Jordan Peterson playing bagpipes.
01:07:08.000 I just want to see it, man.
01:07:09.000 One time.
01:07:10.000 Just one time.
01:07:12.000 Just gotta see it.
01:07:13.000 Somebody said AI self-portrait, which would be really cool.
01:07:16.000 What do you look like?
01:07:17.000 Yeah, Picasso, tell us.
01:07:22.000 All right, here we go.
01:07:23.000 Lori Lightfoot from Beetlejuice.
01:07:25.000 Look at this.
01:07:26.000 Ben Shapiro angry after losing tennis match.
01:07:30.000 Well there you go.
01:07:30.000 go.
01:07:31.000 Pretty accurate.
01:07:32.000 Oh yeah, it looks very accurate.
01:07:33.000 Image of your self.
01:07:37.000 Dream.
01:07:39.000 Yeah, I think it does really well with non-people.
01:07:41.000 Like, I typed in psychedelic space mushroom.
01:07:44.000 I have a feeling it's gonna be really beautiful, bizarre.
01:07:48.000 Not freakish.
01:07:48.000 What if we, like, ask the AI, like, your deepest desire, and it shows an image of just, like, dead humans everywhere?
01:07:54.000 We'll be like, uh, maybe we should turn it off.
01:07:56.000 It'll just start playing the Terminator.
01:07:58.000 It's just a picture of, yeah, T- what was it, 2000?
01:08:01.000 T-1000.
01:08:01.000 T-1000?
01:08:01.000 Yeah.
01:08:05.000 Here we go, it's going.
01:08:06.000 Image of your dream.
01:08:08.000 What's it gonna look like?
01:08:10.000 What?
01:08:10.000 Boring!
01:08:10.000 It just made a Y cup dream dream?
01:08:13.000 Pop out!
01:08:17.000 One more, one more.
01:08:17.000 Artificial intelligence?
01:08:20.000 AI self-portrait.
01:08:21.000 Yeah, psychedelic space mushroom, super cool.
01:08:25.000 Psychedelic space mushroom?
01:08:26.000 Yeah, just weird looking colorful art.
01:08:28.000 Oh, yeah, that's that's what it is.
01:08:30.000 That's nice.
01:08:31.000 Sure is, yeah.
01:08:31.000 Dude, the internet's so interesting that it can source data so quickly.
01:08:36.000 Like people's faces.
01:08:38.000 So, but how do they do this?
01:08:39.000 Is it, like, it pre-loaded tons of images already, like, going up to a certain year, and then it... Whoa, look at that.
01:08:46.000 So, I hope... This one's okay.
01:08:48.000 This one is the creepiest.
01:08:49.000 Christopher Walken.
01:08:51.000 What is that?
01:08:52.000 Welcome to the future, ladies and gentlemen.
01:08:54.000 It is creepy.
01:08:55.000 Don't like it.
01:08:56.000 Very much like a dream. All right, man. Well, you know what we should do? Let's, let's, let's, now
01:09:00.000 that we've spent 40 minutes talking about this, that was fun. We've just done. Let's, uh, we got
01:09:04.000 to talk about this with new, uh, with, uh, Matt Walsh blog.
01:09:07.000 We have this tweet from Nuance Bro, and he says, this is how you do it, folks. All right. So
01:09:11.000 let me give you some backstory.
01:09:12.000 These, uh, Media Matters types are digging up old comments and videos from Matt Walsh when he had
01:09:17.000 like a radio show when he was younger.
01:09:19.000 They're taking them out of context and trying to smear and defame Matt Walsh, and it's just—it's stupid.
01:09:24.000 There was one video they shared where it's Matt Walsh and his friend joking about burning a book and being—and, like, being Nazis or whatever, but it was funny because they couldn't get the lighters to work, and they were like, Just wait until rednecks figure out fire!
01:09:36.000 And they're like trying to light a book on fire.
01:09:37.000 It's funny.
01:09:38.000 So in response to the smear campaign against Matt Walsh, he responded by saying this.
01:09:44.000 So here's my official answer for the record.
01:09:50.000 Kiss my ass.
01:09:52.000 I do not apologize.
01:09:54.000 In fact, by all rights, you sick freaks should be the ones apologizing to me for lying and defaming me and doing it all because I'm trying to prevent you from sexually mutilating children.
01:10:09.000 You damned monsters.
01:10:11.000 You child-abusing psychopaths.
01:10:15.000 I wouldn't apologize to you soulless parasites if I had a gun to my head.
01:10:19.000 Oh man, dude.
01:10:22.000 Instead, I'd rather just tell you all to piss off.
01:10:25.000 I apologize for nothing.
01:10:27.000 I concede nothing.
01:10:29.000 I will never surrender even a single inch of ground to a pitchfork mob of degenerate morons.
01:10:36.000 You know, the secret they never say out loud is that nobody is truly cancelled unless they consent to it.
01:10:44.000 And they willingly play their assigned roles.
01:10:47.000 Well, I do not consent.
01:10:50.000 And I'm not gonna play the game.
01:10:53.000 I'm not going anywhere.
01:10:55.000 That was absolutely amazing, and that's how it's done.
01:10:59.000 That's how you do it, folks, says NuanceBro, and that's it right there.
01:11:02.000 I don't like the name-calling, though.
01:11:05.000 I like Matt a lot, but calling people, like, insults is just inflaming.
01:11:09.000 I think it's just inflaming.
01:11:10.000 If he really wants to lower tension, I mean, if he wants to... These are people who have intentionally taken audio from him out of context, and then lied about him to try and cause damage to his work when he's trying to help kids.
01:11:22.000 When he said that I'm trying to stop you from mutilating children, I get that, because he is.
01:11:26.000 But when he calls them, like, gratuitously morons and things like that, I don't know.
01:11:30.000 I can understand not wanting to insult someone in a general context of a political argument, but this is different.
01:11:34.000 These are people who are acting outside the bounds of morality and ethics.
01:11:37.000 They are seeking to manipulate and lie to people for political power.
01:11:40.000 What would you say?
01:11:41.000 Yeah, what would you say?
01:11:42.000 What would be your response?
01:11:44.000 So you're Matt Walsh.
01:11:45.000 You're remaking this video.
01:11:46.000 Go.
01:11:47.000 I would have taken one of the people.
01:11:48.000 No, no, no.
01:11:49.000 Go.
01:11:49.000 You're on right now.
01:11:51.000 I'd be talking to an individual.
01:11:52.000 I would say their name and directly talk to them as if they were sitting in front of me.
01:11:56.000 And tell them what I would tell them if they were sitting here.
01:11:58.000 They're organizations.
01:11:59.000 Well, I would pick a person and talk right to that person.
01:12:01.000 You should make videos and talk to Joe Biden.
01:12:04.000 An organization published a video.
01:12:05.000 Who do you respond to?
01:12:07.000 The CEO.
01:12:08.000 Okay, there you go.
01:12:08.000 Okay, so what do you say?
01:12:12.000 Well, I mean, Mr. Poopypants, first you gotta figure out who said it.
01:12:16.000 I mean if it's what is it like what who's the organization I don't know the specifics but I would talk to an individual that's a big part of it is you name them you talk to them as if they're there and listen and they hear that and they're like okay he's actually talking to me now I understand there was someone who came to one of our events And lied.
01:12:34.000 And a leftist, smear merchant, and lied.
01:12:38.000 And so I politely responded on Twitter with like, hey, this thing you're saying didn't actually happen, I hope we can resolve this.
01:12:44.000 And they responded with something like, I hope a bird craps on your face.
01:12:48.000 And so I went, okay, I guess.
01:12:51.000 I was at an event, a political event, and the person who founded this organization was on stage.
01:12:57.000 And so I asked, hey, you say that your organization is engaged in truth-telling and fact-checking.
01:13:02.000 When one of your reporters posted verifiably false statements about me, and I asked for a correction politely, they said they wished a burbo crap on my face, and he says, whatever, ignored me, and walked off.
01:13:13.000 Then you gotta go to the individual that said they want the crap on your face.
01:13:17.000 I did.
01:13:18.000 This is why people like what Matt Walsh did.
01:13:20.000 That's how you do it. You confront people and they love the drama. The crowd loves the drama too.
01:13:23.000 This is why people like what Matt Walsh did.
01:13:26.000 Because the problem we face in the culture war is that whatever this faction is constantly tries to be nice.
01:13:33.000 No, I would not be nice to the person.
01:13:35.000 I would directly confront them.
01:13:36.000 But them, an individual, that's how you get through them.
01:13:38.000 This is my point.
01:13:39.000 They are burning down pregnancy centers, they're firebombing them, and then lying about what the pregnancy centers do.
01:13:45.000 They are lying about Matt Walsh, and for the longest time you get people like, you know, with all due respect, when we had, um...
01:13:54.000 Why am I forgetting it?
01:13:54.000 Rick Santorum on the show, and he's like, no, no, no, we gotta play by the rules, play by the rules, and I'm like, yo, they're burning down buildings, and we're not even saying, we're saying the law enforcement should be dealing with it, but at the very least, we don't just say, I'm so sorry, let's be nice to them.
01:14:08.000 We say these are evil, awful people who are burning down buildings.
01:14:12.000 What they were trying to do with Matt Walsh was accuse him of being a child abuser because of out-of-context comments.
01:14:18.000 From 15 years ago.
01:14:20.000 Who is it?
01:14:21.000 Is it just text comments that were insulting?
01:14:22.000 He was on a radio show and he said something about... He said something like, throughout history, women got pregnant at much younger ages.
01:14:29.000 And now, in today's day and age, they're saying it's a mistake if women do, but that's only because we decided that.
01:14:34.000 I don't know the full context.
01:14:35.000 I just know that they're taking these things out of context, which is why he said, outright, you are lying to defame me, and I will not apologize to you for it.
01:14:44.000 So, look.
01:14:46.000 We constantly have people saying, like, hey, why don't we invite this person to come on the show?
01:14:50.000 Question, how come none of the prominent leftists will come on the show and sit down and have a conversation?
01:14:55.000 Some will, but they tend to have smaller followings.
01:14:57.000 They're trying to establish themselves.
01:14:59.000 But for the most part, they don't.
01:15:00.000 And then what happens when we do have some of these people on?
01:15:03.000 They smack the microphone and freak out and get angry.
01:15:06.000 Yeah, I make videos to those people directly with their name and I look at the camera while I'm talking to them and you can humiliate them to their face via video and everyone gets to watch.
01:15:13.000 It doesn't do anything.
01:15:14.000 Oh, it does a lot.
01:15:15.000 I used to do it in the, I mean, it's pretty aggro, but you can definitely shake someone by doing that.
01:15:21.000 So you have people who make videos and comment on that all day and night.
01:15:25.000 Matt Walsh came out and said, he smacked him back.
01:15:31.000 He made it personal.
01:15:33.000 And people are happy that he did because people are sick.
01:15:36.000 This is why people voted for Trump.
01:15:37.000 Because for the longest time, you get people... Again, I appreciate Rick Santorum coming on, but his attitude of being very deferential and saying, we're the ones who are going to play by the rules.
01:15:44.000 It's like, okay, if you're playing a game Monopoly, and the person sitting across the table is literally cheating in front of you, and you're like, well...
01:15:52.000 You're cheating, and they go, and?
01:15:53.000 Are you gonna keep playing?
01:15:54.000 I guess.
01:15:55.000 We better play by the rules.
01:15:56.000 Okay, they're not, but you sit there anyway?
01:15:59.000 It makes no sense.
01:16:00.000 So this is where he's finally saying, this is the sentiment that people have been feeling for a long time.
01:16:04.000 I don't care anymore.
01:16:06.000 You don't matter to me.
01:16:07.000 You people are awful.
01:16:08.000 You are sick.
01:16:09.000 They killed people in the summer of love.
01:16:11.000 They burned down buildings.
01:16:14.000 What they're doing to kids, it's just all abhorrent.
01:16:17.000 And I'm actually, I'm very, very happy today.
01:16:19.000 That when I see this stuff, I know that I don't have to fear them.
01:16:24.000 And Matt Walsh doing this, providing this statement saying, I would not apologize to you parasites if there was a gun to my head, showing how successful that is.
01:16:34.000 That he can do that.
01:16:34.000 That we don't have to bend the knee to a psychopathic cult.
01:16:38.000 It is freeing.
01:16:40.000 We're not trying to be mean, we're trying to be nice, we're trying to solve problems, and we're trying to find a path forward.
01:16:46.000 And what do we get instead?
01:16:47.000 Bricks through the window, lies, manipulations, and a refusal to have a conversation.
01:16:53.000 They stopped Ben Shapiro from going to events and speaking.
01:16:56.000 They stopped Ann Coulter from doing it.
01:16:57.000 They spray-painted a death threat to liberals when Milo Yiannopoulos tried to speak at an event.
01:17:04.000 Where are the conservatives burning down universities?
01:17:08.000 Not happening.
01:17:09.000 So at a certain point, there is a group of people saying, you know what?
01:17:12.000 We don't want what they're doing.
01:17:13.000 We don't want violence.
01:17:14.000 We don't want retaliation.
01:17:15.000 But you don't matter to us anymore at all.
01:17:18.000 I don't know.
01:17:19.000 Matt definitely seems concerned.
01:17:20.000 If they didn't matter to him, he wouldn't have responded.
01:17:22.000 So Matt doesn't talk about this very often, but I am sure that his entire family, all six of his children, two of them unborn, and his poor wife are undoubtedly under constant threat from these absolute soulless monsters.
01:17:37.000 These are people who will stop at nothing.
01:17:40.000 They will come to your house, they will break your windows, they will insult you.
01:17:43.000 If you go to try to defend yourself, they will get You arrested, they will call the cops on you.
01:17:48.000 They will use the state against you.
01:17:49.000 They arrested something like 11 old pro-life protesters just for the crime of being at a pro-life protest.
01:17:56.000 They will use everything in the books.
01:17:58.000 Their game is power.
01:18:00.000 They talk about it all the time because it's what they're going for.
01:18:03.000 And they accuse the right of wanting it because it's what they want more than anything else.
01:18:07.000 And when they have power, they will not give you anything back.
01:18:10.000 They don't care about your freedom of speech.
01:18:12.000 And all he did was insult them.
01:18:13.000 Right.
01:18:14.000 There was a riot on January 20th, 2017, where... I mean, you were there, right, Luke?
01:18:20.000 We were both... Yeah, that's right.
01:18:22.000 I got arrested.
01:18:23.000 You got away.
01:18:23.000 Yep.
01:18:25.000 There's a huge line of riot cops.
01:18:26.000 I pushed right through them and got pepper sprayed right in the face.
01:18:29.000 They were forming a line and so I flanked left to try and get away out of the rioters.
01:18:35.000 Luke went forward and then the cops boxed in the left.
01:18:38.000 And then all the rioters surrounded me.
01:18:40.000 But I ended up getting out because I had a press credential.
01:18:43.000 These people, not only did they have the charges dropped after they were smashing windows and setting fire to vehicles, they destroyed the livelihood of an immigrant who had leased a limo to run a company as a driver.
01:18:58.000 They set his limo on fire.
01:19:00.000 The cops were unable to do anything about it.
01:19:02.000 These people got arrested.
01:19:04.000 And then, after they were released and the charges were dropped, they filed a lawsuit against the city and won, and the city paid them money.
01:19:11.000 And that's what we've been dealing with for, what, a decade now?
01:19:15.000 So, for the longest time, you don't see... There is a 0% probability that Matt Walsh's followers storm a university to shut down these speakers.
01:19:25.000 And that's actually really funny, because Matt says, I'm trying to stop you from harming kids, and they won't even protest, you know, to a certain degree.
01:19:34.000 Obviously there are people that are going out and protesting.
01:19:36.000 What I'm saying is, on the scale that the left is engaged in this level of violence, the right goes nowhere near.
01:19:42.000 And all we're getting is Matt Walsh being like, you are sick degenerates.
01:19:46.000 And you are saying like, oh that's bad, he shouldn't do that.
01:19:48.000 Okay, well I can certainly understand why you'd feel that way, but you should understand why people are like, They burn down a building, and we've insulted them, and you're mad at that wolf.
01:19:56.000 If you want to disperse a mob, you've got to shake one individual up, psychologically.
01:20:01.000 Yelling expletives at the crowd doesn't fix the problem, which is the mob is insane.
01:20:06.000 And this is the point.
01:20:07.000 You take one of them, you make an example out of them by basically humiliating them in front of the crowd, and the crowd's like, oh, I don't want to be associated.
01:20:16.000 When they refuse to come on these shows and have conversations?
01:20:17.000 No, no, you make an internet video directly to you.
01:20:19.000 Oh, come on.
01:20:19.000 Yeah, but Ian, you're sort of saying, like, don't be mean, but then be mean.
01:20:22.000 I'm not saying don't be mean.
01:20:23.000 It's not just that, it's like you're saying scream into the wind.
01:20:27.000 No, you make an internet video.
01:20:28.000 You, especially you right now, you make an internet video talking to someone like Joe Biden.
01:20:32.000 They don't watch the videos.
01:20:33.000 You don't know that, dude.
01:20:34.000 I do know it.
01:20:35.000 If someone hears their name on the internet, they're going to that video.
01:20:39.000 What they do, and this is like, the perfect example is the Young Turks, instead of actually watching the segment, or maybe they do, they just watch a clip from some propaganda channel, or, at the very least, they watch it and then lie about it.
01:20:53.000 Like, every single time the Young Turks has done a segment about something I've said, they've lied about what the context of the conversation was.
01:20:59.000 For example, we had a conversation recently where I said something to the effect of, I'm sure most women Are happy to have careers and engage in work.
01:21:06.000 I'm sure some of them, however, probably end up regretting it, wishing they had families.
01:21:10.000 And what they did was they took that, twisted it, and claimed I thought I was saying something like, women just want to be wives and have babies, which is not what I said.
01:21:17.000 Didn't Cenk, like, approach you one time?
01:21:19.000 Screaming at me in lunacy.
01:21:23.000 They made a smear piece about Dave Rubin, and they put my name front and center for some reason, and I saw a jank at Politicon, I was like, hey man, how's it going?
01:21:32.000 I was trying to get in touch with you, I've been messaging you, you didn't respond, but you guys put up a video that, like, it was about Dave Rubin or something, but my name was on it, so I was just hoping I could ask you, like, just in the future not to do that, and then he just started screaming.
01:21:43.000 You're a Trump supporter!"
01:21:45.000 And this was in like 2017 or something.
01:21:47.000 Just started screaming at me.
01:21:48.000 I don't even remember what he was saying at the time.
01:21:49.000 And I was like, Why are you yelling at me?
01:21:51.000 And then a bunch of journalists ran up and started filming.
01:21:53.000 And they're like, What happened?
01:21:54.000 I'm like, I have no idea!
01:21:55.000 I was like, Dude, went nuts, started screaming at me.
01:21:57.000 These people have lost their minds, dude.
01:22:00.000 I've known Cenk for a long time.
01:22:03.000 I remember seeing him at VidCon and being like, hey, how's it going?
01:22:05.000 I've been in his show a couple times.
01:22:07.000 And then one day I go up to him and I'm like, hey, you guys did like, they were making fun of Dave Rubin or something.
01:22:12.000 And they used a report that had Tim Pool in their thumbnail, like right in the middle.
01:22:16.000 And I messaged Jake, I was like, hey man, I was like, you made a video and it's got my name on it, but it's about Dave Rubin, I'm just, you know, he ignored me.
01:22:22.000 I messaged Anna, who I also have known for a long time, and they ignored me.
01:22:26.000 And then, now they just make weird smear pieces.
01:22:29.000 Like, there's no point in trying to have a conversation with people who will never come and have a conversation.
01:22:35.000 Well, I think there's a point in trying.
01:22:37.000 And eventually, you know, you can get through it.
01:22:39.000 Just because you guys are talking, like right now what's happening is you're talking past each other, you talk about them, they talk about you.
01:22:43.000 Bro, I privately messaged them, But I'm talking about internet video communication.
01:22:46.000 There used to be video responses on YouTube.
01:22:48.000 The whole point was you make a video to someone, then they answer you back and talk to you.
01:22:52.000 There's communication.
01:22:53.000 People get to listen and watch.
01:22:55.000 Then they start to mimic the behavior.
01:22:56.000 Yeah, why do you think YouTube removed video responses?
01:22:58.000 Because Google bought it and they don't understand social networking.
01:23:01.000 That's how Ian and I first met.
01:23:03.000 I did a video response to Ian.
01:23:05.000 That's how we first met.
01:23:06.000 Look, man.
01:23:07.000 You know, knowing these people for like a decade, and then one day Ian's screaming in my face in public, and my response to that is still consistently to politely invite him for a conversation.
01:23:18.000 You said Ian.
01:23:18.000 Cenk?
01:23:19.000 Cenk.
01:23:19.000 Sorry, sorry, sorry.
01:23:20.000 I wouldn't do that to you, baby!
01:23:22.000 Unless you're like far away and I need you to hear me.
01:23:24.000 That'll be in five years.
01:23:25.000 Knowing this guy for as long as I have, having polite conversations, appearing on his show on more than one occasion, then one day, abruptly and for no reason, he starts screaming in my face in public.
01:23:34.000 I made a video about it.
01:23:34.000 I'm like, I have no idea what happened.
01:23:36.000 I have no idea why he's screaming at me.
01:23:38.000 I'm not Alex Jones.
01:23:40.000 I don't have any beef with him.
01:23:41.000 I've never argued with him before.
01:23:42.000 I've never had a negative word about him.
01:23:43.000 He just started screaming at me in public.
01:23:45.000 Cameras everywhere.
01:23:46.000 They're filming it.
01:23:47.000 I'm like, what's happening?
01:23:48.000 I was like, why are you yelling at me?
01:23:50.000 And then, even after that, I still politely say, we'd love to have you on the show at any time.
01:23:54.000 We'll cover all costs.
01:23:55.000 That's a great position.
01:23:56.000 But they never, they will never do it.
01:23:59.000 Never.
01:24:00.000 Because they are not genuine people.
01:24:03.000 This is why when Matt Walsh says, you are degenerate morons, people are cheering for it.
01:24:08.000 Because for too long, we have tried to politely just be like, can we please talk and resolve this?
01:24:13.000 And they say, no!
01:24:14.000 We're gonna burn your city down.
01:24:15.000 We're gonna beat your elderly.
01:24:17.000 We're gonna try and kill Kyle Rittenhouse.
01:24:19.000 You're talking about a lot of different people, though.
01:24:21.000 I'm talking about all of these people aligned with this movement, this cult, this ideology, for whatever reason, and there's various subcultures within it.
01:24:30.000 But for some reason, all of them refuse to have a conversation.
01:24:34.000 A small handful will have a conversation.
01:24:36.000 Jimmy Dore, who's basically an outright socialist, is called far-right.
01:24:40.000 It's nonsensical cult meaninglessness.
01:24:42.000 I wonder if he's changed, because he actually spat in Alex's face on one interview.
01:24:48.000 We were there!
01:24:49.000 Have they resolved that?
01:24:52.000 That was absurd.
01:24:53.000 And he was there with Change.
01:24:55.000 Oh, that was the same event?
01:24:57.000 Yes.
01:24:57.000 The Young Turks were doing this sit-down at the RNC, right?
01:25:00.000 The RNC in Cleveland.
01:25:02.000 And Alex came up laughing and talking to them while they were doing their thing and they got mad. Jimmy spit on Alex.
01:25:07.000 We were standing right there. It was crazy.
01:25:09.000 They were like, what the hell's going on here?
01:25:11.000 So look, I can understand.
01:25:13.000 But now Jimmy actually supports free speech, which is great.
01:25:15.000 Well, he's always supported free speech.
01:25:17.000 Yeah, but like, that seems like obscene behavior.
01:25:20.000 Sure, sure.
01:25:20.000 Look, I think what Jimmy did in that context was wrong.
01:25:24.000 But I think there's a difference between someone coming and interrupting your live shot so you're having a personal beef with them and whether or not you support free speech.
01:25:30.000 Like, I don't think it's inherently free speech that someone walks up to you during your show and starts trying to interrupt it.
01:25:37.000 You know what I mean?
01:25:37.000 That's a different question.
01:25:39.000 But sure, Jimmy should not have got baited and spat him or whatever.
01:25:43.000 I think Jimmy has consistently called out the establishment, the Democrats, and what he encounters is all of a sudden people are blindly marching in lockstep behind them.
01:25:52.000 And he's like, since when?
01:25:53.000 Since when have we supported this machine, this corporate Democrat machine?
01:25:56.000 And they're like, you're right wing.
01:25:58.000 He's like, what?
01:25:58.000 I did see a segment of Ana and Cenk, you know, pushing back against the defund the police.
01:26:04.000 So it's not all one group.
01:26:07.000 No, no, no.
01:26:08.000 You know, this is funny though.
01:26:09.000 You're right.
01:26:10.000 They recently, and I gave him credit for it, came out saying defund the place is stupid, but you know, they very heavily supported it.
01:26:16.000 Yeah, there were tweets.
01:26:17.000 I saw certain tweets.
01:26:18.000 Right.
01:26:18.000 A bunch of tweets came out and so people were like, they supported this.
01:26:21.000 Now look.
01:26:22.000 Change your mind.
01:26:22.000 Exactly.
01:26:23.000 And that's why I'm like, good on them.
01:26:25.000 But the issue is, it's politically expedient.
01:26:28.000 Defund the police has become unpopular.
01:26:30.000 They're following the polls.
01:26:32.000 That's it.
01:26:33.000 If anything, I have all the reason in the world to just keep doubling down and being like, Trump's the best, Trump's the best.
01:26:38.000 Instead, I'm like, I don't know, man.
01:26:39.000 I think Ron DeSantis, maybe.
01:26:41.000 Why is Trump saying death penalty for drug dealers?
01:26:46.000 That is insane.
01:26:48.000 Just don't.
01:26:50.000 We can break this down.
01:26:51.000 First, yeah, I disagree with the death penalty outright.
01:26:54.000 Second, however, Most people interpret that as specifically having to do with traffickers of like heroin and opioids and things like that.
01:27:00.000 Or if you're lacing stuff with fentanyl.
01:27:02.000 Right, right, right.
01:27:02.000 Not like someone slinging pot.
01:27:04.000 And the other argument is that when it comes to those things, these people have killed dozens, hundreds, or more.
01:27:09.000 So... But be specific.
01:27:10.000 Like, don't make statements like that.
01:27:12.000 You have to be targeted in your language.
01:27:15.000 No, and that's Trump.
01:27:16.000 Yeah.
01:27:17.000 That's right.
01:27:18.000 So I, that's why, you know, when I look at, uh, there's, there's, there's some tweets right now.
01:27:24.000 Someone tweeted at me, Will Chamberlain.
01:27:26.000 It was, it was an amazing show we had here with a handful of people.
01:27:29.000 And, you know, Will Chamberlain said something like he actually thinks the Fed is good.
01:27:33.000 And then everyone else goes like, ah, no, like, it's like the most shockingly offensive thing you can say to anybody in this room.
01:27:40.000 But Will's great.
01:27:40.000 We love having Will on the show, even though we completely disagree on that.
01:27:44.000 And then we have Ilad Eliyahu, who reports for Timcast, and he's talking about how he supports military intervention because he wants a monopolar world with the U.S.
01:27:52.000 at charge.
01:27:53.000 And we got really into it.
01:27:54.000 I was yelling.
01:27:55.000 I had to apologize.
01:27:57.000 But we think he's fantastic.
01:27:59.000 We disagreed to the point where we were, like, really angry.
01:28:02.000 And I was like, I was pissed.
01:28:04.000 Um, but I think he's, I think he does an amazing job, and I think, you know, we're glad to have those conversations.
01:28:09.000 But there's that group of people that claim to be the left.
01:28:12.000 They will not come on, they will not talk, and for some reason, their opinions flow with the wind, is however the polls are suggesting.
01:28:19.000 It's amazing.
01:28:19.000 And whatever is the popular group thing.
01:28:21.000 I mean, we even disagree.
01:28:22.000 We have debates on this show, and Iron sharpens Iron.
01:28:26.000 It's great to have those conversations.
01:28:28.000 It's great to challenge yourself.
01:28:29.000 It's great to actually question things instead of just like, hey, this is what people want me to say.
01:28:34.000 I'm just going to say it because I want to be liked.
01:28:36.000 And that's where a lot of these people come from.
01:28:38.000 This is kind of group herd sheep mentality.
01:28:41.000 And this is why you never bend the knee to the pitchfork mob.
01:28:45.000 This is why you never acquiesce to what everyone in the status quo is going along with, because if you're doing that, you're just degressing yourself.
01:28:53.000 When we should be challenging ourselves, we should be questioning ourselves.
01:28:57.000 We should be, of course, always going up to authority and saying, not today, you SOB, because you deserve some accountability.
01:29:03.000 Some transparency and you deserve to be questioned outright with what you're doing with the power that you have power should always be challenged.
01:29:11.000 These people are just trying to get as much as they can for themselves and and truly screwing everyone and giving a disservice not only to their audience but also to themselves by doing this.
01:29:21.000 So, question everything.
01:29:22.000 And this is why I liked what I saw with Matt Walsh, because it comes from that energy.
01:29:28.000 Like, I'm not going to do what you tell me to do.
01:29:30.000 I'm not going to say 2 plus 2 equals 5.
01:29:32.000 And this is why a lot of people seeing this from Matt Walsh today, I don't agree with everything that Matt has to say or think.
01:29:39.000 But when he came out with this message, it was a resounding F you to anyone in the establishment, anyone trying to control him.
01:29:46.000 And that's something that I want to see more of.
01:29:48.000 I think the never apologize idea is generally true, assuming, like, never apologize, you know, for your principles.
01:29:57.000 Never go back against your principles.
01:29:59.000 But the idea of never apologizing is a blank statement, it makes no sense.
01:30:02.000 Like, you should apologize when you want to apologize, and your core principles tell you that that's the appropriate thing to do.
01:30:09.000 I loved when he was just like, by all accounts, you should be apologizing to me!
01:30:14.000 So he acknowledges that apologies are needed sometimes, yeah.
01:30:18.000 Yeah, that was good.
01:30:19.000 Yeah, you know what, man?
01:30:21.000 I think he hit a home run, and more to the point is, you have people who go on Twitter who lie, trying to get people fired from their jobs.
01:30:30.000 This was one of those circumstances.
01:30:32.000 A lot of people have I don't think it's a home run, but I do think he bunted with a guy on third who made it home and scored a run, but he got thrown out at first.
01:30:40.000 is not going to fire Matt Walsh over this. They're going to be laughing about it. And he gets the
01:30:44.000 opportunity to come out and say this. By doing so, it is a stake in the heart of the cancel culture
01:30:49.000 mob. I don't think it's a home run, but I do think he bunted with a guy on third who made it home
01:30:54.000 and scored a run, but he got thrown out at first. So, you know, they're still down by one run.
01:31:00.000 Very specific rebuttal analogy.
01:31:02.000 Nice move.
01:31:02.000 Maybe you could have got there faster, but I think he's, you know... It was anger.
01:31:06.000 It was an angry response, and anger is a very dangerous tool.
01:31:09.000 I'll accept that with another rebuttal analogy, that Matt Walsh is paving the way for more people to stand up against cancel culture, and if that means the runner on third made it home because of what he did, then I agree with you.
01:31:23.000 Yeah, he's doing a lot of good.
01:31:25.000 And not everything has to be all good or all bad.
01:31:27.000 I think that the fact that he's bringing attention to child mutilation of any kind is important.
01:31:32.000 Like, if you're gonna get a kid's, a 13-year-old's boobs cut off and stitched up, like, yo, we gotta really consider what that means as a culture.
01:31:40.000 Yeah, and he is clearly trying to have the conversation with people he disagrees with.
01:31:44.000 I mean, that's what his whole movie was doing.
01:31:46.000 He was trying to find people to have, you know, conversations and debate the issue.
01:31:51.000 So he wants to have the debate and he's willing to talk to anybody.
01:31:54.000 His delivery is the best.
01:31:56.000 It's very just dry and very calm.
01:31:59.000 He's so dry.
01:31:59.000 I love when they said that, when he was talking with the Little Mermaid, and he said that she should have a translucent face and look like a nightmare skeleton floating around the depths or whatever.
01:32:08.000 And they were like, LGBT Nation or whatever wrote that he was having a meltdown.
01:32:13.000 And I'm just like, anybody who's ever watched Matt Walsh knows that that's not possible.
01:32:17.000 His meltdown is just him going like this.
01:32:19.000 That was a meltdown, that's his version.
01:32:20.000 It's him being like, I will not apologize to you.
01:32:23.000 You morons.
01:32:25.000 It's not particularly animated.
01:32:26.000 You'll have to rip my heart from my body.
01:32:28.000 All right, we're going to go to Super Chats.
01:32:31.000 If you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, and become a member at TimCast.com because your membership keeps all of our journalists employed, keeps our shows up and running, and we're hearing a lot of really great things about the Cast Castle vlog.
01:32:44.000 People are really, really loving it.
01:32:45.000 Ian did an excellent job.
01:32:48.000 Oh, shout out to Wes Leslie Goebel and Chris Poole for producing and writing that thing.
01:32:51.000 And you too, as an executive producer.
01:32:53.000 I love the direction that it's going.
01:32:55.000 It's ridiculous, fun nonsense.
01:32:57.000 But you know, it is still early.
01:32:59.000 So you were mentioning the other day, like you're trying to get sound quality improvements and all that stuff.
01:33:03.000 You shoot on different days, so you have different lighting schemes.
01:33:06.000 So when you have like a portable lighting scheme, and sometimes we shoot at different times of day, so we'll get different lighting outside.
01:33:11.000 You can balance that stuff out in post.
01:33:13.000 Or with the right technology sometimes.
01:33:15.000 We have a special guest coming next week to film, and I'm just so excited for the plotline of this.
01:33:21.000 I hope it's executed right, because the jokes, as they've been written, are some of the funniest and most offensive things ever.
01:33:28.000 It's gonna be amazing.
01:33:29.000 Maybe it's not the most offensive thing ever.
01:33:30.000 It's just like, it's designed to be, you know, I don't know, edgy, I guess?
01:33:36.000 Yeah, I wanna talk about edgy stuff in a compelling way that grandmas and seven-year-olds can enjoy.
01:33:42.000 I don't know about that.
01:33:43.000 That's my goal.
01:33:44.000 Not this one.
01:33:44.000 Bring levity to these intense conversations so everyone can have them.
01:33:47.000 Let's grab some super chats.
01:33:48.000 We got Frank Rizzo.
01:33:50.000 He says, Hey Tim and crew, longtime listener and member, you should try to figure out how to broadcast on shortwave radio.
01:33:55.000 The world will hear your show.
01:33:56.000 Is that like local radio stations and stuff?
01:33:59.000 Is it?
01:33:59.000 I think you can just get a device here.
01:34:02.000 I think it's like pirate radio, no?
01:34:03.000 I don't know about doing anything like that.
01:34:05.000 You gotta get like an FCC bandwidth, like you gotta buy an area on the bandwidth, I know.
01:34:10.000 It's probably worth it.
01:34:11.000 Get on AM.
01:34:13.000 Shortwave.
01:34:13.000 Yeah, AM would be great.
01:34:15.000 Yeah.
01:34:15.000 All right, enlighten me with Ronnie.
01:34:18.000 I can't read the last part.
01:34:19.000 Podcast?
01:34:20.000 So say, Tim, have a great chat.
01:34:21.000 I've been up too late when you are live.
01:34:24.000 First ever super chat.
01:34:25.000 Really do appreciate that super chat.
01:34:27.000 Scroats McGoats says it's my birthday and I have no desire to celebrate.
01:34:31.000 Feels like I'm watching the apocalypse play out right before my eyes.
01:34:35.000 Well, one way to get ahead of all this.
01:34:39.000 Bill, you gotta take a lot of money, invest in a company that makes pods.
01:34:45.000 and cricket food. They are telling you these things. So it's just like, you know, look at
01:34:51.000 the beginning of the pandemic, Trump's like, we're gonna get a vaccine. If you invested in
01:34:55.000 Moderna at that point, bought Moderna stock jumped like what 400% or something. Some ridiculous number
01:35:01.000 don't feed the beast. Voting with your dollars.
01:35:06.000 There's ten ethical and- So you're getting, what is that, a one to five?
01:35:09.000 So you're basically eating corn and soy.
01:35:10.000 Someone, a cricket farmer contacted me and said that it requires 1,500 pounds of corn
01:35:15.000 and soy to feed 250 pounds of crickets.
01:35:19.000 So you're getting, what is that, a one to five?
01:35:21.000 So you're basically eating corn and soy.
01:35:23.000 Yeah, essentially.
01:35:24.000 Shocking.
01:35:25.000 11%.
01:35:27.000 And for the person who just had their birthday, happy birthday, try to enjoy the little things in life and not try to be overwhelmed by the world's burdens and enjoy the friends, the family, the people you have around you.
01:35:39.000 I'm just saying, you know.
01:35:40.000 No!
01:35:41.000 Don't listen to him.
01:35:43.000 Don't do that.
01:35:44.000 It's got to be open source, these pods, because they're gonna be tracking your biometrics.
01:35:48.000 All right, all right, we'll try and grab some more superchats.
01:35:52.000 Topher Studio says, I've been subscribed to Luke for over a year, but I won't watch his videos because of how clickbait the thumbnails are.
01:35:57.000 It's not 2016.
01:35:58.000 Make them serious, not silly clickbait.
01:36:00.000 Love you, though.
01:36:01.000 You don't tell me what to do.
01:36:02.000 I'm gonna do what I want.
01:36:03.000 I'm gonna make them more clickbaity.
01:36:05.000 I'm gonna use that AI tool right now and even make them worse off than you thought, Bob.
01:36:12.000 Daryl Lines says, YouTube shenanigans again.
01:36:14.000 Had to reload the stream three times to get audio to load.
01:36:18.000 Yep.
01:36:19.000 Funny how that works.
01:36:21.000 Quispy Joe says, shaking my head, YouTube not notifying me again.
01:36:25.000 I have video proof, where do I send the link?
01:36:27.000 I think it was shadowband at timcast.com?
01:36:32.000 I'll have to double check before we come back for next week, because we were like, we told people if they had evidence of this to send it to us, we're going to go through it.
01:36:38.000 Because if someone was suggesting if we can show a pattern of behavior, then there's a detriment.
01:36:44.000 YouTube's not providing, you know, the proper leaser.
01:36:46.000 Well, at this point, because I heard you on your other stream, it's like, is it Shadowband or Shadowband?
01:36:49.000 You should just do both, so.
01:36:51.000 Oh, yeah, just copy and paste.
01:36:53.000 Shadowband, Shadowbands, Shadowband.
01:36:55.000 There you go.
01:36:56.000 There we go.
01:36:56.000 Shadowsband.
01:36:57.000 Shadowsband.
01:36:58.000 No, not that.
01:36:59.000 Shadowsband.
01:37:02.000 All right, Raymond G. Stanley Jr.
01:37:03.000 says, Tim, after your sad AF 4PM, I stopped at Weiss.
01:37:09.000 Clerk's eyes screamed out sadness.
01:37:11.000 Me, thanks, hope you are well.
01:37:12.000 Him, I'm here.
01:37:13.000 Me, at least you're alive.
01:37:14.000 Breathing.
01:37:15.000 Him, unfortunately, I felt bad for him.
01:37:17.000 Wow.
01:37:17.000 What was your segment about?
01:37:19.000 23-year-old was euthanized in Belgium for being depressed.
01:37:24.000 And it was like she survived a terror attack and so she said she had PTSD and wanted to die.
01:37:30.000 And so they're like, okay.
01:37:31.000 And then the family sat around her as they gave her the medication and she smiled and then died.
01:37:35.000 And that's what nightmares are made of.
01:37:37.000 Has she been physically injured other than just the trauma?
01:37:39.000 Nope.
01:37:41.000 She was on like 10 different medications.
01:37:43.000 Oh, wow.
01:37:44.000 Yeah, and it's just like, well, that's probably what's breaking her brain.
01:37:47.000 Maybe that's the problem.
01:37:48.000 Maybe, you know, all the medical interventions we have making the problems worse statistically aren't helping.
01:37:53.000 Yeah, maybe ignoring the problem for so long didn't actually solve it.
01:37:57.000 Maybe not focusing on health or diet or exercise or daily activity has mattered.
01:38:02.000 All right.
01:38:02.000 Pinochet's helicopter tour says, I've got two words for you.
01:38:06.000 Let's go, Brandon.
01:38:08.000 That's a good one.
01:38:09.000 Good work.
01:38:11.000 What do we got here?
01:38:13.000 Oh, where are we at?
01:38:15.000 Darrell Lyons says, Biden saved us from World War III just like he brought down gas prices.
01:38:20.000 Yeah.
01:38:21.000 Everybody is saying this.
01:38:22.000 I know Don Diego says, guys, guys, I think he was saying Maiden America.
01:38:25.000 Like, like a maiden, you know?
01:38:27.000 Like, it was two words.
01:38:29.000 That's what PolitiFact is gonna do.
01:38:30.000 It's gonna be like, he did say two words.
01:38:32.000 Maiden America.
01:38:33.000 It was a reference to... Do AI art for Maiden, for the Maiden America.
01:38:37.000 See what she looks like.
01:38:38.000 Oh, yeah.
01:38:39.000 So what is this?
01:38:39.000 You can download the program to your own computer?
01:38:43.000 Someone was saying that.
01:38:43.000 So there's probably different apps that can run the engine.
01:38:46.000 Because then you don't got to worry about the queue or whatever.
01:38:48.000 You can just, yeah.
01:38:49.000 And like, I'd be interested in trying to generate images for thumbnails.
01:38:52.000 Like if you can get one that's not scary or creepy, unless you wanted it to be.
01:38:56.000 Oh, we should totally use that for Shane's new show.
01:39:00.000 Shane Cashman.
01:39:00.000 Yeah, man.
01:39:01.000 Great idea.
01:39:02.000 Because it naturally just makes everything look horrifying and nightmarish.
01:39:06.000 Even normal stuff.
01:39:07.000 Yeah, even normal stuff.
01:39:08.000 So if there's like a story about an apple, it'll make it look creepy.
01:39:11.000 There you go.
01:39:13.000 Yo, can we answer a few Superminds?
01:39:14.000 Made in Americas.
01:39:15.000 Oh, you want to answer some?
01:39:17.000 Yeah, I mean, if you want to shout out.
01:39:18.000 Yeah, so guys, we just launched this new product.
01:39:20.000 It's called Supermind.
01:39:22.000 It's like Superchat.
01:39:23.000 It's half the fees.
01:39:25.000 So, you know, more of your money is going to Tim, the creator.
01:39:29.000 You basically can go to minds.com slash timcast, click the supermind button, ask him a question, and you don't pay unless he answers.
01:39:37.000 So it's basically an offer, and if he answers he can answer it on stream, he can answer it during the week, you can do all these different types of responses.
01:39:44.000 So there's some in there, it'd be cool to Well, I'll jump over in a second.
01:39:48.000 So here's the thing, with superchats, people have already paid, and they have comments, and we try to read as many as we can.
01:39:52.000 With superminds, you offer to pay, and if we don't read it, you don't pay.
01:39:59.000 There's a balance I try to do, because some people are like, why won't Tim read the big superchat, and I'm like...
01:40:05.000 I do appreciate the big superchats.
01:40:06.000 I do want to read as many as I can, but I also don't want people who can't afford to send tons of money to be cut out.
01:40:13.000 And so we try to just read what we can.
01:40:15.000 Yeah, small ones are great.
01:40:15.000 And the last thing I would say is that like for people, you know, that's kind of the bad part about superchats.
01:40:21.000 You don't necessarily get a response.
01:40:23.000 So people who, you know, money is tight these days.
01:40:26.000 So to know that you're going to get a response from a creator is a big deal.
01:40:31.000 Alright, Falcon Leisure says, Tim, this weekend you need to watch the movie Threads on YouTube.
01:40:35.000 It is a British-made movie about what the aftermath of a nuclear war would be like.
01:40:40.000 There's also a show on the BBC I've not yet watched called Years and Years, I think it's called?
01:40:44.000 Have you heard of this?
01:40:44.000 No.
01:40:45.000 And it's like a show that was meant to, I guess, mock Trump or something.
01:40:48.000 And it's like about a populist who wins and then starts advocating for immigrant genocide or something like that.
01:40:55.000 I don't know.
01:40:56.000 I gotta watch it, but, you know, something that's probably cringe, but I'm interested in watching.
01:41:00.000 Threads, though, that sounds pretty cool.
01:41:01.000 1984 TV show.
01:41:02.000 Is that what it is?
01:41:04.000 Yeah.
01:41:06.000 Clef the Misfit says, think what happened to Dorsey with Twitter is like what happened to Trump in office.
01:41:12.000 Staffed the organization with snakes and ideologues who hampered him instead of helped him.
01:41:17.000 Perhaps, but Dorsey left Twitter and then came back.
01:41:20.000 Was it Dick Costolo was the CEO for a while?
01:41:22.000 And then, what did he do?
01:41:23.000 He wrote Silicon Valley or something?
01:41:26.000 Oh, is that a book?
01:41:27.000 No, no, the TV show on HBO.
01:41:28.000 Oh, no.
01:41:29.000 Costolo wrote that?
01:41:30.000 Yeah, I think he was involved in that.
01:41:31.000 I thought they, like, judged... Right, right, right.
01:41:34.000 And I'm pretty sure... And Costolo?
01:41:35.000 Really?
01:41:36.000 Well, you want to look it up?
01:41:37.000 Yeah.
01:41:38.000 He's got some credit in there, doesn't he?
01:41:40.000 Yeah, they probably had all kinds of consultants from the Valley.
01:41:42.000 But I think he had, like, a significant... Writing for Silicon Valley.
01:41:46.000 I don't know what he did exactly.
01:41:47.000 Good for him.
01:41:47.000 Writing for the show.
01:41:48.000 Oh, he's an ex-comedian.
01:41:49.000 I didn't know that.
01:41:49.000 And then he became the CEO of Twitter.
01:41:51.000 It is an accurate show, to be honest.
01:41:53.000 Dick Costolo.
01:41:54.000 Improv comic in Chicago.
01:41:57.000 All right.
01:41:59.000 Louis Aguilar says, Hi Lids, I'm here for the scandalous announcement.
01:42:02.000 I wish you good things on your new production.
01:42:05.000 I don't think that I have a super scandalous announcement.
01:42:08.000 I will just say goodbye like I always do for the last time, which is a little bit sad, but life goes on and we'll be good.
01:42:13.000 Yeah.
01:42:14.000 This is it, this is your last show?
01:42:15.000 That's right, it's my last show.
01:42:17.000 Well, last show here.
01:42:17.000 Last show here.
01:42:19.000 You're actually, you have a new show, don't you, on Saturday mornings?
01:42:22.000 Um, I don't have anything set in stone.
01:42:24.000 We're still figuring it out, but we're gonna be trying to make a difference in the culture as we go forward, and you guys can check it out later.
01:42:30.000 Right on.
01:42:30.000 Boom.
01:42:31.000 Aaron Tamiki says, Bye, Linda.
01:42:34.000 I'm gonna miss you on the show.
01:42:36.000 Since you are leaving, what was your favorite moment from the show, and who was your favorite guest?
01:42:40.000 Oh my gosh, my favorite guest was definitely Ed Calderon.
01:42:43.000 He's fantastic.
01:42:44.000 He used to be a cop down in Tijuana and he has crazy stories.
01:42:48.000 If you guys aren't familiar with him, you should check out Ed's manifesto on Instagram.
01:42:52.000 My favorite moment from the show that had me paralyzed was probably when R.A.
01:42:55.000 smacked the microphone.
01:42:56.000 That was scary for me.
01:42:58.000 I was like, what is happening?
01:42:59.000 What's going to happen next?
01:43:00.000 So, yeah.
01:43:01.000 Probably a lot of people have in common, but it's really interesting moments like that.
01:43:06.000 He hung out for a little bit afterwards.
01:43:07.000 We invited him back.
01:43:08.000 Yeah, he felt bad about it afterwards.
01:43:09.000 Yeah, he apologized.
01:43:10.000 We hugged it out.
01:43:11.000 That's cool.
01:43:12.000 You're welcome.
01:43:13.000 Oh, yeah.
01:43:13.000 Okay, Luke.
01:43:13.000 Yeah, Luke had him invite him on.
01:43:16.000 Alright.
01:43:17.000 What about Immortal Technique?
01:43:18.000 He'd be cool to have on, right?
01:43:19.000 I'll ask him.
01:43:20.000 I think he'll be great.
01:43:21.000 But we probably have political disagreements, right?
01:43:23.000 Probably.
01:43:24.000 But it'll be great to talk it out.
01:43:26.000 What about him and Alex Jones?
01:43:28.000 They had a conversation before.
01:43:29.000 So they met together and they had a video interview.
01:43:31.000 I think, you know, I know we need to do.
01:43:34.000 We need to have this table set up with a person from each of the political quadrants on the political compass.
01:43:39.000 There you go. Yeah. Yeah. Good idea. I have like the themed rooms with each color.
01:43:44.000 They have the same color. Well, just like we'll put plate.
01:43:47.000 We'll put mats of the colors. Well, we did. Red. We did our test. Right. And we're like almost
01:43:53.000 opposite each other on the bottom left.
01:43:55.000 Center Libertarian with Luke on the right quadrant. Yeah.
01:43:58.000 One point and me on the left quadrant.
01:44:00.000 So we would be in that same kind of position, and Ian would be, where would you be?
01:44:05.000 I'm further left.
01:44:06.000 I'm not.
01:44:07.000 I kind of play that role, and I do have a kind of an authoritarian bent.
01:44:12.000 So yeah, you're there, and then we need like a super commie.
01:44:15.000 I mean, we could get real authoritarians on the show, that'd be interesting.
01:44:19.000 I'm pretty far left relative to you guys.
01:44:20.000 I was like halfway to the left and halfway to the bottom.
01:44:24.000 All right.
01:44:25.000 Halfway through that quadrant.
01:44:26.000 How do I... I want to answer a supermind.
01:44:31.000 Yeah.
01:44:31.000 But I need to know... I don't have the URL to the stream.
01:44:35.000 No, just type in answered on stream as a reply.
01:44:38.000 Answered on episode.
01:44:41.000 What are we?
01:44:42.000 632?
01:44:42.000 632.
01:44:42.000 Bam.
01:44:43.000 All right.
01:44:44.000 In this supermind, this is from... Oh, I can't see the name of the user right now.
01:44:48.000 Raymundo, why do you think Elon has suddenly decided to go through with a Twitter deal at the original offer price after months of seemingly trying to get out of it?
01:44:56.000 Do you think the deal goes through and will Twitter actually change?
01:45:00.000 So I have talked about this before.
01:45:01.000 I think, um...
01:45:04.000 He did try to negotiate a price down.
01:45:06.000 We were asking this before, like, why didn't he try to get a cheaper deal?
01:45:08.000 He did.
01:45:08.000 He tried getting 30% off.
01:45:10.000 They said no.
01:45:11.000 He tried saying, okay, what about 10?
01:45:12.000 They said no.
01:45:13.000 And then he finally said, okay, fine, we'll do it.
01:45:16.000 My attitude is I think there's an element of he's gonna lose in court.
01:45:21.000 I think there's also an element of when he said no to World War III and the bots bombarded his poll and favored war, he was probably just like, Okay, I have to do this.
01:45:31.000 You know, net worth be damned.
01:45:33.000 But it was a reasonable argument that the bots are a high percentage of Twitter and also the whistleblower came out and it seems like nothing's really coming of the whistleblower saying that he was trying to get the bots taken care of and... Who's the whistleblower?
01:45:47.000 They're head of security.
01:45:49.000 Twitter's head of security came out and said that there's just all of these unacceptable practices at the company.
01:45:54.000 He'd be a great guest on this show.
01:45:56.000 He would be a great guest.
01:45:56.000 Do you know what his name is off the bottom?
01:45:59.000 If you just look up Twitter Whistleblower, I forget his name.
01:46:01.000 Alright.
01:46:01.000 Did we confirm the supermine?
01:46:03.000 Did the payment go through?
01:46:04.000 Oh, I don't know.
01:46:05.000 Yeah, yeah.
01:46:05.000 I mean, yeah, it went through.
01:46:06.000 That's awesome.
01:46:07.000 There's a bunch and they're like, it's a lot of money.
01:46:09.000 It's like $120, $250.
01:46:12.000 We blew it up earlier to get people to roll right on.
01:46:15.000 So alright, here we got one.
01:46:17.000 Max Reddick with the Super Chat.
01:46:18.000 He says, Tim, your impersonation of Nancy Pelosi is hilarious.
01:46:23.000 Please keep doing this also.
01:46:25.000 We miss Seamus.
01:46:26.000 Is she coming back?
01:46:28.000 At some point, I imagine.
01:46:29.000 I hope so.
01:46:30.000 Seamus who?
01:46:31.000 I don't know.
01:46:31.000 Who's that?
01:46:32.000 I have no idea.
01:46:33.000 Is that like somebody's dog?
01:46:34.000 Yeah.
01:46:36.000 Wow, okay.
01:46:37.000 That's not who you're talking about.
01:46:38.000 Good one, Ian.
01:46:38.000 I like that, Ian.
01:46:39.000 I misspoke.
01:46:40.000 Rude.
01:46:41.000 I love you, Seamus.
01:46:42.000 Here we go.
01:46:43.000 I need you.
01:46:43.000 Dorktanian says, how are you sure that some Tesla employee in California isn't trying to make your death look like a traffic accident?
01:46:50.000 Uh huh.
01:46:51.000 How do you know?
01:46:52.000 Tim, do you have enough notifications, bro?
01:46:54.000 What, which where?
01:46:57.000 No, that's 282,923 notifications.
01:46:59.000 I wonder what they say.
01:47:01.000 You know, I got like people, people like I try and tell them like when it comes to having a lot of followers on these social media platforms.
01:47:06.000 Yeah, get notifications.
01:47:08.000 You gotta turn them off, yeah.
01:47:09.000 At a certain point, it's just whatever.
01:47:12.000 Yeah, it's just there.
01:47:14.000 282,925.
01:47:15.000 You know what's cool?
01:47:15.000 I actually didn't know this, but we have like 300,000 subs on Rumble, too.
01:47:20.000 TimCastIRL, and so does my TimCast channel.
01:47:23.000 But my Tim Pool channel, for some reason, only has like 70.
01:47:25.000 So I was like, let's just put all my personal show videos on the TimCast one with 300k subs.
01:47:30.000 That's amazing.
01:47:31.000 Rumble's doing a great job.
01:47:33.000 And then on mine as well, all of my tweets.
01:47:35.000 Go up there.
01:47:36.000 Yeah.
01:47:37.000 The videos do too, right?
01:47:38.000 Yep.
01:47:39.000 I don't know.
01:47:39.000 Yep.
01:47:40.000 It looks like you got a little, a small one from, for 20 tokens.
01:47:43.000 You can do tokens too, but is there a game that you're enjoying?
01:47:45.000 Let's do, let's do this.
01:47:47.000 And we'll, uh, answered on episode 632.
01:47:52.000 We're going to update this FYI in the future.
01:47:55.000 So you can just respond instantly for super, for the live stream use case.
01:47:58.000 So there'll be able to say, cause, uh, the, they want you to answer on stream.
01:48:02.000 You just click once.
01:48:03.000 But what if you go through?
01:48:05.000 And you don't respond to anything and you say, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes, yes Is there a game you're really enjoying these days?
01:48:27.000 I play one video game and it's Spelunky 2.
01:48:30.000 And I'm currently mapping the Spelunky multiverse.
01:48:32.000 So for those that are familiar with Spelunky 2, it's a roguelite game where you enter into
01:48:37.000 a cave and you go down various levels, and then at the end you actually go on a rocket
01:48:42.000 ship up to Hundun's World or whatever it's called, I don't know, and then there's the
01:48:46.000 Cosmic Ocean.
01:48:47.000 I don't care about the Cosmic Ocean at all, for those that are familiar with the game.
01:48:50.000 What I'm doing is, once you unlock seeds, you can actually type in an alphanumeric code,
01:48:56.000 which will give you a specific generation.
01:48:58.000 So the game is randomly generated, procedurally generated.
01:49:00.000 Every new game is different.
01:49:02.000 But if you unlock seeded runs, you can actually enter a code.
01:49:06.000 I think it's like, I don't know how many digits it is, 10 or something.
01:49:09.000 So what I've been doing, I started with World 1, 2, 3, I'm on 252 right now.
01:49:16.000 So I've played through that many games.
01:49:21.000 And there's a bunch more, I've probably played thousands of games of Spelunky, it's ridiculous.
01:49:25.000 I hope you call it the Spelunkyverse, as you were talking about the Spelunky multiverse.
01:49:29.000 Yeah, just the Spelunkiverse or Spelunkyverse.
01:49:31.000 But there's also, there's 0 through 9, and then A, B, C, D, E, F. So there's a lot of potential world generations in this game.
01:49:41.000 And so I'm just going with numbers, so I've played up to world 253.
01:49:46.000 So I'm actually just going 0-0-0-0-0-0-1, 0-0-0-0-0-0-2, 0-0-0-0-0-0-3, and then I'm up to 2-53.
01:49:52.000 You beat them all?
01:49:53.000 You beat one before you move on?
01:49:54.000 No, not beat them all.
01:49:55.000 Played through them to a certain point.
01:49:58.000 Some of them are, like, really bad, and you're like, ugh, I will.
01:50:00.000 Like where you can't beat it?
01:50:01.000 Do they have places where you fall?
01:50:03.000 You can beat them all, but some of them are just really annoying where you have no resources.
01:50:08.000 Some of the level design requires you to anger some of the characters you don't want to anger, and then it's just like, you could.
01:50:14.000 Kinda sucks.
01:50:16.000 Some are really easy, like World 47.
01:50:18.000 For those that are Spelunky fans, this one's gonna really help you out.
01:50:21.000 World 47 starts with a jetpack.
01:50:24.000 And in 4-3, when you're dropping the lava to open the secret lair to the Queen's lair to fight the turtle guy, whatever his name is.
01:50:37.000 There's actually, on the ground floor, an escape hatch that brings you to... I don't know what you call it, like a subterranean?
01:50:44.000 Anyway, there's a level where you're supposed to die, if you're not good at the game.
01:50:49.000 Lava kills you, and then the Ankh resurrects you, and then you can go through a secret door.
01:50:53.000 In World 47, it's all zeros and then 47, there's actually a door you can go in, so when the lava lands, you can just go into it behind the door and then leave.
01:51:01.000 So, for those that are, you know, fans of Spelunky.
01:51:02.000 Cheat code.
01:51:03.000 But, uh, sort of, yeah, I guess.
01:51:04.000 But it's the same world over and over again.
01:51:06.000 But more importantly, if you're really good at the game, you can just not die there.
01:51:10.000 So, like, when the lava falls, you can just throw ropes, put it underneath one of the ledges, and then get a jetpack and fly out.
01:51:15.000 There's a bunch of other ways to do it.
01:51:16.000 Anyway, that's the game I'm playing.
01:51:18.000 I know too much about it.
01:51:19.000 And again, like I said, I don't really care for the cosmic ocean.
01:51:22.000 Is that a zone?
01:51:23.000 You lost me there.
01:51:24.000 I like Apex Legends, but... Cosmic Ocean is the second and final secret world in Spelunky 2 that has 99 levels.
01:51:32.000 You just skip it usually when you play?
01:51:34.000 Well, I usually play to it and then just don't care.
01:51:36.000 Like, it's just so boring.
01:51:39.000 Yeah, it's intense.
01:51:40.000 So, yeah, it's a different kind of game.
01:51:41.000 You're trying to pop three orbs and then a jellyfish attacks you and I'm just like, I don't really care about that.
01:51:45.000 I like the game where you're flying around with a jetpack and your shopkeepers are fighting you and you're getting the gold and stuff like that.
01:51:52.000 Anyway, it's a fun game.
01:51:53.000 Okay, now that I've wasted a lot of time talking about that, let's try and read some more Super Chats.
01:51:57.000 The Musicanon says, Tim, people are salvaging Tesla wrecks, gutting them for the batteries and motors, and installing them on old cars like BMWs with custom firmware.
01:52:05.000 Large market for custom electric cars out there.
01:52:08.000 Alright, let's do it!
01:52:09.000 Cool.
01:52:09.000 Let's make the Mind's Car.
01:52:11.000 I'm in.
01:52:11.000 Yeah.
01:52:12.000 We'll, we'll, we'll, Timcast Mines, we'll, we'll integrate it with Mines.
01:52:15.000 So, however that can work, maybe tokens can do something for you.
01:52:19.000 Maybe like, you can charge it with tokens somehow, something like that.
01:52:24.000 Like, I don't know, we'll figure it out, but that would be cool.
01:52:26.000 And then it's like, to charge it, you can pay your normal electric bill,
01:52:31.000 but then based on how much charge it takes, you'll like be granted tokens or something.
01:52:35.000 Who's leading the engineering on this car?
01:52:38.000 We gotta find somebody!
01:52:39.000 We gotta find somebody.
01:52:41.000 And then we'll just sponsor the creation of it.
01:52:44.000 We'll make a really sleek video, and then maybe we'll do an auction for it or something.
01:52:48.000 One-of-a-kind, unique car with logos on it.
01:52:52.000 Let's do it!
01:52:52.000 That'd be super cool.
01:52:53.000 I'm serious.
01:52:54.000 What kind of car should it be?
01:52:56.000 Something cool.
01:52:56.000 I'm wide open on that one.
01:52:57.000 Luke, what do you think?
01:52:58.000 What do you think, Lydia?
01:53:00.000 It's gotta be a Boss Mustang.
01:53:02.000 Prius.
01:53:02.000 It's gotta be a Prius.
01:53:03.000 No, I'm just kidding.
01:53:05.000 I like my idea better.
01:53:06.000 What year Mustang?
01:53:08.000 Like a 68, 67.
01:53:08.000 Oh, yeah, yeah, yeah.
01:53:10.000 Oh, yeah.
01:53:11.000 I hear good things.
01:53:11.000 That's what's up.
01:53:12.000 I'm into the Mustang.
01:53:13.000 All right, we'll figure it out.
01:53:14.000 We'll figure it out.
01:53:15.000 Do it.
01:53:15.000 Do it.
01:53:17.000 Let's grab some super chats.
01:53:18.000 What is this?
01:53:18.000 Someone said something about Let's see, Fleg Bigums says, Shaggy is a Vietnam vet, canine unit.
01:53:25.000 His cowardice is begging his friends not to get involved.
01:53:28.000 He only opens up to a dog, PTSD like a mofo.
01:53:32.000 Oh, that's so sad.
01:53:33.000 That is so horrible.
01:53:34.000 That is true.
01:53:34.000 It's like his friends are constantly getting in danger and he's like, guys, no, don't do this.
01:53:38.000 And you know what's funny?
01:53:39.000 Cause he's kind of right.
01:53:40.000 Cause every single time they do get attacked and I'm pretty sure often these, these guys in costumes almost do hurt them.
01:53:47.000 You know, pretty serious stuff.
01:53:48.000 There's a lot of violence and Benny Hill music too, right?
01:53:51.000 Yeah, there's laugh tracks too.
01:53:53.000 Really?
01:53:53.000 Yeah, the old Scooby Doo had a lot of laugh tracks.
01:53:57.000 Scooby Doo.
01:53:57.000 Always looking for clues.
01:54:00.000 It's a clue!
01:54:01.000 Don't eat that, Shaggy!
01:54:02.000 It's a clue!
01:54:04.000 Here's a bunch of superchats of people wanting AI image generation.
01:54:07.000 What's this one?
01:54:07.000 Search Adrenochrome.
01:54:09.000 I wonder what that one is.
01:54:12.000 Oh, I did look up Made in America, and it's like Iron Maiden.
01:54:15.000 You typed in the AI to it?
01:54:16.000 Oh, Made in America.
01:54:18.000 Raymond G. Stanley Jr.
01:54:19.000 says PimTool.
01:54:21.000 That actually would be a funny one.
01:54:23.000 He also said Luke Milkers.
01:54:24.000 Damn right.
01:54:25.000 I don't know if that would work.
01:54:26.000 That one's too new of an idea.
01:54:28.000 Too new of an idea.
01:54:29.000 Someone said Cleft the Misfit, Ben Shapiro playing tennis.
01:54:32.000 We did that one.
01:54:32.000 That was actually really funny.
01:54:34.000 RJ says, Ian defeats climate change with graphene.
01:54:37.000 I actually referenced that in my segment earlier today when I was talking about euthanasia, the Great Reset and how they want population reduction.
01:54:44.000 And I pointed out that in the turn of the century, 1800s and 1900s, they were writing about how horse manure would pile up in the streets of New York and it would be a disaster.
01:54:53.000 And the car was invented and it never happened.
01:54:55.000 Now they're saying carbon is destroying the planet and it's gonna end the world.
01:54:59.000 Perhaps a new technology like mining carbon from the atmosphere to make graphene is going to stop that.
01:55:05.000 So it's like, if we've done it before and used technology to overcome these problems, why would we not do it again?
01:55:10.000 And why is the Great Reset the solution?
01:55:12.000 It's not.
01:55:13.000 I think they're lying about it because they want political power.
01:55:16.000 I think their view is just like, oh no, we have to do this.
01:55:19.000 Give us all of your authority and bend the knee to us so we can be in charge.
01:55:22.000 It's either that or it's very short-sighted, and they're not understanding about solutions, and that's also very bad, and you don't want to follow people that are short-sighted.
01:55:31.000 27 said, quote, the end is Bill Nye.
01:55:35.000 Tony Bologna says, Mark of the Beast.
01:55:36.000 Ooh, you want to type that one in?
01:55:37.000 That'd be good.
01:55:38.000 If it's good, I'll search for it too.
01:55:40.000 I typed, we are change milkers, but it's still loading.
01:55:45.000 Christopher says, do Joe Biden sniffing Ian Crosland's hair?
01:55:48.000 Please, please no.
01:55:49.000 No.
01:55:49.000 Give it a couple years and then I'll be, the AI will parse me properly.
01:55:53.000 Which, which, what did you want me to type in?
01:55:55.000 The end is nigh?
01:55:56.000 The end is Bill Nye.
01:55:57.000 Oh, Bill Nye.
01:55:58.000 Raul Hernandez says, this one's for Lydia.
01:56:00.000 Always loved your input and you'll be greatly missed.
01:56:02.000 Long live Chicken Ian.
01:56:03.000 Chicken Ian!
01:56:04.000 Yeah, we love Chicken Ian.
01:56:05.000 Chicken Ian will always be with you.
01:56:07.000 Can't get rid of him.
01:56:08.000 He's great.
01:56:09.000 Marked Ashamed says, so Rakita Law's channel is back.
01:56:12.000 Seems like Google is doing the practice of blinking channels and websites as a new form of intimidation.
01:56:17.000 Well, that's what I was saying.
01:56:18.000 It was a mass report.
01:56:19.000 So the AI took him down instantly.
01:56:21.000 And then they went whoops and put him back.
01:56:23.000 So they said it was a mistake?
01:56:25.000 Well, I don't know if they said it was a mistake, but that tends to be what happens.
01:56:27.000 How's the jury system going at Mines?
01:56:29.000 Is it still active?
01:56:30.000 Yeah.
01:56:30.000 Because you see the AI Bill of Rights the White House proposed?
01:56:34.000 Did you see it?
01:56:35.000 I didn't.
01:56:35.000 Like this last week, they proposed this AI Bill of Rights.
01:56:37.000 And one of the things is you should have a right to a human in these social networks.
01:56:42.000 And I'm wondering if the human, because Tim was saying it's so expensive for a person to hire somebody so that they can admin, that the community is the person and the people that will have you available.
01:56:51.000 You know, they've opted in.
01:56:53.000 Yeah, I mean, honestly, I think that the Birdwatch program at Twitter, the problem with it is that it's enforcing Twitter's ridiculous terms of service, which are censorship-based, but Birdwatch in itself is actually similar to the jury system on Mines.
01:57:08.000 So there are some good things about Birdwatch, but it's enforcing chaos.
01:57:13.000 Like, Twitter's terms are a joke.
01:57:15.000 I want to answer this Super Minds from Gnoldub.
01:57:23.000 What do you think of the Tucker Carlson interview with Kanye?
01:57:26.000 Do you think Kanye has any chance of becoming president?
01:57:29.000 Who has the best chance?
01:57:30.000 Well, okay.
01:57:31.000 I think the interview was great.
01:57:32.000 And Kanye, he says some stuff, but he mentioned that 50%, he said there are more black babies being aborted than born in New York.
01:57:39.000 Fact check true.
01:57:40.000 Fact check true.
01:57:41.000 And that's kind of crazy.
01:57:43.000 That's a terrifying concept.
01:57:45.000 You've got to look at how many are being born and then wonder about the specific population reduction targeting the black community, because a lot of these abortion clinics are in these neighborhoods.
01:57:55.000 And that's just weird and freaky.
01:57:57.000 Um, does he have a chance of becoming president?
01:57:59.000 Sure, but is it a big chance?
01:58:00.000 Probably not.
01:58:01.000 Who has the best chance?
01:58:02.000 Trump.
01:58:04.000 I mean, right now, Trump.
01:58:06.000 So, I don't know what else to tell you.
01:58:08.000 Trump be easy.
01:58:09.000 Alright, everybody!
01:58:10.000 It's Friday night.
01:58:11.000 If you haven't already, would you kindly smash that like button, subscribe to this channel, and share this show with your friends.
01:58:17.000 We've got a bunch of awesome members-only shows up from this past week, and from all the other weeks.
01:58:21.000 You can watch the whole library.
01:58:23.000 You can follow the show at Timcast IRL, and you can follow me at Timcast!
01:58:27.000 Bill, do you want to shout anything out?
01:58:29.000 Yeah.
01:58:29.000 Thanks for having me, man.
01:58:30.000 It was great to see you guys.
01:58:31.000 Yeah.
01:58:31.000 Even though you ran out of gas?
01:58:33.000 Hit me.
01:58:33.000 Yeah.
01:58:34.000 It's worth it.
01:58:34.000 It's worth it.
01:58:35.000 Yeah.
01:58:35.000 Hit me up.
01:58:36.000 Mines.com slash opmin.
01:58:38.000 Also, you know, if you're a creator.
01:58:41.000 Let's do the superminds thing.
01:58:42.000 You can earn for replying to people.
01:58:45.000 We're really psyched about it.
01:58:46.000 I think that it's a new dynamic.
01:58:48.000 This is really cool because the queries exist outside of any other framework, which means you could incorporate these questions into any YouTube video you do.
01:58:58.000 We're going to do an OBS plug-in too.
01:59:01.000 So I like I'll say this a shout for instance like Jeremy over the quartering Imagine you're doing a segment and at the end of every segment you say I'm gonna grab a couple super minds It's basically funding the production like these two super minds were big I'm not saying every single one is gonna be big but they can help this is a way to fund smaller channels smaller creators Videos and everything if they're looking for sponsors someone asking a question could be a form of sponsor.
01:59:25.000 What percent does mines take?
01:59:27.000 It's 15%, which is half of Super Chats, but this is actually a key point.
01:59:31.000 Less than half.
01:59:31.000 We're doing a commission program, so if anyone who signs up to mine's through your referral code, you get 5% of their earnings perpetually.
01:59:43.000 So that comes out of our 15%.
01:59:44.000 Oh wow.
01:59:46.000 So if a sign up comes through anyone on the site's, your referral code, anybody's, you get 5% of their stuff perpetual.
01:59:57.000 Actually, you know, this is a funny story, but OnlyFans, when they first started, this caused them to absolutely explode because people were recruiting for them because they would get the commission fee.
02:00:10.000 So commissions are super powerful.
02:00:12.000 Yeah, yeah.
02:00:12.000 Right on, man.
02:00:13.000 Sweet.
02:00:13.000 Well, thank you for coming on.
02:00:15.000 Good to see you as well.
02:00:16.000 I have a Minds channel as well.
02:00:18.000 And I have two words for everyone.
02:00:20.000 Go to LukeUncensored.com and I'm there on the forum.
02:00:24.000 I'm going to be doing an AMA there.
02:00:26.000 I got three masterclasses, exclusive videos, merchandise, LukeUncensored.com.
02:00:31.000 And Linda, we will miss you.
02:00:33.000 Thank you, Luke.
02:00:34.000 I appreciate it.
02:00:35.000 It's bittersweet, Lydia.
02:00:37.000 It's been quite a ride.
02:00:38.000 Yeah, it has.
02:00:39.000 Thanks for coming along, or thanks for having me along with you.
02:00:41.000 Of course!
02:00:41.000 Yeah, happy to have you.
02:00:43.000 And I'll let Ian say goodbye, but I had a couple things I wanted to say before I go.
02:00:46.000 Do you have anything more to say, Ian?
02:00:47.000 Just, yeah, yeah.
02:00:48.000 Take care of yourself, and this world, this is yours to make.
02:00:52.000 So do your best.
02:00:53.000 Do you want the last word?
02:00:54.000 So should I talk now?
02:00:56.000 I just want to say thank you guys all so much for your very kind, compassionate words.
02:00:58.000 I really appreciate all your nice comments.
02:01:00.000 I'm not dying.
02:01:00.000 I'm still going to be around.
02:01:01.000 I'm still going to be on Twitter.
02:01:02.000 That's fine, we'll do it that way, that's fine.
02:01:03.000 I just want to say thank you guys all so much, your very kind, compassionate words.
02:01:07.000 I really appreciate all your nice comments.
02:01:09.000 I'm not dying.
02:01:10.000 I'm still gonna be around, I'm still gonna be on Twitter, I'm still gonna be causing trouble,
02:01:13.000 posting on Instagram, doing all this stuff.
02:01:16.000 I am disappointed and it does pain me to say that I did something that I never thought I would have to
02:01:22.000 do.
02:01:22.000 I started an OnlyFans.
02:01:24.000 You guys can follow me over there at lidsoftiktok.
02:01:27.000 Go over, check it out.
02:01:28.000 We already made a little video for you and I think you guys are gonna like it.
02:01:31.000 Otherwise, you guys can follow me on Twitter and minds.com at sarahpatchlids as well as sarahpatchlids.me.
02:01:37.000 Lids of TikTok, it's actually very clever.
02:01:38.000 That's right, lids of TikTok.
02:01:40.000 So my friends, Lydia, it was a couple of years ago that I was posting nonsense on Instagram and Lydia had commented something pertaining to news which was, I had noticed a couple of the comments were insightful.
02:01:54.000 And then I don't know if it was like I messaged you or you messaged me or something and then you started sending me stories you thought were interesting and I found that to be helpful because I was producing segments and I was like, oh, I didn't see that.
02:02:06.000 I missed that one.
02:02:07.000 And so that actually ended up helping me out.
02:02:09.000 Yeah.
02:02:09.000 When we decided we wanted to do some kind of show, I was like, hey, you know a lot of this stuff.
02:02:14.000 You know the stories.
02:02:14.000 You know the insights.
02:02:16.000 Why don't you come out and help us start this up?
02:02:18.000 And you could be, you know, producing and doing the camera stuff.
02:02:21.000 And so that was just like almost two years ago.
02:02:25.000 And then you came and joined the show since then, from the beginning.
02:02:29.000 And then there was another funny moment where, this is, I don't know, last year at some point I think, I saw these amazing videos on Instagram from a rollerblader, Brett Dasovic, and he has a series called Audio On, and it was him skating really unique ways, and I was not a rollerblader, I've been rollerblading more these days because it's a blast.
02:02:51.000 And I saw these clips and they were cool and he's really good.
02:02:55.000 And so then I was like, we need to bring more action sports people out.
02:02:57.000 I invited skateboarders, scooter people, and then I hit him up and I was like, you want to come out and skate?
02:03:02.000 And Brett knew the show and he was like, yes.
02:03:04.000 And then he said, can my friend come with me?
02:03:06.000 And I was like, who's your friend?
02:03:07.000 He's like, Andy.
02:03:08.000 And I said, sure!
02:03:09.000 Yep.
02:03:10.000 Andy came out, and then I meet these two guys, and they're really awesome.
02:03:14.000 And I said, you know, why don't you guys work here?
02:03:18.000 We'll find a way to make this work.
02:03:20.000 You guys are really cool.
02:03:21.000 I think you can help us out.
02:03:22.000 And initially, we had Brett filming, because we needed film for the vlog.
02:03:28.000 And Andy was doing, like, grounds maintenance and control for, like, skate park stuff.
02:03:33.000 And then we found out that Andy's basically like a tech whiz security expert with a degree and everything.
02:03:38.000 And I was like, OK, we are drastically underutilized by the talents of this man.
02:03:41.000 And then he ends up building out this studio and the design and all the cables.
02:03:45.000 And then Brett, we find out, is a pop culture genius.
02:03:48.000 And we're like, Brett, you got to do a show.
02:03:50.000 Andy, we said, you've got to be our CTO.
02:03:52.000 We need your expertise to be able to do this.
02:03:55.000 And then at some point Andy and Lydia fell in love and got married and now are going off into the wild to have a family.
02:04:02.000 So I want to say thank you so much to you guys for everything.
02:04:07.000 It's been an amazing ride and I hope you guys will come back and visit.
02:04:11.000 And we'll make sure to shout out the stuff that you're working on when it's up and running.
02:04:18.000 It's been absolutely amazing.
02:04:20.000 And we have some new people who are going to come and help us, but they will never, unfortunately, I'm going to look you in the eyes, they will never...
02:04:28.000 You'll never be the same!
02:04:31.000 Can I just say before I go too that my replacement is fantastic.
02:04:34.000 You guys are really gonna love him.
02:04:36.000 He's an extreme sports dude.
02:04:37.000 He's an international bro.
02:04:39.000 He's a music professional.
02:04:40.000 You guys are gonna love him.
02:04:41.000 He's fantastic.
02:04:42.000 People are like, oh, you got big shoes to fill.
02:04:44.000 Literally, I have big feet.
02:04:45.000 But it's like, he's fantastic and I think you guys are gonna love him for sure.
02:04:48.000 So don't worry.
02:04:50.000 He's awesome.
02:04:51.000 I'm leaving the show in good hands.
02:04:53.000 I'm looking forward to the future.
02:04:54.000 Right on.
02:04:55.000 All right, everybody.
02:04:56.000 Thank you all so much for everything.
02:04:58.000 Thank you to Andy and Lydia for everything you've done to help make all of this possible and grow this company.
02:05:04.000 And we'll see you all next time.
02:05:06.000 And then we'll make sure that once you guys have whatever show it is you're doing, whatever, we can make sure we can shout it out and everybody knows where to find you.
02:05:12.000 Awesome.
02:05:12.000 Thanks.