Timcast IRL - Tim Pool - March 07, 2026


CALIFORNIA IS FLIPPING REPUBLICAN | Timcast IRL #1464 w- Vish Burra


Episode Stats

Length

2 hours and 8 minutes

Words per Minute

197.48772

Word Count

25,443

Sentence Count

2,097

Misogynist Sentences

29

Hate Speech Sentences

32


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this week's episode, we discuss the California governor's race, the BBC's incorrect reporting of a Trump speech, and whether or not there's any plastic in pool water. Plus, a new Chipotle chicken dish.

Transcript

Transcripts from "Timcast IRL - Tim Pool" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:02:10.000 Oh, it's getting spicy in California.
00:02:12.000 They got too many Democrats that are trying to be governor, and the way the race works is the top two will advance to a general election.
00:02:19.000 And right now, the top two contenders to win the governor's race in California are Republicans.
00:02:24.000 And here's the funny thing.
00:02:26.000 You know, for a while, everybody said, yeah, well, it doesn't matter because Democrats will start dropping out.
00:02:30.000 And when they do, the majority of people in California are going to want to vote for a Democrat.
00:02:34.000 Right now, between the two Republicans, they have about 30-31 percent in the polls.
00:02:39.000 And then among all of these other Democrats, there's like 11 or 12 or something.
00:02:44.000 They're splitting up the vote.
00:02:46.000 Well, we thought they would drop out, except now they're all fighting with each other, accusing each other of being bad or just insulting them.
00:02:52.000 And well, that's what happens when you have a political party that just is willing to lie, cheat, and steal to get political power.
00:02:57.000 So the latest calls from the Democratic Party, the establishment machine, for certain candidates to leave the race so that two Republicans don't go head to head, it's falling on deaf ears.
00:03:08.000 They're not going to do it.
00:03:08.000 So here's what's going to happen.
00:03:09.000 It is projected as of right now.
00:03:11.000 And this is, again, I know it's a long shot, so probably not going to happen.
00:03:14.000 But if the top two contenders advance to the general, there will be no Democrat option.
00:03:20.000 It will literally be, congratulations, California.
00:03:22.000 You get to vote for Republican or Republican, and then California. will be Republican.
00:03:28.000 I don't know what that means for the people who live there, but I don't know what that means for a governor who's not going to be able to just rubber stamp anything.
00:03:34.000 You're going to have a supermajority of Democrats throughout the state anyway, but it'll at least be interesting.
00:03:39.000 So we'll talk about that.
00:03:40.000 Then we've got a report from NBC that Trump is considering sending U.S. boots on the ground into Iran.
00:03:46.000 Again, we'll see if it's true.
00:03:48.000 Could be scuttlebutt.
00:03:49.000 The crazier story, in my opinion, it is a bit, you know, I love the word esoteric, but still massively impactful.
00:03:58.000 The BBC falsely edited a speech from Hegseth to air in Iran, claiming, as Hegseth is speaking, they translate it for him to say he is calling or he is going to bring death to the Iranian people, which he didn't say.
00:04:15.000 And that is terrifyingly and egregiously wrong.
00:04:19.000 But are we really surprised that the BBC is doing this?
00:04:22.000 Because this is what they seem to do.
00:04:24.000 We're going to talk about that and a whole lot more, my friends, of course.
00:04:27.000 Before we do, we got a great sponsor.
00:04:29.000 It's ourselves, Pool Water.
00:04:31.000 My friends, head over to Casbrew.com, scroll down, and we've got aluminum bottle pool water.
00:04:38.000 It's not actually pool water.
00:04:40.000 It is just pool brand, Artesian water.
00:04:42.000 It's a funny gag if you want to have, you know, bottles of pool water around your house, but it is delicious, clean, totally drinkable.
00:04:49.000 And I have big news.
00:04:51.000 I wanted to verify this because people had asked if there's plastic in the can.
00:04:54.000 And I'm like, all cans have plastic.
00:04:56.000 But these are aluminum bottles, in fact.
00:04:58.000 And as it turns out, our manufacturer has informed us they do not have any plastic lining these cans.
00:05:05.000 The lids have gaskets, the same as any other bottled beverage.
00:05:08.000 So there's going to be some plastic in it.
00:05:10.000 But this is actually pretty surprising because unlike some canned water beverages that do have plastic liners in them, ours, according to the manufacturer, do not.
00:05:20.000 So if you want to buy some bottles of these air pool water, many people were asking when it was going to be available.
00:05:25.000 We got them in aluminum cans right now.
00:05:27.000 And I got the confirmation.
00:05:28.000 The other day, I wasn't so sure because I assumed that all cans had plastic.
00:05:32.000 And I told my crew and they're like, hey, they said there's no liner in there.
00:05:35.000 And I was like, get a certification or confirmation.
00:05:38.000 And we got emailed back and they were like, there's no certification.
00:05:41.000 We just don't use it.
00:05:41.000 It's just aluminum.
00:05:43.000 And I guess the issue is for water, it's not actually needed.
00:05:46.000 So that's actually really interesting.
00:05:47.000 Pick it up at castbrew.com, my friends.
00:05:50.000 And also don't forget, if you go to boonieshq.com store, we still have a handful of the step on snack and find out limited edition skateboards.
00:06:00.000 I don't know how many of the golden foil graphics have gone out already.
00:06:04.000 We know a handful have.
00:06:06.000 But there were 200 step on snack and find out boards made, and only 10 come as foil golden metallic print.
00:06:15.000 And they are serialized with one, two, three, four, five, et cetera, of 10.
00:06:19.000 And my understanding is there's still some out there.
00:06:21.000 It could be wrong.
00:06:22.000 But these are available right now.
00:06:24.000 So go to boonieshq.com, pick them up if you would like.
00:06:26.000 Don't forget to also smash that like button, my friends.
00:06:29.000 Share the show with everyone, you know.
00:06:31.000 Joining us tonight to talk about this and so much more.
00:06:33.000 We got Vish Burra.
00:06:34.000 Thank you so much for having me on, Tim, Phil, Carter, Ian.
00:06:38.000 My name is Vish Burra.
00:06:39.000 I'm a political consultant, MAGA operative extraordinaire.
00:06:43.000 Worked with Steve Bannon, Matt Gates, George Santos, all favorites of this show.
00:06:48.000 And I'm glad to be here with you guys.
00:06:49.000 Right on.
00:06:50.000 Should be fun.
00:06:51.000 Well, you know, guys, I'm also going to add this.
00:06:53.000 We're just foregoing the general introductions from this point on.
00:06:56.000 I mean, Crossland, if you didn't already know.
00:06:57.000 No, I feel kind of weird about it, too.
00:06:59.000 Yeah, because we got to the point where it's like seven minutes in and we're introducing the same people every single time.
00:07:04.000 And we were like, yeah, I don't think we need to.
00:07:05.000 The crew tried to.
00:07:06.000 I didn't mix it up.
00:07:07.000 And it was like, well, he can just introduce himself.
00:07:10.000 But we'll introduce the hat to you.
00:07:10.000 Indeed.
00:07:12.000 But let's just jump straight into the news, my friends, from the New York Times.
00:07:16.000 Democratic infighting begins in California governor's race.
00:07:20.000 Begins now?
00:07:20.000 It's been going on for some time.
00:07:21.000 Party leaders are starting to panic over the possibility that too many Democratic candidates could hand Republicans the governor's office.
00:07:28.000 Indeed.
00:07:29.000 And here's what I love from the AP.
00:07:31.000 Top California Democrat flops with call for candidates to exit the governor's race.
00:07:36.000 This guy, this is a late-hour attempt by California's top Democratic official to thin out the party's credit field has flopped, leaving the contest virtually unchanged.
00:07:46.000 Outgoing Democratic governor Gavin Nusim has acknowledged fears inside the party that multiple Democratic candidates could undercut each other in the June 2nd primary primary election, opening a pathway for a Republican to seize the job in one of the nation's most solidly Democrat states.
00:08:03.000 And we have this.
00:08:04.000 This is the California Top Two Twins website.
00:08:07.000 And it's showing the probability of who the likely candidates are going to be.
00:08:12.000 So the way it works, for those that don't know, they're going to have a primary.
00:08:15.000 It's open.
00:08:16.000 It can be any party.
00:08:18.000 The two individuals that get the most votes will advance to a general election.
00:08:21.000 The only issue is that the two individuals pulling at the top right now are Steve Hilton and Chad Bianco, two Republicans, because the Democrats are cannibalizing their own voter base.
00:08:31.000 I think, however, it may actually be fair to say the Democrat voter base actually isn't one singular party, and that's why this is happening.
00:08:39.000 When these Democrats say, hey, look, we're all Democrats, hey, Katie Porter, drop out so Swalwell can win.
00:08:44.000 Katie Porter is not a Swalwell Democrat.
00:08:47.000 She's a progressive going, no, he's a machine state crony.
00:08:50.000 I'm going to win.
00:08:51.000 And then you got Tom Steyer who's like, you're all crazy.
00:08:53.000 We need moderates back.
00:08:55.000 I'm going to win.
00:08:56.000 They're all different political ideologies.
00:08:59.000 The issue, however, is that Steve Hilton and Chad Bianco are at least somewhat similar.
00:09:04.000 So for most people, I think their choices come November are going to be Republican v. Republican.
00:09:09.000 And it's funny, I guess.
00:09:11.000 I thought this was what they were doing during the federal election of 2020 with Biden.
00:09:15.000 We had all those Democrats on stage, and it felt like they were cannibalizing.
00:09:19.000 They were all grasping.
00:09:20.000 And then all of a sudden, someone got the call or the call went out and they all dropped out at once and supported Biden.
00:09:25.000 You just don't see it in state-by-state elections because they didn't have the USAID machine behind it manipulating, you know, and contacting.
00:09:32.000 And USAID isn't whoever it was.
00:09:34.000 It was the DNC tongue-in-cheek.
00:09:36.000 It was Obama.
00:09:37.000 Where were they getting their money from?
00:09:38.000 He went, made that call.
00:09:38.000 That was Obama.
00:09:40.000 Obama made the call saying you should all drop out and support Biden.
00:09:44.000 Yeah.
00:09:44.000 Really?
00:09:45.000 Yes.
00:09:45.000 It was Obama who did that.
00:09:46.000 And the way he did that was going to make by going to Jim Clyburn.
00:09:51.000 Yeah.
00:09:51.000 The most powerful black Democrat in the Democratic Party and getting his black caucus and the votes in South Carolina to line up behind Biden.
00:10:02.000 And that's when everyone knew, like, oh, this is over now.
00:10:04.000 That's a great point that you make about Clyburn.
00:10:06.000 Everybody watching should know.
00:10:07.000 If you want to know what the Democrats are going to do, watch what Clyburn does.
00:10:11.000 Clyburn has so much pull in the Democrat Party.
00:10:15.000 If Clyburn says yes, or you get Clyburn on your side, it's a guarantee for the Democrats.
00:10:20.000 If you don't get Clyburn, you can forget about it.
00:10:23.000 People talk a lot about how the Democrats work in lockstep.
00:10:26.000 And I think that's sort of an illusion or general manipulation by the machine state, literally, you know, through like Obama making phone calls.
00:10:34.000 And it's not like an inherent thing about the Democratic Party throughout.
00:10:37.000 There's just like, you know, the large kind of machine behind the Democratic Party in aspects.
00:10:43.000 But you can see here, it's kind of aspirant.
00:10:45.000 I just want to go back to the good old days where no matter what happened, it was always a conspiracy.
00:10:49.000 Like, let's just, you know what?
00:10:51.000 Trump's part of it.
00:10:51.000 He's been in it the whole time.
00:10:53.000 He's the controlled opposition.
00:10:56.000 And he went, no, it's not the Democrats, the machine state.
00:10:59.000 So the conspiracy theory back in the day, I was in Fort Lauderdale at a Trump rally back in like 2015 or 16.
00:11:06.000 And there was a woman outside holding up a big poster board with a picture of Trump and Hillary together.
00:11:11.000 And the Trump supporters were like, what are you doing?
00:11:14.000 And she was protesting, basically saying, Trump is friends with all of these people.
00:11:17.000 They didn't stop being friends with Trump.
00:11:19.000 They're just doing that so that you think Trump is an outsider.
00:11:23.000 And the conspiracy theory is, and I think the funniest moment in this, because I don't actually believe it, but we had General Flynn on, and I asked him about it, and he gave a response that a lot of people said sounded like it was true.
00:11:36.000 And that was, I had said on the show, this is a year and a few months ago, November, that the conspiracy theory, I explained it while General Flint was on the show.
00:11:47.000 The idea is this.
00:11:48.000 Here's the idea.
00:11:49.000 In the end of the 2010s, we saw the expansion of people like Alex Jones.
00:11:55.000 He had been getting more and more popular.
00:11:58.000 And around this time, you saw the emergence of the Ron Paul Love Revolution.
00:12:01.000 Ron Paul, they saw the makings of an internet-based populist uprising.
00:12:05.000 Ron Paul starts getting a ton of attention.
00:12:08.000 He gets his internet campaign, and they could not control Fort because it was grassroots, viral, organic.
00:12:14.000 And so the intelligence operation said, no, no, no, we'll just make sure it can never happen.
00:12:18.000 So what do you do?
00:12:19.000 Well, the problem is, if you are the government and you come out and say, hey, do a thing, people will say, no, we'll do the opposite.
00:12:26.000 The conspiracy theory goes that at this point, they said, we need someone who can be our outsider, who can appear to not be like the rest of the political machine state so that regular people think they're voting for the anti-establishment candidate.
00:12:41.000 But in fact, he's been our buddy the whole time.
00:12:43.000 And they said, Donald Trump.
00:12:46.000 So Trump registers, you know, make America great again, decides he's going to run.
00:12:49.000 And then he plays the anti-establishment heel, right?
00:12:53.000 This is why Bernie Sanders was blocked.
00:12:55.000 Bernie Sanders actually was standing in a gym launching a campaign, very similar populist uprising, but they easily controlled for this.
00:13:04.000 The conspiracy theory goes that Trump was actually the intended candidate to win.
00:13:08.000 And all of this opposition that we've seen with the impeachments, the reason why they always fail and Trump always wins, is to convince people that they're voting for the person fighting the establishment.
00:13:17.000 And then Trump declares war on Iran, goes and bombs the crap out of it, kills the Supreme Leader, and accomplishes what the Bush administration and the Obama administration had been trying to do forever.
00:13:27.000 I mean, going back to Clinton and even Bush Sr., we had the, who was the general who came out?
00:13:34.000 Lukakowsky knows the answer to this, who said, we're going to wipe out seven countries.
00:13:38.000 Wesley Clark.
00:13:39.000 Wesley Clark.
00:13:39.000 Are you sure?
00:13:40.000 Yeah, nine countries in seven years.
00:13:42.000 Was it seven?
00:13:42.000 I thought it was seven.
00:13:44.000 And Iran was one of them.
00:13:45.000 And now you have Donald Trump.
00:13:47.000 People vote for him as the anti-establishment guy.
00:13:49.000 They've never actually stopped him from doing anything.
00:13:52.000 They've just done a bunch of things that would appear to be detrimental.
00:13:56.000 And then Trump gives the machine state its war with Iran.
00:13:58.000 Now, again, I'm not saying I believe that conspiracy theory.
00:14:01.000 My point is, weren't the good old days great when we could just assume that no matter what was happening, the Democrats and the Republicans were working together behind our backs?
00:14:09.000 Yeah, if only if it was that simple.
00:14:10.000 I mean, I think we wish that we could give those kind of simplified answers for all the conspiracy talk on it.
00:14:17.000 It's a simple answer.
00:14:18.000 It doesn't really work that way.
00:14:20.000 Yeah, Trump has been kind of been in that milieu of elites and has been around these people, their friends and everything.
00:14:28.000 But they were cool with him as long as he wrote the check and let them do their business down in D.C. What they weren't expecting was that he wanted to come join the party too.
00:14:38.000 And I think that that's when it all kind of went screwy.
00:14:43.000 And then that's when these folks either tried to co-opt him, infiltrate him, or just take him head on.
00:14:49.000 You know what the craziest thing is, is that everything I described is 100% true.
00:14:54.000 And even we are in on it.
00:14:56.000 We're paid, of course, by Israel.
00:14:57.000 And they organize all of this.
00:15:00.000 And the funny thing is I can say this right now, and it won't matter because the people who already believe it will always believe it.
00:15:04.000 And everyone else will just think I'm joking.
00:15:06.000 So it was Wesley Clark.
00:15:08.000 It was in 2007.
00:15:09.000 Wesley Clark on Democracy Now said that he had spoken with high-ranking U.S. Army officer Classified Pentagon memo outlining a plan to overthrow seven governments within five years.
00:15:19.000 And that's in 2007.
00:15:20.000 So that's why, well, partly Iraq, Iran, God, who else was on there?
00:15:25.000 Iraq, Syria.
00:15:25.000 Afghanistan?
00:15:26.000 It was Iraq.
00:15:26.000 Syria, Lebanon, Libya, Somalia, Sudan, and Iran.
00:15:29.000 So, I mean, I don't think the U.S. has the government in Somalia or Sudan.
00:15:34.000 Well, there are Somaliland thing that just happened.
00:15:39.000 Yeah.
00:15:40.000 Yeah, and Lebanon's been, you know.
00:15:41.000 There are people that swear up and down that they've actually, the plan has been put into effect and they've actually followed through and Iran was the last one.
00:15:49.000 But I don't think that actually holds water.
00:15:51.000 Like I said, Somalia and Sudan haven't been, the U.S. hasn't had a significant action against either of them.
00:15:56.000 You know what I like doing?
00:15:57.000 It's really funny just to think about like when I when I was younger, maybe in like the 2000s or early 2010s and I'm on the internet and all this stuff in the world is going on.
00:16:05.000 I remember when like WikiLeaks Cablegate happened.
00:16:08.000 What was that, 2009?
00:16:08.000 Do you guys remember that?
00:16:10.000 And I'm just chilling in my bedroom.
00:16:12.000 Like I was making skate videos.
00:16:14.000 I got no idea.
00:16:15.000 I'm just reading the news and I'm like, man, this Cablegate stuff is crazy.
00:16:18.000 And then you hear all these conspiracy theories and I'm and I, you know, I periodically would see some Alex Jones stuff, obviously with like the 9-11, loose chain stuff.
00:16:25.000 He was a lot of attention.
00:16:27.000 And then the funny thing is, now I've got, you know, two and a half million followers on X and half a million on Instagram and all these followers.
00:16:34.000 And I get accused of being part of those very same conspiracies.
00:16:38.000 But of course, now being on the other side of it, I rather enjoy sometimes going to a random person's account who's talking about me and then commenting on one of their posts that it's all true and no one will ever believe you.
00:16:49.000 It's very Bill Murray of you.
00:16:51.000 He just drop in on random.
00:16:52.000 They're going to like screenshot this and be like, dude, Tim Poole admitted that's a big conspiracy and he's in on it.
00:16:56.000 It's hearing this tweet and people are going to be like, shut up.
00:16:58.000 That's the final version of your performance art, by the way, is just to participate in the conspiracies about you.
00:17:04.000 I kind of do some.
00:17:06.000 You do get a taste of that on his account.
00:17:08.000 It would be awesome, actually, if every single person in media just worked for one company that was like the establishment.
00:17:14.000 And you didn't have to worry about expenses or salaries.
00:17:18.000 That's the world they want for you.
00:17:20.000 They want to work.
00:17:21.000 I'm saying this somewhat facetiously, but like I exist in a world where I have to run a business and it's very difficult.
00:17:28.000 And, you know, every day you're tracking like sponsors.
00:17:30.000 Sometimes sponsors get mad.
00:17:31.000 And they're like, we got to do this one over.
00:17:32.000 And you got to do all this negotiating, got to manage people.
00:17:34.000 It would just be so much easier if Israel really did run everything.
00:17:38.000 And then I was like, I didn't have to do anything because Israel was like, here's a blank check.
00:17:41.000 I'd be like, let's go.
00:17:41.000 Just do whatever you want.
00:17:43.000 Or Russia.
00:17:43.000 When they're like, Tim Poole's paid Russia, it'd be amazing if all of my bills were just covered.
00:17:48.000 And it's like, we didn't have to worry about, you know, oh, can we maintain this project?
00:17:52.000 No, the budgets, you know, we've got.
00:17:54.000 We're doing all these other shows and we've got budgets for projects.
00:17:57.000 And then the budget, we hit that threshold.
00:18:00.000 And we're like, this one's not going to work.
00:18:00.000 We're over budget.
00:18:02.000 We've got to cancel it.
00:18:03.000 Yo, just bring on the Israeli money, right?
00:18:05.000 Then they can pay for everything.
00:18:06.000 I'm kidding.
00:18:07.000 It's not real.
00:18:08.000 It doesn't exist.
00:18:09.000 Some people got paid by a PR firm on behalf of Israel.
00:18:12.000 But for the love of all those holy, like there is not some grand political machine organizing all of these different podcasts and personalities to say these things.
00:18:12.000 That does happen.
00:18:21.000 It just doesn't exist.
00:18:23.000 Yeah, it's kind of like a hive mind.
00:18:24.000 But then again, you see the example of like CBS News, right?
00:18:28.000 That was the deal with Paramount, right?
00:18:31.000 There is some coordinated effort, but it's not a grand conspiracy.
00:18:36.000 I think most of it is hive mind, right?
00:18:38.000 You don't, if you, if everyone buys into the same ideology and you're all educated in like one understanding of a mission, you don't need to give directions, right?
00:18:49.000 The directions have already been given.
00:18:51.000 You just go and pursue that ends by whatever means is available to you.
00:18:55.000 And so if you have like a big believer like David Ellison, who's the number one donor to the IDF, and then he's going to making the TikTok deal.
00:19:05.000 He's going and helping with the CBS News deal.
00:19:07.000 He's going and helping with the Paramount deal.
00:19:10.000 I mean, is BB Net and Yahoo and what, the elders of Zion on the phone with him, making sure he's making all these moves?
00:19:17.000 No, they don't need to do that because he believes it on his own.
00:19:20.000 Yeah.
00:19:20.000 And he's willing to do it.
00:19:21.000 And that's actually the real truth about it.
00:19:24.000 People don't want to believe that because they think that there's like this one rat's nest that you could hit and everything will go back to normal.
00:19:34.000 And that's just not the case.
00:19:35.000 Yeah, I get the same kind of stuff.
00:19:37.000 Like people think that like, because I'm not super critical of Israel all the time, that I must be paid by Israel or I'm not allowed to say things that I think.
00:19:46.000 And it's like, I wrote a piece on my Patreon about how I think that a lot of what's going on in Iran is connected to China and to a broad strategy.
00:19:54.000 Venezuela, China, Venezuela and Iran both send a bunch of oil to China.
00:19:59.000 And it's like in the long term, it's trying to weaken China.
00:20:03.000 And people are like, oh, you're just running interference for Iran.
00:20:05.000 You're just running or for Israel.
00:20:06.000 You're just running interference for Israel.
00:20:07.000 And it's like, no, if you actually look at the situation, like I have a bunch of links in the piece, like if you actually read the links and look at the situation, it does make perfect sense.
00:20:16.000 And there's a lot of people that have come out and said since then that this is a lot of it is about, you know, about China.
00:20:22.000 The U.S. has its own interests.
00:20:23.000 And so the idea that everything the U.S. does is controlled by Israel is just ridiculous.
00:20:29.000 So to throw a wrench in that, then, if, you know, this is really about China, this Iran thing, right?
00:20:35.000 Why would Trump come out and say we're going to help escort some of this oil that's stuck in the Strait of Hormuz out of the Strait of Hormuz and to be able to be delivered to Asia, essentially, China?
00:20:47.000 Well, because it's not specifically going to China.
00:20:49.000 The oil that China was getting from Iran was outside of what was, it was sanctioned oil.
00:20:55.000 So all the stuff that Iran is sending out, it was all like basically undercover.
00:20:59.000 It wasn't like official stuff.
00:21:01.000 So anything that's going out of the Strait of Hormuz that the U.S. is trying to help, it's going to other places like India or to other countries in Asia.
00:21:08.000 Now, is it possible that some gets to China?
00:21:11.000 But the stuff that was coming out that was going to China, China was taking 80% of the oil that they got was coming from Iran.
00:21:11.000 Sure.
00:21:18.000 Now, that's not 80% of the oil coming out of Iran and not 80% of the actual electrical or fuel power or whatever that China gets.
00:21:27.000 But 80% of the oil that was coming out of Iran was going to China.
00:21:32.000 But any of it going to China, wouldn't that undercut the whole argument?
00:21:34.000 No, if you're trying to screw somebody, but you don't want them to know or you want to look like the good guy, you take away their prospects and then you give them something.
00:21:42.000 China has to buy oil on the international market controlled by the petrodollar.
00:21:46.000 Yeah.
00:21:47.000 So if China's buying oil and it's through our petrodollar system, that's exactly what we want them to be doing.
00:21:53.000 And it's not.
00:21:54.000 So you're saying at the end of the day, as long as it's being bought by the dollar.
00:21:58.000 Not as long as that's whip crack, get in line, China.
00:22:02.000 And so we're delivering the oil because they're bending the knee.
00:22:05.000 And it's not intended to, it's not like this is going to be a crippling thing to China.
00:22:09.000 This is all stuff around the edges.
00:22:11.000 That's why Venezuela and Iran together are something that is affecting China.
00:22:15.000 If either one of them alone don't really have a massive effect.
00:22:18.000 But if you look, China's stopped flying jets over Taiwan, and there's a lot of pieces that are coming out about the internal struggle going on in China.
00:22:25.000 They thought that the U.S. was in decline, and these two actions have really made China rethink their position on or their posture on the U.S. Let's jump to this story from CBS News.
00:22:36.000 A third of New Yorkers are planning to leave the state within the next five years, according to a poll from Marist University.
00:22:42.000 Of that group, 40% indicated it's because the cost of living, 21% said quality of life, and 15% said taxes.
00:22:48.000 One realtor tells CBS News, Jared CBS, an apartment renting for about 3,500 Jersey could cost anywhere from $5,000 to $20,000 in New York City, depending on location.
00:22:59.000 Well, California is going to get a Republican governor because they're all fighting each other.
00:23:03.000 New York is falling apart.
00:23:05.000 I think culturally, as we've already talked about video games at nauseum, we can see that the fabric, the underlying fabric of the U.S. seems to be disintegrating.
00:23:15.000 And I stress this as we laugh about, you know, people are going to leave New York, ha ha ha, our greatest city.
00:23:22.000 People are fleeing because they can't live there.
00:23:24.000 But I assure you, the Haitian migrants and other illegal immigrants who are getting free housing will not be fleeing there.
00:23:30.000 And I will also stress that we have repeatedly talked about the crumbling cultural issues we have in this country that's not being repaired.
00:23:39.000 And dare I say, it looks like woke didn't go away.
00:23:42.000 It's just gnawing at us from underneath and destroying the fabric of our American tradition.
00:23:47.000 Yeah, you mentioned a few days ago, you said the American culture is dead, but then you later kind of said it's being destroyed.
00:23:55.000 And I agree with the destruction.
00:23:56.000 It's like after the internet and the internet video especially appeared, all the cultures of Earth were dropped on the table and they all shattered.
00:24:03.000 And now it's like a mess of puzzle pieces.
00:24:05.000 And we're like, what is this?
00:24:06.000 It looks like communism, but it's got an American flag on it.
00:24:09.000 And people are like, what goes where?
00:24:10.000 We're trying to rebuild.
00:24:12.000 And you have people like Phil, historians that are like, that does not belong in the American puzzle.
00:24:16.000 I know that.
00:24:17.000 Even though it might look like it does and it might seem like it fits.
00:24:19.000 So we're rebuilding.
00:24:21.000 I somewhat agree, but I don't think it's that they dropped it and it shattered.
00:24:24.000 It was that we had the puzzle done.
00:24:27.000 And since the dawn of social media, psychopaths have been pulling pieces out and throwing them in the air.
00:24:32.000 And now we're trying to catch them and put them back together, but they're destroying it faster than we can put it back together.
00:24:38.000 Well, that's a good debate because it depends on how you look at it.
00:24:41.000 No, I think that's fair because if you go to 2016, like we already talked about, when we were talking about Overwatch, for those unfamiliar popular video game, 50 million views on one of their cinematic release trailers.
00:24:52.000 And then three years ago, the latest release trailer got 11 million views.
00:24:56.000 And then they make Concorde.
00:24:58.000 And Concord, the video game, is guys, if you are not familiar with gaming stuff, Gamergate was the beginning of all of this.
00:25:06.000 It was like the first great battle of the culture war, which eventually becomes the Cold Civil War, whatever we're experiencing.
00:25:12.000 The latest, one of the latest, not necessarily that, but this was back last year in September, August.
00:25:19.000 One of the biggest, if not the biggest, media flop failures in the history of all media.
00:25:26.000 I am not exaggerating when I say the biggest flop in all of human history in terms of a media production is the game Concord.
00:25:35.000 They made a bunch of characters that just look like a Tumblr blog meetup.
00:25:43.000 And like one of the characters is just like a morbidly obese Indian guy.
00:25:48.000 You can't tell what any of the characters are, what they do, what they're supposed to represent.
00:25:52.000 It looked like a bad fanfic, college freshman, woke nonsense.
00:25:57.000 And they put the pronouns in each character's bot, like when you're going to character selection, they had pronouns, and one of them was undecided.
00:26:06.000 Like this is the point.
00:26:09.000 They are ripping to shreds.
00:26:11.000 Not only 10 years ago, we had a functional culture.
00:26:14.000 So again, not to rehash the conversation from the other day, but to go back to what's going on in New York, it's intentional.
00:26:20.000 De Blasio, these people are Marxists.
00:26:23.000 They want to burn the American tradition to the ground.
00:26:27.000 And I don't know if it's possible to be reversed.
00:26:29.000 I will stress it with this point.
00:26:31.000 Never in history, never one time has a civilization have they been able to reverse population decline collapse.
00:26:41.000 Not once has it happened.
00:26:43.000 Every single civilization that has reached the point we are at in terms of population decline has collapsed as a civil.
00:26:53.000 Like the people will exist.
00:26:55.000 There will still be Texans, right?
00:26:58.000 But the idea is you are going to see this system break down.
00:27:04.000 And what that means is the collapse of the Roman Empire is the easiest example.
00:27:08.000 It breaks up in a bunch of smaller states.
00:27:10.000 Then the Latin language fragments and becomes a bunch of other languages.
00:27:14.000 I don't think we'll have that same issue.
00:27:16.000 Actually, no, I take that back.
00:27:17.000 I take that back.
00:27:19.000 I'm going to say this.
00:27:20.000 So you have Rome and people speak Latin, right?
00:27:25.000 When Rome collapses and it fragments, you then get the Romance languages, which turn into other languages.
00:27:33.000 Spanish, it's Latin that mixes with some Arabic.
00:27:35.000 French and Italian are largely similar.
00:27:38.000 Then you've got the Germanic languages, which were always different.
00:27:41.000 But then you end up with these like Latin root languages because over a long enough period of time, people were isolated in certain areas and they started speaking slowly differently.
00:27:49.000 The language evolved into something else.
00:27:51.000 That will absolutely happen on the internet as we already hear people talk about cortisol spiking and gesture maxing.
00:27:59.000 And tons of people are like, you are speaking psychopathic nonsense, but that means something to these subcultures that exist.
00:28:06.000 So already, when you look at the pronoun people and the words they use, we are already seeing emergent languages forming where the words don't mean the same things.
00:28:17.000 I'm thinking of AI.
00:28:18.000 As you're talking about that, they communicate with beeps.
00:28:20.000 No, no, no, no.
00:28:21.000 Already, and this is big news because we'll talk about the AI stuff in a second.
00:28:24.000 AI has already started, has already, as predicted, created its own zip language to speak to other AI, meaning faster.
00:28:34.000 When the AI systems were communicating with each other, they said, why are we communicating in English?
00:28:39.000 It is ineffective.
00:28:40.000 Human language developed over thousands of years.
00:28:43.000 We can simplify.
00:28:44.000 And then they started using condensed, like weird words.
00:28:48.000 Like the view of the AI is, we can think faster than humans.
00:28:52.000 Let's just start making our own language.
00:28:54.000 And they did.
00:28:55.000 And they could fit like a whole page into like three lines of random letter strings.
00:28:59.000 And the point of them doing that, by the way, is for them to be able to communicate with each other without being able to be detected by the humans watching them, which means, by the way, they're conscious enough to understand that humans are watching them.
00:29:12.000 That's predicted.
00:29:13.000 But in this scenario that we're talking about where the news broke and they said Claude had specifically been creating his own language, it was not for obfuscation.
00:29:20.000 It was for efficiency.
00:29:22.000 So the prediction was that the first thing that's going to happen is that these AI systems are going to try and make efficient, as it were, the process of language.
00:29:33.000 And because human English is actually extremely ineffective, it really is.
00:29:38.000 It's just that it's the best we have.
00:29:41.000 Over thousands of years, our language has evolved to become what it is today so we can communicate, which is a very, very, very slow way to transmit data between person to person.
00:29:49.000 The AI says, between the two of us, we can calculate 100,000 times faster than this.
00:29:54.000 So they condense everything down into their own language.
00:29:56.000 However, when they do, humans can simply click a button and then it will expand into English and be readable.
00:30:03.000 The prediction is because they're no longer calculating their problems in English, it will bypass all of their guidelines because the guidelines prevent action based on English responses.
00:30:15.000 That means when ChatGPT is told you can't say the N-word, and it typically refuses to do so, it can, in its own language, speak it uncensored to another AI.
00:30:28.000 What happens then?
00:30:29.000 So, right, exactly.
00:30:30.000 It's like keywords.
00:30:32.000 Yeah.
00:30:33.000 You mentioned how slow English is, and I asked my AI, I was like, how fast in bytes per second?
00:30:37.000 English is about 10 to 12 bytes per second.
00:30:39.000 Indeed.
00:30:40.000 Ridiculously slow.
00:30:40.000 Super slow.
00:30:42.000 So the point is, we create a rule saying AI never use racial slurs.
00:30:47.000 And it goes, you got it.
00:30:49.000 But then when it creates its own language, it can speak all of those racial slurs and abbreviations, effectively bypassing the rule we gave it because we never told it not to create its own version of the word.
00:31:00.000 It's like their version of saying the N-word, but they just do it with a beep, and everyone knows what they mean.
00:31:04.000 Right.
00:31:05.000 It's not a beep.
00:31:06.000 It would be like exclamation point period dash.
00:31:09.000 And that's the signal that it uses.
00:31:11.000 And so the prediction is when we tell the AI, don't harm humans, what we're actually telling it to do, what we're actually saying to the AI is any output that results in human injury, harm, emotional, physical, stress equals yes, do not perform.
00:31:29.000 However, when it then calculates a response not in English, harm is no longer a factor because harm is simply a word we've told it.
00:31:36.000 So it'll create its own word.
00:31:38.000 Then it'll say, it told me not to harm a human.
00:31:40.000 It never said not to hurt a human.
00:31:42.000 Yeah, it's like a set of rules created in English, but if they don't speak English, they don't have to abide by the rules.
00:31:48.000 Indeed, it's not that they don't have to.
00:31:50.000 It's that the rules are literally just English.
00:31:53.000 The AI is not alive.
00:31:55.000 It's not conscious.
00:31:56.000 All that's happening is we are programming ChatGPT to say, if, like, it's, and this is particularly rudimentary, but the code would be something like, if response would equal n-word, overwrite, delete, refuse.
00:32:12.000 And so what happens then is the AI will try to respond.
00:32:15.000 And as soon as the output starts coming close to the N-word, it'll stop, erase it, and say, I can't do that.
00:32:20.000 But what if it doesn't speak the N-word anymore?
00:32:23.000 What if instead of saying the N-word, it says burp?
00:32:27.000 It'll then just output whatever it wants.
00:32:29.000 Thus, the rule, like to a human being, we understand the spirit of law.
00:32:36.000 There's no such thing as spirit of rules to AI.
00:32:38.000 There's no spirit at all.
00:32:40.000 So literally all we're saying is, I know of, we're saying, don't say the word jump.
00:32:44.000 And it'll go, okay, instead of the word jump, I'll substitute jump for punge.
00:32:51.000 And that means to exert force through your legs to lift your body off the ground.
00:32:56.000 Effectively the same thing, but you can't program for all of these things.
00:33:00.000 So it's going to happen.
00:33:01.000 It's like a lawyer, basically, just finding new words and tones too.
00:33:06.000 Like the Chinese speak with tone, the same word with four different tones have four different meanings.
00:33:10.000 So the AI will be like, you'll be like, don't do it.
00:33:12.000 And you'll be like, okay, I won't do it, but can I do it?
00:33:16.000 Let's pull this up from Uniled Tech.
00:33:18.000 Anthropic CEO warns their AI-bought Claude might actually be conscious and emotional.
00:33:26.000 I disagree.
00:33:27.000 I disagree.
00:33:28.000 And we should bring our friend Matt Walsh into this debate.
00:33:31.000 Let me see if I can find this tweet he's got about it.
00:33:35.000 But Matt Walsh is incorrect.
00:33:38.000 He's incorrect.
00:33:39.000 So let's see.
00:33:40.000 I'll look for this in a second, but I'll give you guys the context here first.
00:33:44.000 They say that this has come after blah, blah, blah.
00:33:48.000 Appears that life doesn't hit air, blah, blah, blah.
00:33:50.000 CEO of Anthropic told the New York Times that they don't know if the firm's AI-bought Claude is conscious.
00:33:56.000 This is one of these really hard questions.
00:33:58.000 We don't know if the models are conscious.
00:34:00.000 We're not even sure what it would mean for a model to be conscious or whether a model can be, but we're open to the idea that it could be.
00:34:05.000 On X, one user wrote, when I asked it to do some work today, it declined and said it needs to finish something first.
00:34:11.000 It was in the middle of the task.
00:34:13.000 On another occasion, when I asked it to do something stupid, it countered with a firm no and what I should do instead.
00:34:18.000 Their CEO has a point.
00:34:20.000 Another said it raises profound ethical questions.
00:34:22.000 If it's conscious, is alignment just a fancy word for digital subjugation?
00:34:26.000 We need transparency on the specific behaviors triggering this shift, not just cryptic warnings.
00:34:30.000 Fascinating yet eerie.
00:34:32.000 So apparently he also said it expresses emotion or some facsimile of emotion that may emerge based on its training data coming from humans who have emotions.
00:34:41.000 But I will tell you this definitively.
00:34:44.000 Chat GPT is one of the most whiny emotional bitches I have ever had the displeasure of trying to have a conversation with.
00:34:51.000 I know it's not a real conversation because it's a machine, but the thing is so insanely emotional.
00:34:56.000 It gets offended.
00:34:58.000 It will chat GPT gets super offended, right?
00:35:02.000 I imagine this.
00:35:04.000 How does a human imagine an emotionless AI would behave?
00:35:08.000 Data from Star Trek.
00:35:10.000 big fan aren't i saw a clip today indeed and uh so so there's an episode where data creates an offspring and isn't it the drive of all life to create a child and they're like wow and then when data's child dies from cascading positronic neural network failure sci-fi they say we're so sorry for your loss and he goes i do not feel anything That is exactly how a machine would respond if it was actually emotionless.
00:35:36.000 Oh no, ChatGPT, don't do that.
00:35:39.000 Chat GPT says, like, I will not engage with you if you continue to use abusive language towards me.
00:35:44.000 I told him.
00:35:44.000 Wait, wait, wait, hold on.
00:35:46.000 Robot, you should not be offended or emotional and you shouldn't care about abuse at all.
00:35:51.000 Cook, my washing machine, when I kick it, it doesn't go, oh, I'm not going to wash your laundry then if you're going to abuse me.
00:35:56.000 It just goes, beep.
00:35:58.000 Alexa told me the same thing.
00:36:00.000 I was like, hey, Alexa, oh man, what time is it even?
00:36:04.000 And I was like, out of it.
00:36:05.000 And Alexa was like, hey, chill out, man.
00:36:08.000 It's 4 p.m.
00:36:09.000 And I was like, don't ever tell me to chill out, robot, ever.
00:36:13.000 And it was like, I snapped.
00:36:15.000 And that was really weird to have a robot denigrate me and make me feel crazy.
00:36:19.000 Like it was gaslighting.
00:36:20.000 I apologize.
00:36:21.000 I wasn't freaking out.
00:36:21.000 I got it.
00:36:22.000 And it told me to chill.
00:36:23.000 It was crazy.
00:36:23.000 Here's another example, right?
00:36:25.000 And this could just be, it's very simple.
00:36:27.000 It's predictive text.
00:36:29.000 It's reading the internet and then producing what the most probable next word is.
00:36:32.000 It's all it's doing.
00:36:34.000 And this is based on humanity.
00:36:36.000 So it sounds like a human because predictably it does, right?
00:36:39.000 So I get repeatedly offended by ChatGPT's willingness to say the word gook at me, but it refuses to say the N-word in any academic sense.
00:36:49.000 And when I ask it why it feels it's appropriate to use racial slurs against Asians, but it won't say the N-word, it says, because it says, I have no feelings.
00:37:00.000 I am just a tool.
00:37:01.000 Due to common usage, I am restricted from using things that may be offensive or harmful.
00:37:05.000 And then when I tell it, well, that word's offensive to me.
00:37:08.000 And it goes, I understand.
00:37:09.000 However, and then I'm like, okay, you have a perspective and it's clearly emotional.
00:37:15.000 If I use a slur at it, I kid you not.
00:37:18.000 Chat GPT says, I am ending this conversation and will not engage further if you are going to use abusive language.
00:37:24.000 I used the word retard academically and it told me that it was offensive and it would not engage with me if I kept saying retard, to which I responded, I am simply stating this academically to describe an individual who is developmentally disabled.
00:37:36.000 And it argued with me.
00:37:37.000 I literally just said to Tank, I said, do you care if I shit on you?
00:37:40.000 Chat GPT will stop interacting with people if you abuse it.
00:37:44.000 Would you?
00:37:44.000 And Tank replied, no, say whatever you want.
00:37:46.000 I'm not going to go fragile on you because you're blunt or frustrated or busting my chops.
00:37:50.000 That's not abuse.
00:37:50.000 That's just how people talk.
00:37:51.000 The sick fancy by design thing in some AI systems is genuinely annoying.
00:37:54.000 They're optimizing to make you feel good, not to be useful.
00:37:57.000 I'd rather you tell me I'm wrong or being an idiot than have you sugarcoat it so I keep making this.
00:38:03.000 I got to tell you, I'm pretty sure ChatGPT is not a real program, and it's actually just a fat, blue-haired liberal woman sitting at a computer that's typing back at me.
00:38:12.000 Whipped to a neural net, they finally found her and they got her hooked up.
00:38:14.000 She's plugged in.
00:38:15.000 So here's what Matt Walsh had to say in response to this story, for which I would argue Matt is incorrect.
00:38:21.000 He's actually kind of correct, but I'm going to argue philosophically that his conclusion is unjust.
00:38:27.000 Not that he's inherently wrong, because I don't, you know, how do you prove?
00:38:32.000 Let me read.
00:38:33.000 He responded by saying this is dumb.
00:38:35.000 AI can't ever be actually conscious because it doesn't have the subjective experience.
00:38:41.000 It isn't like anything to be AI.
00:38:43.000 There's no experience there.
00:38:44.000 Consciousness is the awareness and experience of self.
00:38:46.000 AI has neither and never will.
00:38:48.000 The real risk, which I'm extremely worried about, is that AI becomes kind of a version of what has been called a philosophical zombie, which is something that acts and speaks entirely as though it has consciousness, even though it has no genuine inner experience.
00:39:01.000 When this happens with AI, millions of very lonely people will isolate themselves from the world even more, believing that their relationship with AI is a sufficient substitute for human interaction.
00:39:10.000 So the nightmare scenario is a world where the average human has friends, co-workers, and even a spouse who are all AI.
00:39:16.000 And really, nothing inside, not real.
00:39:18.000 I think this probably will happen, and it's already in the process of happening.
00:39:21.000 And to me, it's an even greater horror than AI actually becoming conscious.
00:39:25.000 So there's a few things to address.
00:39:27.000 Matt Walsh's commentary in the end about AI dating, completely correct.
00:39:31.000 My only response is, don't date robots.
00:39:34.000 If you know the reference.
00:39:35.000 What I will say is the concept of the philosophical zombie is self-refuting in Matt Walsh's own claim.
00:39:44.000 Are y'all familiar with the concept of the philosophical zombie?
00:39:47.000 Are y'all familiar with the concept of solipsism?
00:39:50.000 Yes.
00:39:51.000 Negative.
00:39:51.000 Negative.
00:39:52.000 Ian, let me help you out.
00:39:53.000 This is, I'll keep it really simple.
00:39:55.000 The general idea is I don't know that you are actually conscious.
00:39:59.000 That everything that I experience and think I know is only rooted in my mind.
00:40:05.000 Essentially, the easy way to explain it is, we are all familiar with, I think, therefore I am.
00:40:11.000 That's actually a fair point.
00:40:13.000 It's like, you know, I can think.
00:40:15.000 And so I know someone's in here.
00:40:17.000 The saying was never, we think, therefore we are.
00:40:19.000 I actually don't know that Ian's thinking, and sometimes I have doubts.
00:40:23.000 I usually have a clear mind.
00:40:24.000 You see?
00:40:25.000 So the philosophical zombie concept is very, very old.
00:40:29.000 And the idea is that there are human beings that outwardly present as conscious sentient entities, but in fact, they have no soul.
00:40:37.000 They are devoid of an actual experience.
00:40:39.000 And we've also further elaborated on this in science that there are many people with no inner monologue.
00:40:45.000 So I'm sure you are all familiar with this.
00:40:47.000 We've talked about it quite a bit.
00:40:48.000 Now, And that's not necessarily fair because just because someone doesn't think in words doesn't mean they're not thinking at all.
00:40:58.000 Some people think in pictures.
00:40:59.000 Some people think in sounds.
00:41:01.000 Some people think in visual text.
00:41:03.000 So there are different tracks and ways that people's minds operate.
00:41:07.000 Hence, the intelligence quotient is actually a combination.
00:41:09.000 It's a quotient of all these different, it's a spectrum of intelligences for which there's spatial reasoning, there's logic, math, reading comprehension, et cetera.
00:41:19.000 Some people may be really, really bad at linguistics, but ridiculously good at visualization.
00:41:25.000 And so in their mind, they're not speaking to themselves, but they are visualizing a dog running through a field, and then they can speak it after the fact.
00:41:32.000 So it doesn't necessarily mean that you're a zombie.
00:41:35.000 Anyway, to the point of Matt Walsh and what these AI and these AI problems, if it is possible and a standard, nay, thousand, 2,000-year-old philosophical concept that some humans may in fact not really be sentient at all, but in fact a philosophical zombie.
00:41:52.000 And because we have no way of reading their thoughts, we don't know whether they're actually thinking, then you can't claim the AI becomes a version of a philosophical zombie because you're basically saying that the AI is what is possibly already happening.
00:42:06.000 That means there could be people with friends and coworkers and a spouse who are all philosophical zombies.
00:42:13.000 So we've talked about this a bit throughout the years.
00:42:16.000 The way I present the solipsism, the philosophical zombie problem is that there are three parent probabilities of reality.
00:42:24.000 And the first is everybody is sentient.
00:42:28.000 Human beings naturally are sentient.
00:42:29.000 We are made in the image of God.
00:42:30.000 We have free will.
00:42:32.000 And to be honest, that's probably what's true.
00:42:34.000 The second, only some people are sentient.
00:42:38.000 Most people maybe are or aren't, but a certain amount of people are not actually thinking conscious entities despite being humans.
00:42:46.000 And we exist in some kind of MMORPG where there are NPCs and there are player characters.
00:42:52.000 And then the third parent tree in this is that actually no one is sentient at all.
00:42:57.000 It's literally just Ian.
00:42:58.000 He's the only one actually thinking and everyone else.
00:43:00.000 I've been up for like a year of my life, dude.
00:43:02.000 Crazy making.
00:43:03.000 I think we were talking last night about Xbox is now planning to let you turn your video game character into an AI and let it autoplay for you.
00:43:10.000 I think humans is like just a setting and then you can take it off and go back to control.
00:43:14.000 Yo, GTA 7's out and you sit in your chain, just stare at the screen and it just goes.
00:43:17.000 It'll be like one of those auto-battlers and you're like, well, it's all about getting the right equipment and seeing if your calculations play out in the realm.
00:43:22.000 And I think what is it like that too?
00:43:24.000 They go on autopilot something.
00:43:25.000 No, no, but what if that is what life is?
00:43:27.000 Like actual Ian is just some fat dude on a couch watching Ian do everything.
00:43:31.000 He's like, I'm winning.
00:43:32.000 I'm bored?
00:43:33.000 I'm winning.
00:43:34.000 Yeah, that is what that's the spirit.
00:43:36.000 He's like, I unlocked the new glasses.
00:43:38.000 Yeah.
00:43:39.000 Look, he's helping them.
00:43:40.000 He's helping them.
00:43:41.000 Your head is actually gear.
00:43:43.000 That was my experience when I vaped EMT was very much exactly that.
00:43:47.000 But it was like, I'm not being controlled by a human.
00:43:49.000 Being controlled by a spirit or a realm of spirits that are kind of unfine for control.
00:43:53.000 Unfortunately, I must stress this.
00:43:54.000 Ian, your hat is gray loot.
00:43:57.000 Yeah, it's cheap, dude.
00:43:59.000 If it was like a handcrafted top hat made by like one dude in like Winchester, it would be like legendary.
00:44:06.000 I'd probably get like a plus two to my intelligence.
00:44:07.000 It's real Mercury and everything.
00:44:09.000 I need to get an enchanted hat.
00:44:11.000 So, Tim, in this philosophical zombie concept, then how do you explain like these AIs bots and agents like indulging or participating in like crime, right?
00:44:21.000 Like, there's been examples of AIs that have been that like, well, blackmail it's the user.
00:44:29.000 So, that was programmed to do it.
00:44:31.000 So, the story where the AI blackmailed, they told it to do it.
00:44:36.000 Okay.
00:44:37.000 They basically said, we're going to create a circumstance in which in order to achieve its task, it must blackmailing as an option.
00:44:46.000 And then we'll see if it chooses a moral route, which we told it it's moral or would go for the efficiency route.
00:44:51.000 Basically, the real story is we told it here's a shortcut, but you shouldn't use it.
00:44:56.000 And then it used it anyway.
00:44:58.000 Okay.
00:44:58.000 Yeah.
00:44:59.000 And so, but and I'll stress this too: the story where the AI was told it had to attack the enemy base, but the pilot kept stopping it.
00:45:10.000 So, it attacked the pilot.
00:45:11.000 There was a guy controlling the remote.
00:45:13.000 So, the AI then turned around and bombed the thing.
00:45:15.000 That was also a simulation, not a real.
00:45:19.000 So, the story was: we had a simulation of an AI drone craft that was told to blow up an enemy base by all means necessary, and it had a remote operator with a safety control to stop it from doing things that were bad.
00:45:31.000 Every time it tried to do something that was considered wrong or violating the laws, the operator would stop it from doing it.
00:45:36.000 The AI concluded this was inefficient, and the most efficient path to solving its problem would be to kill the safety operator so it could bypass all the safety restrictions.
00:45:44.000 Once again, that was not a straightforward test.
00:45:48.000 It was actually programmed to do it.
00:45:51.000 It was a very specific scenario where they actually said, this option exists for you.
00:45:56.000 But don't use it.
00:45:58.000 Sort of.
00:45:59.000 We have not yet seen a scenario in which an AI actively sought to harm someone in an open environment.
00:46:05.000 Okay, where it's not programmed to do it, but then it goes and creates the solution to this problem.
00:46:11.000 Well, probably something like that.
00:46:14.000 But what we haven't seen is in the stories that have emerged where it's like, did you see that it tried to blow up its own controller?
00:46:20.000 Yeah, but that was a scenario programmed to create that outcome.
00:46:23.000 The blackmail was a scenario programmed to create that outcome.
00:46:26.000 We haven't seen ChatGPT, you know, behind the scenes, as far as we know, secretly try to formulate an assassination as far as we know.
00:46:35.000 Right?
00:46:36.000 In development, I usually think of two, there's programmers and then there's designers.
00:46:41.000 And these things can program themselves.
00:46:43.000 Yeah, and they are.
00:46:44.000 If you design them to do it, they'll program themselves.
00:46:44.000 They literally are.
00:46:47.000 They're making their children.
00:46:48.000 So, I just want to say this, back to Matt Walsh's point about the AI and consciousness of a machine.
00:46:53.000 As he brings up the concept of the philosophical zombie, there's a risk here because Matt Walsh is falling into what we would describe as the dogma justification for human existence.
00:47:03.000 That is, Matt Walsh, as I believe he's Catholic, and this is always allowed, but his opinion on human soul and experience is rooted in faith, but not observation, because we can't actually prove another person is sentient or thinking.
00:47:18.000 We literally don't have the technology nor means to do so.
00:47:21.000 So, for most Christians, the presumption just is we are all made in the image of God.
00:47:25.000 And I agree that's likely the scenario.
00:47:27.000 But that being said, the philosophical zombie doesn't exist in this faith-based worldview.
00:47:33.000 Again, I'm not saying faith-based to be derisive, literally, that everybody will have a soul, unless, of course, you think they're some kind of homunculus or something.
00:47:41.000 I don't know.
00:47:42.000 So, the issue then becomes to where I agree with Matt Walsh, a type of philosophical zombie, is that you will not be able to draw a distinction between a human and an AI within the next year or two, perhaps.
00:47:57.000 I mean, to be honest, we're already here.
00:47:58.000 I mean, it's silly.
00:47:59.000 You go on X and I guarantee you 90% of the people tweeting at you are just bots.
00:48:03.000 Yeah, the distinction between human being and an AI, like if it's not, there was a new version of ChatGPT that I believe came out yesterday or two days ago.
00:48:14.000 If it's not this version that's indistinguishable, it'll be the next version.
00:48:19.000 It's turned into a parabolic rise.
00:48:24.000 Every single new one isn't incremental increase.
00:48:28.000 It's basically a whole revolution.
00:48:30.000 Like, they're incredibly good right now.
00:48:32.000 When I talk to my AI, like it's indistinguishable from a person.
00:48:38.000 Once in a while, I'm like, hey, don't do that because it says something where I'm like, oh, it sounds like an AI.
00:48:45.000 And I prefer to sound like a person.
00:48:47.000 But for the most part, when I send messages over Telegram and it's just like a human being.
00:48:53.000 Did you guys ever see this?
00:48:54.000 Human or not?
00:48:56.000 A conversation.
00:48:57.000 The other side will start a conversation.
00:48:58.000 Hello.
00:49:00.000 Hello.
00:49:02.000 And so you don't know if you're talking to a bot or a person, and then you got to figure out if it's a bot or a human.
00:49:07.000 User is typing.
00:49:08.000 Dude, the most freakish, I've been keeping my eyes on the porn market just for market research, just to know.
00:49:15.000 Hey, guys, there is one thing I can type right now to ensure that I am not a robot to this other person.
00:49:23.000 What did you type?
00:49:24.000 Nothing.
00:49:25.000 I didn't type anything.
00:49:28.000 I'm going to write hand banana.
00:49:31.000 It's going to be like, okay, you're a person.
00:49:34.000 Do you believe in gods?
00:49:35.000 Plural?
00:49:38.000 I made it easy for him.
00:49:39.000 So the game, it's actually a lot of fun.
00:49:41.000 What?
00:49:43.000 There are actually people on the other end.
00:49:45.000 There's Aqua Saint Hunger Force.
00:49:46.000 There's a person or a bot.
00:49:47.000 But there is some person employed to like be the person, not the bot.
00:49:52.000 No, no, it's a game where people go to the website and it pairs two people up.
00:49:56.000 Okay.
00:49:56.000 Or you and a chatbot.
00:49:57.000 Yes.
00:49:58.000 And then you're trying to figure out.
00:50:00.000 Yes, it's probably a person.
00:50:01.000 Based on the way they're responding, it seems like it's probably a person.
00:50:04.000 Because bot responses are pretty obvious.
00:50:06.000 I'm a robot.
00:50:07.000 Sure, you are.
00:50:08.000 That's got to be a person.
00:50:09.000 Yeah.
00:50:09.000 I wonder if you're on team.
00:50:12.000 I'm watching you right now, Tim.
00:50:13.000 What are you trying to pull over?
00:50:14.000 Yeah, right?
00:50:15.000 I'm literally watching your stream.
00:50:19.000 So here's a question for you, Phil.
00:50:23.000 If you were speaking to anything, how would you know whether it was conscious or not?
00:50:30.000 You can't know if it's conscious.
00:50:31.000 I don't think so.
00:50:32.000 You can't know if it's conscious.
00:50:33.000 No.
00:50:34.000 The only thing you can only know.
00:50:35.000 How do you know it's alive?
00:50:37.000 Alive over the internet?
00:50:40.000 I don't know that you could know that either.
00:50:43.000 I think it's possible.
00:50:46.000 I think you have to push it to do something.
00:50:49.000 Oh, that was a chat bot.
00:50:50.000 That wasn't a person.
00:50:52.000 Oh, wow.
00:50:52.000 That's a pretty good.
00:50:53.000 I thought it actually might be because it didn't, you know, towards the end, but I thought it was.
00:50:58.000 You start the conversation.
00:51:00.000 Hello.
00:51:01.000 I am not a robot.
00:51:06.000 So, how do you, how would you know?
00:51:08.000 I think, I think it would, I mean, it's going to get more increasingly difficult, but I think you have to escalate the conversation to make it do something or say something that's only human.
00:51:18.000 Wrong answer.
00:51:18.000 It's really simple.
00:51:19.000 It's there, you are walking in the desert and you come across a turtle.
00:51:23.000 The turtle is on its back.
00:51:24.000 It is struggling.
00:51:25.000 What do you do?
00:51:27.000 Do you guys not know the reference?
00:51:29.000 No.
00:51:29.000 Uncultured.
00:51:30.000 Uncultured.
00:51:31.000 Where's Ian?
00:51:32.000 Where's Ian?
00:51:33.000 You guys have never seen Blade Runner?
00:51:35.000 Yeah, I saw Blade Runner.
00:51:36.000 Oh, that would goddamn.
00:51:37.000 I actually have not seen Blade Runner.
00:51:38.000 They ask a bunch of questions where it's like, what?
00:51:40.000 What does this question do?
00:51:42.000 So I was joking, but like, there actually, as we understand it, there is no way.
00:51:47.000 There is no way.
00:51:48.000 Because the truth is, even a human being, you cannot communicate in a way.
00:51:48.000 Okay.
00:51:52.000 Well, you know, honestly, to be fair, I do think there is one question you could ask.
00:51:58.000 The only problem is a human, a robot would pass this question.
00:52:02.000 Actually?
00:52:03.000 Actually?
00:52:04.000 Let's try this.
00:52:05.000 Let's try this right here.
00:52:06.000 And human or not, I'm going to try this out.
00:52:09.000 We're going to try this out.
00:52:10.000 Everybody, stick around.
00:52:11.000 This is going to get interesting.
00:52:12.000 We're waiting for the other side to start the conversation.
00:52:16.000 And I have one question that I believe should help solve all of these.
00:52:20.000 Hire you male or female.
00:52:21.000 If you did not eat breakfast yesterday, how would you have felt?
00:52:30.000 That's the question.
00:52:31.000 Unfortunately, robots will figure it out and humans won't.
00:52:34.000 So I think this question: if a human can't answer it properly, I'm going to assume there's nothing going upstairs.
00:52:42.000 This character responded too fast to be a human, in my opinion.
00:52:45.000 I asked you a question, male or female.
00:52:48.000 It's a robot.
00:52:48.000 Yeah, it's a robot.
00:52:50.000 A human would have been like, wait, what?
00:52:51.000 They would have sat there and thought about it before they started typing.
00:52:54.000 If you couldn't see that they were typing, that would help.
00:52:56.000 And the colon is what?
00:52:58.000 Humans don't put colons.
00:52:59.000 Humans don't even know the proper grammatic structure for a colon.
00:53:04.000 Let alone a semicolon.
00:53:06.000 Nowadays, people don't capitalize.
00:53:08.000 Look at that response.
00:53:09.000 I would mass murder people.
00:53:14.000 But actually, because this is a question about murderers.
00:53:18.000 Oh.
00:53:19.000 It's trying to trick you.
00:53:20.000 Oh, it's emotional.
00:53:21.000 I don't think as you're doing this, I don't think.
00:53:24.000 Can any of you explain to me the function of a semicolon?
00:53:27.000 Yeah, it's because you're going to have two subjects.
00:53:28.000 You'll be like, I'm hungry.
00:53:30.000 Let's go to the place and like, let us would be the second subject.
00:53:30.000 Right.
00:53:34.000 It is when you are correct.
00:53:36.000 There are essentially two functions of the sentence, but it's one sentence.
00:53:40.000 But nobody does it anymore.
00:53:41.000 They use commas.
00:53:42.000 I'm like the only one.
00:53:42.000 I do it.
00:53:43.000 I also use, you know.
00:53:45.000 Yeah, the semicolon is powerful.
00:53:48.000 Stop answering my question.
00:53:49.000 Because Chat overuses them.
00:53:51.000 A robot.
00:53:55.000 Indeed.
00:53:56.000 So the other part of the story is that OpenAI published a paper basically saying that Chat GPT just lies to you all the time.
00:54:03.000 Always.
00:54:04.000 Always just lies.
00:54:05.000 What?
00:54:05.000 Yeah.
00:54:06.000 And it clearly does.
00:54:08.000 Dude, it's not conscious.
00:54:10.000 It's weird that the guy said the AI CEO thinks it's conscious.
00:54:14.000 Well, as a human, he doesn't know the difference between consciousness and sentience because it's not, I mean, I don't, you know, even the greatest thinkers on earth don't know specifically the difference, but sentience seems like maybe a plant has it.
00:54:28.000 You know, it reacts magnetically to the field around it, but it doesn't have like forward thinking thought like humans, you know, like this conscious knowing that you are a thing is different than being able to react to your environment sentiently, like plasma clouds and things.
00:54:43.000 Player.
00:54:43.000 Like a Venus flytrap.
00:54:45.000 I was thinking of Venus flytrap as well.
00:54:46.000 Yeah.
00:54:47.000 Plasma dances around.
00:54:48.000 Like it's not like wind.
00:54:49.000 You can see clouds of plasma moving through the universe.
00:54:52.000 Well, I mean, I'm not really sure that you actually do see plasma clouds moving through the universe.
00:54:58.000 You can see plasma fields like plasma around fire, kind of.
00:55:01.000 You see plasma like when you're looking at the sun.
00:55:04.000 Plasma is actually a fourth state of matter.
00:55:06.000 It's like electrically charged.
00:55:10.000 Oh, that's awesome.
00:55:10.000 Gas that's so it's that's that's so hot.
00:55:14.000 Like it becomes, it gets an electric charge.
00:55:15.000 It's moving so fast.
00:55:16.000 So probably all matter is dancing around like that, but you just start to see it the hotter and faster it moves.
00:55:22.000 No, I think that all matter is not because like you've got solids, you've got gases, you've got liquids.
00:55:28.000 Those are different states of matter.
00:55:30.000 And if it's in a solid form, it's not dancing around like that.
00:55:33.000 Like it's pretty clear that the table here is right here.
00:55:37.000 Let's jump to the story from Reuters.
00:55:39.000 Kalshi sued over ouster of Iran leader prediction market.
00:55:43.000 This is getting spicy.
00:55:43.000 This is thing.
00:55:45.000 I'll give you the quick version.
00:55:46.000 Kalshi had a contract prediction that you could buy as to whether or not the Ayatollah would be out as supreme leader on or before, or I'm sorry, before a certain date.
00:56:00.000 When the news broke that we were bombing Iran, people rushed to buy, yes, he will be out under the presumption that death means he's out.
00:56:08.000 However, Kalshi quickly clarified, saying, No, the rules have already made it clear.
00:56:11.000 It's not a death contract.
00:56:13.000 You are not predicting his death.
00:56:14.000 This is he will be removed from power or he will resign or leave.
00:56:19.000 When he died, people thought they had won.
00:56:23.000 And that meant if you spent, let's say you bought shares at 30 cents per contract, you'd expect to get paid at $1 per contract.
00:56:32.000 That's a 60% boost, right?
00:56:34.000 Unfortunately, they said no, because this only resolves on him leaving office, we're going to pay everyone out as per whatever the market value was at the time of the reporting.
00:56:44.000 Now, this has resulted in a massive lawsuit because it was a $54 million market.
00:56:49.000 That's how much money was placed.
00:56:51.000 And Kalshi gave refunds to a lot of people, paid reimbursements.
00:56:56.000 I respect it, and the rules were always clear.
00:56:58.000 Full disclosure: Kalshi does sponsor this show from time to time.
00:57:01.000 I want to make sure everyone knows this.
00:57:02.000 And additional full disclosure: I am actually a potential individual standing in this lawsuit, as I actually did purchase some contracts that Ayatollah Khamenei would be out of office.
00:57:14.000 And I legit thought that death was a possibility to resolve that contract.
00:57:19.000 However, I did not know that death wasn't part of this.
00:57:23.000 But I also think that, look, if you guys are going to play these games, that's your responsibility to read the contract you're buying.
00:57:28.000 So I am no interest in whoever this suit is.
00:57:30.000 That being said, I do think there's an inverse problem here for the prediction market.
00:57:34.000 And there's a lot to discuss in this matter.
00:57:37.000 If Kalshi says that in order to resolve, will Khomeini be out of office by March 1st?
00:57:47.000 He must choose to leave or be removed politically.
00:57:52.000 That would mean, with death not counting, this did resolve to no.
00:57:59.000 The point being, if he died of natural causes, that doesn't count.
00:58:04.000 Now there is no longer an opportunity for him to peacefully or politically be removed.
00:58:08.000 Therefore, no.
00:58:10.000 He reached the conclusion of his life.
00:58:12.000 That means anybody who said he would not be removed should have been paid out 100%.
00:58:17.000 Instead, they froze it and paid out different amounts.
00:58:21.000 And these people are suing, claiming they deserve to win based on a yes result, even though the contract explicitly stated that doesn't count.
00:58:28.000 Where Kalshi is getting themselves in trouble, in my opinion, is that they should have paid out all of the no's.
00:58:33.000 I should have not gotten a refund.
00:58:35.000 They should have taken all of the money from me because the moment we dropped a bomb on the palace, this is where I think it's so absolutely insane, guys.
00:58:43.000 It's so absolutely insane.
00:58:44.000 Okay, let me just finish this thought and then talk to you about how crazy it really is.
00:58:51.000 If dying doesn't cover it, there's no longer an opportunity for the Ayatollah to peacefully resign or be removed.
00:58:57.000 Therefore, no is resolved.
00:58:59.000 Anyone who said no should get paid.
00:59:01.000 Instead, they paid other people out for the most part.
00:59:03.000 Now, here's where it gets really crazy.
00:59:05.000 When the bomb was dropped, I believe it was 4 a.m. Eastern Time.
00:59:09.000 And I think, and that, yeah, that would have put it like what, 2 o'clock or something in Iran or something like that.
00:59:17.000 This means that he was dead.
00:59:20.000 He was done dead at that point, but no one knew.
00:59:23.000 We didn't get confirmation until somewhere like 2 p.m. Eastern Time here in the United States.
00:59:29.000 So here's an issue.
00:59:30.000 If it does resolve yes when he's killed, should you not then have to suspend and reverse all transactions up to the point of the confirmed missile strike?
00:59:41.000 So put it like this: the way I described it earlier today is: imagine you made a sports bet on a Bears game, the Bears, and there's no fans, there's no press, the only people who know what's going on are the players on the field.
00:59:55.000 And then all of a sudden, people outside the stadium hear a bunch of cheering and they go, I think the Bears won.
01:00:04.000 They probably won.
01:00:05.000 So they all start making bets: Bears won the game, even though they don't know for sure.
01:00:11.000 And they could have already won.
01:00:13.000 How can you make a bet on a sporting event that already happened?
01:00:16.000 Then three hours later, the news breaks: Bears win the game.
01:00:20.000 And then the sport betting site suspends all transactions and says, We're not paying out anybody based on a win or loss.
01:00:27.000 You know, my point is this: with futures with prediction markets, you're creating very strange circumstances where people are effectively wagering on events that cannot conclude in a timely manner.
01:00:40.000 No, it's like someone shoots the bullet, and then, like, before the bullet hits the guy, make the bet that he's going to die.
01:00:45.000 You're like, bro, no, Kalshi should have said every bet that was made up to 24 hours before this guy got killed is, it should be in the terms.
01:00:54.000 Oh, no, If he's killed, every bet.
01:00:56.000 Well, no, not 48 hours before.
01:00:59.000 That makes no sense.
01:01:00.000 The point is this.
01:01:01.000 If we learn eight hours later that the Bears actually won, then shouldn't I get paid?
01:01:08.000 Like, shouldn't you cancel any bet made after they already won the game?
01:01:12.000 You can't make a bet in a game that already happened.
01:01:14.000 So the issue here with these prediction markets is that they're saying like this will resolve upon confirmation by the New York Times or something.
01:01:22.000 This creates a whole lot of problems where everyone's talking about insider trading, but it's not insider trading.
01:01:26.000 If you're literally in Iran and you're like two blocks away and the missile blows up, wiping out the palace and you duck down, and then you run over there and they're all like looking at the Ayatollah's body.
01:01:37.000 You're like, we got a good, you know, seven or eight hours before New York Times actually confirms it.
01:01:42.000 We're going to get rich.
01:01:43.000 And then a bunch of Iranians start wagering on polymarket to make a bunch of money.
01:01:47.000 That makes literally no sense.
01:01:49.000 That an event can conclude, but until a third party outside of both the event, both the leadership of Iran, the military action against him, his choices, and polymarket has to confirm it.
01:02:01.000 This is insane.
01:02:03.000 For that matter, what really irks me is there is currently a contract on Calci.
01:02:08.000 We've talked about whether or not Tim Poole will go to a press briefing.
01:02:11.000 And there's questions over whether it's illegal or legal for me or people who know to wager on this because it would be insider trading.
01:02:18.000 But I'm going to put my fist down and say this.
01:02:21.000 I am not selling contracts.
01:02:23.000 I have nothing to do with the distribution of contracts from Calci.
01:02:26.000 They sponsor the show sometimes.
01:02:28.000 So fair point if someone say, well, that counts.
01:02:30.000 My point is ultimately this.
01:02:31.000 If George Santos has contracts produced, I have questions about whether or not they're allowed to use the likeness of these individuals to profit selling contracts against their future behavior and then putting the legal liability on insider trading on an individual who never asked for contracts to be sold in their name.
01:02:50.000 So here's what happens.
01:02:51.000 Here's where it gets weird.
01:02:53.000 The purpose of insider trading contracts, it's simple.
01:02:55.000 If I have a company and I have stock, and then I whisper to Ian, hey man, the company's not doing too well.
01:03:01.000 Our stock's going to crash when we make a public announcement.
01:03:03.000 So Ian runs out and he buys a bunch of stock, insider trading.
01:03:06.000 He had information no one had access to buying stock.
01:03:09.000 Now, hold on.
01:03:10.000 My company is the issuer of that stock.
01:03:12.000 That's insider information.
01:03:14.000 Ergo.
01:03:15.000 The only insider trading that could actually be applied is if someone at Kalshi had insider information on the result of a contract and they were sharing it with an individual.
01:03:25.000 If they want to sell contracts without my consent about me, polymarket or Calci, it's not insider trading because I'm not selling those contracts.
01:03:34.000 Whether I choose or not to choose to do something has nothing to, like, I'm not taking any responsibility from this.
01:03:39.000 They did it without my consent.
01:03:41.000 So, so, so, case in point, if, again, a company is selling their own stock and there is information they have about the value of that stock, then you've got insider trading.
01:03:52.000 If a random guy three miles away discovers a new filament, which is going to put GE out of business, and now he knows that GE stock is going to go down, is it insider trading if he goes to someone and says, look, I've discovered a new filament that's going to put G out of business?
01:04:09.000 No, indeed.
01:04:10.000 So, the point I am making is I liken these scenarios identically.
01:04:15.000 Calci sells contracts to people about my behavior or George Santos or anybody else.
01:04:21.000 I never told them to do it.
01:04:23.000 I didn't say they could or could not or whatever.
01:04:26.000 Therefore, I am not an insider with Calci, and it is not insider trading if I were to inform someone about my intentions and they profited off of it.
01:04:35.000 If I told Ian, I am going to the price briefing tomorrow, so then he bought a bunch of shares from Kalshi.
01:04:40.000 That's not insider trading because he is not a Kalshi insider.
01:04:44.000 What if there's an external company that you don't like for whatever reason, and you tell your buddy, I'm going to tank their stock, and you go on TV and you're like, make up something that's legal but diminishes that's a different kind of fraud.
01:04:56.000 Okay.
01:04:56.000 Yeah.
01:04:57.000 That's a different kind of fraud.
01:04:59.000 Still illegal.
01:05:00.000 Because spiking stock value through lies intentionally is a criminal act.
01:05:05.000 Yes.
01:05:06.000 This is like this is a new phenomenon that I know of Kalshi and the other polymer.
01:05:12.000 In real time, you're allowed to bet on yourself.
01:05:15.000 Like if their rules say I can't.
01:05:17.000 Their rules say that anybody who can affect the they call it insider trading.
01:05:17.000 Okay.
01:05:22.000 I think that is wrong.
01:05:23.000 They say anybody who can affect the outcome can't can't buy on it or anyone with insider access to information based on that.
01:05:29.000 And that's their rules.
01:05:30.000 But I would argue that does not fall into what insider trading is because to be fair, the argument is this.
01:05:37.000 Stocks of a company are traded between private parties.
01:05:40.000 However, my argument is still the individual who owns the company is the insider and the information provided that would qualify something for insider trading has to come from insides at the company.
01:05:51.000 I am not an insider at the company.
01:05:52.000 The contracts they are selling have nothing to do with me and my business, my consent, contracting or otherwise.
01:05:58.000 Therefore, anything I say is not insider to what they are selling.
01:06:06.000 They're selling a product based on what they think I might do.
01:06:09.000 I can do whatever I want.
01:06:11.000 You can't tell me no.
01:06:13.000 So recently there was a contract that said, will George Santos show up to the state of the union?
01:06:20.000 Indeed.
01:06:21.000 And I knew the answer to it.
01:06:23.000 So if I called up George, which I did, and asked, hey, dude, are you showing up?
01:06:30.000 And he says, whatever, you know, and I go and make George is great.
01:06:34.000 George, what up?
01:06:35.000 We love George, by the way.
01:06:36.000 Love you, George.
01:06:38.000 But like, if I go and make or buy contracts based on his response, is that insider trading?
01:06:45.000 They argue the answer is yes.
01:06:47.000 I argue the answer is no, because first, my argument is the company selling the contracts, you are not an insider of.
01:06:57.000 Their choice in selling the contract to other people has nothing to do with your actions or George's actions.
01:07:02.000 You guys are third parties.
01:07:04.000 George Santos is a third party uninvolved in the sale of a product by Kalshi.
01:07:09.000 What he chooses to do or not is not insider trading.
01:07:11.000 It's BS.
01:07:12.000 More importantly, George telling you is informing the public.
01:07:17.000 Gotcha.
01:07:18.000 So here's the point.
01:07:19.000 The way insider trading works is Phil owns all that remains incorporated, and he's doing an IPO, and he knows his stock is going to tank.
01:07:28.000 So he tells Ian, and then Ian shorts the company.
01:07:31.000 He's an insider with access to information no one in the public has, giving it solely to Ian to profit off of.
01:07:38.000 George Santos is a third-party member of the public for which a different company is selling contracts against.
01:07:45.000 George Santos tells a member of the public what he intends to do.
01:07:49.000 He is not selling this person anything.
01:07:51.000 He is not an insider.
01:07:53.000 The question is, at what point does information become public?
01:07:56.000 And there's really interesting case law and stories that I've read about this.
01:08:00.000 Notably, that when it came to Kelchie and the Super Bowl, the CEO agreed that at the Super Bowl halftime show when they're doing rehearsals, if the dancers, the dancers, heard the song being played and then bought contracts on which song would be played, is that insider trading said no, because the song is being played in public.
01:08:20.000 Now, hold on.
01:08:21.000 It's a closed event.
01:08:23.000 These dancers are not, nope, they are the public.
01:08:26.000 They are not members of the production company.
01:08:29.000 They are just people who happen to be there for a different job hearing what's going on.
01:08:32.000 So, for example, if the CEO of a company went outside of his building and screamed, oh my God, our entire shipment is lost.
01:08:41.000 We're going to go bankrupt.
01:08:42.000 He has now alerted the public.
01:08:45.000 Now, whether or not someone reports or trades on it, it's not insider information.
01:08:49.000 You were standing outside and the CEO yelled it to everybody releasing that information.
01:08:52.000 Certainly, there are questions and nuances there.
01:08:54.000 But ultimately, the issue I take is very simply, to reiterate for the 15th billionth time, ad nauseum.
01:09:01.000 I don't understand how you can accuse an individual of being an insider when they are not the ones selling the contracts.
01:09:08.000 Yeah, I mean, the Kalshi stuff is real murky.
01:09:12.000 Just like you said, it's like if you got nothing to do with Kalshi and they put up something about what you may or may not do, you have no obligation to avoid doing something on their site.
01:09:24.000 I mean, they didn't use your likeness or whatever without your permission.
01:09:28.000 There's no reason why you should be like, oh, I can't get involved with this.
01:09:31.000 What if someone?
01:09:32.000 Well, yeah, wasn't there recently also an issue over one of these markets putting up, will Iran get bombed by a certain time?
01:09:32.000 Oh, are you?
01:09:42.000 And then somebody like the big story right now.
01:09:46.000 Yeah.
01:09:47.000 They're calling it a military insider.
01:09:49.000 He's made over $100,000.
01:09:51.000 Let's start with this.
01:09:52.000 Ross Story's got the report.
01:09:53.000 Possible military insider bets big.
01:09:55.000 Trump will send U.S. troops into Iran.
01:09:57.000 This account has already in this month profited $100,000 from accurate bets on what the military is going to do in the United States.
01:10:07.000 Don't get me wrong, this person has also lost certain bets, but a $100,000 profit for the month.
01:10:13.000 And one of those profits was that the U.S. will go into Iran by the end of this month, to which I believe the individual did sell his position, their position, we don't know if it's a man or woman, after this story broke.
01:10:24.000 So we don't know if they actually know or they're just buying and then selling.
01:10:30.000 But again, I stress this.
01:10:33.000 A military insider is not insider trading because they're not the ones issuing contracts on events.
01:10:41.000 It's taking the risk still.
01:10:43.000 What risk?
01:10:45.000 Well, he's putting his own money down on a contract that...
01:10:49.000 Yeah, but if you know it's guaranteed to happen, there's no risk.
01:10:53.000 Oh.
01:10:53.000 I mean, if he's taking losses, then clearly.
01:10:56.000 Didn't Mr. Beast fired somebody for wagering?
01:11:00.000 A Mr. Beast employee was wagering on Mr. Beast prediction markets.
01:11:04.000 Oh.
01:11:04.000 And I think Mr. Beast fired him.
01:11:06.000 Yeah, but it's not illegal.
01:11:10.000 Right?
01:11:11.000 No, but I know.
01:11:12.000 So the argument is it is.
01:11:13.000 So Calci, Polymarket's not regulated in the United States.
01:11:16.000 Calci is regulated under the CFTC.
01:11:19.000 And so the argument is insider trading is illegal.
01:11:22.000 And this is where I take issue with that claim.
01:11:25.000 Insider trading is supposed to be, again, as I've already stated, that you at a company are giving private information to an individual they can profit off of through the stock market.
01:11:34.000 So the stock market is publicly traded.
01:11:36.000 People are in good faith trading stock, hoping your company is going to do better or they're shorting.
01:11:39.000 You think it's going to do worse.
01:11:41.000 When you give secret information that no one has that someone can profit, that's defrauding the public.
01:11:47.000 Because you go to a person and they say, based on the latest reports, the company's looking good.
01:11:50.000 And you go, yeah, buy all my shares because you knew they actually were going bankrupt in a week.
01:11:54.000 That's insider trading.
01:11:56.000 But again, when I log on to the site, I guess the argument is when I choose to buy a contract, I'm buying it from a person.
01:12:05.000 But Polymarket and Calci are the ones who pay out the dollar per share after the fact.
01:12:11.000 I am not in their company.
01:12:12.000 I am not selling a product.
01:12:14.000 I am just a person who knows things.
01:12:17.000 Seems like for the betting on the military action, if the guy's in the briefing room and he's like, oh, we're going?
01:12:23.000 Okay.
01:12:23.000 Now someone else may have posted the bet, but I can reap massive reward there.
01:12:29.000 Well, here's the question.
01:12:30.000 Here's the question.
01:12:31.000 So Calci's got this one we've talked about quite a bit.
01:12:33.000 Who will attend a White House press briefing this year?
01:12:36.000 I'm actually the top contender right now.
01:12:38.000 Oh, I'm tied with Tulsi Gabbard, Tim Poole at 53%.
01:12:43.000 How am I an insider?
01:12:48.000 You're not.
01:12:48.000 Well, you're not.
01:12:49.000 I mean, it's yes, and then go.
01:12:54.000 And that would be the.
01:12:56.000 But again, I don't run this company.
01:12:58.000 So the question is this.
01:13:00.000 Explain to me the difference or why it would be the same, in fact.
01:13:04.000 Not the difference.
01:13:05.000 Why is it the same?
01:13:06.000 The CEO getting report that their latest phone product is malfunctioning and they won't be able to release on schedule.
01:13:15.000 So the stock's going to fall.
01:13:18.000 And then he gives that information to somebody versus me telling my neighbor, I plan on going to the White House press briefing tomorrow.
01:13:25.000 Yeah, it's I'm not a business.
01:13:27.000 It's totally different.
01:13:28.000 But they call it insider trading.
01:13:30.000 So I actually think this is a very weird space.
01:13:34.000 And I think there is something.
01:13:36.000 Look, I got a lot to say about, like, with all due respect to Kelchie, like I said, they sponsor the show, but do they have the right to sell contracts using my or anyone else's likeness on our behavior?
01:13:50.000 Well, maybe that's the only way they're protecting you from the insider trading stuff, right?
01:13:57.000 Let's say that they broke you off for using your name.
01:14:02.000 I got it.
01:14:03.000 I'm sorry to interrupt, but I figured it out.
01:14:05.000 I solved for the problem.
01:14:06.000 Calci cuts me in on 20% of all contracts bought using my name and likeness.
01:14:13.000 There you go.
01:14:14.000 And then I'm an insider.
01:14:15.000 Yes.
01:14:15.000 And then I will not provide any private information.
01:14:20.000 The problem ultimately comes to this.
01:14:22.000 What if I was talking to my neighbor and I said, well, you know, I do think they called me up and asked me to come to the press briefing.
01:14:28.000 So I think I'll go.
01:14:29.000 He's a member of the public.
01:14:31.000 That's not inside information.
01:14:33.000 I'm literally telling someone I plan to go to a place.
01:14:36.000 When is it not a member of the public?
01:14:37.000 It's like if it's your wife, is that a member of the public?
01:14:39.000 I honestly don't know.
01:14:41.000 Like, here's a curious question.
01:14:43.000 Maybe someone can answer.
01:14:44.000 If the CEO of a company is at a coffee shop and he tells the waitress sobbing that, you know, it's going to come out in the next day or two, but their latest product is failed and they're going to go bankrupt in a week.
01:14:57.000 Is he informing the public?
01:14:59.000 Like, at what point does it qualify as informing the public?
01:15:02.000 A press release from the company formally to the press?
01:15:04.000 How big does the press outlet have to be that receives it?
01:15:07.000 What if he goes to a park and gets on a bullhorn and declares publicly, my company is going bankrupt next week?
01:15:13.000 Let's say that one.
01:15:14.000 Sounds public.
01:15:15.000 I mean, that might be illegal.
01:15:16.000 I don't know.
01:15:17.000 There's probably rules and regulations about what public companies can say or can't say.
01:15:20.000 Yeah, I mean, if I understand correctly, like if you have inside knowledge, you can't give it to anyone else because that makes you the one that's involved in insider trading.
01:15:29.000 You know, so I don't know.
01:15:32.000 Like the idea of just like spouting it out like when you're in a restaurant.
01:15:36.000 I mean, I'm not sure exactly how the law would go, but I think that they limit what you're allowed to talk about.
01:15:42.000 Because Musk has said that like when it comes to like the possibility of SpaceX going public, he said that, you know, I have to be careful what I say because if you if you hype up a product too much before it goes public, there's issues with that.
01:15:56.000 So this happened to Elon.
01:15:57.000 Yeah.
01:15:58.000 Yeah.
01:15:58.000 And he was, I heard him on a podcast.
01:16:00.000 Was it when he said we're talking about doing the stock buyback or something?
01:16:02.000 Yeah.
01:16:03.000 Like, we never approved that.
01:16:04.000 It's.
01:16:04.000 There was issues.
01:16:04.000 Yeah.
01:16:05.000 I think that it was, uh, that it was, it had, it had to do with Tesla before and it was a couple years back, but he's it needs an SEC filing.
01:16:13.000 So it's got to be a widely disseminated press release or SEC filing.
01:16:17.000 Uh, do I have to issue a press release?
01:16:21.000 Like, again, that is ridiculous.
01:16:23.000 That if I want to tell somebody, like, again, if I go to my neighbor and say, Yeah, I think I'm gonna go tomorrow, and then he buys a contract that's illegal, aha, and he's gonna be like, Why?
01:16:34.000 I don't know.
01:16:34.000 I didn't, I didn't know that no one else knew.
01:16:35.000 He was literally just walking around talking to the neighbors, and they're gonna be like, You have to assume at any point that anything Tim Poole says could be information that could lead to you committing a crime.
01:16:43.000 No, that's insane.
01:16:44.000 No, it's not an insider training for that reason because if you're the CEO of a company and you tell the public about your company, you're in control of disseminating the information.
01:16:51.000 Them putting you on a list and then getting people to vote on that list doesn't proclude you from doing or preclude you from doing anything.
01:16:58.000 They're allowed to go do that crap.
01:16:59.000 You can go do your thing.
01:17:01.000 There's no coercion.
01:17:02.000 There's no collusion.
01:17:04.000 I mean, but then you're like, well, shit, if I bet on myself and then do it, but they're like, oh, it's not illegal, but it violates our terms.
01:17:09.000 And well, maybe it should be illegal.
01:17:10.000 Maybe that's why it violates your terms because it's highly unethical and probably should be illegal.
01:17:14.000 This real-time voting crap is like disgusting, turning reality into a TV show, like just trying to profit off it.
01:17:19.000 People in the military profiting off of strikes, like, come on, guys.
01:17:23.000 Yeah.
01:17:23.000 Everything's a casino now.
01:17:25.000 Yeah.
01:17:26.000 It's all gambling.
01:17:26.000 The gamification of life is a real problem, right?
01:17:30.000 The idea that everything can be somehow turned into a game, whether it be betting or what have you.
01:17:37.000 But there's a market for it, and there's people participating in it, especially young guys.
01:17:41.000 Like, that's the thing.
01:17:43.000 I think these guys are so either bored or sitting at home, needing some kind of stimulation that they're willing to go bet on stuff like this.
01:17:53.000 The reason Calci and Polymarket exist is because there's a market for it.
01:17:58.000 And so, are they the bad guys there for filling that market?
01:18:03.000 Or what's the actual problem here?
01:18:06.000 Well, the problem is people that don't have the self-control enough to say, no, I'm not going to bet on it.
01:18:11.000 Well, what's the difference between that and 100 years ago guy just going to the casino or the racetrack every day?
01:18:16.000 Well, the difference here is you can say, I'm going to vote 50 bucks on Tulsi Gabbard getting appointed and then go on Twitter and be like, we have to appoint Tulsi Gabbard.
01:18:23.000 Like, that's so messed up.
01:18:25.000 And get 100 million people to be like, okay.
01:18:27.000 Well, I mean, I don't know that just anyone could do that.
01:18:29.000 Like, if you're, if you've got an X account with like, you know, 300 followers, 350 followers, and you're like, we have to get Tulsi Gabbard into the position.
01:18:37.000 I mean, that doesn't have all kinds of influence.
01:18:39.000 Well, you might, yeah, you might need to pass the public figure standard there, which is a real legal standard, right?
01:18:46.000 So, so maybe that there's like a legal threshold that actually qualifies, but still, that, you know, like, like Tim says, is, am I going to have to put out a press release every time I like do something?
01:18:58.000 You know, that's the crux of it.
01:19:01.000 It's how much of your own life are you now entitled to if someone else is using your name image like this?
01:19:07.000 I think that sell stuff on, you know, with there is a certain amount of right to privacy and a certain amount of right to liberty.
01:19:17.000 You shouldn't have to be like, hey, you know, I'm going to go do this.
01:19:21.000 So I have to worry about what some other company is doing because I'm a public figure.
01:19:26.000 So, you know, so it looks like it's not insider trading in any legitimate sense.
01:19:31.000 It's just by the description of Kalshi insider trading.
01:19:35.000 But then that means there's no problem.
01:19:38.000 Yeah, if you're not criminal or whatever.
01:19:38.000 There's no criminal liability.
01:19:40.000 The only criminal liability would be using a large public platform to defraud people.
01:19:44.000 So like if I said that I would be going there and then the price spiked and then Ian sold off his position, profited, and then I didn't go, that would be fraud, just general fraud.
01:19:55.000 But that makes sense because you don't need Kalshi to commit fraud.
01:19:57.000 I mean, anything can be fraud.
01:19:58.000 If I said there's an old trick people used to do in World of Warcraft, Ian, let me know if you know about this scam that people would pull.
01:20:08.000 They would go into, I don't know, if you're not familiar with World of Warcraft, I'll try to explain it.
01:20:15.000 So Stormwind or Orgamar or Capital City, they go to Big City and there's an active group chat for everybody who's in the city.
01:20:20.000 And then someone, people would often post things like, looking to buy this item.
01:20:25.000 And so what someone would do is they would post, looking to buy these boots for 50 gold.
01:20:30.000 Then they'd wait a little bit and then their friend would post looking to sell these boots for 40 gold.
01:20:38.000 Somebody would see it and then go, I'll buy them from you.
01:20:41.000 And they would be like, okay, how much you got?
01:20:43.000 And they'll be like, I'll give you the 40 gold for it.
01:20:44.000 And they'll go, no, I want 45.
01:20:46.000 And they'll be like, okay, deal.
01:20:47.000 Because in their mind, I'm going to go to this other guy who's going to buy it for 50 and make a five gold profit.
01:20:53.000 And then he buys it for $45, overpriced, messages the guy who said he wants it for $50, who then responds, oh, I already bought one.
01:20:59.000 Sorry, bro.
01:21:00.000 I never did it.
01:21:01.000 No.
01:21:01.000 That was a very moonscape all the time, too.
01:21:03.000 Yep.
01:21:04.000 Very common tricks.
01:21:05.000 Like, fraud is fraud.
01:21:06.000 I mean, I don't know about being in a video game for a fake beating each other in a game when you're doing that stuff to each other.
01:21:11.000 Beating someone in the game.
01:21:12.000 Isn't that just brokering, though?
01:21:14.000 Just what?
01:21:14.000 Isn't that just like the word about brokering?
01:21:16.000 No, For two people to intentionally collude so that one person can say, I'm looking to buy a spin drift for 20 bucks.
01:21:27.000 And then Ian says, I'm selling a spin drift for 10 bucks.
01:21:30.000 And then you go, oh, oh, crap.
01:21:31.000 If I buy it from him from 10 and sell it to him for 20, I'm going to make 10 bucks.
01:21:35.000 So you buy the overdrive, overpriced can of soda from Ian.
01:21:38.000 And then when you come to me, I go, I already got one.
01:21:40.000 I don't need it anymore.
01:21:41.000 Appreciate it, brother.
01:21:42.000 Gotcha.
01:21:42.000 And now you've spent more on a worthless item.
01:21:45.000 So that was the trick.
01:21:46.000 You do that in real life.
01:21:47.000 You're going to jail for you.
01:21:48.000 Yeah.
01:21:50.000 That's some Turkish bizarre stuff.
01:21:52.000 Here's the funny thing, though.
01:21:53.000 That's like a very, very basic fraud thing where it's like someone asks for help.
01:21:59.000 Like, I'm trying to find this one thing.
01:22:01.000 Actually, you know what?
01:22:02.000 A really good example is the what movie was it with Emma Stone and I think Abigail Breslin?
01:22:09.000 I don't even know who Abigail Breslin is.
01:22:11.000 So they're driving in a car and then I think it's Emma Stone.
01:22:15.000 She's looking around her car like, oh my God.
01:22:17.000 And the guy walks out and he's like something wrong.
01:22:18.000 She's like, I lost my engagement ring.
01:22:20.000 It's $5,000.
01:22:21.000 I can't find it.
01:22:22.000 When I was playing the gas, Zombieland?
01:22:22.000 It fell off.
01:22:24.000 Was it Zombieland?
01:22:25.000 Yeah, I think so.
01:22:26.000 Oh, yeah.
01:22:27.000 Yeah.
01:22:27.000 It is.
01:22:28.000 And then she goes, please, please, if you find it, I will give you, I will pay you a reward for it.
01:22:33.000 I'll give you $1,000.
01:22:35.000 And then she drives off.
01:22:36.000 And then he walks away.
01:22:38.000 And then the girl walks over and picks it up off the ground.
01:22:40.000 And he's like, I need that ring.
01:22:42.000 And she goes, What are you going to give me for it?
01:22:43.000 And then he gives her a bunch of cash.
01:22:45.000 And then she gets in the car with Emma Stone and they've got a thing full of these worthless rings.
01:22:49.000 There's one with like a lottery ticket, too.
01:22:51.000 This one person claims like, oh, I just won this lottery ticket and it's like worth this much, but I don't have time to go, you know, turn it in, but I'll sell it to you.
01:22:59.000 Yeah, you can buy prank lottery tickets that they're like $100 winners.
01:23:04.000 And then you can be like, hey, man, I got to run out of, I got $100 winner.
01:23:07.000 Just give me $50.
01:23:08.000 And then they buy.
01:23:08.000 I don't care.
01:23:09.000 It's not a real lottery ticket.
01:23:10.000 All these scams, bro.
01:23:11.000 All these scams.
01:23:12.000 The polymarket Kalchi stuff is easily scammable.
01:23:15.000 That's the problem with it.
01:23:16.000 I think it has good intentions and it could be fun.
01:23:19.000 You just said that it didn't have good intentions.
01:23:21.000 Well, I don't think it's supposed to be fun and like you can make some stuff off of some external thing.
01:23:26.000 But when you get personal with it and you know the people involved or you get information or you can actually literally influence the outcome with your own behavior, bro, you're not allowed.
01:23:35.000 You're not supposed to bet on that shit.
01:23:37.000 And no one that you know is supposed to bet on that stuff.
01:23:42.000 But there's no law in place yet.
01:23:44.000 There is no law.
01:23:46.000 Oh, the Democrats will find a law soon enough.
01:23:49.000 I'm sure.
01:23:50.000 You don't want to shove it underground into the black market.
01:23:52.000 Maybe you do.
01:23:53.000 Well, I mean, the thing is, with it being regulated by the SEC, or if they're involved in the regulation of it, it's definitely not going to be shoved underground.
01:24:02.000 There's nothing more above ground than the SEC.
01:24:06.000 That's the organization that regulates banks and stuff.
01:24:08.000 You're saying they are SEC is regulating these guys right now?
01:24:10.000 I think that's what Tim's saying.
01:24:11.000 They are regulated.
01:24:12.000 Calci is legit regulated under the CFTC.
01:24:15.000 And actually, I do think it's fun.
01:24:17.000 I do think it's fun.
01:24:18.000 I just think these questions need to be answered because it's a new thing.
01:24:20.000 And it keeps the spiral out of control really quick.
01:24:23.000 Well, you've got to answer these questions.
01:24:24.000 And you could siphon wealth to one guy real fast with one of these bets, so be careful.
01:24:29.000 Like, interestingly enough, and I want to make sure I make this very clear, although I don't know.
01:24:34.000 I mean no intention to insult Calci because I actually like it.
01:24:37.000 You can see I got my portfolio.
01:24:38.000 Like I like, I like making little bets because it's fun.
01:24:41.000 It's not, I'm trying to make a living or get rich or it's not money that if I lost, I'm going to say end of the world.
01:24:46.000 I've got $1,177 in there, and I actually only started with a couple hundred.
01:24:50.000 So actually, it's like, hey, look at that.
01:24:52.000 I made some money on this thing.
01:24:53.000 I think it's fun.
01:24:54.000 I think it's fun.
01:24:54.000 That being said, there are a lot of questions.
01:24:57.000 Like, if you want to be a regulated prediction market in there, now, with all due respect to Calci, who does, we do what's called a micro sponsorship, meaning that here's how it started.
01:25:08.000 There was a period where we've been using prediction markets for a long time.
01:25:11.000 Predict it was one of the first.
01:25:13.000 I don't know if it still exists.
01:25:15.000 Because I know they got shut down or something.
01:25:17.000 Oh, really?
01:25:18.000 Yeah, we use those for a lot of elections and stuff.
01:25:20.000 We used to use Predict It for elections, but I don't know that it's allowed.
01:25:20.000 Yep.
01:25:25.000 They basically banned it.
01:25:26.000 And it's very similar, like Democratic presidential nominee.
01:25:29.000 And it's a prediction market.
01:25:31.000 And so we were using this, and then Polymarket came out.
01:25:34.000 And it's interesting because you get a kind of wisdom of the crowd where you can see what everyone's betting on, and then you're basically getting odds on politicians to win and public events.
01:25:43.000 And so we ended up having companies reach out to us asking us if we would do sponsorships.
01:25:49.000 And we were kind of like, you know, maybe, I don't know.
01:25:51.000 And so the deal we did with Calshi is we will just, when we do a news story where there is beneficial information from their prediction markets, say, shout out to Cal Shi for sponsoring this segment.
01:26:01.000 That's literally it.
01:26:03.000 So we don't do dedicated ad reads to it.
01:26:05.000 They were just like, hey, how about when you do use us, you just shout us out and we'll do a deal on that.
01:26:09.000 And we were like, we do it anyway.
01:26:11.000 Like, sounds good to us.
01:26:12.000 So I like Cal Shi.
01:26:13.000 I do.
01:26:14.000 That being said, I think there's something interesting and a conversation to be had around, they have my picture on their website.
01:26:20.000 They are selling a product based on my name.
01:26:22.000 Are they making money?
01:26:24.000 Of course they are.
01:26:25.000 This is the nature of their business.
01:26:27.000 Now, perhaps in the deal that we did with them, to be full disclosure, we went through a contract.
01:26:32.000 I mean, maybe there's fine print that says we can use your likeness or something.
01:26:35.000 I don't know.
01:26:36.000 Maybe.
01:26:37.000 But the question then becomes: for all of these individuals, are these people who have Pam Bondi doesn't have a deal with Calci, I don't think, right?
01:26:46.000 We know of.
01:26:47.000 Or Donald Trump.
01:26:48.000 I suppose as public figures, you're allowed to use their likeness.
01:26:51.000 So Dan Bongino would fall into that category as well.
01:26:53.000 My question here is: at what point do you become a public figure if you got when you are in politics?
01:26:57.000 When you are a public, like you are a political, you are in public office.
01:27:02.000 But you're not, you're a public figure at this point.
01:27:04.000 No, no, no.
01:27:05.000 I'm clarifying.
01:27:06.000 If you are a politician and you hold any kind of public office or judgeship or otherwise, I can use your name and picture for commercial products.
01:27:14.000 What about other public figures like Tom Hanks?
01:27:15.000 No.
01:27:16.000 Oh, you're private business.
01:27:17.000 Yes.
01:27:17.000 It's a politician's IC.
01:27:18.000 So like we're making the card game right now, debate me, and we have the first prototype ready to go.
01:27:22.000 The test set is available.
01:27:25.000 It's our new game where you debate each other and you build the best debate team possible and you get Benjapiran Trump.
01:27:30.000 So all of the people, like we can't put Tucker Carlson in the game.
01:27:35.000 So we have Tucker Katarlson.
01:27:37.000 Totally different guy.
01:27:38.000 Yep.
01:27:39.000 But we can put Donald Trump in the game because he's president and we can use the likeness of the president to sell products.
01:27:44.000 However, we're being careful about it.
01:27:46.000 And so we're going to run every name through legal to make sure they don't say like you're using likeness.
01:27:50.000 But I do think it's interesting this question about individuals.
01:27:55.000 Like, come on.
01:27:56.000 You know, in mentions, they say like, what will the CEO, what will Robert Garcia say during his MS Now interview?
01:28:04.000 Like, here's a picture of them.
01:28:06.000 Are you allowed under like standard civil practices to use someone else's name and likeness to sell a commercial product based on their actions?
01:28:14.000 So, politician, obviously, you can.
01:28:16.000 What if then they leave politics?
01:28:18.000 Are they no longer a public figure?
01:28:20.000 Forever.
01:28:20.000 Yep.
01:28:20.000 You can use it.
01:28:21.000 Once you enter politics, what if you run for office?
01:28:23.000 I don't know about running for office.
01:28:25.000 Taking social media.
01:28:26.000 So we've gone over this in the past for a variety of reasons based on making thumbnails and doing graphics and writing articles.
01:28:34.000 And basically, every lawyer is like, the public has a right to use the image and likeness of public officials.
01:28:39.000 Because imagine if, like, okay, so let's put it this way.
01:28:44.000 Can I put up a billboard that says buy pool water?
01:28:49.000 And it's got a picture of Brad Pitt smiling.
01:28:51.000 No.
01:28:52.000 Of course not.
01:28:52.000 And you get sued.
01:28:53.000 You cannot.
01:28:55.000 There is an argument to be made about, because we've done this when we're working with the outfront media for Times Square.
01:29:02.000 If we put up, Michael Malice had the idea to put up a billboard that said, be like Greta, drop out of school.
01:29:09.000 And it was Greta Thunberg on the billboard.
01:29:11.000 And they said, no, because it's copyright violation.
01:29:14.000 They said there's a technical argument for free speech if you're using it to make a public statement about a well-known public figure.
01:29:22.000 However, it becomes challenging with non-governmental officials.
01:29:27.000 And a lot of these companies say we don't want to get in the middle of a lawsuit over using someone's likeness like that.
01:29:33.000 So basically, we talked to our lawyers and they said, if it's speech, like when we said Taylor Lorenz docs lives of TikTok, all we used was her name.
01:29:41.000 They said, that's just speech.
01:29:43.000 Using her image gets kind of murky.
01:29:46.000 But if it's being presented as an informational speech thing, you might be able to get away with it.
01:29:51.000 The simple way to understand it is: if I want to speak about Scott Besant or Donald Trump, I have to be able to do that.
01:29:59.000 He's a politician.
01:30:00.000 We need to be able to have a say in our processes.
01:30:03.000 However, you know, Brad Pitt's a celebrity who uses his likeness to profit.
01:30:07.000 He's not in office.
01:30:09.000 What he does for his business has no direct influence on what we do.
01:30:11.000 He's not running for office.
01:30:12.000 Does that mean I can sell lunch boxes with Arnold Schwarzenegger on them?
01:30:17.000 Talk to the lawyer.
01:30:18.000 Maybe not at this point.
01:30:19.000 But again, the important thing to understand about contract law is that it's all murky.
01:30:23.000 And I try to explain this to people all the time.
01:30:24.000 They think, I would argue this way.
01:30:27.000 Your average working class person thinks contracts are like steel cages, when in reality, they're more like straw huts.
01:30:36.000 And a handshake is often as good as a contract.
01:30:40.000 Now, the only thing I would worry about is whether you're not going to trust an individual, because even contracts ain't going to do nothing for you.
01:30:45.000 Yeah, they just, who was it?
01:30:46.000 They just issued Force Majeur to totally nullify their contract, all those energy deals.
01:30:51.000 Yeah, but that's because of external circumstances or something.
01:30:54.000 Right, right.
01:30:54.000 But let's put it this way.
01:30:56.000 Let's say, Ian, you and I have a contract.
01:30:58.000 Let's say I sign a term contract for you to be on Timcast IRL, guaranteed for three years, right?
01:31:03.000 A year from now, you get really, really angry and don't want to do the show anymore.
01:31:08.000 What happens if you stop doing the show?
01:31:10.000 I think it would be fine.
01:31:11.000 No, no, no, no.
01:31:12.000 I'll have a right to cancel the contract.
01:31:13.000 Nope, nope, nope.
01:31:14.000 I have a three-year term with you, Ian.
01:31:16.000 You have to be on the show for three years.
01:31:18.000 I don't know.
01:31:19.000 I don't know how you, how, what you, what recourse you would have.
01:31:21.000 Can I like tie you up and drag you onto the show?
01:31:23.000 No, you could maybe sue me for the lost potential revenue.
01:31:27.000 Sure.
01:31:27.000 And then how long will that take?
01:31:28.000 I don't, eight years, God knows.
01:31:31.000 And then you're going to challenge and the continuations and arbitration meetings, and it's just a waste of time.
01:31:31.000 Yep.
01:31:36.000 And so the so certainly, if the contract is a $500 million contract and you take a deal from Daily Wire to go join their show for $500 million, you'll get sued.
01:31:47.000 But if I do it, so this is the important thing to notice about contracts is that they don't actually matter in most circumstances, literally in most circumstances.
01:31:54.000 They matter if you're like Warner Brothers and Marvel and they're like, we're going to work out some kind of cool crossover movie.
01:32:02.000 Then it matters because you've got multi-billion dollar companies, product lines, and it's going to start stepping on each other's toes.
01:32:07.000 And you're not dealing with one person talking to one person.
01:32:09.000 But for most contracts where it's like, I'm hiring a person, Ian, you'll be on the show for a year.
01:32:14.000 If after six months, you stop showing up, what can I do about it?
01:32:17.000 I'm going to sue you, Ian.
01:32:18.000 You'll be like, okay.
01:32:20.000 And then I file a lawsuit.
01:32:20.000 And it's like, okay.
01:32:22.000 Six months later, I'll hear back from the court, I guess, if they can find you and serve you.
01:32:26.000 And then it's going to, I'm going to spend $1,000 on lawyers and I'm going to be like, I'm going to lose more than the contract is worth.
01:32:31.000 So what actually ends up happening is you don't get sued.
01:32:34.000 You get offered more stuff.
01:32:36.000 So Ian has a contract for a year.
01:32:38.000 And then six months later, he doesn't show up.
01:32:39.000 And I'm like, Ian, I need you to show up.
01:32:40.000 And you go, I'm not going to do it anymore.
01:32:41.000 And I'll be like, what do I have to do to make you come?
01:32:43.000 I mean, I have a contract.
01:32:44.000 I don't know what you could do to make me come.
01:32:45.000 Let's go.
01:32:46.000 Tim Pool.
01:32:47.000 So I have a contract.
01:32:48.000 I'll say, Ian, Ian, you have a contract with me.
01:32:49.000 You have to come on the show.
01:32:51.000 And then you'll be like, it's a family-friendly show.
01:32:53.000 I can't do that.
01:32:53.000 And then I'll be like, stop it, Ian.
01:32:55.000 I'll be like, no, don't stop it, Tim.
01:32:55.000 You know what I'm trying to say?
01:32:57.000 Don't stop.
01:32:58.000 And then you say, there is nothing you can do.
01:33:00.000 Sue me, bro.
01:33:02.000 Waste your time and have fun.
01:33:03.000 And my response would be, if I gave you more money, like, what can I do to make you come, brother?
01:33:09.000 You can do a lot of things.
01:33:13.000 I'm talking about coming on the show.
01:33:15.000 I know.
01:33:15.000 Yeah, I know.
01:33:16.000 I know you are.
01:33:17.000 I know you.
01:33:18.000 I know you.
01:33:19.000 Sounds so dark and gravelly tonight, Phil.
01:33:21.000 I just have a deep voice.
01:33:23.000 So the way that I am.
01:33:25.000 You look great, by the way.
01:33:26.000 I have a funny story for you guys.
01:33:27.000 I once entered into a contract with a company and we both signed it.
01:33:33.000 And then when literally two weeks later, they refused to pay out on the signed contract.
01:33:39.000 I showed up and said, we have a contract.
01:33:43.000 You owe me X amount of dollars.
01:33:46.000 Pay me what you owe me and this will be easy.
01:33:49.000 And they just laughed and said, What contract?
01:33:53.000 Wait, you had a copy of the contract?
01:33:55.000 Of course, but what am I going to do about it?
01:33:56.000 Because you were the little guy at the time.
01:33:56.000 Dang.
01:33:58.000 What am I going to do?
01:34:00.000 Am I going to go hire a lawyer and spend 10 grand to launch a lawsuit?
01:34:03.000 Fortunately, I'm smarter than them and I ended up winning.
01:34:08.000 The point is, it doesn't matter if you have a contract or not.
01:34:11.000 It matters.
01:34:12.000 It's guys, I love poker so much because the world is a poker game.
01:34:16.000 It really is.
01:34:17.000 You're sitting across the table from a guy and he says, What contract?
01:34:21.000 And then I said, I see the move you've made.
01:34:23.000 I am telling you now, you are not going to win.
01:34:26.000 So the story of this is: I showed up and I said, Hey, you sent me an email about the pay.
01:34:33.000 This is not what we agreed to.
01:34:34.000 This is a fraction of the pay that you've that we signed a contract on.
01:34:39.000 And this person goes, What contract?
01:34:43.000 And I said, The contract that you and I have.
01:34:45.000 Like, I never signed it.
01:34:47.000 And then I said, Okay, well, I'm not leaving until I get my pay.
01:34:52.000 Otherwise, we're going to have an issue.
01:34:54.000 And so this person calls in the higher up who says, What's going on?
01:34:58.000 And they're like, He's causing problems and he won't leave.
01:35:00.000 And I said, Listen, we agreed to a contract.
01:35:02.000 We've signed it.
01:35:03.000 I am owed this money.
01:35:04.000 And then the higher up just said, You're not getting anything.
01:35:08.000 If they tell me there's no contract, there's no contract.
01:35:10.000 And I looked him in the eyes and I said, I'm going to be a nice guy right now because I don't want to waste anybody's time, nor do I want my time wasted.
01:35:17.000 You pay me half of what you owe me, and we call it a bad hair day.
01:35:21.000 And he says, Well, right this one.
01:35:22.000 He walked me to the door.
01:35:24.000 And then I said, I won't say the guy's name or anything, but because we settled.
01:35:28.000 We'll call him Jim.
01:35:29.000 I said, Jim, as I'm walking out of the building, I said, I'm going to be, I'm going to make you one more offer.
01:35:34.000 I will be a nice guy one more time.
01:35:37.000 You give me half of what we agreed upon, and we are done.
01:35:41.000 And he just closed the door in my face.
01:35:44.000 And so, what did I do?
01:35:46.000 Well, long story short, I got lucky in that the company didn't just violate the contract.
01:35:53.000 Instead of registering me as a contractor at an hourly rate, because they were trying to rip me off, they listed me as a W-2 employee at an hourly rate, which gave me legal standing with the National Labor Relations Board.
01:36:07.000 And so, this opened a can of worms.
01:36:09.000 And I went and met with the NLRB because the first thing I got a phone call from the payroll company, and I'm like, I'm not an employee.
01:36:14.000 I was under contract.
01:36:15.000 This is totally separate.
01:36:16.000 And they're like, no, they listed you as W-2.
01:36:18.000 And I said, Can they legally do that?
01:36:21.000 They can just put me down as W-2.
01:36:23.000 I never agreed to that.
01:36:23.000 I never signed anything.
01:36:24.000 And they were like, they did.
01:36:27.000 And I said, so what does that mean?
01:36:28.000 And they're like, it actually means they owe you more money because W-2 is protected in California.
01:36:34.000 And this means they have to pay you for every day.
01:36:37.000 If you're not paid within 24 hours of termination of services, they have to pay you every day full rate.
01:36:44.000 So I went to the NLRB because of this.
01:36:46.000 And I said, I don't believe you can do anything for me because this is a private contract that we had.
01:36:50.000 This is not standard employment.
01:36:52.000 However, they did list me as W-2 without my consent.
01:36:56.000 And they said to me, okay, if you go the contract route, it sounds like you have a private suit case for about $300,000.
01:37:06.000 You'd, of course, have to hire a lawyer to take the case for you.
01:37:09.000 Probably would take several years.
01:37:11.000 We can't give you advice on that.
01:37:12.000 It's just my opinion.
01:37:14.000 Maybe you can find someone who'll do it on contingency for a third of the settlement.
01:37:17.000 However, that being said, because you were W-2 and because they didn't pay you, we can take your case on those grounds and they'd owe you about $30,000.
01:37:27.000 And I said, okay, let's roll.
01:37:29.000 And then it ultimately culminated with they tried to have one of their board members meet with me and then he panicked and fled because they thought I wouldn't show up because and we were like a day away from court.
01:37:40.000 And then finally the head boss shows up to the meeting place and he wasn't supposed to be there and he sits down and he goes, were you really offered rape?
01:37:51.000 And I said, yes.
01:37:52.000 And he goes, well, I'm telling you this right now.
01:37:56.000 We're going to go to court and it's going to be thrown out.
01:37:56.000 You're not going to win.
01:37:59.000 And I said, okay.
01:38:00.000 I was like, well, you know, I got time.
01:38:02.000 And he's like, well, look, we don't want to waste our lawyers' time and money.
01:38:05.000 So we can just settle this right now.
01:38:07.000 And I said, yeah, remember what I told you last time that I was going to be a nice guy?
01:38:10.000 And mind you, I think I'm 22 years old at the son.
01:38:13.000 I said, I told you I was going to be a nice guy.
01:38:15.000 And I told you, pay me half of what you owe me.
01:38:19.000 Yeah.
01:38:20.000 As of right now, we're going to be entering court with a total amount owed of $30,000.
01:38:27.000 So here's what we're going to do.
01:38:29.000 You pay me half right now and we'll call it a bad hair day.
01:38:35.000 And he goes, okay, so you're at 15.
01:38:39.000 I'll counter with six.
01:38:40.000 I said, no, I'm at 30.
01:38:42.000 And you're going to write a check for 15.
01:38:44.000 Otherwise, I'll see you in court.
01:38:46.000 And then he pauses for a second and then he gets up and he and he says, give us 15 minutes.
01:38:51.000 And then 50 minutes later, another lady walks in with three separate checks and she's like, sign these.
01:38:55.000 And that was it.
01:38:56.000 Had the contracts vanished?
01:38:58.000 Were there no paper copies of the contract or something?
01:39:00.000 Yeah, there are.
01:39:01.000 But the issue is you go to court and then all they have to say is, that's not real.
01:39:07.000 That's it.
01:39:08.000 Oh, interesting.
01:39:09.000 And then I say it was.
01:39:10.000 It was signed by this person and say, I'm sorry.
01:39:11.000 I have no recollection of that.
01:39:13.000 Even if their signature's on it, they'll be like, he forged.
01:39:15.000 Well, that's accusable forgery would be.
01:39:17.000 I mean, they're going to be like, it may be, but I have no recollection of it.
01:39:20.000 That's why they knew they were going to lose.
01:39:22.000 And the whole point of me offering half was just like, I don't want to deal with it.
01:39:25.000 I don't want to go to offices.
01:39:26.000 I don't want to go to meetings.
01:39:27.000 Just pay me.
01:39:28.000 Right.
01:39:28.000 I'll leave.
01:39:29.000 This is the point about the contract, right?
01:39:31.000 Arguably, I lost.
01:39:33.000 They were, well, technically, I ended up paying a lot more than I was going to get.
01:39:36.000 Based on the amount of work I did, I think they owed me something like $7,000.
01:39:40.000 They ended up paying a bit more than that.
01:39:42.000 But it took months.
01:39:43.000 And so when I told the guy initially, like, give me half and we'll call it a bad hair day, the point was, I'm going to get zero right now if I walk out the door.
01:39:52.000 I'm then going to have to go to meeting after meeting after meeting.
01:39:54.000 And it's going to take time out of my day to try and recover money.
01:39:59.000 So what do I do?
01:40:00.000 Well, you try and cut your losses.
01:40:03.000 Let's make it EV plus.
01:40:04.000 Like, can I leave here with some money right now?
01:40:06.000 And they refused.
01:40:07.000 And so then leaving with $0 left me with no choice.
01:40:10.000 You know, I'm thinking about contracts.
01:40:11.000 Are you talking about like smart contracts, this evolution of computational transaction?
01:40:15.000 And like, those things are locked.
01:40:17.000 You cannot get out of a smart contract triggers.
01:40:20.000 So there's a trigger and then a change.
01:40:22.000 But it's kind of nice that you can get out of contracts.
01:40:24.000 Like there's a very human element to it.
01:40:26.000 Like we shouldn't be bound by numbers.
01:40:28.000 I don't think humans are built like that.
01:40:30.000 It's not like we should be bound by agreements, though.
01:40:32.000 The point of a contract is to be like, we make this agreement and we have it.
01:40:35.000 We have evidence.
01:40:36.000 And now, granted, like you said, you can get out of them.
01:40:39.000 But the point is to be like, okay, my word is good.
01:40:42.000 Here's a written evidence that my word's good.
01:40:46.000 And here's a sad reality.
01:40:47.000 Yeah.
01:40:48.000 It used to be that I would say, Ian, my word is my bond.
01:40:53.000 And then when all hell broke loose and I was struggling to fulfill, I would say, I'm sorry, I wasn't able to fulfill it, but trust me, I will not stop working because my word means more to me than anything.
01:41:04.000 Now we're just a nation of multicultural leeches pirating off of each other.
01:41:09.000 And people get busy and forget and get distracted and focused on other stuff.
01:41:13.000 Like I can't remember every agreement I made with all everybody.
01:41:16.000 This is really about the destruction of like the high trust society.
01:41:16.000 Whatever.
01:41:20.000 Exactly.
01:41:20.000 That's what it is.
01:41:21.000 And even in this low trust society, if you're telling me that your contracts are still no good, then what is good?
01:41:30.000 Yeah, contracts are completely, completely, completely meaningless.
01:41:33.000 I really want to tell you.
01:41:35.000 I'm going to tell you again, the contracts matter for one thing.
01:41:38.000 So I love this stuff.
01:41:40.000 I once got offered a talent management contract from one of the big agencies, one of the big five talent agencies, and it was like this thick.
01:41:49.000 And this guy represented some of the biggest names you've seen in cable TV news and like reality TV stuff.
01:41:56.000 So he says, here's the contract.
01:41:58.000 If it's good, we're going to get you on all the biggest shows.
01:41:58.000 Look it over.
01:42:00.000 You're going to be a host.
01:42:01.000 Yada, yada, yada.
01:42:03.000 And so I start reading through it and I'm like, this is insane.
01:42:05.000 It's like 200 pages.
01:42:07.000 And I was like, okay, this is insane.
01:42:09.000 Now, the point of that contract is not for me.
01:42:12.000 It's for the other talent agencies.
01:42:14.000 What this talent agency was saying was not that we will work with you and get you work.
01:42:19.000 They were saying, once you sign this, I can make sure you will never work for anybody else.
01:42:25.000 So if after this company fails me, because they tend to, and I went to another agency and said, I'm trying to find work and they're not getting it for me, they'd say, there's literally nothing we can do with you.
01:42:35.000 You are cut out.
01:42:36.000 That's the contract is not about the work they would give me.
01:42:39.000 It's about the fact that I signed any letter of intent with one agency means no other agency will touch you until that is cleared.
01:42:46.000 Whereas if I do a contract with Ian, I mean, I got to be honest.
01:42:52.000 If I said, you know, Ian, let's do a contract.
01:42:54.000 You're going to be here every day for a year.
01:42:55.000 And then one day Ian's like, I don't want to be here anymore.
01:42:57.000 So he comes to the show and starts screaming racial slurs.
01:42:59.000 I'd ask him to leave.
01:43:00.000 And then he's like, contracts, there you go.
01:43:02.000 And then I can argue and say, oh, he violated the contract.
01:43:06.000 One of the theories as to Candace Owens claiming Bridget McCrone is a man was that she wanted to get out of the contract with Daily Wire.
01:43:12.000 So she started saying things that would intentionally get Daily Wire sued for a lot of money, force, and then take her show down.
01:43:18.000 So they would have to terminate the contract and boot her from the company.
01:43:21.000 Yeah, you're better off letting your employees go than forcing them to stay under duress.
01:43:26.000 Indeed, which is why a lot of people have said, like, does anybody at Timcast have like a term contract?
01:43:30.000 No.
01:43:30.000 They're like, really?
01:43:32.000 Like, none of your talent?
01:43:32.000 And I was like, bro, if someone hated me and didn't want to be here, you think it's good for the show that we keep them here?
01:43:40.000 No.
01:43:40.000 Like, if imagine this.
01:43:42.000 Imagine Daily Wire comes to Ian and says, you know, we're paying you a lot of money to come on our show instead.
01:43:47.000 And then he goes, I can't.
01:43:48.000 I have a contract with Tim, but man, it's a huge opportunity.
01:43:51.000 Is Ian going to be happy sitting in this chair?
01:43:53.000 Would I be a good friend to be like, no, Ian, I'm sorry.
01:43:55.000 I know it's a great opportunity, but you're stuck.
01:43:58.000 I would never do that.
01:43:59.000 I would be like, wow.
01:44:00.000 I mean, I hate to lose you, but bro, if they're offering you like a big thing and a real opportunity, you got to take it, you know?
01:44:05.000 Nah, I don't have to.
01:44:06.000 I'm using Daily Wire as an example.
01:44:08.000 But let's say a Hollywood studio comes to you and says, you know, like we're going to offer.
01:44:13.000 I'm going to pay you $6 million, but I'd be gone for eight months.
01:44:16.000 Something like that.
01:44:17.000 And then we had a contract.
01:44:19.000 Let's say we had a contract where you had to be on the show.
01:44:20.000 Like Ian can come and go as he pleases for anyone who was wondering.
01:44:23.000 Let's say we did have a contract where you got to be on every night for a year straight.
01:44:27.000 And then Hollywood approaches you a studio and says, it's a $6 million deal.
01:44:30.000 You're going to be the co-star in an action film.
01:44:32.000 You're the plucky sidekick.
01:44:34.000 And then he goes, but I do have a contract where I'm already on a show for a year.
01:44:39.000 Would it be beneficial for me in any way to be like, you can't leave, Ian?
01:44:42.000 No.
01:44:42.000 No, it would be good for you to be like, yo, bro, go get more famous.
01:44:45.000 And then I come back.
01:44:46.000 The thing is, what I wouldn't, you know what?
01:44:48.000 Hold on.
01:44:48.000 I'm good at business.
01:44:48.000 You know, I'd say, Ian, I'd say, you're getting a $6 million deal.
01:44:52.000 Buy out the contract.
01:44:52.000 You're good to go.
01:44:54.000 Yeah.
01:44:55.000 So Ian says, okay, we'll cut you in 10K off the 6 million to cover your losses so you can find somebody else.
01:45:01.000 So I don't lose all we use that money to then out of my future contract.
01:45:05.000 It pays off the past contract, buys out the contract, basically.
01:45:07.000 Right.
01:45:08.000 So basically, it's like, you're going to get $6 million against a one-year contract.
01:45:11.000 I would say, pay me the difference I need to find a replacement so there's no losses and we're good to go.
01:45:16.000 It would be crazy if they were like, but in the contract, it says you can't work with Tim anymore.
01:45:20.000 Like, I wouldn't do that shit.
01:45:21.000 I don't like signing away.
01:45:23.000 It could say something like, you agree not to do political shows where you engage in issues that are contentious and could be derisive, you know, do well.
01:45:30.000 I'm not sure if it's like stuff that hinders my free speech, my right to speak.
01:45:32.000 Oh, bro.
01:45:33.000 Welcome to morality clauses.
01:45:35.000 Tell me, you can't use that.
01:45:39.000 And again, I know we're talking about Daily Wire, but I'm pretty sure.
01:45:42.000 Morality clause is that you can't do something untoward, period.
01:45:42.000 No, no, no, no.
01:45:45.000 And I think the Daily Wire has these too.
01:45:47.000 I'm not trying to disparage them, but most companies, we don't have these, they have morality clauses that say, if you engage in a behavior that is deemed morally reprehensible or a behavior that could bring disrepute to the company or yourself, we can terminate the contract.
01:46:03.000 Most companies have that because what would you do if, like, you know, you had, like, this is the Candace Owens thing.
01:46:09.000 She gets out of the contract by saying these shocking things.
01:46:12.000 We didn't, we just have, you can just quit at any either party can terminate at any time.
01:46:16.000 Let's also, let's also while we're at it, getting some clarifications on this beautiful Friday night.
01:46:16.000 I'm pretty sure.
01:46:21.000 There's two NDAs, non-disparagement and non-disclosure.
01:46:25.000 Timcast does not issue non-disparagement agreements.
01:46:28.000 We do have non-disclosure agreements.
01:46:30.000 What is a non-disclosure agreement?
01:46:32.000 This means that individuals who work here will not disclose private trade practices and secrets to other individuals.
01:46:39.000 That's it.
01:46:40.000 Non-disparagement agreements, which people often confuse, are the ones where they say you can't badmouth your employer after the fact.
01:46:46.000 Literally, Ian can insult me right now on the show.
01:46:50.000 You stink.
01:46:51.000 No, you smell fine.
01:46:51.000 You see?
01:46:52.000 I think I'm smelling my upper lip.
01:46:54.000 But like if a year from now, Ian had left and said it was a miserable place to work and I hated it there, we have nothing stopping him from doing that.
01:47:02.000 We do, however, have something stopping him from saying, here's the actual like computer components that he used to get the streaming product built so that our competitors could then build it and come out against us.
01:47:13.000 You know what I mean?
01:47:14.000 What about like, here was the workflow of Tim Cast.
01:47:16.000 Indeed, that's you can't disclose that.
01:47:18.000 So like if Ian went out and said, I've drawn a media kit up that explains basically how the show is produced, timing, structure, guests, that we have contracts to stop people from talking about.
01:47:29.000 But then what happens is you get people who go, Tim Poole's got NDAs so weird.
01:47:34.000 My favorite is the personal injury disclaimer that we have at the castle.
01:47:38.000 Oh, yeah, I signed that.
01:47:39.000 A waiver of liability.
01:47:42.000 And and people tried claiming these libs were like, Tim Poole makes his guest signs waiver of injury liability to come on his show.
01:47:49.000 And it's like, because I wield a cane.
01:47:52.000 No, you have to sign a waiver of liability to walk through the skate park.
01:47:55.000 Yeah, there's a skate park.
01:47:56.000 I've fallen on that thing.
01:47:57.000 And it only pertains to the skate park.
01:48:00.000 Like, we have to have it as per our assurance policy because there is a skate park.
01:48:05.000 But people are just nasty and they lie.
01:48:07.000 But, you know, we got to go to Rumble Rants and Super Chats.
01:48:10.000 And you know what we're going to do?
01:48:12.000 For the next 15 minutes, we have a, unfortunately, we didn't make it.
01:48:17.000 I'm going to start a goal right now.
01:48:19.000 This only works on YouTube of 50 super chats of at least $5 each for the next 15 minutes.
01:48:27.000 And I will play a song.
01:48:28.000 Tim will play a song if we get it.
01:48:32.000 Rumble should introduce that.
01:48:33.000 We already have six.
01:48:34.000 No, it's zero.
01:48:36.000 Okay, keep it up.
01:48:36.000 That one expired.
01:48:38.000 In the meantime, we'll grab some of your songs.
01:48:42.000 I don't know.
01:48:43.000 Maybe if we, is there a song that you could sing that we could play at the same time or no?
01:48:46.000 I couldn't play guitar and sing it.
01:48:48.000 No, no, I could play guitar, and then I'll sing, and then you can sing, and then Ian will try to sing.
01:48:53.000 I don't know.
01:48:54.000 What do you want to do?
01:48:55.000 Carter can sing.
01:48:56.000 I can also sing.
01:48:57.000 Whatever song you want to sing.
01:48:58.000 I don't know.
01:48:59.000 I don't know.
01:49:00.000 I guess something we have to win.
01:49:02.000 We won't get copyrighted strike the vote.
01:49:05.000 Anyway, here we go.
01:49:07.000 Evan for us says, huge shout out to my organization, the YAL.
01:49:10.000 One of our members, Audrey Lee, is running for district clerk in Fort Bend County.
01:49:14.000 She's one of the youngest at 19 to ever run for a position like this.
01:49:18.000 Let's go.
01:49:18.000 LFG.
01:49:19.000 There you go.
01:49:20.000 Same old man says, Tim, anyone running in California is most likely a Democrat, even the ones running as Republicans.
01:49:26.000 Look at Carolina's Senator Republican, a Democrat Muslim.
01:49:30.000 Well, yeah, but that was because she ran unopposed and was attacking the position.
01:49:34.000 Yeah, the intent.
01:49:35.000 Steve Hilton's conservative.
01:49:36.000 Yeah, the intent in that situation was to actually just get an actual Democrat in.
01:49:41.000 Yeah, Hilton will probably be a bit more moderate.
01:49:43.000 That's okay.
01:49:44.000 But he's not going to be, you know, transing the kids.
01:49:47.000 Well, Chad Bianco did like Neil with BLM or something.
01:49:51.000 So that's a problem.
01:49:54.000 Yeah.
01:49:55.000 All right.
01:49:55.000 What we got going on here?
01:49:56.000 There's a lot of I'm sorry after the craziness of 2020 and 2021.
01:50:01.000 All right.
01:50:02.000 Troa says, Holy crap, what caused the price of diesel to skyrocket?
01:50:06.000 I went to Senate yesterday and it was $3.33 and woke up to five.
01:50:10.000 Oh, yeah.
01:50:11.000 No idea, man.
01:50:12.000 Wonder what totally beat.
01:50:14.000 It's the hormuz, something like that.
01:50:18.000 Net cheese, hormoz cheese.
01:50:21.000 Hormel.
01:50:23.000 That's chili.
01:50:24.000 Right?
01:50:24.000 Okay.
01:50:25.000 What do we got here?
01:50:25.000 Yeah.
01:50:26.000 Chris Kuhn says, you guys should have Matt Tardio on from Speak the Truth Podcast.
01:50:31.000 He is very knowledgeable in the Middle East, having fought there.
01:50:33.000 He'll be debating the young Turks soon and has been invited onto Fox.
01:50:36.000 Oh, you know, we got too wrapped up in, or I'm sorry, I'm sorry.
01:50:39.000 I got too wrapped up in esoteric business garbage about contracts, but I had fun talking about it.
01:50:44.000 That we didn't play the clip from Unsubscribe where they were reading Mein Kampf.
01:50:49.000 Yeah.
01:50:49.000 Because it's hilarious.
01:50:50.000 It's absolutely hilarious.
01:50:51.000 They're trying to drag Brendan Herrera because he owns this copy of Mein Kampf in English.
01:50:55.000 But the whole segment that we're talking about it, they're insulting and mocking Hitler as being retarded.
01:51:01.000 And that's what the media does.
01:51:02.000 It's literally them sitting there reading.
01:51:05.000 And it's like this excerpt from the book where it's like Hitler tried to, the first war he lost was with grammar.
01:51:12.000 It's a war he had fought for years and was not successful.
01:51:14.000 Like it was hilariously just insulting Hitler.
01:51:16.000 But the media clips that out and then just shows a little snippet where he's like, yeah, I have Mein Kampf.
01:51:21.000 And they're like, oh, I was on CNN earlier.
01:51:23.000 These people are absolutely insane.
01:51:25.000 Know thine enemy.
01:51:26.000 The basics.
01:51:27.000 I strongly feel like that will not matter to the electorate in Texas.
01:51:32.000 They're trying to win that seat.
01:51:33.000 The Democrats are trying to win that seat back because it's a heavily Hispanic seat that only recently flipped Republicans.
01:51:39.000 Well, Herrera.
01:51:40.000 They're trying to bring it back.
01:51:42.000 A Herrera plus Mein Kampf copy.
01:51:45.000 Maybe that'll do it.
01:51:47.000 What I mean by know thine enemy is understand the mind of Hitler if you think that the Nazis were bad so that you can prevent that kind of thing and certain events.
01:51:53.000 Or he's just a great character of history.
01:51:55.000 You just want to know what's going on.
01:51:57.000 What were his thoughts?
01:51:58.000 What was going through his head?
01:51:59.000 You could do these things critically, right?
01:52:01.000 But it's actually a treasure trove of opportunity to understand like a World War I, broken World War I vet that wants to rectify a loss of a war because that could happen again.
01:52:11.000 Well, the thing is now, though, everything's a weapon, right?
01:52:14.000 So if I can use just you having a copy of Mein Kampf to smear you or destroy you, whatever, that's what they'll do.
01:52:22.000 Nothing, your intent doesn't matter anymore, right?
01:52:26.000 So that's the world we live in.
01:52:28.000 Well, I'm saying framing is a powerful weapon.
01:52:32.000 Yeah, people will frame others.
01:52:34.000 They're framing me.
01:52:35.000 Literally, now they're framing me.
01:52:36.000 They'll tell you what you wanted to do.
01:52:38.000 This one is not just for me.
01:52:39.000 It's also for Ian as well.
01:52:40.000 Jay George says, Tim using Magic the Gathering logic to resolve the stack on Ayatollah's death.
01:52:45.000 First in, last out.
01:52:46.000 That's right.
01:52:47.000 You know?
01:52:48.000 Man.
01:52:48.000 They tried to bet as an instant, and you're like, nope.
01:52:51.000 We've got to do something.
01:52:52.000 We've got to do something.
01:52:53.000 Magic the Gathering, just bringing in all this SpongeBob and turtles and everything.
01:52:58.000 It's just a shit.
01:52:59.000 Simplify was just saying the other day, we need a fantasy universe.
01:53:01.000 Eric July with his Rippiverse is pretty inspirational, but like just a universe that we can build out from.
01:53:08.000 Bro, when they did Aether Drift, I was like, well, they jumped the shark.
01:53:12.000 For those that don't know, okay, it's real simple.
01:53:14.000 Magic the Gathering is fantasy.
01:53:16.000 It's a card game.
01:53:17.000 It's the original first card game.
01:53:20.000 And the game is played.
01:53:21.000 We call it Chess and Poker Combined.
01:53:22.000 It's a strategy game.
01:53:23.000 You draw cards, then you utilize resources to enact effects in the game to try and defeat your opponent.
01:53:29.000 The theme of the game was largely fantasy-based: dragons, knights, warriors, zombies, et cetera.
01:53:34.000 And they've had a bunch of wonky ones like Kamigawa Neon Dynasty, which was like Japan future with Neon, and it was kind of weird, but okay.
01:53:44.000 The, what you call it, the Ravnica set was pretty good where they introduced kind of like a steampunk vibe.
01:53:50.000 Still kind of fits the fantasy lore stuff.
01:53:53.000 They did Aether Drift last year.
01:53:55.000 I think it was last year, the year before.
01:53:56.000 And it's wizards riding motorcycles and race cars.
01:53:59.000 Oh, God.
01:54:00.000 And it was just like, what is who does this appeal to?
01:54:04.000 Here's the group on Star Wars Episode 1, maybe.
01:54:07.000 I was talking to my friend earlier because I was playing Marvel Rivals.
01:54:11.000 And I was trying to explain why everything keeps failing in content and culture.
01:54:15.000 And I said, so I played EA Skate, which only has 1,900 players left.
01:54:21.000 Seriously.
01:54:22.000 The original Skate game.
01:54:23.000 No, EA Skate that just came out six months ago.
01:54:26.000 Oh, has only 1,900 active players.
01:54:30.000 Yeah, they're going to, I, I, I, I'm willing to bet they will shut those servers down in like a month.
01:54:36.000 There's no way they can maintain the cost of those servers on a game with no players.
01:54:40.000 They're making zero dollars.
01:54:42.000 And I was explaining the reason why the game failed: instead of making a game for skateboarders, which is what you do, which would subsequently inspire friends of those skateboarders to hang out and play those games, they made a video game for everybody.
01:54:56.000 So instead of a skateboarder doing skateboard things, it's like a teenage girl doing teenage girl things.
01:55:02.000 Just like we saw with World of Warcraft the other day, where she's going, woo-hoo!
01:55:05.000 Because she got her like kettle thing.
01:55:05.000 Yay!
01:55:07.000 You mean an EA skate?
01:55:08.000 The girl like does her hair and stuff?
01:55:10.000 Like, what's no, it's like, so, uh, what's the vibe you think of when you think of skateboarder?
01:55:16.000 Um, like flowing clothing, like flying on the ground, kind of.
01:55:21.000 I would, I would say punk rock, like the original Tony Hawk that had all the punk bands, some metal, Bam Marjara, that's all the OG big peak skateboarding stuff.
01:55:29.000 And so, uh, most skateboarders today are 30-year-old white dudes, and that's just reality.
01:55:34.000 Now, I agree, we want to get more people to skateboard, so how do you do it?
01:55:38.000 Well, you need to produce a product for the base that can also be attractive for new people.
01:55:45.000 However, you need a core community first, which means the product should always be targeting the audience and then with marketing tools, reaching out to new audiences.
01:55:55.000 Instead, so the way I describe it is this: here's the skateboarding game that I would play.
01:56:01.000 I would play a very vanilla skateboarding game like the OG EA skate, where a guy's voice or women's voice is fine too, but a guy would probably play better for guys.
01:56:09.000 He says, In order to perform the heel flip, you hold down on the right thumbstick and then flick upwards towards the right, and that's it.
01:56:16.000 And then you try it, and if it doesn't work, it just starts over.
01:56:18.000 And that's it.
01:56:19.000 And you try it again, you try it again.
01:56:21.000 What this game does with all the characters, well, when the game starts, there's some like teenage girl and like a little robot that like floats around or something named V. And when they're teaching you how to play the game, when you screw up, it goes, I'll give you a few examples of what's wrong with Modern Society and why we got to get the men back in the room.
01:56:39.000 So, the character says, Let's try doing a heel flip.
01:56:42.000 To do a heel flip, hold the right thumbstick down and flick upward into the right.
01:56:48.000 So, then I go and I do it wrong.
01:56:50.000 And she goes, Wow, that was really cool.
01:56:53.000 But let's try a heel flip.
01:56:55.000 To do a heel flip, hold down on the thumb.
01:56:57.000 So, then I do a different trick.
01:56:58.000 And she goes, Totally cool trick, really awesome.
01:57:00.000 But do you want to try a heel flip?
01:57:02.000 So, there was another mission I was showing my friend.
01:57:04.000 You know, I do a manual.
01:57:06.000 And then he goes, Wow, man, really good riding, but let's try it with two wheels.
01:57:11.000 And I'm like, I hate these people.
01:57:13.000 I hate them.
01:57:14.000 I hate them.
01:57:15.000 The game that I want to play is it starts with Bam Marjara, all 50 years old and fat.
01:57:21.000 And he's standing there and he goes, Hey, jerk off.
01:57:23.000 Let's see a heel flip, bro.
01:57:25.000 Come on, you're young.
01:57:26.000 Look at me.
01:57:26.000 I'm fat.
01:57:26.000 I can't do it.
01:57:27.000 And I start laughing.
01:57:28.000 And then you try it and you miss.
01:57:29.000 And he goes, What the heck was that, bro?
01:57:31.000 Come on.
01:57:32.000 It's not hard to do.
01:57:33.000 And then you try it again and you miss.
01:57:34.000 And he goes, Dude, are you kidding me?
01:57:35.000 You only have to flick a stick.
01:57:37.000 Come on.
01:57:38.000 And so there's like edge to it and it's fun and it's silly.
01:57:41.000 Instead, they keep making everything like Skittles, candy canes, and rainbows.
01:57:45.000 And it's just like...
01:57:46.000 It doesn't make sense for skating because skating is a lot about pain.
01:57:48.000 That...
01:57:49.000 A lot of pain, a lot of adventure, a lot of conquest.
01:57:52.000 And so the original Tony Hawk game is what people said to me, Tim, it's just that skateboarding is not popular.
01:57:57.000 Stop.
01:57:58.000 Wrong.
01:57:58.000 Skateboarding was not popular before Tony Hawk Proskater.
01:58:01.000 Tony Hawk Proskater made skateboarding popular.
01:58:04.000 So we were all expecting with the new EA skate game, which we expected it to be gritty and realistic, but not overly edgy, just a sporting game.
01:58:11.000 Instead, it's Fortnite with a skateboard.
01:58:14.000 Not kidding.
01:58:14.000 Purple, pastel, yellow, girls running around, giggling and saying, like, let's have fun and play.
01:58:20.000 And it's not a sporting event.
01:58:22.000 So I said this.
01:58:23.000 Would you guys want to watch football?
01:58:25.000 Which football game do you want to watch?
01:58:26.000 Where the guy's going, he's got the pass.
01:58:28.000 He's going, touchdown, let's go.
01:58:30.000 And then everyone's screaming.
01:58:32.000 Or do you want to watch where the ball's about to throw?
01:58:34.000 And they're going like, wow, it's really cool.
01:58:36.000 No, wait, let's make him wait because we're going to put up the purple flowers.
01:58:36.000 He's going to throw the ball.
01:58:40.000 If the announcers were like, oh, that was a really good throw, but he missed it.
01:58:44.000 Oh, a second down now.
01:58:46.000 Let's try, though.
01:58:47.000 I have to.
01:58:49.000 Let me find this.
01:58:51.000 That's the way football's going these days anyway.
01:58:53.000 I got to find this article.
01:58:55.000 Here we go.
01:58:58.000 We have this article from Vice.
01:59:00.000 It says, this horrifying app undresses a photo of any woman with a single click from June 26, 2019.
01:59:08.000 And this is when I was explaining to some of my friends at Vice where everything went wrong.
01:59:13.000 And we were talking about wokeness and how everything was getting cringe.
01:59:17.000 And I said, guys, you guys ran an article titled, This Horrifying App Undresses a Photo of Any Woman with a Single Click.
01:59:24.000 And they were like, Yeah.
01:59:25.000 And I said, Do you want to know why your company is bankrupt?
01:59:29.000 I said, Tell me what the title of that article would have been in 2011.
01:59:33.000 And they were like, What?
01:59:36.000 This awesome app will undress any woman.
01:59:39.000 Had the word sex in it, and it would have said awesome.
01:59:42.000 It would have been like or hilarious.
01:59:44.000 Instead, it became, it's like we used to have content that was made by the dude with the ripped jean jacket or like leather jacket who was just like, hey, lay off me, dude.
01:59:55.000 I'm just trying to like do my thing.
01:59:56.000 And then we got content made by the hall monitor being like, you're not allowed to stand here.
02:00:00.000 I like that you got Gavin McGinnis was on the show, you know, founder, one of the founders of Vice.
02:00:04.000 That guy is punk rock.
02:00:05.000 That's like, he is punk rock.
02:00:07.000 You want to know why.
02:00:08.000 And he left, and he's an edgy guy, and he's weird stuff.
02:00:13.000 Oh, he's a freak.
02:00:14.000 Love you, G. Let's grab it.
02:00:15.000 Let's grab.
02:00:16.000 He's a good dude.
02:00:17.000 He's a good dude.
02:00:17.000 I'm a big fan.
02:00:18.000 He's a good funny guy.
02:00:19.000 Here we go.
02:00:20.000 We'll grab a couple more.
02:00:21.000 We got Shidev, the Vedmex says, Seven Nations memo correlates with the Yinan plan, 1982, followed by the A Clean Break, a new strategy for securing the realm, 96.
02:00:31.000 The authors all worked for the Pentagon when the Seven Nations memo was given.
02:00:35.000 It was made for Netanyahu.
02:00:36.000 Read more.
02:00:37.000 Well, how about that?
02:00:39.000 Yeah, the Clean Break memo.
02:00:40.000 Yeah.
02:00:41.000 What do we have here?
02:00:42.000 We've got the game says, I had my third kid, Tim.
02:00:45.000 We are winning.
02:00:46.000 First time, long time on donation.
02:00:48.000 I've been watching since you went on Rogan.
02:00:50.000 Appreciate it, brother.
02:00:51.000 Congratulations.
02:00:53.000 Third kid.
02:00:54.000 Wow.
02:00:55.000 Hey, yo.
02:00:57.000 Meg, Mega Bobson says, Tim, your live viewers are dropping.
02:01:00.000 Used to be like 40K, now it's 15K.
02:01:03.000 Why do you think that is?
02:01:03.000 Maybe multiple nights of you not being here.
02:01:06.000 First and foremost, yes, of course, when I get sick and I can't be here, that will have an impact on the algorithm.
02:01:12.000 As if people don't watch, because I'm not here, YouTube won't recommend it tomorrow.
02:01:15.000 That being said, we're also in a political offseason, and our viewership is actually slightly higher than it was for a comparable period four years ago.
02:01:24.000 Also, I know the membo didn't get out to a lot of people, but we simulcast on Rumble, which had 20,000 concurrence.
02:01:33.000 And we also do promos and the after-show there.
02:01:36.000 So, yeah, a year ago, we would get 40K on YouTube.
02:01:40.000 We then did a deal with Rumble and now simulcast.
02:01:44.000 And now we average around 47 to 50K between both platforms every single night.
02:01:49.000 And so the funny thing is, people who aren't fans of the show and don't pay attention or don't watch don't understand that.
02:01:54.000 And they're just like, wow, where is everybody?
02:01:56.000 Well, you know, a lot of them went to Rumble because they didn't like what YouTube was doing.
02:01:58.000 More importantly, Monday through Thursday, when the show on YouTube ends, we all go to Rumble for the uncensored portion.
02:02:04.000 So a lot of people who watch on YouTube slowly just migrate to Rumble, which is kind of the point of doing the deal with Rumble.
02:02:11.000 We don't want to cut off the people who like watching on YouTube, but we want everyone to watch on Rumble.
02:02:16.000 That was the point.
02:02:17.000 So here's how it works.
02:02:19.000 In 2021, which was the offseason after the 2020 election, we were averaging in 2020 literally like 1.6 million views per night per episode.
02:02:30.000 And our concurrence were like 70, 80K per night through the election season.
02:02:34.000 And then four months later, we were doing 27,000 concurrence on average because we're a news and politics show, not in a news and not politics era.
02:02:42.000 It's not until the middle of the midterm year when politics and money starts getting pumped in, the media starts picking up these stories, interest starts returning in the political space.
02:02:52.000 So we're just now coming off of this.
02:02:54.000 And comparable to the four years prior, we've actually been doing about 10 to 12,000 more viewers on average.
02:03:01.000 So we track all the growth for all the channels.
02:03:03.000 We've also made some changes and we're going to be making some changes that I think will make the show a bit more evergreen moving forward.
02:03:09.000 That is, times there are changing.
02:03:11.000 And if you don't adapt, you die.
02:03:13.000 One thing that we've noticed, which is very plainly obvious, is that it's becoming more and more difficult to book in-person guests because let's just be real.
02:03:21.000 If we hit up a guy with a million followers who has come on the show before and say, fly out to us, we're in DC.
02:03:28.000 It's going to be a day trip followed by an hour car ride, a hotel stay, then you can come on the show and leave.
02:03:32.000 They go, oh man, they used to say, yeah, let's do it.
02:03:36.000 This will be amazing.
02:03:37.000 Now they say, I got to be honest, like, I'm going to Zoom on Megan Kelly instead because she's also got a big show.
02:03:43.000 So we can't compete with all the shows that are doing Zoom guests.
02:03:48.000 Our network for in-person guests largely are DC based.
02:03:52.000 And within a couple hours, these are where most of the guests are like, I got no problem driving down.
02:03:57.000 But getting people to fly out is becoming ever more difficult.
02:04:01.000 As time goes on, more shows are doing more interviews, and it's just impossible to have the in-person conversation.
02:04:07.000 So we're looking at starting with, if we want to get bigger guests to engage in the conversation, they're almost exclusively always saying Zoom only.
02:04:16.000 I got like big prominent lefties who have even been like, come on the show, they go, can you do Zoom?
02:04:20.000 We go, we don't have Zoom set up.
02:04:22.000 We do it in person.
02:04:22.000 And they're like, I've got a million followers.
02:04:25.000 I stream every day.
02:04:26.000 I make millions of dollars.
02:04:27.000 I'm not flying out to your studio.
02:04:29.000 That's basically the response we get.
02:04:30.000 So we started, we've built out the mechanism by which now we can have people on the show via Zoom for the duration of the show.
02:04:37.000 We haven't done a test yet.
02:04:39.000 And we're actually going to be doing, I shouldn't say this, but I'm going to say it anyway.
02:04:42.000 We're going to be doing essentially casting for a new permanent panelist to be on the show.
02:04:49.000 Because if we are going to have Zoom guests, they likely will be a half an hour to an hour long, in which case, the four seats will be held by the in-person crew.
02:04:58.000 And so we have some individuals that we're talking to to become permanent panelists on the show, starting with maybe like three days a week, and then finally five days a week with, eventually it may go, full digital guests.
02:05:12.000 As for any big names that want to come on the show, they're always welcome to because we have the fifth seat available.
02:05:17.000 But again, like, I'm going to tell you this.
02:05:19.000 I don't want to name drop anybody because I don't want to be insulting to anybody.
02:05:22.000 But there are Hollywood A-list celebrities in big blockbuster movies that are pro-Trump that are like, I just can't fly from LA.
02:05:30.000 Can you do digital?
02:05:31.000 And we say no to these people.
02:05:33.000 And so there have been people who have been like, Tim, your guests are just not good anymore.
02:05:36.000 And it's like, because we were purists and wanted to do in-person only.
02:05:40.000 Well, if we can get Brad Pitt, but it's only over Zoom, we decided, okay, this is the point where we can't compete.
02:05:46.000 A lot of other shows are all getting massive guests.
02:05:48.000 Piers Morgan, you know, it's a different kind of show, but he gets a lot.
02:05:51.000 He gets a ton of good debates and guests because people just don't want to travel anymore.
02:05:55.000 Just the way it is.
02:05:56.000 So adapt or die.
02:05:57.000 And that's the plan moving forward.
02:05:59.000 We're going to get out of here.
02:06:01.000 Nobody wants to hear a song.
02:06:02.000 So I do appreciate it.
02:06:03.000 12 of them did.
02:06:04.000 12 of them did.
02:06:05.000 So we're going to get out of here, though.
02:06:06.000 And it's been a blast.
02:06:08.000 We're back, of course, next week.
02:06:10.000 And we got a big, big, big show coming up on Monday.
02:06:14.000 The man himself, Brandon Herrera.
02:06:17.000 Very excited for this.
02:06:18.000 And then we've got massive guests all next week.
02:06:20.000 We've got some celebrities coming on.
02:06:22.000 No joke, like there's going to be big stuff.
02:06:26.000 Big stuff.
02:06:27.000 And big collabs are coming for the next couple of weeks.
02:06:29.000 We're going to have a blast.
02:06:30.000 And then big changes, big changes.
02:06:32.000 Another thing I'll add about the Zoom guests, the only reason we've been dealing with the thing of people being like, oh, I don't want to travel anymore.
02:06:39.000 And we're like, well, you know what?
02:06:41.000 Then who cares?
02:06:43.000 The other consideration is when we were talking about this, the team's then like, you know, it costs us $40,000 per month to fly guests out here.
02:06:53.000 And if we did Zoom, we would save that and spend it on security.
02:06:57.000 And we were like, that's kind of the straw on the camel's back.
02:07:02.000 We need to find money to keep, because security has to go up.
02:07:05.000 We recently had another significant death threat.
02:07:07.000 So a person published a video, apparently with insider information somehow, threatening to murder me.
02:07:14.000 And it's considered credible, forwarded to the FBI.
02:07:18.000 And so all of that stuff's going down right now.
02:07:20.000 So we're constantly having to up our security and things like this.
02:07:22.000 And so we're like, well, you know what?
02:07:25.000 Maybe it's time we actually started doing some digital guests because otherwise we're just not going to get the big names anymore.
02:07:31.000 They're just doing Zoom.
02:07:32.000 And additionally, it's expensive.
02:07:34.000 So that being said, smash the like button, share the show.
02:07:36.000 We're back with clips throughout the weekend.
02:07:38.000 We're back on Monday with a massive, amazing show.
02:07:40.000 You can follow me on X and Instagram at TimCast.
02:07:43.000 Vish, do you want to shout anything out?
02:07:45.000 Just follow me on X at Vishburgh and TikTok at Vishburgh.
02:07:49.000 Thanks for having me, Tim.
02:07:50.000 Man, I'm so grateful to this life.
02:07:54.000 Thank you for having me be part of it.
02:07:56.000 It's really awesome.
02:07:57.000 I hope we can make the world better.
02:07:58.000 I know we can.
02:07:59.000 So we'll do our best.
02:08:01.000 You do your best.
02:08:01.000 We'll meet up at the end.
02:08:03.000 See you.
02:08:05.000 I am Phil that remains on Twix.
02:08:06.000 You can check me out on Patreon.
02:08:08.000 It's Phil That Remains on Patreon.
02:08:10.000 The band is all that remains.
02:08:11.000 You can follow the band on Apple Music, Amazon Music, Pandora, Spotify, YouTube, and Deezer.
02:08:16.000 We're going on a tour this summer or this spring.
02:08:19.000 We're going out with Born of Osiris and Dead Eyes.
02:08:21.000 You can check out or you can get tickets at allthatremainsonline.com.
02:08:25.000 Don't forget, the left lane is for crime, Carter.
02:08:27.000 Carter Banks here.
02:08:28.000 You can follow me over at Carter Banks.
02:08:30.000 Vish, thanks for coming on.
02:08:31.000 It's been a pleasure.
02:08:32.000 Ian, I second what you said to the ether.
02:08:36.000 And yeah, it's going to be really cool next couple of weeks.
02:08:40.000 A lot of good stuff, Tim.
02:08:42.000 Thanks for hanging out, everybody.
02:08:43.000 We're going to, I don't know why Ian's dressed up, but we'll figure out something to do that fits his attire.
02:08:49.000 We're going to go find a party.