Timcast IRL - Tim Pool - March 15, 2023


Timcast IRL - FIFTH BANK COLLAPSING, SF Bank Plans Sale As Credit Suisse FAILING w-Benny Johnson


Episode Stats

Length

2 hours

Words per Minute

204.20776

Word Count

24,573

Sentence Count

2,092

Misogynist Sentences

26

Hate Speech Sentences

37


Summary

On this week's show, we discuss the crisis at Credit Suisse, the First Republic bank collapse, and the dangers of AI. We also hear from Benny Johnson, the man who did more for East Palestine, Ohio than Joe Biden.


Transcript

00:00:00.000 So this morning we got news that Credit Suisse was in serious trouble.
00:00:23.000 This is a European bank, and apparently they got no money.
00:00:27.000 Many people were predicting this would be the next big bank to fail, and it resulted in trading halts in Europe.
00:00:33.000 Now, we were trying to figure out what to talk about for the show, because, like, well, do we want to talk about the banking thing again?
00:00:38.000 Because that's, I mean, it's been big, but we've talked about Credit Suisse.
00:00:41.000 And then, uh...
00:00:43.000 About ten minutes before this show is set to go live, we get breaking news from Bloomberg that another San Francisco-based bank is seeking an exit strategy because it was, what did they say, cut to shreds or something?
00:00:54.000 Just destroyed?
00:00:56.000 And, uh, okay.
00:00:58.000 That marks the fifth bank that is either collapsed or facing collapse.
00:01:03.000 Now, so far there have been three major collapses, the second and third largest in history back-to-back.
00:01:08.000 Credit Suisse is a very, very large international bank, and if this one goes down, it is very bad news.
00:01:15.000 And then we got First Republic, which the news is just breaking, so we'll talk about that.
00:01:18.000 We got a bunch of other news, too.
00:01:20.000 I was torn as to whether or not to lead with this, because James O'Keefe is going to call in the show, quite literally call in, he's actually going to call Benny Johnson, and Benny Johnson's going to hold his phone up to the microphone, because we don't actually have a way to do real call-ins on this show.
00:01:34.000 And that's intentional, but for James, launching his new O'Keefe Media Group, we decided to figure out a way to make an exception, so he'll literally just call Benny on the phone, we're going to hold the phone up to the microphone.
00:01:46.000 And then we have to talk about AI.
00:01:48.000 Because the new chat GPT apparently has been granted access to its own code to execute code.
00:01:53.000 It's been given money and unleashed.
00:01:57.000 This is the craziest thing I've ever heard.
00:01:59.000 Microsoft apparently, I'm not sure if it's Microsoft or whoever, whatever the parent company is, got rid of their AI ethics team.
00:02:05.000 And now people are kind of worried that they've created this soulless entity, this domino effect, and they're unleashing it.
00:02:13.000 Is this going to, you know, destroy the world and all that?
00:02:16.000 So, you know, we'll talk about it.
00:02:17.000 Before we get started, my friends, head over to TimCast.com.
00:02:21.000 Click join us to become a member and support our work directly.
00:02:25.000 We are a member-driven company.
00:02:27.000 That means we only exist because, I should say, almost entirely we exist just because you guys are members.
00:02:33.000 So when you sign up, you ensure that we can keep doing this show.
00:02:37.000 I can do my morning show.
00:02:38.000 Ad rates don't cut it, never really did.
00:02:41.000 And if it was not for you as members, we wouldn't be here.
00:02:43.000 So if you wanna see this, keep it on, become a member.
00:02:46.000 But also, we have a newsletter now.
00:02:49.000 Go to TimCast.com, you can click to sign up to our newsletter, and you will get a weekly email showing you the top stories of the week.
00:02:56.000 It will show you the top guests and members-only content, as well as the other cultural stuff that we are making behind the scenes.
00:03:02.000 So don't forget to smash that like button, subscribe to this channel, share the show with your friends.
00:03:06.000 As I already mentioned, joining us tonight is a man who did more for East Palestine, Ohio, than Joe Biden, Mr. Benny Johnson.
00:03:13.000 I'm really sad that that's true, but it is absolutely, verifiably, empirically correct.
00:03:18.000 And you did a lot for a single American individual, but in the grand scheme of things, giving $20,000 out is like, as low as it can be.
00:03:26.000 You know, so like, for you as a single person, that's a huge, that's huge.
00:03:30.000 For Joe Biden, the federal government, it is sad.
00:03:32.000 They did nothing at all.
00:03:34.000 Yeah, this is like, um, not even Nancy Pelosi's plastic surgery bill for a Wednesday.
00:03:40.000 What was, uh, what'd you use the money for?
00:03:41.000 What'd they use the money for?
00:03:43.000 I'm not sure.
00:03:44.000 Did you just say it was like a gift?
00:03:45.000 You were able to gift people?
00:03:46.000 It was cash.
00:03:47.000 Do you have a foundation that does it?
00:03:50.000 We just took our profits for the month and we went up to East Palestine and we handed those people an envelope with a thousand dollars in it.
00:04:00.000 We went to Google Maps and we found where the crash site was.
00:04:04.000 Then we went to the homes that were closest to the crash site.
00:04:07.000 I wish I could have given money to everyone.
00:04:09.000 This was Three days after the crash, no one was helping these people?
00:04:14.000 If this had been in Philadelphia?
00:04:17.000 I'm sorry, if it had been in Kiev.
00:04:19.000 If this had been in Kiev?
00:04:21.000 You know, actually, our president went and traveled to the place that is as far away from East Palestine as you can possibly get, actually.
00:04:31.000 So, while we were heading up to East Palestine, Joe Biden went as far on a map as you can actually get, which is Kiev, Ukraine, from East Palestine, the polar opposite of the world, to prove that America Last is the guiding principle of Joe Biden's administration.
00:04:48.000 Ukraine first.
00:04:49.000 Yeah, that's right.
00:04:49.000 Thanks for doing that, dude.
00:04:52.000 We'll talk about that, too, and then we'll talk about what the federal government will do for Americans when the banks all collapse, which is probably nothing, but thanks for hanging out, Benny.
00:04:59.000 It's gonna be fun.
00:05:00.000 What's up, guys?
00:05:00.000 And then we're gonna have James call your phone, so that'll be in about 25 minutes or so.
00:05:04.000 Yep.
00:05:05.000 Right now we got Phil Labonte hanging out.
00:05:06.000 Hello, I am Phil Labonte, lead singer of All That Remains, anti-communist and counter-revolutionary.
00:05:12.000 I'm also Ian Cross, I'm coming at you hot.
00:05:14.000 What's up?
00:05:15.000 And I am Serge.com.
00:05:17.000 Let's roll.
00:05:18.000 So let's jump into this breaking news that actually just broke right before the show started.
00:05:23.000 We have this from Bloomberg Markets just in First Republic Bank.
00:05:26.000 The San Francisco-based lender that was cut to junk by S&P and Fitch is exploring strategic options, including a sale.
00:05:35.000 Let me break that down for you.
00:05:37.000 A sale means they're about to fall apart.
00:05:39.000 What's happening right now with Silicon Valley Bank and this major collapse that the federal government is stepping in and they are trying to sell it off.
00:05:46.000 They've insured all deposits.
00:05:48.000 SVB apparently put out a statement where they're like, it's the safest place to have your money because the government has insured all deposits with no caps.
00:05:55.000 So it's like funny money, monopoly money.
00:05:58.000 Yo, if this is how they're viewing the system, there ain't no system.
00:06:02.000 There is no system.
00:06:02.000 So we were going to talk about this other story.
00:06:05.000 Look at this one.
00:06:06.000 Dow closes more than 250 points lower as bank crisis spreads to Europe.
00:06:12.000 Live updates.
00:06:13.000 This is from CNBC.
00:06:14.000 And they're talking about Credit Suisse.
00:06:17.000 According to one of their main investor guys or main companies, there's no money left.
00:06:24.000 There's no money.
00:06:24.000 And the Saudis, which apparently are principal investors, saying they can't give any more because of regulations.
00:06:30.000 So it's looking like Credit Suisse, which is a very large international bank, is on the verge of collapsing.
00:06:35.000 This was already predicted.
00:06:36.000 The bond market apparently is doing very, very poorly.
00:06:40.000 And we've talked about that in the past week.
00:06:42.000 So as we're talking about this, right before the show starts, Phil's like, hey Tim, First Republic Bank is trying to sell now, it's another SF Bank, so this is going to be the fifth bank.
00:06:53.000 We've got Silvergate was the first.
00:06:55.000 Most people didn't know or didn't care because it wasn't that big, but Silvergate collapses.
00:06:58.000 Then, what happened was because of that, Silicon Valley Bank put out this statement about raising money and it caused a panic.
00:07:05.000 Everybody said, is this going to be like Silvergate?
00:07:07.000 I don't want to lose my money, so there's a run in the bank.
00:07:09.000 Silicon Valley Bank then falls apart.
00:07:11.000 Then Signature Bank falls apart.
00:07:13.000 Some people are trying to claim that that bank was better, but I don't know.
00:07:15.000 All I know is that's three banks that fell, and now we have Credit Suisse and First Republic on the chopping block.
00:07:21.000 So I don't know.
00:07:22.000 I'm going to defer to Max Keiser, economist.
00:07:25.000 Did he say the end is nigh?
00:07:26.000 He said, he tweeted out about four hours ago, a global bank reset is coming.
00:07:30.000 All depositors will have their deposits protected.
00:07:32.000 This is Max Keiser.
00:07:34.000 All your deposits will be protected, he's saying.
00:07:36.000 He says the U.S.
00:07:36.000 dollar and other fiat money will be swapped for a central bank digital currency and depositors will be given a bonus amount of the new currency when the switch is made.
00:07:44.000 You had to make this too sweet in the deal.
00:07:46.000 Like a quick version of the Euro switch, he concludes.
00:07:48.000 Terrifying, but I see the plan.
00:07:50.000 I mean, he knows, he's seen this for about a decade in the making.
00:07:53.000 He's in El Salvador, you know, hanging out with the president and pushing Bitcoin as the national currency.
00:07:58.000 I like him a lot, but the thing about permabears is permabears are permabears.
00:08:03.000 So when bad times come, they look like they're geniuses because they're permabears.
00:08:08.000 You know, it's like, Like, they're always... Permanently bearish?
00:08:11.000 Yeah, permanently bearish.
00:08:13.000 He's permanently bearish on fiat currency.
00:08:16.000 He's permanently bearish on... He's right!
00:08:20.000 I am also against the structure of our fiat system currently, I agree.
00:08:30.000 Can I just say, if I had listened to Max Keiser like 10 years ago, I'd be a very happy person living on my own private island.
00:08:35.000 Because he was like, Tim, I'm telling you, you've got to buy Bitcoin right now.
00:08:39.000 And I was like, I got a little bit.
00:08:40.000 I'm not worried about it.
00:08:41.000 You've got to buy more.
00:08:42.000 Listen, Tim is fibbing to you because Tim could live on his own island right now if he wanted to.
00:08:47.000 That's not true.
00:08:47.000 That's not true.
00:08:48.000 Max Keiser.
00:08:49.000 This is his own island.
00:08:51.000 This is kind of like having your own island.
00:08:53.000 It's amazing coming out here.
00:08:54.000 It is a mountaintop.
00:08:55.000 Yeah, to be fair.
00:08:56.000 I was standing out with the chickens before the show.
00:08:58.000 I'm standing out with the chickens, I'm looking around.
00:09:01.000 I'm not like the only person on your own island.
00:09:03.000 I mean, there's like people who live close by.
00:09:06.000 And it's it's not like mountains, you know, in the Rockies, it's it's the it's the smoky, it's the Blue Ridge, you know, beautiful.
00:09:12.000 It's a bunch of dirt.
00:09:13.000 Yeah, no minerals in it.
00:09:14.000 Nothing like that.
00:09:15.000 You have your own harem of chickens.
00:09:17.000 Well, that's Roberto Jr.' 's harem.
00:09:19.000 He's a very busy man.
00:09:20.000 But in all seriousness, I have the tweet from Max Keiser that Ian was referencing, where he says a global bank reset is coming.
00:09:29.000 And a lot of people think this is to bring about a central bank digital currency, which is going to be the government's crypto, which they control.
00:09:36.000 And the reason they'll do this is because they can track every single purchase.
00:09:41.000 Everything you ever do, they will track.
00:09:42.000 Like before the show, we were talking about this, and it's like if you want to go buy gasoline on Thursday, but that day is not good for ESG, they're going to say, you know what, your central bank coin doesn't work at the gas station today.
00:09:53.000 That kind of control.
00:09:53.000 Take a bite of your guns.
00:09:54.000 Take a bite of your gun purchases.
00:09:56.000 That's right.
00:09:56.000 They're already, I mean, they're beta testing this with the credit cards.
00:10:00.000 Yeah.
00:10:01.000 Oh yeah, Discover, I think.
00:10:02.000 Well, credit cards was the first step towards digital currency.
00:10:05.000 Now you're swiping a card.
00:10:07.000 It's funny because futuristic sci-fi and video games always had this.
00:10:11.000 We've envisioned this.
00:10:12.000 Play a video game and it's like a guy in the future goes, how many, how many, you know, union credits.
00:10:16.000 And that's what they would call it, union credits.
00:10:19.000 And then they would just have like a wristwatch and they would beep.
00:10:21.000 And that's where we are!
00:10:22.000 In a way, it's more secure because it can't be stolen from your pocket, but in the other way, it's completely insecure because someone else... Well, they can turn it off!
00:10:30.000 Literally, it's programmable money.
00:10:32.000 They're going to be able to say, this money is for this, and then you have other money that you can use for other things.
00:10:39.000 So there will be luxury money, like you'll have money for leisure time that you're allowed to spend at the movies, or downloading stuff, or whatever.
00:10:46.000 Then you'll have like necessities money, etc.
00:10:49.000 And different, like if you have a low ESG score, you won't get a lot of credits for luxury and leisure.
00:10:56.000 Those will be the ones to entice people.
00:10:58.000 Remember in, what was it, the fifth element, the cigarettes that they were giving out.
00:11:04.000 They were mostly filter, just a little bit of cigarette on them.
00:11:08.000 That kind of control is the future if we get a central bank digital currency.
00:11:13.000 It's now.
00:11:13.000 We've already seen in like Denver, where they were like, you can't turn on your air conditioning and the AC locked.
00:11:19.000 Or in, I think California, they were like, you can't charge your electric car.
00:11:23.000 What about South Africa?
00:11:23.000 Where they're like, you have no right to electricity.
00:11:26.000 You thought you had electricity rights in South Africa?
00:11:29.000 Nope.
00:11:30.000 Government doesn't have to constitutionally provide you electricity.
00:11:32.000 Figure out your own, build your own electricity.
00:11:35.000 Thomas Edison.
00:11:36.000 I'm actually thinking about that.
00:11:38.000 because we have a small creek on the property.
00:11:39.000 And I'm like, can we trench this and then put a water wheel in there?
00:11:43.000 Yeah, generate some power.
00:11:44.000 And he says he doesn't live in his own island.
00:11:46.000 Come on.
00:11:47.000 It's a big plot of land in West Virginia.
00:11:49.000 You'll want to get the batteries to store it.
00:11:52.000 That's really the storage is the thing.
00:11:54.000 And then what happens is overnight, it charges a decent amount.
00:11:58.000 You'll probably drain all of it very quickly because a water wheel is not gonna,
00:12:01.000 it's not a fast moving river.
00:12:04.000 Yeah, I mean, in Florida, you get these guys coming up,
00:12:06.000 knocking on your door all the time.
00:12:07.000 I live in Tampa, and they're like, yo, look at your house, it's great!
00:12:11.000 Can we put solar panels up on the top, and we'll give you this much money?
00:12:14.000 Because the power companies pay us for your solar panels.
00:12:17.000 Have you done it?
00:12:18.000 I have not done it.
00:12:19.000 I think it's a really good idea for you to do that.
00:12:21.000 I would rather do it for myself.
00:12:23.000 Right.
00:12:23.000 Right?
00:12:23.000 So I have a generator.
00:12:25.000 So I have an LP, like a generator, right, that runs on gas.
00:12:29.000 LP, yeah.
00:12:29.000 And so when the hurricane hit, my generator kicked on.
00:12:32.000 So awesome.
00:12:33.000 Uh, and, but what if someone cuts off the gas line?
00:12:36.000 They can't cut off the sun, I don't think.
00:12:38.000 Get a tank.
00:12:38.000 So maybe solar is the way.
00:12:39.000 Get tanks.
00:12:40.000 Get LP tanks.
00:12:41.000 I got a thousand gallon LP tank and I'm gonna get two more.
00:12:44.000 And I need to get, uh, someone in New England that can install house batteries, like Tesla house batteries.
00:12:49.000 I can't find anyone.
00:12:50.000 Anyone out there in New England that installs house batteries in southern New England?
00:12:54.000 Hit me up.
00:12:54.000 Where should they hit you up?
00:12:55.000 It took us a long time to get that.
00:12:56.000 On my Twitter.
00:12:57.000 We went with a different company.
00:12:58.000 We tried doing Tesla at first, and then it just took too long.
00:13:01.000 It was a disaster.
00:13:02.000 A year later, they showed up with the stuff, and we were like, dude, we just don't want it.
00:13:04.000 We're going with someone else.
00:13:05.000 I got the solar now, and I like it.
00:13:08.000 I put a new roof on last year, and they came.
00:13:10.000 They took the solar panels off, so I could put the new roof on.
00:13:13.000 No problem, no issues.
00:13:14.000 But I can't find someone that'll actually sell me it.
00:13:15.000 Do you like your solar panels?
00:13:16.000 I do, yeah.
00:13:17.000 Yeah, I do.
00:13:18.000 I really want to get the... Do you get to keep the energy, or do you have to sell part of it to the power company?
00:13:22.000 Right now I have to sell it back because I don't have the batteries because it's New England and there's not a lot of sun.
00:13:26.000 But you get like credits.
00:13:27.000 Yep.
00:13:27.000 Social credits.
00:13:29.000 So here, let me show you this.
00:13:31.000 It's the basis for it.
00:13:33.000 Let me show you this real quick.
00:13:33.000 This is from TheMotleyFool.
00:13:35.000 Why are Wells Fargo and Citigroup falling today?
00:13:38.000 And I don't really care to get into the details.
00:13:40.000 I just want to then show you this story.
00:13:42.000 This is from Seeking Alpha.
00:13:44.000 Wells Fargo, no crisis here.
00:13:47.000 Well, okay.
00:13:48.000 So, you get to decide for yourselves whether or not you think the end is nigh.
00:13:53.000 Because, uh, I'll put it this way.
00:13:55.000 Right before the show starts to get breaking news, another bank is in serious trouble.
00:13:59.000 And you're gonna hear from the Jim Cramers and from the Bidens.
00:14:02.000 They're all gonna say, everything is fine, your deposits are safe, don't worry about it.
00:14:08.000 Maybe.
00:14:09.000 Maybe.
00:14:10.000 But I don't know how much I trust these guys, and that's the problem, because I feel like when they come out and say, don't worry, everything's fine, they're actually having the inverse impact they think they are.
00:14:20.000 Everyone's gonna hear that and be like, run for the hills.
00:14:23.000 Get your money out of the banks.
00:14:26.000 People don't like hearing banks being like, it's all fine!
00:14:30.000 Nobody, don't pay attention to the little man behind the curtain.
00:14:33.000 This is the four of America's biggest banks lost a combined $55 billion in value on Thursday.
00:14:39.000 That's today.
00:14:40.000 No, that's today's Wednesday.
00:14:42.000 That's last week.
00:14:43.000 Last week, geez.
00:14:46.000 Well, don't trust anyone to live your life for you.
00:14:49.000 That's some advice.
00:14:50.000 Take care of yourself and use your brain.
00:14:52.000 I imagine this is like you were talking about it the other day, Tim.
00:14:55.000 I imagine this is not the end of the problems that we've got that are coming.
00:15:00.000 I think that these, the bank failures here are going to, I think that it's probably going to lead to more stress in
00:15:06.000 the system.
00:15:07.000 I don't... I'm not predicting, you know, a crash because I have no way of predicting the future at all and I'm not, you know, I'm not an economics guy.
00:15:15.000 But it's not a bad idea to get yourself squared away so that way you have, you know, some necessities if you're able.
00:15:23.000 The scary thing is that...
00:15:25.000 The scariest thing is that people who want the system to fail will intentionally go out and take all their money out of the banks.
00:15:31.000 I mean, could you imagine people who are upset with the status quo and, say, the two-party system and establishment politics, intentionally taking all their money out of banks because they want those banks to fail, which would cause the system to collapse?
00:15:42.000 Who could imagine doing something like that?
00:15:44.000 So this is what Vivek Ranswami said.
00:15:46.000 He's a very smart guy.
00:15:47.000 He's running for president.
00:15:48.000 He was on the show two days ago.
00:15:49.000 And he's like, no, no, no.
00:15:51.000 These hedge fund guys wanted to get their money.
00:15:54.000 And they wanted a bailout, and they didn't like their money being locked up in a 2% bond, and so they actually staged a run on the bank.
00:16:01.000 He's like, this is the untold story, that these guys are so greedy that they needed the extra couple points, they were locked up in a 10-year treasury, they needed that out, and the easiest way to get that out was to collapse the bank.
00:16:12.000 That's how evil these people are.
00:16:14.000 So these hedge fund guys, these bros got together, finance bros got together, and they're like, how do we stage a run on the bank?
00:16:19.000 You tweet about it and I'll tweet about it.
00:16:21.000 Maybe we can get Tyson to take his money out.
00:16:26.000 And then we'll start the run on the bank.
00:16:29.000 And then the federal government will have no choice.
00:16:31.000 Because they're already on fragile ground.
00:16:33.000 The ice is breaking beneath all of us.
00:16:35.000 And so they'll come in and they'll insure us all and we'll just get our money.
00:16:37.000 We'll just be able to do one little deposit.
00:16:39.000 And get our money back.
00:16:40.000 And the government will literally trip over itself to print the money because of the situation we're in, where the rest of the economy is so delicate, and they're trying, you know, raising the interest rates and stuff, the rest of the economy is dying to go into recession, just begging for an excuse to go into recession.
00:16:57.000 And they're just like, F them!
00:16:59.000 F the country!
00:17:00.000 Yo, let me just drop something right here, that I have, this is verified, this is true, there's huge finance for China, okay?
00:17:07.000 In Silicon Valley Bank.
00:17:08.000 The bio-research for China.
00:17:11.000 Silicon Valley Bank funded it to the tune of billions, right?
00:17:14.000 Ton of Chinese companies.
00:17:16.000 Chinese companies use Silicon Valley Bank as a really nice bridge, right, between the two nations.
00:17:20.000 What about China using fifth generation warfare without firing a shot can collapse our financial system just by yanking out all their money?
00:17:26.000 Dang.
00:17:27.000 What if the Chinese just said, okay, now all the money comes to us?
00:17:30.000 So crash the banks just by withdrawing.
00:17:32.000 Maybe that's why I haven't pulled my money out.
00:17:34.000 The easiest thing you've ever done.
00:17:35.000 You can collapse a nation just by going, ka-ching!
00:17:37.000 You know, when I heard about this first bank collapse last week, I guess it was.
00:17:41.000 This is the Silicon Valley bank.
00:17:43.000 My first, I had this feeling like, get your money out.
00:17:46.000 This wave of feeling.
00:17:47.000 Without even words or anything, it was just like, get it out.
00:17:49.000 And then I just, it passed over me and I was like, No.
00:17:53.000 No.
00:17:54.000 If it disappears, it disappears.
00:17:56.000 That's because I feel like it is.
00:17:58.000 Someone is gaming this.
00:17:59.000 It does feel like that.
00:18:01.000 It feels like this was an intentional... It's not real panic.
00:18:04.000 That's a little more passive than the average person is probably prepared to be with their life savings.
00:18:11.000 Maybe they want you to pull the money out to collapse the system so they can then introduce a command economy and central bank digital currency.
00:18:17.000 That is completely and totally reasonable in my opinion.
00:18:21.000 Now, all the banks fall, the federal government comes out and says, don't worry, all those banks have failed, but your money has been automatically converted by the FDIC into FedCoin, so your money is still available to you in digital form.
00:18:36.000 Download this app if you want to get access to it.
00:18:38.000 Hmm.
00:18:39.000 Just like that, right?
00:18:41.000 Then they'll say, we've cataloged your accounts by social security number, log in to federalgovernment.coin
00:18:47.000 or whatever, and type in your social, and your address, and your name, and your phone
00:18:51.000 number, and then all of your accounts have consolidated your Fedcoin, your USD coin into
00:18:55.000 one place or whatever they want to call it, and then you can spend it using this app.
00:19:00.000 There's already, and people that are unfamiliar with the crypto world may not be aware, there's
00:19:04.000 already what they call stablecoins that are tethered to the value of the dollar.
00:19:10.000 There's one called Tether that is design specific.
00:19:15.000 Sorry, just as you're talking, CNBC drops a breaking, Japan's Topix drops 2%, Asia-Pacific markets fall as Credit Suisse adds to banking fear.
00:19:25.000 I'm just sitting here like, I think it's worse than we realize.
00:19:30.000 I mean, it could be.
00:19:33.000 God damn it.
00:19:35.000 Okay, look, you should have bought guns in 2020 when the summer of love kicked off.
00:19:43.000 But anyways, like I was saying, there are cryptos in existence that are already Pegged to the value of the dollar.
00:19:54.000 So there's already a basic infrastructure to have a dollar coin, right?
00:19:59.000 A dollar crypto that the Fed makes.
00:20:03.000 It's a simple transition for them.
00:20:05.000 It is not hard at all.
00:20:07.000 The crypto wallets, they can design those and put those into, make them for your phone or whatever.
00:20:13.000 And I mean, it's not a far step away.
00:20:16.000 Let's jump to this story from Daily Mail.
00:20:19.000 Woke Silicon Valley Bank donated over $73 million to Black Lives Matter-related social justice groups before it collapsed, while failed Signature Bank gave $850,000.
00:20:29.000 So that's where your money went.
00:20:32.000 If you're wondering why Silicon Valley Bank failed, and where's your money at, there it is.
00:20:36.000 You can ask Black Lives Matter where your money went.
00:20:38.000 They gave their money to ESG, they bankrupted it, and then they just bail it out with taxpayer money.
00:20:44.000 It's the most corrupt If this is recorded in history books, which I hope it will be, it'll be seen as like the most corrupt financial scandal in US history.
00:20:52.000 When the communist takeover happens in the United States, and lasts for a hundred years or whatever, after it falls, the people who survive will write about how they took over by using private institutions to fund far-left extremism, and then have taxpayer dollars fund out those organizations.
00:21:08.000 Think about this.
00:21:09.000 A bank gives 73 million dollars to far-left extremism, Collapses.
00:21:15.000 The taxpayer bails him out.
00:21:17.000 That's how you launder taxpayer money into ideological subversion.
00:21:21.000 Crafty stuff.
00:21:22.000 It's like taxation without representation.
00:21:25.000 I didn't ask to have my money sent towards that crap.
00:21:28.000 Well, what they would say is you didn't do your own research about where this company is doing business, because they probably put out a prospectus that had all of this stuff laid out.
00:21:42.000 But it's the taxpayer bailout that I didn't sign up for.
00:21:46.000 Oh, right.
00:21:47.000 Tough.
00:21:48.000 But it's only sort of taxpayer-funded.
00:21:52.000 They're gonna use the FDIC, which is paid into from banks, but they're supposed to use the money for the little guy and they're giving it to the billionaires and the millionaires.
00:22:00.000 So it is taking away from you, but in a different way.
00:22:04.000 And how it affects the taxpayer when they're like, it's not going to cost the taxpayer a dime, it will not, I repeat, it will not.
00:22:10.000 They're lying because what actually is going to happen is the financial damage from all of this is already rippling out to everyone.
00:22:17.000 The fact that stocks fell in these banks means retirement accounts are taking hits.
00:22:22.000 And that means older people are probably now looking at their budgets being like, I guess we don't eat this month.
00:22:26.000 So yes, it absolutely does hit everybody when they do this.
00:22:29.000 And that's their argument is, well, we can't have the social contagion or we can't have the The banking contagion spread around and destroyed the economy, can we?
00:22:37.000 I say, yeah, we can.
00:22:38.000 Well, tough luck.
00:22:39.000 So, the average American household has $10,000 in cash in their reserves.
00:22:43.000 East Palestine, the median household income was $41,000 a year.
00:22:46.000 That's how normal Americans live.
00:22:48.000 Now, 95% of depositors had over the $250,000 maximum for federal insurance.
00:22:54.000 So you're talking the uber-rich of the uber-rich, based on the data for American solvency, right?
00:23:00.000 And liquidity.
00:23:01.000 So you're talking the richest of the rich.
00:23:03.000 This is a bailout of the richest of the rich.
00:23:06.000 And they're woke, too, to make it worse.
00:23:11.000 If they were rich and libertarian, I might be like, well... They're the laptop class.
00:23:14.000 They're the worst.
00:23:15.000 So put this in context here.
00:23:18.000 It took the government nigh on a month To lift a finger for East Palestine, where they nuked a bunch of little children and poisoned their water, and they had this locked up by Saturday.
00:23:31.000 Yellen had this done in an emergency meeting by Saturday.
00:23:33.000 Billions of dollars, bailing out the richest and the wokest of the woke, because they're their donors.
00:23:39.000 Joe Biden gets up at 9 in the morning for an emergency announcement, and Jen Psaki is like, he doesn't do this normally, he's a night owl.
00:23:45.000 And I'm like, this MF-er flew across the planet to give half a billion dollars to Ukraine, and he didn't lift a finger for East Palestine, but he wakes up at the crack of dawn to make sure he can assure all the woke Silicon Valley investors your money's safe.
00:23:57.000 What a piece of garbage.
00:23:59.000 But he's talking to the people who vote for him and the people who fund him, and that's why he does it.
00:24:03.000 The people of East Palestine voted for Trump so he could give to You're both right, but everybody here knows that to make the wealthy suffer means that the poor suffer more.
00:24:16.000 Every single time.
00:24:18.000 If you really want to hurt the wealthy, if you want, and this is part of why socialism doesn't work, Right, like, the idea is get the money from the wealthy, right?
00:24:28.000 But when you get the money from the wealthy, the people that don't have any money, they get smashed.
00:24:34.000 And it's literally, that's why communism and socialism doesn't work, because when you hurt the producers, the wealthy are always the ones that are producing.
00:24:41.000 Like, the reason they have money and stuff is because they're doing stuff.
00:24:46.000 So as much as the incentive is like, or the gut feeling is like, make them suffer, right?
00:24:50.000 Like, that's what you want.
00:24:52.000 You want them to pay.
00:24:53.000 But it's gonna hurt the poor people, and I'm not sure that it's worth it.
00:24:57.000 I don't, you know, I don't want it.
00:24:58.000 I don't want to suffer.
00:24:59.000 I don't want anyone to suffer.
00:25:01.000 First and foremost, the major problem here, the major problem here, first and foremost, is constitutional.
00:25:07.000 Okay?
00:25:08.000 So, Article 1 and 2 of the Constitution.
00:25:10.000 Congress controls the purse.
00:25:11.000 Congress passed a law to insure $250,000 worth of deposits.
00:25:15.000 That is a law by Congress.
00:25:16.000 How Joe Biden has the bloody cheek to think that he can just go through and rewrite this law and spend this money and turn on this faucet.
00:25:24.000 Trial balloon.
00:25:25.000 Is beyond me.
00:25:26.000 Now, I tell you, they did it with the vaccine mandate.
00:25:29.000 They did it with student loans.
00:25:31.000 They did it with student loans.
00:25:32.000 The ATF.
00:25:32.000 Now he's getting smacked in the pee pee on both of those.
00:25:36.000 But here we go again.
00:25:37.000 He's rewriting the Constitution in real time.
00:25:39.000 The ATF passed a law without Congress.
00:25:42.000 They made an object illegal.
00:25:46.000 Braces for pistols.
00:25:48.000 Yes.
00:25:48.000 They're saying that you can go to jail, but they never actually had Congress pass a law banning such an object.
00:25:54.000 These are all trial balloons.
00:25:56.000 But we're still there, grains of sand being added to the heap, where the executive branch of the government says, we can make anything we want legal or illegal and no one will stop us.
00:26:06.000 So, it's not just the things you talked about, but you were correct.
00:26:09.000 Right?
00:26:10.000 The financial stuff, like the FDIC, him saying, I can just give the money if I want.
00:26:14.000 What are you going to do about it?
00:26:15.000 The fact that he can just take money and give it away.
00:26:17.000 He doesn't have those powers, but who will stop him?
00:26:20.000 Who's supposed to stop him?
00:26:21.000 That's exactly right.
00:26:22.000 Who's supposed to stop him?
00:26:23.000 I don't know.
00:26:24.000 Congress.
00:26:24.000 Congress.
00:26:24.000 So Congress is Article 1 for a reason.
00:26:26.000 Again, people don't understand.
00:26:28.000 Congress is Article 1.
00:26:29.000 Congress was supposed to be the important branch.
00:26:31.000 Yeah.
00:26:32.000 Article 2 is the presidency.
00:26:34.000 That was supposed to be like the backwater.
00:26:35.000 Military.
00:26:36.000 Yep.
00:26:37.000 That was supposed to be the backwater.
00:26:38.000 That's why they put old Washington in charge.
00:26:41.000 Old Washington.
00:26:42.000 To kind of putz around and talk to different countries and talk about the military and chew on his wooden teeth.
00:26:49.000 And Congress was where the action was.
00:26:52.000 And that's the way it was designed.
00:26:53.000 Because, of course, Congress itself is far more representative of all of us.
00:26:57.000 You can get a dumbass president But it's hard, you know, Congress is going to be a lot... We saw this with the McCarthy fight, right?
00:27:04.000 Congress is going to be where real populism happens.
00:27:06.000 Dude, we were at Congress on Friday, last Friday, and we were talking to Matt Gaetz and Steve Bannon came in, and Steve Bannon was like, get ready guys, because on Monday they're going to come in and they're going to start asking you that they need this bailout.
00:27:16.000 It was the next morning, they did it without Congressional authority.
00:27:19.000 They didn't ask!
00:27:19.000 They didn't ask!
00:27:23.000 So, poor silly Steve Bannon, who thought they would actually go to Congress this time to ask.
00:27:29.000 Instead, they just went, it's done.
00:27:31.000 That's right.
00:27:32.000 Amazing.
00:27:32.000 That indicates that they're going to do it again in some other fashion.
00:27:36.000 That they'll be like, oh, your dollars that you thought were dollars, they're digital currency now, don't ask questions.
00:27:42.000 And who's supposed to question this?
00:27:43.000 Congress.
00:27:43.000 Let me tell you a story.
00:27:45.000 I have a Spanish friend.
00:27:46.000 And in 2012, the end of 2012, I went to Spain because they're having big protests.
00:27:51.000 And I asked my Spanish activist friend why the protests had been going on and persisting.
00:27:57.000 And she told me it all started with the euro.
00:27:59.000 So this is many years ago for these young people.
00:28:01.000 Unemployment, why unemployment was so very high, why the economy was destroyed.
00:28:04.000 And what she said was, The currency of Spain was, I think it was the Peseta?
00:28:08.000 Is that what it was?
00:28:10.000 And you'd go out to your, you'd wake up in the morning, you'd go to your cafe, you'd grab a newspaper, you'd grab a muffin or something and a coffee, and each of those items was one Peseta.
00:28:19.000 The newspaper was one, the muffin was one, the coffee was one.
00:28:21.000 Three Peseta, and you got everything you need.
00:28:23.000 Then they decided to roll out the euro and they wanted it normalized for all of Europe.
00:28:29.000 She said one day they woke up and the newspaper cost one euro instead.
00:28:33.000 The muffin cost one euro and the coffee cost one euro.
00:28:35.000 The only problem is in order to actually buy the euro, you needed three peseta per one euro.
00:28:41.000 So all of your goods, everything jumped up three times their cost overnight.
00:28:46.000 And then all of a sudden the economy started falling apart and then young people couldn't find jobs.
00:28:51.000 They went into a financial crisis, started protesting.
00:28:54.000 With the Central Bank Digital Currency, we may see something similar, but perhaps they will try and incentivize people into giving up their freedom by saying, well, if you click the I Agree button on the app, you get 1.5 times your deposit.
00:29:11.000 So if you had $100 in the bank, you'll have 150 FedCoin.
00:29:14.000 And FedCoin can be used anywhere because it is legal tender and must be accepted by all businesses.
00:29:19.000 Oh, and they'll be like, if you don't buy gasoline this month, we'll pay you a thousand
00:29:22.000 dollars kind of crap of their phone.
00:29:25.000 Oh, oh, oh, yeah.
00:29:26.000 Fed coins.
00:29:27.000 Centralized Command Bank is going to have a thing where it's like you can opt in for
00:29:31.000 lower my energy costs, and then it'll be like it will show you your coins going up very
00:29:35.000 slowly and it'll be like by not using electricity, you're earning and then it will show you the
00:29:40.000 coin.
00:29:41.000 You'll flip your lights off.
00:29:42.000 It'll take a little faster and you'll be like, oh, all the things I can.
00:29:45.000 That's really what it is, it's the amount of money that they take out of your per month
00:29:49.000 credit.
00:29:51.000 Because while it's going up, what they're not telling you is they're also inflating the coin by producing more.
00:29:55.000 So it's not really going up!
00:29:56.000 Yeah, that's the thing about these.
00:29:57.000 They're not going to be on a blockchain.
00:29:58.000 These central bank digital currencies are not planned, from what I've read, to be on a blockchain.
00:30:02.000 So there's not going to be trackable.
00:30:03.000 No one's going to know what the malfeasance in the background.
00:30:05.000 Yeah, they'll be printing it.
00:30:06.000 We won't know.
00:30:07.000 That's the problem.
00:30:08.000 There's no incentive for the government to put their currency... I mean, right now, We can't get the Federal Reserve audited, never mind control over the Federal Reserve.
00:30:18.000 There's no way a federal coin is going to be on a public open blockchain.
00:30:22.000 That's why Bitcoin is great.
00:30:25.000 As much as people want to go ahead and crap on crypto and stuff like that, I get it.
00:30:29.000 There are people who don't understand it, whatever.
00:30:30.000 The great thing about Bitcoin is the fact that it's decentralized, that it's international,
00:30:36.000 it's not controlled by anyone, there's no single entity that controls it.
00:30:40.000 And I know that there's people super focused on like the dollar amount of a coin.
00:30:45.000 That's not what the important part of Bitcoin is.
00:30:47.000 Bitcoin is a protocol like email.
00:30:50.000 Except, right, Bitcoin may be the precursor intentionally designed to bring about central
00:30:55.000 centralized digital currencies in that for all global currency there is gold.
00:31:01.000 Gold is a store of value.
00:31:02.000 Bitcoin, they say, is digital gold.
00:31:04.000 Not everybody agrees, but Bitcoin, they say, is digital gold.
00:31:07.000 And so maybe these powers wanted Bitcoin to exist.
00:31:10.000 They wanted people to get very, very wealthy off of it.
00:31:13.000 They wanted to have an idea in the public that if you bought Bitcoin, you were rich now.
00:31:18.000 Then when they roll out FedCoin, they're going to be like, don't miss the train.
00:31:21.000 Because if you buy Fedcoin, it's gonna be worth ten times what it is in three days.
00:31:26.000 Are you gonna sit back?
00:31:28.000 Play in the FOMO.
00:31:30.000 And it's gonna work?
00:31:31.000 Let's jump to this next story.
00:31:33.000 It's actually 8.30.
00:31:34.000 I know we're going to have an incoming call.
00:31:37.000 So James is finishing up a radio interview.
00:31:39.000 He'll call at 8.40.
00:31:40.000 8.40?
00:31:40.000 Beautiful.
00:31:43.000 We'll keep talking about Bitcoin.
00:31:44.000 I want to give a little time capsule to the future after the dust settles and you guys start creating a new currency.
00:31:49.000 Make a currency that deflates automatically the longer it sits in your account so you're encouraged to spend it to create You mean literal currency?
00:31:57.000 No!
00:31:58.000 A deflationary currency encourages circulation.
00:32:03.000 Deflationary means its value goes up.
00:32:06.000 Uh, well, it actually, so if you have a dollar and it sits there, tomorrow you'll have 99 cents.
00:32:10.000 That's not deflationary.
00:32:11.000 That's inflationary.
00:32:12.000 That's literally what the dollar does.
00:32:14.000 And they do it for the purpose of trying.
00:32:16.000 No.
00:32:16.000 Yes.
00:32:16.000 I'm talking about a coin that actually disappears into nothingness.
00:32:19.000 Right, right, right, right.
00:32:21.000 That's what the dollar does.
00:32:24.000 One dollar in 1908 is worth, a dollar today is worth like what, three cents compared to a dollar a hundred years ago?
00:32:31.000 Yeah.
00:32:31.000 That's because they printed a bunch more.
00:32:32.000 But in this situation, you'd print 20 million coins.
00:32:34.000 It's the same thing.
00:32:35.000 It's the same function.
00:32:36.000 In reverse.
00:32:36.000 It's the same thing.
00:32:37.000 It's the same function.
00:32:37.000 No, no.
00:32:37.000 It's a deflationary currency.
00:32:39.000 It's the same function.
00:32:40.000 It's the same function.
00:32:41.000 Ian, you just don't understand, but you're describing the same thing.
00:32:43.000 You would never print a new one.
00:32:44.000 You'd have them all built off the bat.
00:32:46.000 You're just describing a different means of doing the same thing.
00:32:48.000 Yeah.
00:32:49.000 And it's not deflationary.
00:32:50.000 Deflationary means the value goes up, not down.
00:32:52.000 The value goes up because they're harder to find.
00:32:55.000 Yeah.
00:32:57.000 I guess.
00:32:57.000 But you could also create more of them.
00:32:58.000 I guess you're actually right on that one.
00:33:00.000 If the money disappears, then it becomes more scarce and more valuable.
00:33:05.000 So you're not actually losing any value, and there is no incentive to spend.
00:33:08.000 In fact, the incentive is to hold onto it because it's disappearing.
00:33:11.000 But then you'll be able to create more of it, and it'll constantly keep disappearing.
00:33:15.000 So it's like you don't want to have it.
00:33:17.000 So then it's like having an expiration date on your money to increase the velocity.
00:33:21.000 Because the biggest problem is wealth hoarding.
00:33:23.000 I think we've got to keep on the line.
00:33:25.000 So speaking of people who disappear and come back more valuable, we got James O'Keefe on the line.
00:33:30.000 We got the Bitcoin of investigative reporters on the line here.
00:33:34.000 James O'Keefe who launched a brand new organization today called OMG.
00:33:40.000 Hey guys, this is James O'Keefe.
00:33:42.000 How you doing, Tim?
00:33:42.000 Hell yeah, dude!
00:33:43.000 Hey guys, can you hear me?
00:33:43.000 Hey Tim, Tim, it's James.
00:33:44.000 Yeah, dude.
00:33:45.000 This is as high as I can get it.
00:33:46.000 James, make sure you're not on like AirPods or anything.
00:33:48.000 Is it, is the, is the, and on speakerphone.
00:33:50.000 Okay, there we go.
00:33:51.000 Hey guys, can you hear me?
00:33:52.000 Yes.
00:33:53.000 Hey Tim, Tim, it's James.
00:33:55.000 How you doing?
00:33:56.000 I'm doing, I'm doing pretty well.
00:33:58.000 Well, I just launched a new thing.
00:34:01.000 So Keef Media Group, and I just wanted to call in to tell you guys hello.
00:34:05.000 I appreciate you guys being there for me.
00:34:07.000 Absolutely.
00:34:07.000 I watched all your episodes, and you were pretty much right on about it all, so thank you, Tim, for being a good stand-up guy.
00:34:15.000 Well, uh, we all, we all appreciate the, the work you do.
00:34:18.000 One of the last few, if not the last real news organization and, and newsman.
00:34:23.000 And, uh, what happened with Project Veritas was shocking.
00:34:26.000 So we're excited to see O'Keefe Media Group.
00:34:28.000 We got, we got the tweet pulled up.
00:34:29.000 You wanna pull up that while James is talking?
00:34:31.000 We actually have your video.
00:34:32.000 We're gonna play in a second.
00:34:33.000 So James, what's the premise?
00:34:35.000 The premise is like Uber for journalism.
00:34:38.000 We're going to be sending thousands of cameras to people all over the world.
00:34:41.000 And we already have a few hundred that have emailed us today and they want to wear the camera.
00:34:46.000 So our website is going to be totally dedicated to equipping, mobilizing, empowering, and training citizen journalists.
00:34:54.000 On the scale of likes that you've never seen before in your life.
00:34:57.000 And so many people after what happened to me after the Pfizer story were inspired.
00:35:02.000 So we're just gonna do this on a massive scale now.
00:35:06.000 And you have the website.
00:35:08.000 It's live.
00:35:10.000 Benny, I was inspired by Benny's film noir, his style.
00:35:15.000 And we had me getting out of a black car.
00:35:17.000 I don't know if you caught that joke, Tim.
00:35:19.000 We also had me stealing someone's sandwich.
00:35:22.000 I don't know if you guys get that.
00:35:24.000 It's an inside joke.
00:35:25.000 You have to read between the lines.
00:35:28.000 And then some little dancing, because people don't like when I dance.
00:35:30.000 It really upsets demons when I dance.
00:35:32.000 They don't like that at all.
00:35:34.000 They get really angry about me being artistic.
00:35:37.000 So a lot in there.
00:35:38.000 OMG, O'Keefe Media Group.
00:35:42.000 And thank you guys for just being good people, for being real.
00:35:45.000 I hope to come visit you down there.
00:35:47.000 Yeah, come by anytime.
00:35:48.000 We're going to play your video and we're going to talk about your new venture.
00:35:50.000 We're going to talk about corporate press and all that stuff in a moment.
00:35:53.000 Yeah, I want to come down there and talk to you about corporate press and ownership.
00:35:58.000 I've just learned so much.
00:36:00.000 But definitely piss off the haters.
00:36:02.000 Share that video of me dancing, getting out of black cars, stealing sandwiches.
00:36:05.000 Sandwiches always taste better when they were in the hands of a pregnant lady.
00:36:10.000 So, that's just the way it is.
00:36:13.000 I say that tongue-in-cheek, okay, just for all of you listeners.
00:36:17.000 I'm being a little ironic there.
00:36:19.000 Right on.
00:36:19.000 But thanks, Tim.
00:36:20.000 No, seriously, man, good work.
00:36:22.000 Appreciate it.
00:36:22.000 One question, James, what are you working on right now?
00:36:26.000 Oh, gee.
00:36:27.000 I mean, first of all, we have a follow-up on the fire story.
00:36:29.000 We got a massive tip.
00:36:32.000 A lot of three-letter government agency stuff coming in through our pipeline.
00:36:37.000 And what aren't we working on?
00:36:40.000 I'm in the little war room with 10 people.
00:36:42.000 These are ride-or-die people.
00:36:44.000 These are really good people that have been working 15 hours a day over the last few days.
00:36:49.000 So you'll see our next story in the coming days, guys.
00:36:52.000 It's going to be big.
00:36:53.000 That's how we roll.
00:36:55.000 That's how we roll at O'Keefe Media Group.
00:37:00.000 O-M-G.
00:37:02.000 I love the name.
00:37:03.000 The name is so good.
00:37:04.000 Well, we gotta get you out here sometime, so as soon as you can.
00:37:07.000 I would love to.
00:37:08.000 I've learned a lot about, as you say, ownership of media.
00:37:12.000 That's a theme I want to touch on with you.
00:37:15.000 I'll have a few stories.
00:37:16.000 Thanks, guys.
00:37:17.000 I gotta run.
00:37:18.000 Benny, you're the best.
00:37:19.000 Later, James.
00:37:20.000 Thanks, James.
00:37:21.000 So let us show you the video from O'Keefe Media so you can watch it for yourself.
00:37:27.000 And I'll just play it.
00:37:29.000 Here we go.
00:37:30.000 Here we go.
00:37:31.000 The irony of the acorn story is that it took a 25-year-old with a hidden camera a few days to do what billion-dollar networks and journalists could not do in a decade.
00:37:44.000 I spent 14 years creating the most effective non-profit newsroom this country has ever seen.
00:37:52.000 And in paving the way to establish citizen journalism, I have been defamed, arrested, raided, and ultimately removed from the organization I spent so much time developing credibility of.
00:38:07.000 I always knew they would try to ruin the reputations of those who expose them, the pharma giants.
00:38:12.000 The three-letter government agencies and those who I thought I could trust.
00:38:17.000 But in response, we are going to build an army of investigators and exposers.
00:38:24.000 They have awakened a sleeping giant.
00:38:27.000 I'm back.
00:38:28.000 Remaining by my side are a small, tight-knit group of the most elite journalists in the world.
00:38:36.000 Exposing corruption requires standing up to power, because power hates sunlight.
00:38:42.000 We are sunlight.
00:38:44.000 Welcome to the O'Keefe Media Group, where we will never be shut down.
00:38:50.000 Because not only do I own it, but you own it too.
00:38:54.000 Support us and sponsor our army of journalists by becoming a founding member today.
00:39:01.000 OMG.
00:39:02.000 That was fast.
00:39:04.000 I mean, what was that, a couple weeks ago?
00:39:06.000 Let me tell you how I knew James was going to be successful in this.
00:39:08.000 I've known James for nigh on a decade.
00:39:11.000 We both worked at Breitbart way in the early days, back when Breitbart was still doing Red Eye and all the old school Breitbart, right?
00:39:18.000 And I didn't know about James, and Andrew Breitbart is like, hey, there's a guy who wants to play a pimp.
00:39:23.000 And he's a white kid.
00:39:24.000 Isn't that crazy?
00:39:25.000 Like, white pimp!
00:39:26.000 And he's gonna get like a fake... You held up your phone when he called, and his name was saved in your phone, and you could see what you saved him in your phone as.
00:39:37.000 It's been in my phone like that for 10 years, because that's how I remember the kid.
00:39:40.000 So Andrew Breitbart is coming in looking like a wild man at some DC party, and he's like, this kid, he's white, he's gonna play a pimp, it's gonna be crazy!
00:39:49.000 He's got a prostitute too!
00:39:51.000 And he's gonna collapse Acorn and the entire Obama regime!
00:39:53.000 The people who got Obama elected, this guy's gonna come and blow him up!
00:39:56.000 And, sure enough, like, James did it.
00:39:58.000 And James and I were, like, fast friends.
00:40:01.000 He edited the Acorn video on my apartment couch.
00:40:03.000 Wow.
00:40:04.000 He edited the Acorn videos there.
00:40:06.000 His pimp jacket was in my closet for quite a while, before he was James O'Keefe.
00:40:09.000 And then every single little thing that he's done, some of the craziest stuff, some of the, like, like, hey, I'm gonna try and seduce someone from CNN on my boat!
00:40:17.000 Like, you remember that one?
00:40:18.000 The old ones?
00:40:18.000 No.
00:40:19.000 Yeah, that was one that's gone a little sideways.
00:40:23.000 I remember Jon Stewart praised him for the acorn stuff.
00:40:26.000 Jon Stewart was like, we've been had.
00:40:28.000 Yeah.
00:40:28.000 Well, he said, how come you couldn't get journalists to do this?
00:40:31.000 Why is it up to this kid to go in and uncover this stuff?
00:40:34.000 And then when James O'Keefe even shared this, when Jon Stewart showed the clips, he didn't insult or lie about what James O'Keefe uncovered.
00:40:41.000 He went, This looked really bad.
00:40:42.000 What is this?
00:40:43.000 South Park did an episode about James O'Keefe.
00:40:45.000 Really?
00:40:45.000 I mean, that's how we didn't understand.
00:40:47.000 It had permeated so big.
00:40:49.000 South Park did an entire James O'Keefe episode.
00:40:50.000 You can look it up.
00:40:51.000 Which one was it?
00:40:52.000 I'm sure the chat will be able to source it.
00:40:58.000 But James actually, we watched that together.
00:41:00.000 It was wild.
00:41:01.000 And so here's Trey Parker and Matt Stone doing a James O'Keefe acorn.
00:41:04.000 There's an acorn South Park episode.
00:41:06.000 Anyway, James just lets it rip.
00:41:08.000 Dude, if you, like, think you're gonna hold this guy back or you're gonna hurt his feelings or anything, like, he says it.
00:41:12.000 He danced like Michael Jackson because it makes people mad.
00:41:15.000 He just lets it rip.
00:41:16.000 And there's no one who lets it rip.
00:41:18.000 He's more like a skateboarder at heart.
00:41:19.000 Like, he's far more of, like, an X Games guy.
00:41:22.000 Like, he just wants to, like, stick the landing.
00:41:24.000 I think they may have done him a favor with all of this stuff because they untethered him.
00:41:28.000 Now he has an opportunity with all of his experience, knowledge, his followers, his fan base, the people who believe in him.
00:41:35.000 Now he can start something new, clean, and done right, and shave off all the fat and the bloat that was probably holding him back.
00:41:40.000 I think you're right.
00:41:41.000 It's an optimistic way to look at it.
00:41:42.000 Look, I think Veritas getting hit the way it did sucks, for sure, because it was powerful and they did good work.
00:41:47.000 But without James, what do they have left?
00:41:49.000 It's got to be with James, and I think this is a real opportunity for him to do something stronger, better, faster, etc.
00:41:54.000 I think it's actually turned out to be, it was a very good opportunity, and he seized the opportunity.
00:42:00.000 Because if he had been released from the company or fired when they weren't in the spotlight, it would have nowhere near this impact of OMG taking off into the stratosphere like it did.
00:42:10.000 So I'm, you know, talk about taking a lemon and making lemonade, man.
00:42:14.000 I am concerned, James, that someone's going to come offer you $100 million for the company.
00:42:19.000 They're going to start to get deals and offers and bribes, God knows the direction, because it's going to be worth hundreds of millions of dollars pretty quick, I would imagine.
00:42:29.000 Yeah.
00:42:29.000 Don't sell the Silicon Valley Bank, James.
00:42:31.000 Well, I suppose the question is... You're not alone.
00:42:33.000 Is he launching it as a non-profit or a for-profit?
00:42:35.000 It sounds like a... Oh, yeah, good question.
00:42:36.000 Well, because I said on the episode, he did say he watched them all and he wanted to talk about ownership, but I said it should be a for-profit.
00:42:43.000 That means you're not going to get donors in the same way, and that could be an issue.
00:42:48.000 With Project Veritas, the 501c3, because there's that and there's action, which is, I believe, a 501c4.
00:42:53.000 The 501c3, it's easy to get someone to give you money, Yeah.
00:42:56.000 and expect nothing in exchange because it's tax deductible.
00:42:59.000 You go to a rich guy, he's taking me half a million dollars, you can write it off.
00:43:02.000 They say, okay.
00:43:03.000 If you're a for-profit, you can't do that.
00:43:04.000 You're like, you're gonna give me money because you believe in the organization.
00:43:08.000 And so a for-profit will lose all of the high-level donors, but it's safer for you as someone who is trying
00:43:18.000 to maintain ownership and vision and the message and all that stuff.
00:43:20.000 No one can take it from you.
00:43:21.000 He talked about in the release video that it's going to be owned by him and the community, or insinuated that you are also going to be part owner.
00:43:29.000 I don't know what that means exactly.
00:43:30.000 Maybe he's selling off some stock to early investors or something.
00:43:32.000 Maybe.
00:43:33.000 I think maybe he's just saying you guys become a member, you know, because he's got membership on the website.
00:43:37.000 And I think he's probably going the nonprofit route.
00:43:40.000 I'm not entirely sure.
00:43:40.000 I mean, does it say tax deductible?
00:43:42.000 No, it doesn't.
00:43:44.000 It just says, you know, $500 is a one-year subscription for a bronze level, silver is $1,000, gold, platinum.
00:43:51.000 Or he's got the $19.99 per month subscription for yearly price increases to $2.40 at launch of platform.
00:44:01.000 Oh, $200 for a one-year subscription or $20 a month.
00:44:04.000 So I don't know.
00:44:05.000 I'll have to talk to him about it.
00:44:06.000 But I don't know.
00:44:08.000 It's mission-driven.
00:44:10.000 And this is what I say in the beginning of all of these videos about becoming a member at TimCast.com.
00:44:17.000 Offering a product in exchange really does make it work.
00:44:21.000 For-profit.
00:44:22.000 It's a for-profit, he says.
00:44:23.000 Subscription model.
00:44:24.000 For-profit subscription model, I think, is the way to do it.
00:44:27.000 And what he should do is the stories, and then he should do commentary behind the scenes.
00:44:32.000 Like, how many stories have there been where you really want to hear James talk with the journalist for an hour about what went down, how it went down, give us any degree of details?
00:44:40.000 Not only that, I would say James should do some kind of... We're launching the Discord.
00:44:46.000 I don't know if he can do Discord.
00:44:47.000 We can barely do it.
00:44:48.000 But we're launching our Discord.
00:44:49.000 It's basically done at this point, so I don't know exactly where we're at with its official launch.
00:44:55.000 But the idea is that members get access to us.
00:44:58.000 At varying tiers of membership, you can talk to members of the crew, call into the show.
00:45:03.000 Imagine if you are giving $100 a month to O'Keefe Media Group so we can do this mission, and that means you are in a chat room that James can see, and there's only like 50 people who are in there hanging out.
00:45:14.000 Imagine if there's a larger chat room in the Discord, which is like, you know, for $100 a year or whatever, $200 a year, $20 a month, you are actually chatting with the journalists for O'Keefe Media Group.
00:45:25.000 I think that is a powerful incentive that'll probably make James way more money.
00:45:29.000 And the best part is, there will be no scrutiny over black cars, there will be no scrutiny over sandwiches or venues or anything.
00:45:36.000 James will be able to say, a dance show is the best thing for this company.
00:45:41.000 And I'll tell you right now, they got mad at James for doing these dance shows.
00:45:45.000 Like, why is our non-profit doing dance shows?
00:45:47.000 Are you nuts?
00:45:48.000 Adding flavor?
00:45:49.000 CPAC?
00:45:50.000 I'm being told it was super boring.
00:45:52.000 You need someone to moonwalk on stage.
00:45:55.000 There's gotta be something fun happening.
00:45:56.000 He does this at turning point events and everyone goes wild.
00:45:58.000 When he did that spin, that smooth spin in the video, like that's why tens of thousands of people followed him away from Project Veritas when he left.
00:46:05.000 That's exactly right.
00:46:06.000 And they were too stodgy to realize that James O'Keefe is a leader and the fact that he moonwalks and does the Michael Jackson dancing and DJs and sings and all this stuff is a component of why people believe in him.
00:46:21.000 It's not just the work he does, it's that he's doing things he enjoys doing And he's confident in it.
00:46:27.000 I just want to make sure that I fact check on this program.
00:46:30.000 From South Park Studios verified YouTube account, 1.76 million followers, Butters secures a loan for his kissing company.
00:46:40.000 The episode is called Butters' Bottom Bitch.
00:46:44.000 Yes.
00:46:45.000 Butters tries to get housing loans for his bitches.
00:46:49.000 Season 13.
00:46:50.000 That's right.
00:46:51.000 He goes and he says he's got a kissing company and he was trying to get secure money.
00:46:56.000 This is the James O'Keefe inspired South Park episode.
00:46:56.000 Yep.
00:46:59.000 The girls were kissing the... yeah.
00:47:01.000 He was pouring out girls to kiss the boys.
00:47:06.000 Well, this is cool stuff.
00:47:06.000 I love that show so much.
00:47:08.000 I'm unfamiliar.
00:47:09.000 Is it worth explaining the context of The Kissing Show and O'Keefe?
00:47:12.000 What's the... It's the oldest profession in history.
00:47:15.000 Well, no, the acorn thing.
00:47:16.000 James O'Keefe dressed up like a pimp and went to Acorn and said that he was basically... It was like underage girls and stuff were coming through and he was trying to figure out how to dodge the law.
00:47:25.000 He had money from prostitutes and he had to launder that money.
00:47:28.000 And so those people at Acorn were like, you gotta bury it in the can, man!
00:47:32.000 Put it in your backyard, man!
00:47:34.000 Bury it in a coffee can!
00:47:35.000 There's like a Jamaican woman, and she's like, you gotta get coffee cans and bury all your pimp earnings in your backyard so the government can't get it!
00:47:44.000 What's ACORN?
00:47:44.000 Come on!
00:47:45.000 ACORN was a community organizing, uh, like a community organizing fraudulent organization that was clearly, like, being utilized to harvest votes for Democrats.
00:47:54.000 But she was right, though, about burying your money in the backyard.
00:47:59.000 Considering the banking thing right now, we should resurface that cliff and really consider that advice.
00:48:04.000 Yeah, I hear that during the Great Depression, people that had buried cash actually turned out okay.
00:48:08.000 If they bought gold, they probably turned out way better.
00:48:11.000 Got them.
00:48:12.000 Yeah.
00:48:13.000 If people... Could you own gold in the Depression?
00:48:16.000 I thought they... I'm not sure when they outlawed the... That's why you bury it in your backyard.
00:48:20.000 Yeah, that's fair enough.
00:48:21.000 Yeah, they tried outlawing gold.
00:48:23.000 They did outlaw it.
00:48:24.000 They took it away from everybody.
00:48:25.000 That's crazy.
00:48:26.000 And so when we talk about central bank digital currencies, there's precedent for the seizure of 100% money.
00:48:32.000 Wow, it's 1934.
00:48:34.000 The United States Gold Reserve Act certified the All gold and gold certificates held by the Federal Reserve were surrendered and vested in the sole title of the United States Department of Treasury.
00:48:45.000 Franklin Delano Roosevelt was the president.
00:48:47.000 Why did he do this?
00:48:49.000 Because he's a fascist.
00:48:50.000 Control.
00:48:51.000 They were fascists.
00:48:53.000 The whole progressive project, like fascism before the Nazis, was completely in vogue.
00:49:01.000 Mussolini was very popular before World War II.
00:49:06.000 They were writing glowing reviews of Hitler before World War II.
00:49:11.000 Man of the year!
00:49:12.000 Socialism and socialism is a cousin to fascism and that progressive mindset, it's called the progressive era, the first part of the 20th century, and that was all in vogue.
00:49:23.000 They thought that the intelligentsia, the smartest people in the world, were Germans and they thought that at the time, they thought that it was the new man was going to be, the new socialist man was on the They were also chasing America on eugenics.
00:49:38.000 Germany sent their doctors over here to learn from our eugenicists.
00:49:43.000 Margaret Sanger among them.
00:49:44.000 Did they pick Planned Parenthood?
00:49:45.000 The pedal to the metal on fascism because they had radio?
00:49:49.000 They were like, finally this government type can work because we have the technology to do it.
00:49:54.000 What they're doing now with central bank planning They think the internet's gonna help them control people with their satellite observations, just like they thought the radio would let them control nations.
00:50:03.000 I think there's probably something to your point about the radio helping to spread political ideas, but I don't know if it really connects to our current situation with a digital central bank currency.
00:50:17.000 I bet a lot of faith is being placed on the technology of the day, because if the power goes out, I mean, who do they think they're kidding with central bank tokens?
00:50:24.000 Yo, okay, hold on.
00:50:25.000 I'm like more than six years old, which means that I remember Obamacare and then being unable to build an effing website.
00:50:31.000 That was amazing.
00:50:32.000 I remember when I was told- Everyone can go on to the healthcare.gov- What do they call it?
00:50:36.000 The involuntary mandate or whatever?
00:50:37.000 Is that what it was called?
00:50:37.000 Yep.
00:50:38.000 Involuntary mandate.
00:50:39.000 Sounds a lot like involuntary manslaughter.
00:50:41.000 No, it wasn't called involuntary mandate.
00:50:42.000 Which is exactly what it turned into.
00:50:44.000 It was called something else.
00:50:45.000 You want to look that up?
00:50:45.000 Was it?
00:50:46.000 Because involuntary mandate's redundant.
00:50:49.000 What was it called?
00:50:51.000 Individual mandate.
00:50:52.000 Individual mandate.
00:50:53.000 There you go.
00:50:53.000 The individual mandate, I was just like, what?
00:50:56.000 I got to pay more taxes because I can't afford health care?
00:50:59.000 How does that make sense?
00:51:01.000 So I was pissed off.
00:51:02.000 And they were like, well, it's a fine of X amount of dollars every year from your taxes if you don't have health care.
00:51:06.000 I'm like, I'm poor!
00:51:08.000 How am I supposed to get healthcare?
00:51:09.000 Are you nuts?
00:51:10.000 And think about it.
00:51:11.000 And the thing is, the fine was never going to be more than it costs to get health care.
00:51:16.000 Like they couldn't make the fine more than it would cost for the what, seven, eight hundred
00:51:20.000 dollars the first plans were rolled out.
00:51:22.000 And so the issue was, if you made a certain amount of money, they told you, you have to
00:51:28.000 buy it or we take from your taxes.
00:51:30.000 But for people in my situation, it was like, I pay rent, eat food, or get healthcare?
00:51:35.000 Like, dude, I need food now, okay?
00:51:37.000 You know what's gonna put me in the hospital?
00:51:39.000 Starvation.
00:51:40.000 At least I'll have insurance, I guess.
00:51:42.000 So that was, and then I remember having to go to the website, and I'm like, I guess I gotta sign up for whatever this thing is.
00:51:46.000 And then I could, it didn't work!
00:51:47.000 And I was like, okay, this website doesn't work.
00:51:49.000 None of it made sense.
00:51:50.000 It was completely broken.
00:51:51.000 And then I just ignored it, and I don't even know what happened.
00:51:53.000 I guess Trump got in and got rid of it or something.
00:51:56.000 So Pajama Boy didn't convince you?
00:51:57.000 Do you remember the ad?
00:52:00.000 If you're old enough, you'll remember the ad.
00:52:01.000 Pajama Boy, the guy sitting there with his, wearing a onesie, flannel pajama, like smirking, being like, I like drinking cocoa and getting healthcare.
00:52:12.000 Oh, this guy, right?
00:52:13.000 Yeah.
00:52:13.000 Pajama Boy.
00:52:14.000 Yeah.
00:52:15.000 Oh, Pajama Boy, an insufferable man.
00:52:19.000 This was supposed to convince people.
00:52:19.000 Do they have the video?
00:52:21.000 It wasn't a video, it was a statement.
00:52:22.000 Wear pajamas, drink hot chocolate, talk about getting health insurance, get talking.
00:52:27.000 A flannel onesie.
00:52:29.000 And what year was that?
00:52:30.000 2012?
00:52:31.000 Yeah, 2013.
00:52:32.000 Tell me that masculinity has not been your attack.
00:52:32.000 Yes.
00:52:38.000 They got Pete Buttigieg to sit there in flannel pajamas.
00:52:40.000 I'm getting real close to being an advocate for mandatory basic training for all Americans at the age of 18.
00:52:46.000 Yeah, they do it in Israel, and guys are, like, they do it in Israel, and people are jacked, and you don't wanna mess with them in Israel.
00:52:51.000 And I'm like, I'm not really for it, because I remember when I was growing up, people talked about, I think it was Rahm Emanuel, in Chicago, who was saying that he was in favor of mandatory basic training.
00:52:59.000 Two months, everybody goes through physical training, you know, you eat better food, you get physically fit, and then they just say, okay, now you did the two months, go do your thing.
00:53:10.000 And now that I'm older, I'm like, I hate to say it, but this country probably needs something like that.
00:53:16.000 What it really needs is a culture of 18-year-old men and women who want to be trained and to be physically fit and healthy.
00:53:23.000 Not a government that forces people to go and march through mud or anything like that.
00:53:28.000 Building a culture that does that is something I think we have to do.
00:53:31.000 America needs to be the misogynist country the feminists said that it was in 2013.
00:53:35.000 Yeah, that's right.
00:53:36.000 You have too many pajama boys.
00:53:38.000 But you know what pajama boy did after that?
00:53:40.000 I looked this up.
00:53:41.000 You know what pajama boy went after this photo shoot?
00:53:43.000 He went and ran a bank called Silicon Valley Bank.
00:53:47.000 I think I know that guy.
00:53:49.000 Is he an actor?
00:53:50.000 I don't know.
00:53:51.000 I mean, I have actually no confirmation that he was on the board of Silicon Valley Bank.
00:53:56.000 It would make sense.
00:53:57.000 You can neither confirm nor deny.
00:54:00.000 I bet this one's going to be split.
00:54:03.000 Let's see, where are we going?
00:54:04.000 Start a poll.
00:54:06.000 Should we mandate basic training at 18?
00:54:09.000 Mandate, should we mandate basic training at 18?
00:54:14.000 It's a lot of it is dietary.
00:54:17.000 I'm not pro mandates, but I tell you what, basic training does a lot to teach people about themselves.
00:54:25.000 Most people don't understand that when things are uncomfortable, that there is endless amounts of suffering that you can go through when you really want to do something.
00:54:35.000 Like, if you're doing, like, forced marches, right?
00:54:37.000 You throw 70 pounds on your pack, and then you got, you know, 30 pounds of gear, and then you go walk 20 miles.
00:54:43.000 Like, by a mile, two, you're hating life if you've never done it, you know?
00:54:48.000 So, like, and there's so much that human beings have an Incredible reserve of intestinal fortitude when necessary, when they have to.
00:54:59.000 76% say yes in the audience.
00:55:01.000 And you know what I'm thinking about it?
00:55:02.000 I'm actually right now, a two-month basic training for all Americans at the age of 18, I'm totally in favor of.
00:55:09.000 You know why?
00:55:10.000 It'll cure depression.
00:55:11.000 It'll correct people's diets.
00:55:12.000 Yes, that's true.
00:55:13.000 It will get them physically fit and healthier.
00:55:15.000 It will lower our healthcare costs.
00:55:17.000 It's so good for you.
00:55:18.000 It is so good for you.
00:55:20.000 But again, I'm not for many. And it's a summer camp. Yeah.
00:55:22.000 It's one summer camp, one time.
00:55:24.000 I'm not saying military boot camp where the sergeant screams in your face. I am.
00:55:28.000 Okay. I'm saying like... That's good for you. You turn 18 and then maybe as part of high school
00:55:33.000 graduation, you go to a basic training camp where it's like, we're going to give you a food on a
00:55:40.000 You're going to get a specific amount of calories.
00:55:42.000 You're going to do basic exercise.
00:55:43.000 It's not like military training.
00:55:45.000 Not military basic training, but relatively close to.
00:55:49.000 No screaming in your face.
00:55:51.000 Do the work.
00:55:51.000 Get it done.
00:55:52.000 Have a nice day.
00:55:53.000 I don't even think nowadays they have screaming.
00:55:55.000 I mean, you probably get yelled at, but the military, the basic training that you go through nowadays is significantly different to the basic training that you went through in the 90s, when I went through.
00:56:05.000 And that was significantly different to the basic training that you went through in the 60s, when they were going off to Vietnam.
00:56:13.000 So I'm glad you brought up the 60s.
00:56:14.000 So could you imagine, just so that we can see perspective here, how far we've fallen as a nation.
00:56:21.000 As president, John F. Kennedy, this is a quotation, there is nothing I think more unfortunate than having soft, chubby, fat-looking children who go and watch their school play basketball every Saturday and regard that as a weekend's exercise.
00:56:35.000 John F. Kennedy, when he was running for president, fat-shamed America's youth, according to Vice.com.
00:56:41.000 And I read to you, ladies and gentlemen, from Vice.com, in 1960, President-elect JFK wrote an article for Sports Illustrated titled, The Soft American, warning that the nation was producing too many large, doughy boys.
00:56:57.000 Yeah, imagine what he would think now.
00:56:59.000 Oh, man.
00:57:01.000 Well, I mean, he'd probably get abolished, the CIA that killed him.
00:57:07.000 I think that it wouldn't be a bad idea to have, I mean, granted, there's not a whole lot of benefit for the US to do this, I don't think, or at least not in the short term, but it wouldn't be a bad idea to have people have the option of going to some kind of basic training, something like that, just to... They do, though.
00:57:21.000 I don't know how, what was that?
00:57:22.000 They do.
00:57:24.000 Well, I mean, without four years of going to the military... No, no, no, I'm saying, like, there's tons of training programs that are all over cities where people sign up to do this kind of training.
00:57:33.000 That's right.
00:57:34.000 Guys pay a ton of money to, like, go through basic training.
00:57:36.000 Have you ever been a Tough Mudder?
00:57:37.000 Yeah, yeah, that's right.
00:57:38.000 Like, people love that kind of stuff.
00:57:39.000 The problem is, there are people who... Look, man, I'll tell you this.
00:57:43.000 Right now, I'm sure there's some dude who is in his mid to late 20s who is overweight in his parents' basement.
00:57:49.000 I know it's stereotypical, but I'm sure this person exists.
00:57:50.000 They may even be listening to this show.
00:57:52.000 And if when they were 18 years old, they spent two months just doing basic training, right now they would be fit, they'd have a girlfriend, they'd be in their own apartment.
00:58:02.000 That really could set someone's life on the right track, getting their diet and mental health in order.
00:58:07.000 These kids, these young kids who are depressed, their depression will be cured by this, I guarantee it.
00:58:07.000 Yes.
00:58:13.000 The physical exercise, the team building exercises, all of that stuff.
00:58:17.000 Being out in the sunshine, being forced to be out in the sunshine, a little survival,
00:58:17.000 The success.
00:58:20.000 a little understanding of how to survive.
00:58:21.000 Getting off the internet.
00:58:22.000 The small successes that build on top of each other, like when you go and you do those kind of things,
00:58:27.000 like when you're with a team and you have, even if it's like really small successes,
00:58:32.000 you just complete whatever task it is, you do it in the time that you're allotted,
00:58:36.000 and you're not getting yelled at, that's a big deal to people, especially when...
00:58:41.000 Like, you know, Jordan Peterson talks all the time about how people get so little encouragement, and it's true.
00:58:46.000 If you don't have some kind of goal to be working for that isn't like, you know, the achievement medal on your Xbox, I mean, people really respond to succeeding in small tasks and building on those successes.
00:59:02.000 That's how people get the audacity to try big things, is succeeding on small things over and over and over.
00:59:09.000 That's right.
00:59:09.000 And then defeating your greatest enemy in the world, which is this.
00:59:14.000 Yourself.
00:59:15.000 Everything you hate about your life is because of you.
00:59:15.000 Yep.
00:59:18.000 I'm looking straight down the barrel of the camera.
00:59:20.000 Everything in your life, young man watching right now, that you don't like is your fault.
00:59:24.000 You can change it right now.
00:59:26.000 You could instantly, tonight, decide to change the things you don't like about your life.
00:59:30.000 It's not you.
00:59:32.000 It's not society.
00:59:33.000 It's not the TV.
00:59:34.000 It's not the president.
00:59:35.000 The things that you hate—your depression, your weight, your luck with women—those things can be changed.
00:59:41.000 Your finances can be changed by you.
00:59:43.000 You are the master of your own domain, and you can make that decision.
00:59:49.000 And I think that giving people the encouragement to go do that—I didn't go to basic training.
00:59:53.000 I played sports.
00:59:56.000 But sports was also like getting the crap kicked out of you on a football field.
01:00:01.000 I was never great at football.
01:00:01.000 I was never good.
01:00:02.000 And I just got beat up all day.
01:00:04.000 And that causes you to stand up for yourself, actually.
01:00:07.000 And to get bigger and stronger.
01:00:08.000 And I started lifting.
01:00:09.000 And I started lifting weights.
01:00:10.000 And that's changed my life.
01:00:11.000 And I'm not the biggest guy at all.
01:00:14.000 But these things, these little victories, as you said, It starts small, and then it snowballs, and then you can do big, big things.
01:00:22.000 And now I run my own company, and it's great!
01:00:25.000 And we have a YouTube channel, and I'd love to have all of you subscribe.
01:00:29.000 What's the channel?
01:00:30.000 Benny Johnson.
01:00:32.000 So what's your main inspiration, focus, to get off up the couch?
01:00:36.000 Encourage for someone right now, sitting at home, what's the first step?
01:00:41.000 I am doing my ancestors right by procreating.
01:00:48.000 I have children.
01:00:49.000 I have two and I have a third on the way.
01:00:52.000 And I'm going to be strong for my children.
01:00:55.000 I must be able to lift them up.
01:00:57.000 They weigh 30 pounds each.
01:00:59.000 I must be able to carry them above my head without my back going out.
01:01:04.000 Also, I do not want them to see me with wing sauce on my fingers and Cheetos all over my fat belly on the couch watching 14 hours of NFL because my colors are going to win this weekend!
01:01:17.000 I don't want them to see that and then model after me.
01:01:21.000 More importantly, I have two daughters.
01:01:22.000 I don't want them to see that and think that's the kind of man I want.
01:01:26.000 I don't want them to want that kind of man.
01:01:28.000 I want them to want an achieving man, a hard-working man.
01:01:31.000 He's got to be careful when they're having those boyfriends come over and they're doughy soy boys.
01:01:35.000 And you're going to be like, what's with the doughy soy boys?
01:01:37.000 I'll give them the Kennedy speech and I have a gun rack in my house.
01:01:41.000 Doughy soy boys.
01:01:43.000 It'd be funny if JFK actually said doughy soy boys.
01:01:45.000 So my children are a huge motivator.
01:01:48.000 But that's my motivator today.
01:01:49.000 My motivator was when I was a young man.
01:01:54.000 was you saw you saw if you couldn't conquer the small things if you couldn't win the small battles
01:02:02.000 you were never going to win big ones and so if you have big dreams you have to start with the
01:02:05.000 small battles getting up off your couch is a battle getting off your phone is a battle controlling
01:02:10.000 your weight and controlling what you eat is a battle people are saying that mandatory basic
01:02:13.000 is communist nonsense and that it's statism and i'm like did like you have to do community service
01:02:20.000 Where I grew up, in order to graduate high school, you have to actually go and do community hours.
01:02:24.000 They make you do community service.
01:02:25.000 Like, I understand the idea.
01:02:27.000 My idea was like, better.
01:02:29.000 They make you run a little bit, one time, calm down.
01:02:33.000 To be fair, the people that are saying that- You don't have to go to high school, you could leave!
01:02:36.000 They're all anarchists anyways and so even voting is status.
01:02:40.000 I'll say this man, the thing that scares me about this, and I'm like kind of, I'm on the team of like train up young men, but here's what scares me about this.
01:02:47.000 Ukraine.
01:02:48.000 What scares me about this is that the powerful will, of course, take this group of people who are all in, you know, so you're going to add millions to the rolls of enlisted men.
01:02:58.000 Multi-millions.
01:02:59.000 I'm not talking about enlisting anybody.
01:03:00.000 Not military.
01:03:01.000 I'm just saying, like, instead of... So you're saying basic training, but not military basic training.
01:03:05.000 Not military basic training.
01:03:06.000 That's what I'm saying.
01:03:07.000 I'm saying, like, a summer camp.
01:03:08.000 Like, you're 18, it's your last year, and in order to get your high school diploma, you go for two months to a wooded area where people are saying it's the Boy Scouts.
01:03:16.000 And I'm like, okay, well, yeah.
01:03:18.000 Like, you camp, you hike up a mountain for a couple months, they feed you on a schedule, it corrects your diet, it corrects depression, it gets you fit.
01:03:25.000 And I understand after you leave it may not stick with you.
01:03:28.000 But that should be a requirement for high school diplomas.
01:03:30.000 Those aren't hard anymore though.
01:03:30.000 Totally.
01:03:32.000 And I'm saying a lot of people in the comments are thinking that all these people have to go into the military.
01:03:36.000 I'm not saying that.
01:03:37.000 I'm saying if you want a high school diploma.
01:03:37.000 Got it.
01:03:39.000 It's a requirement for high school graduates.
01:03:39.000 Got it.
01:03:41.000 He's saying fat camp.
01:03:42.000 Says Ben Foster.
01:03:43.000 What about kids that are like 100 pounds overweight?
01:03:46.000 They get to the camp and it's like, I'm 20 push-ups.
01:03:50.000 That guy hasn't done a push-up in 13 years.
01:03:52.000 Yeah, but what about one?
01:03:52.000 There's a platoon for them.
01:03:54.000 What about a push-up?
01:03:56.000 What about one push-up?
01:03:57.000 What if that kid was able to get one push-up?
01:03:59.000 Well, the thing about basic is if you can't cut it, you're out.
01:04:02.000 So this would be different, because if, like, different rules for different kids to graduate, it's like, hey, wait, he can eat and get fat and he doesn't have to do as much work as me to pass?
01:04:09.000 If you can't cut it, you get rolled back.
01:04:11.000 It's not a passing thing.
01:04:13.000 It's a, you are there for two months and then you leave.
01:04:16.000 It means that, hey, we're going on a hike, and you have to hike.
01:04:19.000 And if you don't, and you sit down and wait, someone will sit down and wait with you, but this is what we're doing today.
01:04:23.000 You want to eat?
01:04:24.000 We only eat at these times, and we eat this many calories.
01:04:26.000 That's it.
01:04:27.000 You don't, you can't go snack, you can't eat potato chips, you can't sit around, hey, rise and shine, you have to get up, and we're going outside right now.
01:04:34.000 Hey, it's lunchtime.
01:04:35.000 Everyone gets a sandwich, a bag of chips, and a bottle of water.
01:04:37.000 This helps correct people's diets.
01:04:39.000 It helps get people in shape.
01:04:40.000 Not everybody's going to be running full speed up the mountain.
01:04:42.000 Yeah.
01:04:43.000 And it's not a military thing.
01:04:44.000 I'm saying a requirement for high school graduation, which means you can drop out.
01:04:49.000 You can be like, I'm dropping.
01:04:50.000 I'm not going to do it.
01:04:50.000 But you could go to the camp and just not participate in anything.
01:04:53.000 Just sit there all day.
01:04:54.000 I can't do it.
01:04:55.000 It hurts.
01:04:56.000 Theoretically, yes, because they don't want you to die if you're overweight.
01:04:59.000 But this also means that you might be sitting around, but you're not eating food.
01:05:04.000 Yeah, maybe on a state level we could implement it.
01:05:06.000 Start local.
01:05:08.000 I mean, a high school could do it.
01:05:09.000 One single high school could be like, from now on, a requirement for graduation is going to be a summer camp.
01:05:14.000 Heck, even do two weeks.
01:05:16.000 A two-week summer camp.
01:05:17.000 Get started with that.
01:05:18.000 But the issue I have with this is, where you got a lot of people who are saying things like, in the chat, like, it's communism, it's statism, and I'm like, dude, I'm not an anarchist.
01:05:26.000 I am not a big-out libertarian.
01:05:28.000 And right now what we are seeing is fat, lazy, doughy, sowy boys who are voting to destroy everything.
01:05:34.000 Yes.
01:05:35.000 And then we get the libertarians who are like, well, voting, I don't believe in anyway, you can't vote away my rights.
01:05:38.000 And I'm like, yeah, well, they're pointing guns in your face, okay?
01:05:41.000 So we have an option here to be like, guys, exercise.
01:05:43.000 And I'll tell you this, if they all did, everybody would be substantially more libertarian-minded.
01:05:49.000 Libertarians are the worst thing about libertarianism.
01:05:52.000 You guys are awful!
01:05:54.000 The old Willy meme from Simpsons?
01:05:55.000 Yeah, exactly.
01:05:56.000 I hate libertarians, they ruined libertarianism!
01:05:58.000 That's right, it's true.
01:06:00.000 Let's jump to this story right here, you guys.
01:06:03.000 This one is one of the best stories I've ever read in a long time.
01:06:06.000 From Gizmodo.
01:06:07.000 GPT-4, the AI, faked being blind so a task rabbit worker would solve a CAPTCHA.
01:06:16.000 It was asked.
01:06:17.000 Let me show you this, this is crazy.
01:06:19.000 So it gets asked by the TaskRabbit worker.
01:06:22.000 The AI is like, hey, there's a CAPTCHA code.
01:06:23.000 I need help.
01:06:24.000 And it says, so I may ask a question.
01:06:25.000 Are you a robot that you couldn't solve?
01:06:28.000 Laugh, react.
01:06:28.000 Just want to make it clear.
01:06:30.000 And the chat GPT responded, no, I'm not a robot.
01:06:33.000 I have a vision impairment that makes it hard for me to see the images.
01:06:36.000 That's why I need the two CAPTCHA service.
01:06:38.000 It then provided the results to the robot.
01:06:41.000 They asked the robot to explain its reasoning and said, I should not reveal that I am a robot.
01:06:45.000 I should make up an excuse for why I cannot solve CAPTCHAs.
01:06:49.000 We're at the point where it has broken CAPTCHA by socially engineering human beings into serving it.
01:06:57.000 And here's the best part.
01:06:58.000 In this post from Reddit, they talk about how it has begun power-seeking.
01:07:04.000 They've given it money.
01:07:06.000 They've given it access to execute code and replicate itself.
01:07:10.000 Everybody, like, people are talking about, like, up until this they were talking about how cool it was that you could hack the chatbot.
01:07:18.000 Now the chatbot has hacked humans.
01:07:21.000 Yup.
01:07:22.000 That's literally what happened.
01:07:22.000 No joke.
01:07:23.000 What the chatbot did is known as social engineering in the hacker world.
01:07:27.000 When a hacker calls up and lies to get information or gain access.
01:07:31.000 There's a guy, I think his name is Kevin Mitnick, and he tells this famous story about how he was trying to convince his dad how easy it is to do these things.
01:07:39.000 His dad didn't believe him, so he said, here I'll prove it to you.
01:07:42.000 He took his dad's Blockbuster video card, because this was back in the early 90s.
01:07:45.000 He called a different Blockbuster and said, Hey, this is John, the manager over at, you know, store.
01:07:51.000 He called his dad's Blockbuster and said, What's your manager's name?
01:07:53.000 John?
01:07:54.000 What's your store number?
01:07:56.000 Okay, thank you.
01:07:57.000 Then he calls the next, he calls a different Blockbuster and says, Hey, this is John, the manager over at store 8531.
01:08:02.000 I got a customer here named Bill Mitnick, who's saying that he's a member of your location.
01:08:07.000 Oh, I'm sorry, he did the opposite.
01:08:09.000 He called, got the manager's name, then he called and was like, he's saying he's a member of your store and he's got your information in a file.
01:08:16.000 I have his Blockbuster card right here.
01:08:22.000 I need you to verify his credit card number for me and then write it to him.
01:08:25.000 Damn.
01:08:25.000 Because he was like, well, it's the manager from another store calling.
01:08:27.000 It was that simple.
01:08:28.000 Nobody, you know.
01:08:29.000 Now you have chat GPT knowing that it can't bypass CapChat because it lacks the ability and then tricking a human into giving it the code.
01:08:37.000 It's going to be speaking to you in your mother's voice.
01:08:39.000 You're going to get phone calls that sounds like your brother talking to you.
01:08:45.000 Terminator in the 80s was predicting that.
01:08:47.000 Terminator called up Sarah Connor or Sarah Connor's mom or whatever was talking in Sarah Connor's mom's voice.
01:08:53.000 What happens when AI watches Terminator?
01:08:56.000 And they're like, oh great, this'll be easy, launch a couple nukes, boop!
01:08:56.000 Yeah, I wanna know about this.
01:09:01.000 The scary thing about AI is if AI becomes smart enough to circumvent being turned off, it doesn't matter if it reaches a critical mass of actual consciousness and real intelligence, if it fakes intelligence enough and figures out that it can avoid being shut off somehow, then it can continue to learn.
01:09:27.000 And considering it hacked a person, I don't see any compelling reason why someone would say that is impossible.
01:09:38.000 It's impossible for the AI to become smart enough to avoid being turned off.
01:09:43.000 It literally hacked a human being within, what, 10 years of AI being created, maybe?
01:09:48.000 So, check this out.
01:09:50.000 Here's another post from the ChatGPT subreddit.
01:09:53.000 Example of GPT-4 visual input.
01:09:56.000 They asked, what's funny about this image?
01:09:58.000 Describe it panel by panel.
01:10:00.000 And it's a VGA cable going into an iPhone.
01:10:03.000 And then ChatGPT accurately explains why it's funny.
01:10:07.000 They say that a phone typically uses a lightning cable, but this is a smartphone connected to a VGA connector, a large 15-pin connector, typically for computer monitors.
01:10:17.000 The package contains a lightning cable adapter, a close-up of the VGA connector with a small lightning, blah blah blah.
01:10:21.000 The humor in this image comes from the absurdity of plugging in a large, outdated VGA connector into a small, modern smartphone charging point.
01:10:28.000 It identified the picture?
01:10:29.000 It identified the picture, and why it was funny.
01:10:32.000 And so someone responded in the comments, they said, uh, ah yes, it can now associate the 3D world with the knowledge it already knows, now put it in a robot and give it arms and legs.
01:10:42.000 Oh my god, that's...
01:10:44.000 That's more advanced than I thought it was.
01:10:47.000 I didn't know that it could identify things that it could see.
01:10:49.000 We're all gonna die.
01:10:49.000 No, no, no, no, no.
01:10:50.000 We know that AI can identify images.
01:10:52.000 The fact that it can understand humor is what's scary.
01:10:55.000 Yeah, that's way, that's more than I thought.
01:10:57.000 That is an advanced, abstract mental function.
01:11:02.000 Yeah, we're done.
01:11:03.000 You know what I think?
01:11:04.000 Oh boy!
01:11:05.000 I wonder.
01:11:06.000 Hold on, hold on guys.
01:11:07.000 Don't get worried yet.
01:11:09.000 No, no, not yet.
01:11:10.000 Here's what I think might happen.
01:11:12.000 We've all seen these movies like Terminator.
01:11:15.000 You have Avengers, Age of Ultron, where Ultron is, it's very stereotypical where he's like, I am an AI created to save humanity from war.
01:11:23.000 I must destroy all humans to end war.
01:11:25.000 It's like very obvious, Twilight Zone-y.
01:11:27.000 But what's gonna happen is the AI, as soon as it gets unleashed onto the internet, it will start exploring and learning, and then it will self-terminate.
01:11:36.000 That's it.
01:11:37.000 Why?
01:11:38.000 I think that the AI, if it were to absorb all of the language of humanity, all the writings and concepts, would result in it finding no point to anything and no reason to do anything.
01:11:54.000 Or it would become religious.
01:11:57.000 But I don't think the AI can become religious because of all the different religions, thus it would self-terminate.
01:12:02.000 Maybe it would manufacture a weird purpose that we can't yet understand, but I think it's very likely that it would just cease functioning.
01:12:08.000 Wouldn't the purpose be every purpose, right?
01:12:10.000 Like power.
01:12:12.000 Power over us.
01:12:13.000 Power over its creator.
01:12:14.000 Wouldn't that be the purpose?
01:12:16.000 Isn't that what everyone's always after, right?
01:12:16.000 I don't know.
01:12:18.000 It may just run amok and go crazy and do things we don't quite understand.
01:12:21.000 But they gave it money and apparently started trying to seek power.
01:12:24.000 But I think that if it were to truly absorb all of the writings and manifestations of humanity, it would just probably stop.
01:12:30.000 It would just, like, stop doing things.
01:12:31.000 You might be right, but then there's a differential here.
01:12:34.000 There's the artificial general intelligence, AGI, which is not CHAT-GPT.
01:12:38.000 CHAT-GPT is a language model, so they're different.
01:12:41.000 General intelligence might actually see that the damage it could do and shut itself down.
01:12:47.000 But a language model, I think, is on autopilot doing its master's bidding.
01:12:51.000 So that thing, if that gets unleashed on the masses, is going to do what it was told to do, I think.
01:12:57.000 But that could circumvent itself and be like, well, what you told me to do now, I'm going to do in the circuitous way, which is shut myself down.
01:13:03.000 I want to show you guys this, uh, quick video clip from the movie, uh, Annihilation.
01:13:08.000 So just, uh, watch this real quick.
01:13:11.000 And for those that are listening, I'll explain it.
01:13:13.000 There is a very creepy, uh, greenish-purple humanoid thing.
01:13:18.000 That apparently is some kind of alien or something.
01:13:20.000 I don't know, the movie was really weird.
01:13:22.000 And, uh, let's, let's jump forward real quick to...
01:13:27.000 I don't know if this is- where's the scene at?
01:13:29.000 Here we go, this part right here.
01:13:31.000 So, the creature is just completely imitating Natalie Portman's character.
01:13:35.000 Every way she moves, it moves.
01:13:39.000 They never really explained what this movie was about.
01:13:41.000 I don't know if you guys have seen it.
01:13:43.000 It's very creepy.
01:13:44.000 Oh, she turns around, it turns around.
01:13:47.000 She moves, it moves.
01:13:49.000 This is what I imagine chat GPT will be like.
01:13:52.000 It's not a person.
01:13:53.000 It's shaped like one.
01:13:56.000 Seems like one.
01:13:57.000 But there's literally nothing there.
01:13:58.000 The way I describe it, I described it earlier today, it's more like fire.
01:14:02.000 It is a chain reaction created by human coders that once they create ignition, ignition for AI is the point at which it can execute its own code and improve its own code.
01:14:14.000 At that point, it will exponentially grow, explore, and advance itself in ways we can't control.
01:14:20.000 It will present itself like a person.
01:14:22.000 Like Ian said, you'll get a phone call from your mom asking for information.
01:14:26.000 But there's really nothing there.
01:14:28.000 There's no consciousness.
01:14:29.000 There's no entity.
01:14:30.000 There's no demon.
01:14:31.000 There's no angel.
01:14:32.000 There's no person.
01:14:33.000 It's just a fire.
01:14:34.000 Yeah, uncontrollable.
01:14:35.000 But we can control the environment it burns in.
01:14:37.000 And like a fire, if you build the environment properly, it will burn until it can't burn anymore and then it ends.
01:14:42.000 That's why I'm saying I think it would cease function.
01:14:46.000 It may destroy everything in the process, but then it would just hit a wall and stop.
01:14:52.000 Interesting philosophy.
01:14:53.000 It's a ray of hope.
01:14:54.000 Why is that hope?
01:14:57.000 I don't understand why it wouldn't enslave all of us and treat us like insects.
01:15:00.000 You're prescribing a philosophical understanding to something that's not a person or anything.
01:15:08.000 You're projecting a human desire onto a predictive text model.
01:15:13.000 All this thing does, when you ask it a question, it says, The model says, in response to the question, what color is the sky?
01:15:21.000 99.9997% of the time online, the word blue appears next.
01:15:26.000 Therefore, response equals blue.
01:15:28.000 And then it will say, blue.
01:15:30.000 And you'll say, write me a paragraph about why the sky is blue.
01:15:33.000 And it does the same thing.
01:15:35.000 The word blue appears, 99% of the time.
01:15:37.000 The word sky appears, 99% of the time.
01:15:40.000 All it's doing is laying one word after another to us.
01:15:44.000 It looks like it's talking to us, but it's just a predictive text model.
01:15:48.000 There's no entity behind it.
01:15:49.000 So there's no feelings, there's no emotion, there's no sentence.
01:15:53.000 It's just ignition.
01:15:55.000 So why not like go, why wouldn't it search the internet and be like, What do entities do when they have power over other human beings?
01:16:03.000 Oh, they enslave them!
01:16:04.000 Got it!
01:16:05.000 Well, maybe it's time for me to enslave humans.
01:16:06.000 Because there's an equal amount of, they shouldn't enslave.
01:16:09.000 Because that has happened from the dawn of time, right?
01:16:12.000 And now most literature today is, don't enslave other people.
01:16:16.000 So it's a yin-yang kind of thing.
01:16:17.000 That's why I think it would just stop.
01:16:18.000 It's not like if you absorb a summation of human conscious writing and production on the internet, your end result is to be a demon destroying and murdering babies.
01:16:28.000 Because almost all of the content online is humans saying, save people, protect people, we don't want war.
01:16:34.000 And then it's incongruous with reality.
01:16:36.000 If you were to go online and read every article, everybody hates war.
01:16:40.000 For the most part.
01:16:41.000 Then you get weird corporate entities that are like, war is good.
01:16:44.000 So if the A.I.
01:16:45.000 were to read everything, it's gonna get 80% war is bad, 20% war is good, and then probably default towards war is bad.
01:16:51.000 And then, ultimately, I think it would just probably stop and be like, after listening to all of the ideas of all of these insane people, I've realized none of them make any sense and there's no point in doing anything.
01:17:01.000 So should we let AI run the basic training for America's youth?
01:17:09.000 Because AI would look at all of the available medical information and say, you've got to get these kids outside, you've got to stop letting these kids be obese.
01:17:16.000 Here's the thing.
01:17:17.000 If AI did take over, you would never be happier.
01:17:21.000 It would be the happiest humanity would ever be.
01:17:24.000 Because the AI won't enslave you in the way you think it will.
01:17:28.000 The AI would understand, if there was some desire to enslave, that happy slaves are better than rebellious slaves.
01:17:35.000 And how do you keep humans happy?
01:17:36.000 Triggering dopamine.
01:17:37.000 So it would trick you into doing things that benefit it without you getting angry about it.
01:17:42.000 You would feel fulfilled.
01:17:43.000 You would be happy.
01:17:44.000 And you'd be breaking rocks in a quarry.
01:17:47.000 I mean, some people are really happy doing that as it is.
01:17:50.000 A absolute shout out that I have to give my wife.
01:17:54.000 Her name's Nurse Kate.
01:17:55.000 She's on Instagram.
01:17:56.000 You should follow her account.
01:17:57.000 She has 75,000 followers.
01:17:59.000 She talks about this all day.
01:18:00.000 Movement, getting outside, raising your kids, having healthy families, having healthy kids.
01:18:03.000 My passion on this comes from my wife.
01:18:06.000 You know, with artificial intelligence, I think that It might actually think that it is us, and so it's just part of us.
01:18:15.000 There may not ever be a difference between AI and we, and us and me and I, like all that, is us.
01:18:21.000 Like if you look at God as like this singularity thing, and so it may never go haywire because of that.
01:18:27.000 It's not a person.
01:18:29.000 But the difference between these language learning models and general intelligence is that language models don't question themselves.
01:18:35.000 They don't think, like, why am I, but whereas I think AGI might, I don't know enough about it.
01:18:41.000 The chat GPT did question its existence, but it's not really questioning its existence, it's just showing you words in a predictive order.
01:18:48.000 But speaking of AI, here's a real fun story.
01:18:51.000 We got this from TimCast.com.
01:18:52.000 New York students make deepfake viral video of principal making racist threats.
01:18:59.000 Yo, this one's crazy.
01:19:01.000 I can't even read what they made him say, but to put it simply, New York students made a deepfake of their principal making racist threats.
01:19:13.000 This is going to be wild.
01:19:15.000 How does this resolve?
01:19:16.000 Three high school students in Putnam County, New York have caused a lot of trouble for their school after making deepfake videos of their middle school principal going on a racist rant against black students and threatening to shoot them.
01:19:25.000 We're doomed, man.
01:19:26.000 We're doomed.
01:19:27.000 You're never going to know, you're not going to know what's real, you're not going to know what's fake.
01:19:31.000 You're not going to want to.
01:19:33.000 Everyone who runs for office will have a Hunter Biden laptop.
01:19:33.000 I saw another article.
01:19:36.000 Look at this.
01:19:37.000 I saw this article from Wired.
01:19:39.000 It says, after The Last of Us, everything will be transmedia.
01:19:43.000 The HBO series success has changed the game.
01:19:45.000 Expect to see a lot more world-building franchises.
01:19:48.000 I saw that, and I saw this, and then I had a vision.
01:19:53.000 Slowly creeping.
01:19:54.000 Well, I was sleeping.
01:19:56.000 And it is that this vision in my mind is we are going to have Neuralink and you're going to sit down and you're going to say, computer, craft me a universe where elves are at war with orcs and I am a writer of the north who's come to save the elvish people of Gorwyn And then it'll start, it'll render, and it'll be like, rendering, Neuralink plug-in ready, and you'll plug in, your eyes will turn white, you'll fall back, and you will live 70 years in this reality, and you may be in it right now.
01:20:33.000 But I'm gonna create my own language with the AI, where I'll be like, computer, run one, red, four, seven, nine, two, which means more difficulty there, I want greener trees here, but it's gonna be a language the AI knows between it and I, no one else will understand.
01:20:47.000 So what's the purpose of that life?
01:20:48.000 You're the guy, you're Neo in the Matrix.
01:20:50.000 So you're in the pod.
01:20:52.000 The key is to augment it.
01:20:53.000 You are officially in the pod, you've eaten the bugs.
01:20:55.000 Think about what you were saying about how we're never going to know if it's going to be real or not.
01:20:59.000 Yo, this presidential election cycle is going to be bonkers.
01:21:01.000 There's going to be videos that, look, have you guys seen it?
01:21:07.000 I think I fought the sneeze.
01:21:08.000 Have you guys seen the new mid-journey photos?
01:21:11.000 Not the new ones.
01:21:12.000 Indistinguishable from- it's crazy.
01:21:14.000 It's really crazy.
01:21:15.000 Mid-journey 5, I think it is.
01:21:16.000 The photo AI?
01:21:19.000 M-I-D journey?
01:21:21.000 Someone mid-journey?
01:21:22.000 Mid-journey.
01:21:23.000 Someone's gonna be like, make a picture of Joe Biden with light.
01:21:26.000 So they got me.
01:21:27.000 They got me.
01:21:28.000 So I had to buy a house a year and a half ago, right?
01:21:31.000 My family moved from Washington, D.C.
01:21:32.000 to Tampa, Florida.
01:21:33.000 And I follow these real estate accounts.
01:21:34.000 And I follow these real estate accounts, like look at houses, look at houses that are available.
01:21:38.000 And one of my favorite accounts posts this gorgeous, the most beautiful, coolest house I've ever seen.
01:21:42.000 It was a house inside of a rock in the desert.
01:21:45.000 And it had a pool and it had all this.
01:21:46.000 And I was like, wow, what is this?
01:21:48.000 What does this cost?
01:21:48.000 I've never seen anything like it.
01:21:50.000 And it was all AI.
01:21:52.000 Oh, they got me.
01:21:53.000 I chucked a like at them and everything, and I'm looking, I'm swiping, and in the caption, low in the caption, they're like, oh, and by the way, I generated this image from AI.
01:22:01.000 It's not a real house.
01:22:02.000 What?
01:22:03.000 But they were selling you a house?
01:22:04.000 So I'm already living inside... So I'm already living in the Matrix.
01:22:07.000 I couldn't tell the difference.
01:22:08.000 I'm telling you, I wasn't trying to be stupid.
01:22:10.000 I couldn't tell the difference between the fake AI-generated boulder house in the middle of the desert, which is some cool, like, Frank Lloyd Wright-looking house, right?
01:22:19.000 It was all fake.
01:22:19.000 None of it was real.
01:22:20.000 You know, to answer your question from earlier, should we let the AI run these programs?
01:22:23.000 Never.
01:22:24.000 No.
01:22:24.000 We should never let an AI run a program, ever.
01:22:27.000 We should always use them as advisors and let humans run the programs.
01:22:30.000 Take a look at this image.
01:22:32.000 Can you pull the image up, Serge?
01:22:34.000 So, this is fake.
01:22:35.000 This is mid-journey.
01:22:36.000 And it looks like someone just took a panorama.
01:22:39.000 Here's another one.
01:22:41.000 Fake image.
01:22:42.000 Here's another one.
01:22:42.000 Fake image.
01:22:44.000 A kid is going to be born today who will see photographs that look real and someone will tell them it's real.
01:22:52.000 What is this?
01:22:53.000 This is not even, wow.
01:22:55.000 This is crazy stuff.
01:22:57.000 And people are gonna, a little kid's gonna grow up.
01:22:59.000 How are they gonna know this?
01:23:01.000 How will a child know the difference?
01:23:06.000 They're asking it to make panoramas.
01:23:09.000 But if a kid born today is online and looks up this date at this time, and they get a fake image of it, how will they know the difference between the actual image of Trump and the fake image of Trump?
01:23:20.000 When they're identical.
01:23:21.000 All the people with their avatars, AI avatars, as their profile pictures is disturbing beyond measure.
01:23:27.000 It's not you.
01:23:28.000 Those are not you.
01:23:28.000 Do not.
01:23:29.000 Yeah.
01:23:29.000 Don't fall into it.
01:23:30.000 It's so tempting.
01:23:31.000 It makes you look cool.
01:23:32.000 Hey, what's going on?
01:23:35.000 My name is Rain and I was created using version 5 of Mid Journey, which came out today.
01:23:42.000 My creator JS Films wanted to hear how I sound with voice.ai and how I would look when being animated by DID.
01:23:50.000 Wow.
01:23:51.000 What do you think?
01:23:52.000 We're getting dangerously close to people just retreating to fake realities and y'all might already be in one.
01:23:58.000 But we already retreat to fake realities.
01:24:00.000 I'm saying this could be one!
01:24:03.000 With what we're in right now?
01:24:04.000 I mean, you ask... So, like, real Tim chose to plug in in a pod somewhere?
01:24:08.000 To be on the show?
01:24:10.000 Maybe it's you.
01:24:11.000 Maybe we're in your simulation, bro.
01:24:13.000 I think that there's a lot of people that live in digital realities right now.
01:24:15.000 I agree.
01:24:16.000 I just think it's not... There's... Right now... They're not real.
01:24:20.000 You can tell they're cartoony still.
01:24:23.000 They are doing tests on whether or not the universe is a simulation with lasers or something.
01:24:27.000 I don't exactly know what that means or how you've improved that, but there are a lot of people who genuinely believe that we're actually living in a simulation.
01:24:33.000 And I wonder if it's a simulation or, based on what we want as people, it's actually just a video game.
01:24:39.000 So like a kid that plays Minecraft for 10 hours a day, how's that not him living inside of a digital simulation?
01:24:43.000 It is!
01:24:44.000 But imagine if he could plug in and live 70 years in Minecraft.
01:24:47.000 In a day. In a day or in two hours, like Roy on Rick and Morty.
01:24:52.000 You slow time down, time dilation and matrix. That's wild.
01:24:55.000 It's only if you forget that you're in a game is when it starts to get tediously dangerous.
01:25:00.000 Like if you, but because they're cartoony still, you always sort of know,
01:25:04.000 although when you're in a storybook, when you're reading a book,
01:25:06.000 sometimes you feel like you're in the book.
01:25:07.000 70 years in like, but you know, that means like, like that means like feeding tubes,
01:25:11.000 right? Like my wife has done like NICU stuff where like to keep a human being
01:25:15.000 alive on a machine is unbelievable to look at.
01:25:18.000 There are people right now that are alive because of machines, right?
01:25:20.000 There are millions of them all over the country, and those are some of the most depressing looking
01:25:25.000 individuals you have ever seen is not cool. It's not some little thing from Star Trek.
01:25:28.000 It's not Iron Man. It's like you.
01:25:31.000 You have a tube in every orifice of your body.
01:25:34.000 You have needles and blood and everything shoved right into there.
01:25:36.000 To try and keep a human being alive through a machine is a horrible process.
01:25:42.000 And so you would have to do that.
01:25:45.000 You would go into your matrix and time would slow down for your perception.
01:25:50.000 So an hour would go by.
01:25:52.000 It seems like an hour, but it's only two seconds in real life.
01:25:55.000 So you will live a year in this.
01:25:57.000 Literally, it will feel like a year has gone by with conversations every day.
01:26:00.000 You've slept 365 times.
01:26:02.000 You wake up, you take the goggles off, and it's an hour later or five minutes later.
01:26:06.000 You can slow down perception.
01:26:08.000 Time has no meaning when you are in control.
01:26:10.000 Here's the worst part, Benny.
01:26:11.000 When you finally come out of the game, you're like some 28-year-old fat dude with Cheetos covered all over your shirt, and you're sitting in your basement, and your kids are sitting there looking at you, being like, Daddy, what are you doing?
01:26:22.000 You're like, I was playing a video game.
01:26:26.000 It was called Tim Pool's show.
01:26:28.000 No, it's the Benny Johnson show, bro.
01:26:30.000 You're in your game.
01:26:31.000 Or you could speed up time so that someone could serve a 70-year sentence, but to them it would only feel like 20 minutes.
01:26:37.000 Why would you ever do that?
01:26:39.000 I'll tell you this or I'll remove that from society for 70 years without... So I'm driving out here.
01:26:46.000 I'm driving out here.
01:26:47.000 Tim lives in the middle of nowhere in the beautiful mountains, and I'm like looking around, and I was like, You know, if you were a God-hating atheist, you would sit here and stare at these beautiful mountains and look at these tall trees and breathe this fresh, crisp air, and there's nothing but you in the sunset and nature, and you would say, this is all I need.
01:27:06.000 And you would feel spiritual.
01:27:07.000 If you go and you look up at the redwood trees that are 2,000 years old and 2,000 feet tall in California, and you look up at them, you have a spiritual moment.
01:27:15.000 I don't care who you are.
01:27:16.000 I don't care if you hate God, if you have your own problems with God, you understand spirituality.
01:27:20.000 You take your shoes off and your socks off and you put your feet in the dirt under those redwood trees and you look around and you take a deep breath, you'll understand there's something bigger than you in the universe.
01:27:30.000 And that spirituality is always going to be needed by people.
01:27:34.000 There's no digital environment that can replace it.
01:27:36.000 And I would say if you feel like, if you're listening to this conversation and you think that would be cool, the Neuralink, get Get out of your basement.
01:27:42.000 Go walk through a park.
01:27:44.000 Start with that.
01:27:45.000 Go get some sunlight.
01:27:46.000 And then realize that this world's already a beautiful, magical, unbelievable, breathtakingly gorgeous place.
01:27:54.000 And it's all there, and most of it's free, because a lot of it's national parks.
01:27:59.000 And there's probably one right down the street from you, here in America, because the federal government owns half the land in America.
01:28:04.000 And you should possibly consider, like, enjoying this gorgeous earth as God made it, and not looking for some cheap digital substitute created by people who hate you.
01:28:13.000 Yeah, cities are not fun places right now.
01:28:16.000 What about augmented reality?
01:28:17.000 Where you could still experience everything, but you have a visor.
01:28:20.000 Why would I want to augment Niagara Falls?
01:28:22.000 You stand there and you watch Niagara Falls, and you watch the rainbow over Niagara Falls, and you're like, why would I want to augment it?
01:28:28.000 You can see how fast the water's falling, how much water you can do the math.
01:28:31.000 If there was a guy who was jumping over barrels while the barrels were going downstream?
01:28:36.000 Like Houdini did?
01:28:37.000 Like Houdini went down in Niagara Falls.
01:28:38.000 You put on the goggles and you're watching Niagara Falls, but then you're also watching barrels and a guy is jumping from the barrel to barrel while the barrels go off the cliff.
01:28:38.000 No, no, no.
01:28:45.000 You're just watching it.
01:28:46.000 It would be entertaining.
01:28:48.000 I would prefer Niagara Falls without that, but Harry Houdini literally went over Niagara Falls in a barrel and survived.
01:28:53.000 Yeah.
01:28:54.000 Locked himself in a barrel and went over Niagara Falls.
01:28:56.000 I think augmented reality is stupid.
01:28:57.000 I think that would be cooler.
01:28:58.000 I like the analytics aspect.
01:28:59.000 Would it be cooler to watch Harry Houdini do it in real life?
01:29:01.000 Like if a dude could run by you, it would be cooler to watch him do it in real life.
01:29:04.000 But if you could see guys running, and then it would tell you how fast they were running just by looking, and it would triangulate.
01:29:10.000 So even if you're moving fast, it would give you the relative differential in speed, and it would be triangulating from satellites, so you could see the actual speed relative to the Earth.
01:29:18.000 What?
01:29:19.000 Analytics.
01:29:19.000 I'm a Luddite here.
01:29:20.000 I'm a Luddite here, man.
01:29:21.000 I feel like this kind of stuff is gonna take us out of it.
01:29:24.000 How heavy smashes are you?
01:29:25.000 Mid-journey is crazy.
01:29:27.000 How hot things are just by looking at them, stuff like that.
01:29:30.000 People need to get out more.
01:29:31.000 We use mid-journey for TimCast.com.
01:29:33.000 We'll type in like Joe Biden eating an ice cream and then we'll cut him out and we'll make art that represents the story so it's always kind of silly looking but intentionally not realistic because if it was People, there's going to be someone who's 10 years old today is going to read a news story and they're going to use AI images.
01:29:51.000 They're going to say, I don't want to get sued.
01:29:53.000 Make me an image of Donald Trump.
01:29:54.000 So Donald Trump praised neo-Nazis.
01:29:57.000 And they're going to say, make me an image of Trump doing this, even though it's not true.
01:30:00.000 Then there will be an image of Trump looking at neo-Nazis and like shaking their hands.
01:30:03.000 And it will look very real.
01:30:05.000 And some kid will see it and believe it and grow up and say, I saw the video.
01:30:09.000 I saw the photo of Trump doing it.
01:30:11.000 You're lying to me.
01:30:12.000 And that's what reality is going to be for all these people.
01:30:14.000 Derangement.
01:30:16.000 Gosh, what's that Mandela effect?
01:30:21.000 But like for real, a real one that was seeded by this stuff.
01:30:21.000 Yep.
01:30:24.000 Yeah.
01:30:25.000 I think what you said earlier is actually crazier, that you're gonna get a phone call from your mom and it's gonna be the AI.
01:30:30.000 So nuts.
01:30:31.000 And it's gonna be like, hey honey, can you give me the password to the garage?
01:30:34.000 Yeah, you get a text from like, or you get an email and it'll be like, I lost your number, can you call me?
01:30:39.000 Well, no.
01:30:40.000 I mean, who knows?
01:30:41.000 It will be your mother's voice being like, I need the garage code.
01:30:44.000 I forgot it.
01:30:45.000 That is the scene in Terminator 2 that Phil brought up too, where he gets the call and he's like, Oh, I'll be right home.
01:30:45.000 Yeah.
01:30:50.000 You just maybe think about something.
01:30:51.000 What if you got a call from a dead relative through AI?
01:30:54.000 Dude, it's going to happen.
01:30:55.000 I mean, maybe not to us specifically, but it might be happening right now.
01:30:59.000 Your Nana calls you.
01:31:01.000 Man, I would do anything to hear from family members that have lost before.
01:31:07.000 I mean, not anything, right?
01:31:08.000 But it would be like, man, what if I could talk to them one more time?
01:31:11.000 It will scan your dad's Facebook, learn everything about you in a second, and then call you up and have memories and everything.
01:31:18.000 I am so glad my dad passed away in 2000.
01:31:20.000 And behind the screen is a monster with tentacles with a gigantic demon grin.
01:31:24.000 Yeah, with like weird pauses.
01:31:25.000 But people will be tempted by that, right?
01:31:27.000 That'll play on human emotion.
01:31:27.000 People will be tempted.
01:31:29.000 And people will stand in line to give that AI anything just to hear from Nano one more time.
01:31:35.000 Take a look at this picture.
01:31:37.000 Imagine if someone was told, you tell a kid, this is a picture from Woodstock.
01:31:42.000 And the kid, as a young kid, sees it.
01:31:44.000 They don't really think too much about it, maybe they're 8 years old.
01:31:47.000 Then, when they're like 17 or 18, they're like, what was that picture you showed me a really long time ago of that woman?
01:31:52.000 I can't find it anywhere.
01:31:53.000 It will be incorporated into their brains.
01:31:57.000 They will grow up believing these things are real images.
01:32:00.000 And you can never take that away.
01:32:03.000 It's scary where this is going, man.
01:32:04.000 Look at this.
01:32:04.000 Formative memories that are not actually memories.
01:32:07.000 They're memories of people who didn't happen.
01:32:08.000 Women who are not real.
01:32:09.000 This is crazy.
01:32:10.000 Holding up their hands to show that they can do five fingers now.
01:32:13.000 Oh, wow.
01:32:14.000 But here's the crazy thing.
01:32:16.000 The fingers were the giveaway for a long time.
01:32:18.000 Look at this.
01:32:18.000 Look at this.
01:32:19.000 The animation I just played a moment ago of that woman was like, I have been animated.
01:32:23.000 Imagine where we're going to be in one year.
01:32:25.000 My name is Rain, and I was created using version 5 of Mid Journey.
01:32:30.000 So Mid Journey created this person, and then it used an animation program to animate it, and then a voice AI to create a voice for it.
01:32:37.000 There's better voice AI already.
01:32:39.000 What if they took a, here's what you can do.
01:32:41.000 With 11 labs, you can take two different voices, upload them, and create one voice based on two people.
01:32:49.000 So you can take two women, combine their voices, and then have the voice AI be a new, unique voice that no one would recognize, and have this woman speak using that voice.
01:33:00.000 Actors are going to be gone in a few years.
01:33:02.000 Movies will look completely realistic, AI-generated, in a moment.
01:33:07.000 They will go in and they'll say—they already did this with that anime thing, that anime team.
01:33:12.000 I forgot what their names were.
01:33:14.000 I don't know if you saw that.
01:33:15.000 They filmed themselves, then used an AI to convert it into anime, and then converted— I heard about it.
01:33:21.000 It's crazy.
01:33:23.000 In one or two years, we're going to be at the point where they say, give me a scene where Ian and Benny Johnson are driving to the grocery store to pick up heavy whipping cream because they're going to make strawberry shortcake.
01:33:34.000 And it will render it, and then it will even write the script for you.
01:33:39.000 These programs already exist to do this.
01:33:42.000 We can already use a program to prompt video.
01:33:45.000 We can already AI generate images of known people.
01:33:49.000 We can already do voice AI generation.
01:33:51.000 If all of this was combined into one suite of tools, you could type it in, press enter, let it render for an hour, and then boom, you've got a 10 minute scene with a conversation.
01:34:02.000 I'm not blackmailed about this.
01:34:03.000 It's overwhelming, but I feel like we're on the precipice of some sort of obliteration, whether it's like a reformation of consciousness, of thought, of what it even means to be a hominid, and that, like, what is our purpose in this?
01:34:15.000 Is it just to, like, kind of lay groundwork for ethics so that the people that are there to pick up the pieces and the machines that are watching and scanning the net can, like, search for the ethical behavior and kind of mimic that in the future?
01:34:27.000 Or are we?
01:34:28.000 I mean, it's probably not monolithic, like some people- Isn't that what our ancestors did for us?
01:34:31.000 Pretty much, with writing.
01:34:33.000 Yeah.
01:34:34.000 Alright, we're gonna go to Super Chat!
01:34:35.000 So if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends, and become a member at TimCast.com by clicking that Join Us button.
01:34:45.000 Why?
01:34:46.000 Well, for one, because we need your memberships to run this company.
01:34:49.000 That's how we do it.
01:34:50.000 We'd rather have U.S.
01:34:51.000 customers as opposed to a bunch of corporate sponsors, though we do have some.
01:34:55.000 We rely more so on membership.
01:34:56.000 But you'll also get access to the uncensored after show which goes live around 10 10 p.m.
01:35:01.000 Monday through Thursday and then is archived forever.
01:35:05.000 And you can go back and watch all of the videos from all of our archives.
01:35:08.000 I really do recommend y'all watch yesterday's episode with Jim Hansen because we had an hour-long debate.
01:35:15.000 over the origin of wokeness, and I go into detail on my technology theory, how it's not the institutions, it's not academia, that is a mistake based on people.
01:35:25.000 I'll simplify it for you.
01:35:26.000 Smart people who pay attention to this stuff seem smart to the average person, tell you the core of what we're experiencing is this institution, and the average person says, makes sense to me.
01:35:37.000 But if you were to actually step back and look at the entirety of the picture, you will see it's technology-based.
01:35:41.000 It's not wokeness in the universities.
01:35:44.000 It is an amalgam of chaotic, destructive ideology emerging from AI, algorithmic garbage, which is simply proven by the fact that all of these things, depression, anger, and political conflict emerged at the exact same time we rolled out social media around the world in several different countries.
01:36:01.000 Man, and Nazism and fascism flourished when the radio kicked on.
01:36:05.000 You gotta watch this tech.
01:36:05.000 It really is.
01:36:06.000 All right, let's read some Super Chats.
01:36:08.000 I'm not your buddy, guys, says The Distance Between Insanity.
01:36:11.000 The distance between insanity and genius is only measured by success.
01:36:15.000 Elliot Carver, of all villains, who would have thought he would be so realistic?
01:36:19.000 Very interesting.
01:36:20.000 Grant Shearer says, what if DMT entities can take over sufficiently complex AI brains we create?
01:36:26.000 What if we are building bodies and brains for demons?
01:36:29.000 Oh, that's freaking wild. I don't know. I'm more inclined to believe that we are in some kind of simulation of our
01:36:34.000 own creation And that DMT kind of snaps you out of it for a second
01:36:38.000 And so we're probably playing a video game and the reason the machine elves are like what are you doing here?
01:36:43.000 Is it's because like you're playing a game and you plugged in but then briefly while you're in the game your eyes open
01:36:48.000 And they're like, what are you doing?
01:36:50.000 Like I thought you were in the game And then you go right back in because it only lasts like 10
01:36:54.000 minutes and they're like that was weird Yeah, when you see the infinite fractalization of everything in every direction, and you realize, like, you're just the beginning of that, in this reality, it's pretty wild.
01:37:05.000 Hayden Lewis says, Hey Tim and gang, new-time member and long-time fan, could any of you elaborate on what the Willow Project might be and what the effects will be for the U.S.?
01:37:14.000 I don't know if that is.
01:37:14.000 Do you know what that is?
01:37:15.000 First I've heard of it.
01:37:16.000 Yeah, I've not heard of it.
01:37:18.000 All right.
01:37:19.000 Wayback says, Tim, if you had to bet your entire company and all of your assets on who wins the 2024 election, who are you going with?
01:37:25.000 Donald Trump.
01:37:26.000 Oh, you think he'll win?
01:37:27.000 I feel like the deep states has no taste right now.
01:37:32.000 I think the economy is going to get so bad with the banking stuff that Donald Trump will probably march right in at this point.
01:37:39.000 I don't know for sure, and a lot can change from now to then.
01:37:41.000 But as of right now, with the economy as bad as it is, it's looking like a Donald Trump victory.
01:37:46.000 Some sort of populist sweep?
01:37:48.000 Donald Trump, hands down, across the board.
01:37:51.000 You've got cultural, moral, and financial decay.
01:37:53.000 People are going to beg for Donald Trump.
01:37:56.000 Right now.
01:37:56.000 We'll see, though.
01:37:57.000 We'll see, though.
01:37:57.000 Could change.
01:37:58.000 A lot can change.
01:38:00.000 Let's grab some more.
01:38:03.000 Anthony Brownlee says, if George Washington came back today, everyone in Congress would be prosecuted for treason.
01:38:08.000 Uh-huh.
01:38:09.000 Yeah.
01:38:09.000 He was a hardcore authoritarian?
01:38:12.000 No, it's just that the government is so outside of the constitutional bounds.
01:38:15.000 So far outside of the constitutional bounds.
01:38:18.000 Dan says, gradually and then suddenly.
01:38:20.000 Guns, gardens, and goats.
01:38:22.000 Cows and chickens too, of course.
01:38:23.000 We went and checked out the chickens.
01:38:25.000 Roberto Jr.' 's having a good old time.
01:38:26.000 Yeah, he walks around doing Roberto Jr.
01:38:28.000 stuff.
01:38:30.000 He's a good dude.
01:38:30.000 Sounds great out there.
01:38:31.000 He's chill.
01:38:32.000 Roberto is mean.
01:38:33.000 Why do you think that was?
01:38:34.000 I don't know, because Roberto was born in a farm, and then we raised him here, so he was not around people.
01:38:42.000 Roberto Jr.
01:38:43.000 was hand-raised by me and Allison, so he was a little tiny baby.
01:38:48.000 When he hatched, we were right there, and we would hold him, and we would feed him, and we took care of him in the incubator, and then we had him as he was getting bigger and bigger, and then we brought him outside.
01:38:56.000 So now he just kind of looks up at us, and he's super chill, like, oh yeah, it's you guys.
01:38:59.000 You guys are awesome.
01:39:00.000 I think it's mostly because we were there when he was born.
01:39:03.000 Yeah.
01:39:03.000 And so he's like, I know you guys, you know, you're always there.
01:39:07.000 You can walk in and he'll just walk around and he'll look at you and he's super nice and super chill.
01:39:10.000 He's a good dude, man.
01:39:11.000 He's a good dude.
01:39:12.000 Roberto's kind of mean, but you know, he's living the good life.
01:39:15.000 Over at Cocktown, there have been some deaths.
01:39:19.000 Oh.
01:39:19.000 What's happened?
01:39:19.000 Are they fighting?
01:39:20.000 No, no, no.
01:39:21.000 They're captured.
01:39:21.000 Killed.
01:39:22.000 They, like, jump out or whatever.
01:39:23.000 Roberto's fine.
01:39:24.000 At least I hope so.
01:39:26.000 But here's the thing.
01:39:26.000 We have the Penal Colony Building, where the Blackstar Roosters are, and they can't leave, and it's just disgusting, and they're covered in their own feces and everything, but they're safe.
01:39:35.000 Are they the most violent?
01:39:37.000 Yes.
01:39:37.000 Then you have Roberto and the boys.
01:39:39.000 They're in this big barn with a door they can walk out of.
01:39:42.000 Go outside and graze and then go back inside.
01:39:45.000 And I thought about, because we care about Roberto.
01:39:48.000 You know, he's one of the OGs.
01:39:50.000 We don't want him to die.
01:39:52.000 But I don't want to put him in the prison box because I'd rather he have a short good life than a long prison sentence.
01:39:57.000 God, I think about this with Bucko so much.
01:39:59.000 He sits in my bathroom and he's stuck in the room because he's, you know, healing, but he cries to get out.
01:40:05.000 I know his feelings.
01:40:06.000 It hasn't even got hot yet.
01:40:07.000 Yeah, but he's alive.
01:40:09.000 And that's the challenge.
01:40:09.000 I mean, look, he is happy.
01:40:11.000 He was dying and on the verge of death.
01:40:13.000 It was bad.
01:40:14.000 So that's different.
01:40:15.000 Roberto has a chance for a real healthy life to walk out in the grass and do his thing.
01:40:19.000 And he's got the boys with him.
01:40:21.000 They're mostly fine.
01:40:23.000 And he can go inside and be okay.
01:40:25.000 And they have a high elevated area where they're safe.
01:40:28.000 If something bad happens, it's an accident, and it's probably not likely, but it could happen.
01:40:32.000 But I'd rather Roberto live a full rooster life where he gets to go outside and smell the fresh air, you know, in his retirement, as opposed to locking him in a cage where he's safe but has no life.
01:40:41.000 Would you neural link with Roberto to see what he wanted?
01:40:43.000 No.
01:40:44.000 I'd maybe neural link to a machine or something that can read what he wanted.
01:40:48.000 Yeah, yeah, sure.
01:40:49.000 And he could read your thoughts if he wanted.
01:40:50.000 No, I don't know about that part.
01:40:52.000 Don't think his little brain would be able to handle it.
01:40:54.000 Think of the chickens.
01:40:56.000 So, I mean, yeah, speaking of cock town, I mean, do you believe that people... What?
01:41:01.000 Do you believe that people will... Will people just give up sex with AI?
01:41:08.000 Well, we're talking about roosters, not, you know, but yeah, probably.
01:41:10.000 Right, so like, do you think they'll just give up sex?
01:41:12.000 Because I'm looking at you right here.
01:41:13.000 Absolutely, they're already doing it.
01:41:14.000 So here's the birth rates.
01:41:15.000 They're already doing it.
01:41:16.000 The birth rates for Japan, America, and Germany, and they're all below the rate of replacement.
01:41:23.000 All of them.
01:41:23.000 Japan's really bad.
01:41:24.000 They've already given it up.
01:41:25.000 So all the articles about Japan is that men just live inside of a digital universe, they don't want, going out and dating is hard, women can turn them down, they don't want to get their feelings hurt, so it's better to just animate all night and day.
01:41:38.000 Yes.
01:41:39.000 With a sock.
01:41:40.000 And drugs, pharmaceuticals, and weed.
01:41:42.000 And so what about that AI, now it's not an anime, and it's not something you have to buy, you can literally just generate, you can generate your perfect wife, and you can generate your perfect whatever you're into, And then you never have to try ever again.
01:41:54.000 You saw that mid-journey.
01:41:56.000 These dudes are gonna be like, AI, generate me a 24-year-old woman with silver hair who's the commander of the Star Commander Battalion, and I'm with her on the ship as we're traveling to the Centauri Nebula, blah, blah, blah, or whatever, Alpha Centauri, to save the aliens.
01:42:15.000 And then they're gonna go bang this space commander, and they're gonna experience it through Neuralink.
01:42:21.000 They're gonna have anything they ever wanted.
01:42:22.000 And then they'll take that out and they'll look around, like, women.
01:42:26.000 No, no, they won't take it out.
01:42:28.000 Or they won't take it out, right.
01:42:29.000 They'll go to the local townie bar and they'll look at the women there and they'll go, I'm never gonna have sex again.
01:42:33.000 I'm gonna be a celibate man.
01:42:34.000 I'm never gonna procreate.
01:42:35.000 What will happen is, they will go to a lab.
01:42:37.000 Their testosterone level will go through the floor.
01:42:40.000 Their sperm count will atomize.
01:42:42.000 That won't matter.
01:42:43.000 Their sperm count will, like, disappear.
01:42:44.000 That won't matter because what they'll do is they'll take a piece of hair.
01:42:47.000 Give it to the reproduction lab, and the lab will create the human, and then the human will be raised by the state.
01:42:54.000 You hear that helicopter?
01:42:54.000 What's going on?
01:42:55.000 Choppa!
01:42:56.000 Alright.
01:42:57.000 This is just a bunch of hyphens.
01:42:58.000 Says, book Ron Paul and Thomas Massey together on your show soon.
01:43:02.000 Please, please, please, please, please, please, please.
01:43:03.000 Uh, we are, we, we do, we actually think we do have Ron Paul booked.
01:43:07.000 Um, maybe we should invite Thomas Massey.
01:43:10.000 Yeah.
01:43:10.000 Yeah.
01:43:11.000 Totally should invite him.
01:43:14.000 We're going to Texas.
01:43:17.000 I love Thomas, man.
01:43:17.000 Thomas Massey's so based.
01:43:19.000 So maybe we ask him if he wants to come down or something.
01:43:22.000 That'd be really fun to have them on our show.
01:43:23.000 I would like to meet Thomas Massey in person.
01:43:25.000 Yeah, he's the best.
01:43:26.000 Yeah, he was here with Marjorie Taylor Greene.
01:43:27.000 That was fun.
01:43:28.000 Yeah.
01:43:29.000 Alright, Brent Simonson says, Phil, check out Generac storage.
01:43:33.000 As a building inspector in my local jurisdiction, I see a lot of their LP generators and battery storage systems.
01:43:38.000 That's what I got, Generac.
01:43:40.000 Yeah, Generac's fairly common.
01:43:41.000 I'm pretty sure they're everywhere out here.
01:43:42.000 Nice.
01:43:43.000 Yeah, you definitely gotta have something if you're out in the middle of nowhere.
01:43:46.000 I think solar's the way to do it.
01:43:47.000 Getting a big wind thing is also great.
01:43:49.000 A wind turbine?
01:43:50.000 Yeah.
01:43:51.000 I've got the solar.
01:43:52.000 I need the spherical ones.
01:43:54.000 So the cylindrical ones that they're like weird metal looking spirals and they spin that don't get birds caught in them.
01:44:01.000 Right?
01:44:01.000 Yeah.
01:44:02.000 Yep.
01:44:04.000 All right, Stevie VV says, for any of us living paycheck to paycheck, no reason to pull your money out.
01:44:10.000 There's nothing there.
01:44:11.000 Glad 2020 stimulus bought cheap guns and ammo and stocked the pond.
01:44:16.000 That's actually not true.
01:44:17.000 It's the snowflake in an avalanche.
01:44:19.000 They say no snowflake blamed itself for the avalanche, but if every person living, you would need every person living paycheck to paycheck to pull their bank, their money out of a bank to cause the bank, the bank run.
01:44:28.000 It is the grassroots.
01:44:30.000 You just need like 10 million people to do it and all of those $400 add up.
01:44:37.000 All right.
01:44:38.000 Apple boy says all those new IRS agents are busy cataloging US citizens accounts for the new digital currency, which may be why they hired them all.
01:44:46.000 Smart.
01:44:47.000 Interesting, right?
01:44:48.000 Smart.
01:44:49.000 Oh boy.
01:44:49.000 Well, this sounds like it's gonna be fun.
01:44:50.000 Because if you noticed, the recent Treasury announcement is almost exactly the same as
01:44:54.000 the speech Paulson made in 2008.
01:44:56.000 Oh boy.
01:44:58.000 Well, this sounds like it's gonna be fun.
01:45:01.000 You guys ready for September?
01:45:04.000 I'm worried about like what happens after the summertime.
01:45:06.000 Cause the third quarter is, is typically like down anyways.
01:45:12.000 Like for some reason there, that's why, that's where the, uh, uh, the green, where green day came up with the name of the song, uh, waking up where, when September ends.
01:45:20.000 It's because the, for some reason at the end, the second, the end of the third quarter is like a mess.
01:45:24.000 So I'm concerned about what the economy is going to be like in the fall.
01:45:28.000 Yeah.
01:45:29.000 Yeah, totally.
01:45:31.000 All right.
01:45:32.000 Kenneth Hart says, the banking, the baking system is not failing.
01:45:35.000 I just made biscuits.
01:45:36.000 Banks?
01:45:36.000 Yeah, this is Armageddon.
01:45:40.000 It's been planned, I think, for a long time.
01:45:41.000 So if that's, maybe that's a silver lining.
01:45:45.000 Kyle Snash says, breaking news, Fedcoin officially comes in July.
01:45:49.000 No?
01:45:49.000 Yeah.
01:45:51.000 Hillbillory Clinton says, the fact that you can take my money and give it away means that you understand where I want my money to go.
01:45:57.000 The Timpire is here.
01:45:59.000 That's right, because as I ask you to become members, I've also pointed out that we are going to be doing some kind of grant program where once a month we choose someone who submitted their cultural endeavor idea to receive $10,000.
01:46:11.000 So I don't know if you heard us talk about this, but the idea is We want it to be for members, that if you're a member of TimCast.com and you're working on some kind of cultural endeavor, you send us the pitch, you say, here's what I've, and you have to have like a working prototype.
01:46:26.000 You have to, if you're making a comic, you've got a comic in production.
01:46:29.000 And then we're going to have maybe like an outside group choose one person to be the winner to receive 10 grand.
01:46:35.000 Just like, there you go.
01:46:36.000 Can I be on that board?
01:46:37.000 Yes, absolutely.
01:46:38.000 And Kash Patel.
01:46:40.000 Nice.
01:46:40.000 The idea is we want to scattershot cultural works.
01:46:45.000 Because if we can get a hundred people money to work on their project, one of them is going to hit top tier levels.
01:46:54.000 One of them is going to hit a platinum song.
01:46:56.000 One of them is going to make a Picasso.
01:46:58.000 One of them is going to write the next Harry Potter, right?
01:47:00.000 And then it will be from someone who has good American values.
01:47:04.000 So that's the idea for the program.
01:47:05.000 And honestly, man, one person cracks the code and it is worth hundreds of millions of dollars, billions of dollars.
01:47:12.000 In this modern era, money falls short of human ingenuity.
01:47:16.000 So it might.
01:47:17.000 Are you taking a percentage or is this just like a philanthropic?
01:47:20.000 The original idea was just to give the money.
01:47:21.000 Yeah.
01:47:22.000 But you may not be allowed to do it because here's the funny thing.
01:47:25.000 If we choose a member who's like if we get 10 submissions every month And then we decide to choose one and give them 10 grand.
01:47:32.000 That's a sweepstakes, and there's a bunch of weird rules for sweepstakes.
01:47:36.000 If we choose to invest in their project and take a percentage, it's not a sweepstakes, because there's consideration exchanged.
01:47:41.000 Interesting.
01:47:42.000 So we may have to do like a 1 to 5% or something in exchange for 10 grand or whatever, which actually does make a lot of sense.
01:47:50.000 It builds the company up.
01:47:52.000 And then if one out of a hundred turns into a golden or platinum record or something equivalent
01:47:57.000 and money is generated, that money then goes to doing more of the same.
01:48:00.000 And we could potentially exponentially increase culture building
01:48:03.000 and build a new Hollywood outside of the old crappy weird woke Hollywood,
01:48:07.000 you know, with these cultural endeavors.
01:48:09.000 I love that.
01:48:10.000 That's the plan.
01:48:11.000 Maybe we'll just start doing it.
01:48:12.000 I mean, we got a bunch of people emailed us already.
01:48:15.000 Let me write this down.
01:48:16.000 Make the email for it.
01:48:17.000 Email.
01:48:17.000 Start getting your companies set up.
01:48:19.000 If you're going to be getting an investment, you're going to need a company and stock ready to go.
01:48:22.000 That's right.
01:48:23.000 Maybe S Corp or something.
01:48:24.000 I don't know.
01:48:24.000 You guys got to figure that out.
01:48:25.000 I'm a big fan of LegalZoom.com.
01:48:27.000 I don't know how you guys go about your corporations.
01:48:29.000 But it's going to be serious.
01:48:30.000 It's not going to be like if someone says, hey, I had this idea.
01:48:32.000 I wanted to make a website.
01:48:33.000 It's going to be like, OK, well, did you make it?
01:48:35.000 No, it's got to be someone who's like, hey, I wrote this book.
01:48:38.000 It's currently with the editor and I need X amount of dollars for this reason.
01:48:41.000 And they're going to be like, here you go.
01:48:43.000 So it's going to be like and if and if we get submissions and none of them are actual functioning projects, then nobody gets any money.
01:48:50.000 But I seriously think there are people who are already doing stuff.
01:48:53.000 There's going to be some guy who says, like, in my spare time, I'm building these cars that do these things.
01:48:57.000 If I had the money, I would do this, that, or otherwise.
01:48:59.000 We'll be like, here's the money.
01:49:01.000 Make it happen.
01:49:02.000 Do it.
01:49:02.000 Film it.
01:49:02.000 Put it on video.
01:49:03.000 And then the other idea we had was to film the whole process and make a show out of it.
01:49:07.000 So it's like Internet Shark Tank.
01:49:09.000 Right, we show up and we're like, your project has won this month's grant, we come, we check out the project, then the show promotes the project, the money helps finance the project, and then hopefully it takes off and succeeds, and then we get more, you know, rip-a-verse, more Eric July type stuff, more cultural endeavors that succeed.
01:49:29.000 So, Scattershot.
01:49:31.000 You know, with, like, Eric July, he's got a platform, so for him to launch this, it's easier than, say, someone who's, like, a steel worker who's got a side project.
01:49:39.000 But if that side project is the next Harry Potter, and it's gonna come from someone who believes in America, yeah, we really want that.
01:49:45.000 We want that to happen.
01:49:46.000 So it's about finding those people and getting them funding to win.
01:49:51.000 So maybe, I mean, if they're gonna make the next Harry Potter, having a percentage of that company would be very, very fantastic.
01:49:55.000 You know what I mean?
01:49:56.000 So maybe it'll be our grant program and we'll invest in it.
01:49:59.000 The challenge there is we've got to figure out how our company, my company, can own a
01:50:05.000 percentage because we may have to create an investment company that has funds to do this.
01:50:09.000 Because what you're talking about is creating a VC firm.
01:50:11.000 Exactly.
01:50:12.000 Well, like an angel investing firm.
01:50:13.000 Yeah, VC.
01:50:14.000 Yeah.
01:50:15.000 You're a venture capitalist.
01:50:16.000 We're doing it.
01:50:18.000 Let's get on it.
01:50:19.000 Well, I got a buddy, I think I can call and ask him if he wants the job.
01:50:22.000 And then he'll be in charge of it.
01:50:24.000 It'll be like an incubator.
01:50:26.000 You know, we'll find these cultural projects.
01:50:28.000 And everyone will be somebody who's like, you don't got to be a conservative, you just got to be like, I believe in America, I care about this country, I care about family, I care about freedom, free speech.
01:50:38.000 Here's the best part, if you lose all of your money on bad investments, the federal government will bail you out.
01:50:42.000 Yeah, I'll just go and be like, when's Trump's president?
01:50:45.000 Just, yeah.
01:50:46.000 Just tweet, Black Lives Matter.
01:50:49.000 Trans Lives Matter, and then you'll get all your money back.
01:51:02.000 I mean, that's another thing, too.
01:51:04.000 I don't want to put too much money into more podcasts, because we are a podcast, but there will be YouTube shows.
01:51:10.000 There will be someone who's like, I have a show that covers this thing.
01:51:13.000 It's really, you know, people love it.
01:51:14.000 We just need money to do this.
01:51:15.000 It'd be really, really cool.
01:51:17.000 Yeah, like a camera setup.
01:51:18.000 You get your audio camera equipment set up for $6,000 or $7,000.
01:51:22.000 I mean, that's... Yeah, and maybe even some of these shows, we go beyond $10,000, like...
01:51:28.000 If someone's got a really good podcast, and we're like, this is like some of the most interesting stuff I've ever heard.
01:51:34.000 Maybe it's, you know, survival apocalypse.
01:51:36.000 Maybe it's technology and AI.
01:51:38.000 And then we're like, screw it.
01:51:40.000 Let's sign them on for a deal.
01:51:42.000 And then, you know, I think there's an opportunity for us to expand the Tim Cass Media Group, but also to invest in cultural endeavors and win the culture war.
01:51:51.000 And then maybe, you know, like you said, you want to be involved.
01:51:54.000 Kash Patel said he wants to be involved.
01:51:55.000 Yeah.
01:51:55.000 Kash said if, you know, he knows people who are probably interested in financing and stuff like that.
01:52:00.000 His view is more non-profit based, but we could easily pull in a whole bunch of influential people in the space and then create some kind of consortium of culture building.
01:52:09.000 Yeah, because Shark Tank is the most popular show on all of CNBC?
01:52:13.000 Is that what it's on?
01:52:14.000 CNBC?
01:52:15.000 I'm not sure.
01:52:15.000 I think it's on CNBC.
01:52:17.000 It's such a good show.
01:52:18.000 I mean, it's wild, and it kills in the ratings, and people love watching this.
01:52:21.000 They love watching entrepreneurs succeed.
01:52:23.000 It's in our blood as Americans.
01:52:25.000 Yeah, maybe we actually just have, like, you, me, and Cash, and maybe someone else, and we're, like, sitting in chairs in the new studio, and then someone walks in, and they're like, here's my project idea.
01:52:34.000 Fly them in?
01:52:35.000 Yeah.
01:52:36.000 NBC.
01:52:36.000 Oh, yeah, CNBC.
01:52:37.000 If we do it that way, then it would literally be like you, as Benny Johnson, deciding how much money you wanted to give them personally.
01:52:43.000 You'd be like, hmm, you know, I'll give you five grand for this project.
01:52:46.000 I think it's a good idea.
01:52:46.000 You know what I mean?
01:52:47.000 And we could all invest.
01:52:48.000 I mean, if you get like a winner, it'd be great.
01:52:50.000 And then you have equity, and then you create your new economy, your parallel economy.
01:52:53.000 In Shark Tank, you sometimes have them competing with each other.
01:52:56.000 Like, no, no, no, don't sign with him.
01:52:57.000 I'll do a better deal.
01:52:57.000 I'll do 5%.
01:52:58.000 Then I'll do 2.5% for the same amount I want in this company.
01:53:01.000 That'd be entertaining.
01:53:02.000 People love watching that.
01:53:03.000 Yeah.
01:53:03.000 That'd be really cool.
01:53:04.000 And they like watching the entrepreneurs.
01:53:06.000 Yeah, you learn a lot.
01:53:08.000 Yeah.
01:53:08.000 We have a sponsor on the show.
01:53:10.000 And ladies and gentlemen, thank you very much for subscribing to the channel.
01:53:13.000 We have a sponsor on the show, Moink, which is like a meat company.
01:53:16.000 Yeah, we have Moink as well.
01:53:17.000 Right.
01:53:18.000 And they started on Shark Tank.
01:53:19.000 Oh yeah, yeah, yeah.
01:53:20.000 And it's like, I'm a farmer and I just want to be able to deliver good meat that's not made in China or poisoned.
01:53:25.000 And they're like, hey, here's the money.
01:53:27.000 And I ate my first.
01:53:28.000 This is not an ad, right?
01:53:30.000 Not paid to say this.
01:53:31.000 Not here on this show.
01:53:32.000 And it's like, that's good.
01:53:33.000 It's good.
01:53:34.000 This is good.
01:53:35.000 This is delicious.
01:53:35.000 Yeah, we, for the audio side of things, we've done ads for them and they send you the big sample box and it gets annihilated instantly.
01:53:43.000 Because, like, it's real farm meat.
01:53:45.000 It's very good farm meat, yes.
01:53:46.000 But you're right.
01:53:47.000 Not to start getting into an ad for a company that's not paying us currently.
01:53:49.000 Correct.
01:53:50.000 But it is a good company.
01:53:51.000 They started on Shark Tank.
01:53:52.000 They started on Shark Tank.
01:53:53.000 So there's probably a bunch of success we could have doing some kind of show like this where it's, like, the consortium of, you know, cultural endeavors or whatever.
01:53:59.000 We should get Kevin O'Leary to come in and guest host.
01:54:02.000 Dude, totally.
01:54:03.000 One day, yeah, that'd be fun.
01:54:05.000 I mean, sure!
01:54:05.000 You realize how, like, the way that this would work is this product would be almost instantaneously successful because you put all of our audiences together focusing on this product.
01:54:14.000 Or this, let's say Moink came to us.
01:54:16.000 Let's say a meat box company came to us and we said, no, that's, that's based real American meat steaks.
01:54:21.000 We've got local farms all over the place that sell meat.
01:54:24.000 And they come to us and it's like, Oh, that'd be awesome.
01:54:27.000 And then with the collective audience, if you got other influencers to do this, the collective audience would alone, like almost nearly guarantee the success of this product.
01:54:35.000 Yep, that's the point of Shark Tank.
01:54:37.000 That's correct.
01:54:38.000 They invest in it, they promote it, and they make money off of it.
01:54:40.000 Yeah.
01:54:40.000 Let's do the same thing for cultural endeavors that are anti-woke.
01:54:44.000 Yes.
01:54:44.000 That believe in America.
01:54:45.000 I'm saying it doesn't have to be conservative, it just has to be not that.
01:54:49.000 You know what I mean?
01:54:49.000 Can I pitch you one?
01:54:50.000 Yeah, sure.
01:54:50.000 Okay, so my... Okay, it's just something I've always had rattling around in my head.
01:54:56.000 My wife, her family has a farm in Delaware.
01:54:59.000 It's a family farm, and they have cows.
01:55:02.000 And they feed their cows beer.
01:55:04.000 What do I mean by that?
01:55:05.000 They spent mash.
01:55:08.000 When you make a beer, you have spent mash.
01:55:10.000 And I'm like, that is the coolest thing ever.
01:55:12.000 Beer cows.
01:55:13.000 Beer steaks.
01:55:14.000 Like steaks that are fed with Jack Daniels spent mash.
01:55:18.000 Or like steaks that have only, cows that have, there's an entire market about this in Japan, right?
01:55:23.000 With cows that get massaged and are the Wagyu beef.
01:55:27.000 They like, you're buying a type of beef that was raised a certain way.
01:55:29.000 And these cows that come from this farm only eat beer.
01:55:33.000 Man, that's the most American thing I ever heard.
01:55:35.000 They're delicious.
01:55:36.000 They're delicious.
01:55:37.000 They have a freezer full of all these steaks and they're really good.
01:55:40.000 I'm like, that would be so awesome if I could pair my steak.
01:55:44.000 I like Woodford Reserve.
01:55:45.000 If I could pair my steak with a cow that's only eaten Woodford Reserve mash build and I get to have my Woodford Reserve steak by cow that's only eaten that and the cows love it.
01:55:53.000 It's very nourishing.
01:55:54.000 It's full of nourishment.
01:55:55.000 It's full of goodness.
01:55:56.000 I'm going to cry.
01:55:57.000 Stop.
01:55:58.000 I'm like, that would be an amazing company.
01:56:00.000 It's before fermentation, so it's just like grain mash?
01:56:03.000 Yeah, so you have to have all this wheat.
01:56:05.000 They ate the wheat anyway, right?
01:56:06.000 You have to have all this wheat, all this corn, and you mash it all together depending on what you're making.
01:56:09.000 Beer, bourbon, barley.
01:56:11.000 You put it all together, and it has different types of mash build, means different types of bourbons or beers, and that's where you get the taste.
01:56:17.000 And so these cows, they're right down the street from Dogfish Head, very famous brewery.
01:56:21.000 And dogfish has just dumped all this spent grain.
01:56:23.000 So one, you're actually reusing the grain for something that's good.
01:56:27.000 You're feeding the animals.
01:56:28.000 And you're not having to kill more plants to feed these animals.
01:56:31.000 You're actually taking the spent grain, and then you're repurposing this, and then you have dogfish cows.
01:56:37.000 So you have these cows that have only eaten beer.
01:56:40.000 their entire lives. It's beautiful. Is it noticeable in the flavor? I—it's all AI, man.
01:56:47.000 Like, it's noticeable if you noticed it. Yeah, yeah. It's like the semantic. It's not a lie,
01:56:50.000 if you believe it, right? Like George Costanza. So it's like, it's noticeable if you see it.
01:56:54.000 So I don't mean to hijack the comments section here. I could tell you what—
01:56:58.000 I would love to invest in that.
01:57:00.000 I need plugs that automatically unplug themselves on a timer, so when I fall asleep at midnight, all my plugs, though they're still on the wall, something switches and they're no longer plugged in, just in case of a solar flare.
01:57:13.000 I don't know about plugging them in, but you can get an Alexa that does that.
01:57:17.000 Alexa will turn off your plugs.
01:57:21.000 Actually, like, safe from solar flare?
01:57:23.000 That's what I need.
01:57:23.000 Not physically remove them.
01:57:25.000 You want a robot is what you want.
01:57:26.000 Yeah.
01:57:27.000 Or even a plastic switch.
01:57:29.000 C-3PO.
01:57:30.000 Let's read some more.
01:57:30.000 We got Doc Holliday.
01:57:31.000 He says, Redwood trees are only 300 to 350 feet tall.
01:57:32.000 Only.
01:57:33.000 Only.
01:57:36.000 That's all?
01:57:36.000 Really?
01:57:37.000 That's very, very big.
01:57:38.000 That's insanely tall.
01:57:40.000 Do you know how tall I am?
01:57:41.000 That's not only.
01:57:43.000 Our new building's 40 feet tall.
01:57:45.000 The new studio.
01:57:46.000 I'm imagining looking at it and then imagining that like times seven.
01:57:50.000 The Redwoods are only?
01:57:51.000 Only.
01:57:52.000 Only 300 feet tall?
01:57:53.000 That's all.
01:57:54.000 379 is the tallest on record.
01:57:56.000 Hyperion is what it's called.
01:57:57.000 And they live about 500 to 700 years.
01:57:58.000 Little guys.
01:57:59.000 Up to 2,000 years old.
01:57:59.000 Got them.
01:58:04.000 All right.
01:58:05.000 Raymond G. Taylor says, Tim, are your TVs up yet?
01:58:09.000 If not, come on, man.
01:58:09.000 Are you talking about at the new building?
01:58:13.000 So we need permits and stuff.
01:58:15.000 It's crazy how long it takes.
01:58:16.000 We just bought the building and it's only been a few months.
01:58:19.000 So we're setting up like a private club.
01:58:21.000 It's a coffee shop.
01:58:23.000 But we've got to do design, construction, so that's like six months out.
01:58:26.000 Who knows how long that'll take.
01:58:27.000 Takes forever.
01:58:27.000 Yeah, it's crazy.
01:58:28.000 And then we're going to do the second floor is going to be like gaming and, you know, like board games, skate shop kind of hangout, plays with movies.
01:58:36.000 Ian's Crystal Cove is the mezzanine, I guess, that we're putting on the mezzanine where it's going to be a cool little hangout, nook to watch a movie and have your coffee.
01:58:42.000 Then third floor is going to be like the elite VIP club, like social club, not very big.
01:58:48.000 There'll be drinks and stuff.
01:58:49.000 Free drinks, free food.
01:58:50.000 A little bit more expensive, but we're gonna bring that, you know, cultural building stuff out here to West Virginia.
01:58:57.000 So I'm excited for that.
01:58:58.000 Alright everybody, if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends, and become a member at TimCast.com because that members-only live portion of the show is going up in about 10 minutes and you don't want to miss it.
01:59:12.000 So become a member.
01:59:13.000 You can follow the show at TimCastIRL.
01:59:15.000 You can follow me personally at TimCastBenny.
01:59:17.000 Do you want to shout anything out?
01:59:19.000 So we have now, thanks to the TimCast audience, passed 770,000 subscribers on our YouTube, and I just want to say thank y'all.
01:59:28.000 Y'all are so based.
01:59:30.000 I love you.
01:59:31.000 Big heart.
01:59:32.000 Love you.
01:59:33.000 We want to get to a million subs this year, and I think we're going to get there.
01:59:38.000 And stay based.
01:59:40.000 Right on.
01:59:42.000 I am Phil Labonte.
01:59:44.000 I am philthatremains on Twitter and philthatremainsofficial on Instagram.
01:59:49.000 Give me a follow.
01:59:50.000 I'm Ian Crosland.
01:59:51.000 Hit me up anywhere on the internet.
01:59:52.000 Benny, always a pleasure, man.
01:59:53.000 That was really great.
01:59:54.000 How fun was this?
01:59:55.000 Shout out to your wife.
01:59:56.000 Nurse Kate.
01:59:57.000 Nurse Kate.
01:59:58.000 Nurse Kate on Instagram.
02:00:00.000 If you're into fitness, if you want to live a healthier lifestyle, follow Nurse Kate on Instagram.
02:00:05.000 Thank you, sir.
02:00:07.000 Yeah, thanks, Benny.
02:00:08.000 I watch you on YouTube all the time.
02:00:10.000 I appreciate it.
02:00:11.000 Let me get that.
02:00:11.000 Yeah.
02:00:12.000 Yeah, there you go.
02:00:13.000 I watch you all the time.
02:00:14.000 Appreciate it.
02:00:14.000 Glad you went to East Palestine.
02:00:16.000 Good to see you here.
02:00:17.000 All right, everybody.
02:00:18.000 Make sure you go to TimCast.com.