Timcast IRL - Tim Pool - October 12, 2022


Timcast IRL - Alex Jones Ordered To Pay ONE BILLION DOLLARS In Defamation Trial w-Andrew Gold


Episode Stats

Length

2 hours and 2 minutes

Words per Minute

212.46675

Word Count

26,098

Sentence Count

2,037

Misogynist Sentences

11

Hate Speech Sentences

42


Summary

Alex Jones is ordered to pay nearly $1B to the families of the Sandy Hook victims. John Fetterman is unable to understand the questions asked of him during an interview. Joe Biden is running for president and the Dems are still voting for him.


Transcript

00:00:00.000 you you
00:00:04.000 you Alex Jones in his latest defamation trial has been ordered
00:00:36.000 to pay nearly 1 billion dollars 965 million to the families in the Sandy
00:00:45.000 Hook the circumstance
00:00:47.000 And I'm shocked.
00:00:50.000 I'm outraged.
00:00:51.000 A billion dollars?
00:00:52.000 That's it?
00:00:52.000 It should be a gajillion bajillion trillion!
00:00:56.000 Because let's be real, a billion dollars is a meaningless number.
00:01:00.000 Not only does Alex Jones not have it, but it's a ridiculous number to award someone anyway.
00:01:04.000 And it just shows, it's just, it's just all nonsense.
00:01:06.000 Now Alex Jones come out saying, he's not going anywhere, he's not going to stop.
00:01:10.000 And you can't, you can't sue someone into not existing.
00:01:13.000 So there's workarounds, and I'd be really surprised if any of these people actually see a penny from Alex Jones.
00:01:20.000 He's claiming he doesn't have it, he's claiming he's broke.
00:01:22.000 We will see how that plays out.
00:01:23.000 The next story, man.
00:01:25.000 This is brutal.
00:01:25.000 John Fetterman.
00:01:27.000 He's running for office, he did an interview with NBC, and he was unable to understand the questions asked of him.
00:01:34.000 The journalist who did the interview said that during small talk, he couldn't understand what she was saying.
00:01:38.000 He can hear the words, but his brain can't process it.
00:01:41.000 He needed a special device that transcribes what people are saying into text so that he can answer these questions.
00:01:47.000 And I just think, right there, it shows the dude is not mentally fit to be a senator.
00:01:52.000 And you know what?
00:01:53.000 I'll tell you.
00:01:54.000 Democrats will still vote for him because they voted for Joe Biden too.
00:01:56.000 But here's my warning, man.
00:01:58.000 You look at what happened when you vote for someone like Joe Biden.
00:02:00.000 Because you don't care that he's clearly not with it.
00:02:02.000 You're like, well, whatever, we'll let him win.
00:02:04.000 Now look at your economy.
00:02:05.000 Now look at your gas prices.
00:02:06.000 Look at the war.
00:02:08.000 Even Biden's saying we're close to Armageddon.
00:02:10.000 Not okay.
00:02:11.000 We're going to talk about all this, but before we do, head over to eatrightandfeelwell.com to pick up your Keto Elevate MCT C8 oil powder.
00:02:20.000 That's medium chain triglyceride oil powder.
00:02:23.000 Many of you pointed out that in a new interview, it's an old interview actually, that was released of me, I was looking quite round.
00:02:29.000 Well, in fact, I cut out the sugars, started eating some of this Keto Elevate powder, putting it in my coffee, and it was really fast actually.
00:02:35.000 The weight just like melted off.
00:02:36.000 Soon as I got rid of the sugars, the greasy stuff and just started doing keto, that's what really helped.
00:02:43.000 So go to eatrightandfeelwell.com and you will get 51% off as well as a 60-day money-back guarantee.
00:02:49.000 Keto Elevate provides your body only C8, the most ketogenic MCT.
00:02:53.000 That means it provides support for energy levels, healthy appetite management, mental clarity and focus, athletic performance.
00:02:59.000 Keto Elevate, personally, is my favorite on the market.
00:03:01.000 I mention this every so often.
00:03:02.000 Someone actually brought some other stuff over, and I didn't like it.
00:03:05.000 I really do like this stuff.
00:03:05.000 It's great in coffee.
00:03:07.000 You get free shipping on every order, and for every order today, BioTrust will donate a nutritious meal to a hungry child in your honor through their partnership with NoKidHungry.org.
00:03:17.000 To date, BioTrust has provided over 5 million meals to hungry kids.
00:03:20.000 Please help them hit their goal of 6 million meals this year.
00:03:23.000 You'll get free VIP live health and fitness coaching from BioTrust team of expert nutrition and health coaches for life with every order and their free e-report, the top 14 ketogenic foods with every order.
00:03:34.000 And maybe we can do, we'll do like a side-by-side and we'll pull up a picture of me from like two years ago and the next time we do this.
00:03:40.000 Cause you can see like, I actually, I lost like 30 some odd pounds.
00:03:42.000 So people were pointing that out, but yeah, man, keto works.
00:03:45.000 At least I can say that.
00:03:46.000 You talk to your doctor or go talk to the experts over at biotrusteatrightandfeelwell.com.
00:03:52.000 And don't forget to head over to TimCast.com, become a member, and support our work directly.
00:03:56.000 We got a new show coming soon with Shane Cashman.
00:03:58.000 He's gonna be talking live, super chats, thunderstorms, mysteries, paranormal, UFOs, Bigfoot.
00:04:05.000 I'm super excited for this show.
00:04:06.000 It'll probably be the only show I actually listen to, because usually I'm just reading the news.
00:04:10.000 But as a member, you're also supporting our journalists, and you'll get access to the uncensored Tim Cast IRL show Monday through Thursday at 11 p.m.
00:04:16.000 We will have one of those episodes coming up for you tonight, so smash that like button, subscribe to this channel, share the show with your friends, be the notification.
00:04:24.000 People are saying they're not getting notifications anymore.
00:04:26.000 We need you to be the notification so that people can actually find out the show exists and it's on, because we're probably being censored.
00:04:34.000 Joining us today to talk about all of this and more is Andrew Gold.
00:04:38.000 Hello, thanks for having me on.
00:04:39.000 What a pleasure to be here.
00:04:40.000 Oh, thanks for coming.
00:04:40.000 Who are you?
00:04:41.000 I am a British, as you can hear, journalist and documentary maker and podcaster, host of the On the Edge with Andrew Gold podcast.
00:04:49.000 I look into all sorts of cults and ideologies.
00:04:52.000 I think the most controversial aspects thereof is that I consider woke ideology and that kind of thing as one of the ideologies.
00:05:00.000 A cult?
00:05:01.000 Yeah, I think so.
00:05:02.000 That's why you're here, because we say it all the time.
00:05:05.000 I think so.
00:05:05.000 It's cult-ish, at least.
00:05:06.000 It's cult-ish.
00:05:08.000 It's a weird kind of, like, decentralized cult, I guess, that centers around the internet or something.
00:05:14.000 It's got some comparisons to former ones, you know, like the Bolsheviks and things like that.
00:05:19.000 The Puritans back in the... Well, I don't even know when that was.
00:05:21.000 When was that?
00:05:21.000 The 18th century?
00:05:22.000 The Bolshevik one is a scary one, considering what's going on politically.
00:05:22.000 Puritans?
00:05:25.000 It comes from like all cults and religions and things, it all comes from this need to be righteous, to feel like you're better than someone, to raise your status, your virtue higher than other people.
00:05:37.000 As you raise your microphone as well.
00:05:38.000 As I raise my microphone.
00:05:42.000 Yeah, and it's just, you know, the rest of us see through it, but they continue this crazy, crazy charade.
00:05:47.000 We got a lot to talk about in that regard, too.
00:05:50.000 Also, Elon Musk's old burnt hair cologne or something, whatever.
00:05:53.000 Luke's here!
00:05:54.000 Well, cheerio, bloke!
00:05:56.000 Nice to have you here.
00:05:57.000 Over the pond.
00:05:58.000 My name is Luke Godowsky here of wearechange.org.
00:06:00.000 The FDA just announced another emergency use authorization for small children today, so I decided to be very brave and support this action by wearing my 1984 doses to slow the spread.
00:06:12.000 Now, we're still a little behind, but I think we're going to get there eventually, and if you agree, you can get the shirt on thebestpoliticalshirts.com because you do.
00:06:21.000 I'm here.
00:06:21.000 Thank you so much for having me.
00:06:23.000 Luke reinforced one of my concerns that my Spoon was ding a little bit too much into the microphone if you've heard the ding ding ding ding ding So I got a nice large wooden one in the meantime.
00:06:32.000 Thanks Luke And if you didn't catch it today, I did an interview with the one the only Hotep Jesus Brian Sharp on his YouTube channel If you haven't seen yet, you're gonna want to check it out after this show Maybe you can catch it before the after after show which will be up at 11 o'clock p.m.
00:06:46.000 On Tim cast calm What's going on search?
00:06:49.000 Hey guys, I'm here again.
00:06:51.000 You can't get rid of me that easily.
00:06:52.000 I am here.
00:06:53.000 Lydia is not here.
00:06:54.000 If you haven't heard the news, a lot of people haven't heard the news at all apparently.
00:06:57.000 They'll figure it out eventually.
00:06:58.000 She's in the monastery.
00:07:00.000 We wish her luck.
00:07:01.000 And I guess that's it.
00:07:01.000 We do.
00:07:03.000 We'll start the episode.
00:07:04.000 Serge is here pressing the buttons from henceforth.
00:07:07.000 Let's jump into this first story from the AP.
00:07:10.000 It's just so silly.
00:07:12.000 Alex Jones ordered to pay 965 million dollars for Sandy Hook lies.
00:07:18.000 An hour ago.
00:07:18.000 I saw that and I just bursted out laughing.
00:07:21.000 I think I was on the toilet and I'm like, I'm looking at my phone and I'm like, like right when the news breaks, I was like, is this a joke?
00:07:26.000 Come on. You know, when they did the $50 million, I was like, wow, that's brutal.
00:07:31.000 There's still I still doubt anyone's gonna get a penny from this because the way
00:07:34.000 lawsuits work. Alex Jones says he doesn't even have the money anyway.
00:07:38.000 Then they're a billion. It's this is just like, what is this for?
00:07:43.000 Narrative and for the TV to say, oh, you know, he lied about this and now he's got to pay one billion dollars.
00:07:49.000 It reminds me of Austin Powers when, in the first one, he freezes himself and then he goes to the future and he says, I want one million dollars and they all laugh at him.
00:07:59.000 And then they're like, uh, Dr. Evil, uh, one million dollars today isn't actually a lot of money.
00:08:02.000 And he's like, oh.
00:08:03.000 And then in the next one, he goes back in time and says, I want a hundred billion dollars.
00:08:07.000 And they'll start laughing at him again, saying, you might as well have just said a bajillion gajillion dollars!
00:08:10.000 That money doesn't exist!
00:08:11.000 This is what it is!
00:08:12.000 Like, who in their right mind thinks Alex Jones has a billion dollars to give away?
00:08:18.000 I don't think anyone.
00:08:20.000 I mean, he's been pretty transparent with his finances, and he doesn't.
00:08:24.000 I don't understand.
00:08:24.000 There's a phrase trying to squeeze blood from a stone?
00:08:28.000 From a turnip.
00:08:28.000 From a turnip?
00:08:29.000 Yeah.
00:08:31.000 I don't know how the law works in this kind of situation where someone gets sued for money that they don't have.
00:08:37.000 I'm sorry, I'm sorry.
00:08:38.000 This is the biggest judgment in US history, apparently, for defamation.
00:08:41.000 And it's people want a judgment for defamation that weren't even named by Alex Jones.
00:08:45.000 I don't understand this.
00:08:46.000 There were like six people who six people's families and an FBI agent that was involved somehow is pain and suffering from something Alex.
00:08:53.000 I mean, I understand he named someone and then that person has phone calls people went to that person's house.
00:08:58.000 I think I don't know what the extent of the what happened that I get that guy can sue for defamation.
00:09:03.000 He got named for something that but the rest of them.
00:09:05.000 I don't I don't understand this and it's setting a precedent.
00:09:08.000 There wasn't even a trial.
00:09:09.000 They just had a judgment because they had problems with discovery.
00:09:14.000 And this is the second of three trials.
00:09:18.000 There's still going to be yet another trial.
00:09:20.000 And this settlement, you know, this judgment, excuse me, is almost as big as, of course, one of the biggest healthcare fraud settlements in history, which was given to Pfizer that had to pay 2.3 billion dollars for fraudulent marketing back a couple years ago.
00:09:35.000 So, When you compare the two, you know, bad opinions compared to what Pfizer did back then, there's a big difference.
00:09:44.000 Here's a question then.
00:09:45.000 So what should happen?
00:09:47.000 I don't have an answer to this.
00:09:48.000 Let's say you're a parent of the Sandy Hook thing.
00:09:51.000 You're devastated and there are people turning up your house because of what Alex Jones said.
00:09:55.000 What should happen?
00:09:56.000 Maybe nothing.
00:09:57.000 Well, I mean, that's tough.
00:09:59.000 For one, Alex had said that under Texas law, you have to say someone's name for it to be defamation and that he was just saying these people and those people.
00:10:08.000 But I think he did name one person.
00:10:10.000 There was one individual he named.
00:10:12.000 I'll say this.
00:10:13.000 Alex Jones went on his show to a massive audience and said things that were not true.
00:10:19.000 He's allowed to have his opinions.
00:10:20.000 But when he said things definitively, there's a question there of defamation when you call out someone with false information who is not a public figure.
00:10:28.000 I don't think the New York Times should be allowed to get away with it.
00:10:32.000 There's a couple things I have issue with here.
00:10:34.000 One, a billion dollars is just the stupidest thing I've ever heard.
00:10:40.000 They didn't give him a trial.
00:10:41.000 And when are any of these big major media organizations ever going to be held to account in any way, let alone any way like this?
00:10:50.000 It's not happening.
00:10:51.000 So you can see how the machine comes for people who are outside of the establishment.
00:10:55.000 I would say, I would find it reasonable if Alex Jones was actually ordered to pay, you know, six figures to each family, for each family member, something like that.
00:11:02.000 Maybe the total payout would have been like five million bucks.
00:11:05.000 And they would have said that to cover the cost of security and moving, all the legitimate, you know, All the legitimate costs incurred by these families.
00:11:14.000 But the idea is here, punitive damages.
00:11:19.000 The lawyers actually said to the jury, fine him so much he cannot keep his business up and running, or something to that effect.
00:11:27.000 So the goal here is to punish him so that he doesn't do it again.
00:11:30.000 And I just think that is where I say, no, that's wrong.
00:11:34.000 And I'll tell you, Alex Jones is allowed to be wrong about things.
00:11:37.000 He's allowed to have his opinion about things.
00:11:38.000 Now, if he defamed by saying statements of fact that were not true about private individuals, then I think he owes them damages.
00:11:45.000 But if it was that he truly believed it, and he was expressing his opinion in a major news event, then I don't see how you come after him for tens of millions or even a billion dollars.
00:11:53.000 Paying for actual damages because you were wrong and spread misinformation about someone that damaged their life?
00:11:58.000 That I understand.
00:11:59.000 I'm thinking about a metaphor.
00:12:02.000 I know you'll love it about like social media administration.
00:12:05.000 And one of the things that minds that we were talking about is if you ban someone for violating terms of service, you ban the channel.
00:12:11.000 And then when they try and make a new channel, some of the admins would be like, no, they can't make any more channels.
00:12:16.000 They violated the terms.
00:12:17.000 I'm like, no, no, no, you're not banning the person.
00:12:19.000 You're banning the channel.
00:12:20.000 The channel violated the terms.
00:12:23.000 So it's the same thing with Alex.
00:12:24.000 You don't stop him from coming back.
00:12:28.000 You punish him for the fraud or whatever the problem, but there's no preventing people from existing or a realm back to normality.
00:12:28.000 That's not the point.
00:12:40.000 That's my opinion anyway.
00:12:41.000 Or redemption.
00:12:43.000 It's essential.
00:12:44.000 The human conversation is about redemption.
00:12:47.000 I mean, that's the Christian ethos is redemption.
00:12:50.000 Well, these people aren't Christians.
00:12:52.000 Well, our society is kind of based on it in a lot of ways.
00:12:55.000 And I think that's a good thing.
00:12:58.000 I think progress is a good thing, but I think there's a lot of values that come from Christian values that I think are good, notably like innocent until proven guilty is a really good one.
00:13:09.000 And there's a lot of bad ones.
00:13:10.000 And we found over a long period of time, we've gotten rid of the bad ones.
00:13:13.000 But I think with this, the issue comes down to you cannot get rid of Alex Jones.
00:13:17.000 It's impossible.
00:13:18.000 Okay, here you go.
00:13:20.000 John Smith starts a media company, and then says, Alex, I'm going to hire you, and I'm going to pay you $10 an hour to host this show.
00:13:28.000 A benefit of working for this company is that you're going to eat at a five-star restaurant steakhouse every night, the company pays for your penthouse, and you get a car, a corporate car, and you get corporate private jets, but you only get paid $10 an hour.
00:13:40.000 What are they going to do?
00:13:41.000 Rad, that's cool.
00:13:42.000 What are they going to do?
00:13:43.000 So if Alex Jones as a personality on a show generates millions of dollars, tens of millions, hundreds of millions, so if they sue him into oblivion, he has a trusted individual who starts a company and hires him.
00:13:54.000 Then what?
00:13:55.000 Now if at that point he says something again, they can sue that company, sure.
00:13:59.000 But then what?
00:13:59.000 Just do it again?
00:14:00.000 It's ridiculous.
00:14:02.000 There's nothing you can do to stop Alex Jones.
00:14:03.000 He is unstoppable.
00:14:04.000 He is a gigantic man boulder rolling down a hill.
00:14:07.000 You will not stop him.
00:14:09.000 Screaming on the top of his lungs, going crazy.
00:14:13.000 I want you to imagine Alex Jones rolling down a hill at 100 miles an hour going, ahhhh!
00:14:18.000 He hits the bottom of that ramp and then he just goes flying through the air.
00:14:22.000 Supernova.
00:14:23.000 And he's like meditating in the sky as he's flying.
00:14:25.000 Like that Pokemon.
00:14:26.000 Not Pikachu.
00:14:27.000 I forgot who it is.
00:14:28.000 But let me say this.
00:14:29.000 How many times, I'll put it this way.
00:14:31.000 The New York Times, the lies about the Iraq war and all of that stuff.
00:14:35.000 The lies about Iraq that got us into a war that cost, what, how many people died?
00:14:40.000 How many innocent civilians were killed?
00:14:40.000 Yeah, really.
00:14:43.000 How much money was wasted?
00:14:44.000 How much damage just around the world because of this?
00:14:48.000 Have we sued the New York Times into oblivion to make sure they can never operate again?
00:14:53.000 No, of course not.
00:14:54.000 They just say, well, you know, don't do it again.
00:14:56.000 And we all suffer because of it.
00:14:59.000 Alex Jones says incorrect things about a bunch of families that I think it was wrong of him to say.
00:15:05.000 And in the end, they're like, let's make sure he can never, ever run a company.
00:15:09.000 They want to destroy not just his life.
00:15:11.000 They want to destroy the lives of anyone who worked for him.
00:15:13.000 Even a guy who was like a groundskeeper or a security guard.
00:15:16.000 That's insane.
00:15:17.000 I want to tell you one more very, very important thing, especially as it pertains to censorship.
00:15:21.000 When they censored Alex Jones on YouTube, this is very important, they did not just take away his ability to speak, they deleted the entire archive of all of the work he had ever produced and published on that platform.
00:15:33.000 That's the scary thing about social media censorship.
00:15:36.000 It doesn't just say, you can't talk anymore.
00:15:39.000 Twitter, Facebook, YouTube erases your entire history.
00:15:43.000 Gone.
00:15:44.000 And then you can't comply with discovery.
00:15:46.000 And then they're like, where's the video clips?
00:15:47.000 Where's the video footage?
00:15:48.000 Where's your apology?
00:15:50.000 When did you say sorry?
00:15:51.000 Show us the details.
00:15:52.000 And of course, no one has the ability to have that many hard drives, that many backups.
00:15:58.000 He streams essentially, what is it, four hours a day on his show, and then there's a bunch of other shows that he goes on.
00:16:04.000 So It's just impossible and truly the larger censorship efforts are repugnant and disgusting and the corporate media does what Alex Jones does every single day.
00:16:15.000 The only difference is they never get held accountable for all the lies and all the disinformation that they spew.
00:16:19.000 Repugnant.
00:16:20.000 Yeah, that's a word, right?
00:16:22.000 Repugnant.
00:16:23.000 I love that word.
00:16:23.000 Potato, potato.
00:16:24.000 You know what I was saying.
00:16:25.000 I thought it was a word I didn't know.
00:16:26.000 There you go.
00:16:27.000 Yeah, it sounded quite good to me.
00:16:28.000 But the context transfer, people got it.
00:16:31.000 Is the issue that... I mean, this was an emotional reaction, I suppose.
00:16:34.000 It's like a billion dollars or whatever it is.
00:16:36.000 I do agree with what you're saying, actually.
00:16:38.000 Now that I think about this, to delete someone's entire archive of work seems insane.
00:16:43.000 But I also just feel like, what is to stop all of us now from just saying whatever we want that completely ruins someone's life that might not be true?
00:16:49.000 What should be done?
00:16:50.000 Human decency, I guess.
00:16:52.000 Morals, ethics.
00:16:54.000 To an extent, the Christian values that were instilled in this country, whether you like them or don't, they're there.
00:17:01.000 What should be done is defamation tort, civil tort.
00:17:06.000 You know, I think we have a problem with Times v. Sullivan.
00:17:09.000 Are you familiar with Times v. Sullivan?
00:17:10.000 This is a precedent that basically says politicians and public figures, it's not just Times v. Sullivan, there's another court case that added to it, but it's basically the precedent that says if you're a public figure, then there's a higher degree of scrutiny.
00:17:24.000 And I think that, you know, there's good things and bad things about it, but I think it kind of weighs on the negative.
00:17:29.000 What this means is, if CNN comes out and lies about me, and says that, you know, Tim Pool, you know, punched a dog, and then I sue and say, no, I didn't, they'll say you're a public figure.
00:17:42.000 Oh, well, that's no good.
00:17:44.000 No, and so, I mean, saying that I punched a dog is a very definitive thing, so they probably would have a hard time with saying something that definitive, but they can come out and say that you're a known white supremacist who sympathizes and does this, that, or otherwise, and then if you try and sue, they'll say it's a protected opinion against a public person, and so, case dismissed.
00:18:05.000 If you're a private individual, then you've got way more grounds, which is why Alex Jones basically lost here.
00:18:11.000 But I think the real reason he lost here is because the machine was out to get him.
00:18:14.000 That's my opinion on the matter.
00:18:16.000 I mean, it's pretty clear.
00:18:17.000 I mean, look what the corporate media does.
00:18:19.000 Just a few months ago, during the whole Afghanistan debacle, The United States Pentagon military literally launched a missile strike and killed an aid worker.
00:18:27.000 The U.S.
00:18:28.000 corporate media said that he was ISIS-K, that he was a terrorist.
00:18:31.000 He was an aid worker that was bringing fresh water to Afghanistan.
00:18:35.000 That's libel.
00:18:36.000 That's defamation.
00:18:37.000 This family is being defamed as some kind of new radical Islamist terrorists when in reality they were working with the U.S.
00:18:45.000 government bringing fresh water to Afghanistan.
00:18:47.000 Is the corporate media going to be held responsible for that particular incident of defamation, of lying about someone that just had their entire family annihilated in a drone strike that the corporate media and the US Pentagon were lying about?
00:19:00.000 No.
00:19:01.000 Should they?
00:19:02.000 If we're going to be playing by these rules?
00:19:03.000 Absolutely!
00:19:04.000 And I think you would get a reward more than a billion dollars for such defamation and such actions if we're playing by the same rules set by the president of this particular court hearing.
00:19:14.000 I don't normally do super chats this early, but there is one that I want to address because I brought it up, and I think for people who are listening to the segment, they should hear it.
00:19:21.000 Augusto Mimache says, Where is innocent until proven guilty in the Bible?
00:19:25.000 He who is without sin isn't the same.
00:19:27.000 No, it is the story of Sodom and Gomorrah, and it was, I think it was Abraham was talking to God.
00:19:33.000 And Guy was like, you know, I'm probably getting the story wrong, but the general idea is, you know,
00:19:36.000 I'm going to blow up these cities because they're full of, you know, nasty people.
00:19:40.000 And then he asks, but what if there are 50 righteous people?
00:19:42.000 And he's like, okay, if there are 50 righteous people, I won't blow it up. So
00:19:44.000 what about 40? Okay, if there's 40, and then finally comes down to what if there is but one
00:19:48.000 righteous man? And he says, okay, if there is one righteous man, I will not destroy these
00:19:52.000 cities. The idea there is that you cannot condemn someone to death if they are a righteous person,
00:19:58.000 right?
00:19:59.000 That's the general idea.
00:20:00.000 That was the inspiration for Blackstone's formulation.
00:20:03.000 It is better that ten guilty persons escape than one innocent person suffer.
00:20:06.000 Which then went to Benjamin Franklin who said it's better that a hundred guilty persons escape than one innocent person suffer.
00:20:11.000 And the philosophy there, the legal and moral and governmental philosophy is, if people don't have faith that as innocent individuals, they will be protected from false accusations, if people believe that even if they are innocent, the machine will try to crush them, then they don't lend their confidence to government, government doesn't work.
00:20:28.000 Ultimately, this becomes the element of the government that is, you must have a trial, you must be proven guilty, you can confront your accusers, and you are innocent until they prove you're guilty.
00:20:36.000 It's amazing.
00:20:37.000 I was reading about the Constitution, where these ideas came from, why we had them, and I traced that one all the way back.
00:20:43.000 It was very, very fascinating.
00:20:45.000 So yeah, there we go.
00:20:47.000 They say that 10% of people in prison are innocent.
00:20:50.000 It reminds me of Amanda Knox.
00:20:51.000 Remember Amanda Knox?
00:20:52.000 She was on my podcast and she's this lovely person.
00:20:55.000 She seems so nice.
00:20:56.000 And the minute I put that out, the amount of abuse I got and that she got, she now has to live with that because people don't go by innocent until proven guilty.
00:21:04.000 They're just all convinced she did it, you know?
00:21:06.000 Yeah, the court public opinion now. Yeah. And that's the problem with stuff like this Alex
00:21:11.000 Jones thing. Most people don't even know the full details of the trial. And I'm not gonna pretend to
00:21:16.000 either. I think if you if you watch someone like Viva Frye, you might get a much, much better
00:21:21.000 understanding of how the court case played out and everything like that. But then you take a look at
00:21:24.000 like the Ahmaud Arbery case, you take a look at the Cal Rittenhouse case, there are mobs with
00:21:28.000 pitchforks going around demanding violent ends to to to, you know, and they don't even know what
00:21:36.000 what happened. Like hands up, don't shoot.
00:21:38.000 People were riding over the Michael Brown stuff that turned out to be fake.
00:21:41.000 Ferguson was... West Florissant in Ferguson was burnt to the ground.
00:21:44.000 All these buildings over lies.
00:21:46.000 Lies from the media.
00:21:48.000 And the media went there and they were all gloating and talking about how it was a great networking event.
00:21:51.000 These people are nasty people, man.
00:21:54.000 So I don't think this is going to be a devastating blow against Alex Jones, because he is very persistent, and I think he's definitely going to figure out a lot of different ways around this.
00:22:03.000 I think he already has safeguarded a lot of his assets and a lot of his businesses, but also, more importantly, a lot of legal experts are saying that he's going to win on appeal when it comes to, of course, challenging this major decision.
00:22:16.000 There's still, again, one more court hearing that's going to, of course, be going after him.
00:22:20.000 But more importantly, I think with hyperinflation, the $1 billion is going to be a cakewalk, especially with the value of the dollar going down, with the Federal Reserve quantitative easing policies that have utterly destroyed our currency, and a billion dollars is going to be nothing in a few months from now.
00:22:37.000 Let's jump to this next one.
00:22:38.000 This is from Candace Owens, who tweeted, Earlier today I learned that Kanye West was officially kicked out of JPMorgan Chase Bank.
00:22:47.000 I was told there was no official reason given, but they sent this letter as well to confirm that he has until late November to find another place for the Yeezy Empire 2 bank.
00:22:57.000 This is nuts!
00:22:58.000 Here's the letter, it says, Dear Yee, is that his real name?
00:23:00.000 No, he goes by Yee now.
00:23:02.000 Wow, alright, there you go.
00:23:03.000 We are sending this letter to confirm our recent discussion with blank.
00:23:06.000 JPMorgan Bank has decided to end its banking relationship with Yeezy LLC, etc. etc.
00:23:11.000 The first thing I'm going to say before we get into the political ramifications of this is
00:23:14.000 make sure you are using Parallel Economy.
00:23:16.000 I know it's a relatively new financial service co-founded by Dan Bongino.
00:23:20.000 Shout out to Dan who's doing tremendous work to help fight back against censorship.
00:23:24.000 If you become a member at TimCast.com, we default Parallel Economy.
00:23:28.000 Support businesses that are fighting back, that believe in these values.
00:23:32.000 Parallel Economy is one of them.
00:23:34.000 Kanye West.
00:23:35.000 I will say it right now.
00:23:36.000 Candace Owens.
00:23:36.000 Kanye West.
00:23:37.000 Talk to Parallel Economy.
00:23:39.000 They're not a bank, they do financial transaction services, but talk to them.
00:23:44.000 Because if Kanye said, okay, I won't use Chase, I'll use this bank and I'll use this financial service, it could be a major movement towards pushing back against censorship.
00:23:53.000 And Kanye's big enough to make a huge impact in that regard.
00:23:55.000 So here we go, ladies and gentlemen.
00:23:58.000 You will owe nothing and you will be happy.
00:24:02.000 You want to take this one or should I?
00:24:04.000 You don't own your money anyway.
00:24:06.000 People think that's their money.
00:24:07.000 Those are Federal Reserve notes on loan.
00:24:09.000 They can take them back if they want.
00:24:10.000 They can shut you down.
00:24:11.000 Doesn't it even say on like money or something about that?
00:24:14.000 This is a Federal Reserve note.
00:24:15.000 It's a promissory note.
00:24:16.000 They give it to the American government.
00:24:18.000 The American government promises to pay them back a dollar plus interest for every dollar they borrow.
00:24:23.000 It's such an insane system.
00:24:24.000 So Ethereum, right Luke?
00:24:25.000 No.
00:24:27.000 No!
00:24:27.000 The other guy was shilling for it.
00:24:30.000 But that's another topic to discuss here.
00:24:33.000 Again, decentralized currencies are another thing, but the Federal Reserve is building their own centralized digital currency, which is going to lead to behavior like this tenfold.
00:24:42.000 And this is not the first time that a major banking institution punished someone because of their opinions.
00:24:48.000 I think that's what's happening here.
00:24:49.000 We don't know exactly what's happening here.
00:24:50.000 Maybe there's some accusations by Chase Manhattan that they didn't want to, of course, make public, but it's most likely what's been happening previously before, and that's we don't like your opinion.
00:24:59.000 We're going to make sure you can't exist in society.
00:25:01.000 We're going to make sure that you as an individual can't have any free flow of transactions, which is absolutely crazy.
00:25:08.000 Can I just give a shout out real quick to Candace Owen's profile picture on Twitter?
00:25:13.000 It's the Communist Fist, appropriated by BLM, holding wads of cash.
00:25:18.000 And it's melting!
00:25:20.000 Wow!
00:25:20.000 You don't like Ethereum?
00:25:22.000 I question everything, and I'm skeptical of everything.
00:25:25.000 So, again, when it comes to, you know, a lot of these digital currencies, I think skepticism is your best friend.
00:25:33.000 Ethereum is clearly better than Bitcoin, Luke.
00:25:35.000 Mark Zuckerberg is your friend.
00:25:37.000 Yeah, I told people from the very beginning, invest what you're willing to lose, because it's a Wild West market.
00:25:41.000 I do believe there are tremendous opportunities for decentralization, especially when it comes to privacy coins, things like Monero.
00:25:48.000 There's a reason the IRS is sending out messages and letters saying, please help us crack Monero.
00:25:52.000 There's a reason Amazon and Jeff Bezos are building technology that will break, of course, encryption.
00:26:01.000 I do believe in privacy coins, but I don't believe in what the World Economic Forum and a lot of the other big bankers are pushing, and that's centralized digital currencies.
00:26:11.000 The CEO of BlackRock was just talking about how the situation in Ukraine is going to lead to a bigger push for centralized digital currencies.
00:26:18.000 That's essentially what they're pushing for, a social credit score where they will track, trace, and database and punish you on the fly for wrong-think, just like they probably did to Kanye West with Chase Manhattan Bank.
00:26:29.000 I'm pretty flabbergasted.
00:26:30.000 I didn't know that banks and PayPal, of course, at the moment, I didn't know they could actually do this based on your views.
00:26:35.000 They can just say, okay, well, we're going to take your money or you can't use us for your money.
00:26:39.000 I had no idea that was allowed.
00:26:41.000 They can't do that in the UK?
00:26:42.000 Well, I think they can, but I didn't know.
00:26:44.000 I didn't know until recently.
00:26:45.000 Yeah, I'd imagine it's probably worse in the UK.
00:26:47.000 Yeah, quite possibly.
00:26:47.000 The PayPal stuff started happening to quite a few people I know in the UK.
00:26:51.000 I started seeing it happening on Twitter.
00:26:53.000 Loads of people saying, well, my PayPal's closed.
00:26:54.000 No one's told me why.
00:26:55.000 Don't know what's going on.
00:26:56.000 I mean, I've been stressed out about the whole free speech thing for quite some time now.
00:26:59.000 Just the way people are clamping down on it, using culture and the press to stop people just having views.
00:27:04.000 Look, the Kanye West thing.
00:27:05.000 I'm Jewish.
00:27:06.000 I've grown up Jewish.
00:27:06.000 My family's Jewish.
00:27:07.000 Friends are Jewish.
00:27:08.000 You know, properly, actually Jewish.
00:27:09.000 It's very offensive what he said.
00:27:11.000 I take it with a pinch of salt because he's Kanye West.
00:27:13.000 You know, he says mad things all the time and whatever.
00:27:15.000 You hear it every day anyway, so okay, whatever.
00:27:17.000 But I wouldn't want to have a bank that doesn't allow people who disagree with me or say horrible things to me to be able to keep their money there.
00:27:25.000 And it's the same with speech.
00:27:26.000 I want people who've got opposite opinions to mine to be able to say whatever they want, and people can decide if they want to listen to them, you know?
00:27:33.000 Would you have debanked Hitler if you were running the bank that he was using when he declared war on Poland?
00:27:39.000 I think once you've started a war, it might be a bit different if you're trying to stop it.
00:27:42.000 But I don't know.
00:27:43.000 That's over my pay grade.
00:27:44.000 What would you have done?
00:27:45.000 He was getting financed by the Bushes and the Rockefellers.
00:27:49.000 Prescott Bush.
00:27:49.000 I would have debanked him and used my bank as a weapon to win the war.
00:27:53.000 I mean, I'll be honest.
00:27:55.000 I mean, we say it's a culture war.
00:27:56.000 We kind of joke and laugh.
00:27:57.000 It's a real culture war.
00:27:59.000 It's a domination for your mind.
00:28:00.000 Sure.
00:28:01.000 But that's different.
00:28:01.000 That's you trying to win a war.
00:28:03.000 We're not trying to win a war against Kanye West.
00:28:04.000 But sanctions in war happen all the time.
00:28:06.000 Tons of people say we're no longer going to do business with you and your money's frozen.
00:28:10.000 This is a private citizen in the United States being told that he's being kicked off of a bank.
00:28:15.000 Now, we don't know exactly why.
00:28:16.000 They didn't say why, but we know why.
00:28:18.000 He's in the news, they're saying he's offensive.
00:28:21.000 I think this is actually really, really bad for Chase, and they made a big mistake.
00:28:24.000 Because we saw what happened with PayPal.
00:28:26.000 Their stock is down, I think, I don't know where it's at right now.
00:28:29.000 Let me check PayPal stock for the day.
00:28:31.000 Because it was up early.
00:28:34.000 Let's see where it's at.
00:28:34.000 So it's up 0.87%.
00:28:35.000 But in the past five days, they're down 10%.
00:28:40.000 So since this news broke over the weekend, they're down 10%.
00:28:44.000 Dude, while you're here, look at their last five years.
00:28:46.000 Look at what happened April 2020 to PayPal stock.
00:28:49.000 COVID is announced.
00:28:50.000 Guess how?
00:28:51.000 All these people made three times their money and then sold it all.
00:28:54.000 And then they went back to normal.
00:28:55.000 Someone tripled their money.
00:28:57.000 Yeah, but that means someone lost their money.
00:28:59.000 I imagine it was pension funds.
00:29:00.000 I think a cabal of organizations tripled their money by investing in PayPal through the pandemic and then got out.
00:29:07.000 And then a bunch of people that were trying to ride the wave probably bought in while it was up.
00:29:10.000 Yeah, people knew commerce was going to, of course, happen online and not in person and not in real life.
00:29:16.000 So it was an investment that a lot of people predicted.
00:29:19.000 I've heard stories from people who bought Moderna stock.
00:29:22.000 Someone told me this, they bought Moderna stock right at the start of the pandemic because they were like, oh, the media is talking about it.
00:29:27.000 And then what was that, like a 100 times increase or like a 40 times increase on your money?
00:29:32.000 Investing in these machines and then getting rich off them, like investing in PayPal, knowing that people have to buy online and they can't use, you know, Mom-and-pop stores and bodegas.
00:29:44.000 So there was money to be made.
00:29:45.000 I don't think it's necessarily what you think it is, Ian.
00:29:48.000 It's probable that a lot of powerful elites were playing games.
00:29:51.000 I think a lot of it was people being like, I'm going to invest in PayPal and Amazon and Netflix.
00:29:56.000 And all of their stocks skyrocketed because people couldn't leave their houses.
00:30:00.000 It's almost like his wife is the speaker of the house and has an inside track on what bills are moving forward.
00:30:05.000 It's probably more likely that he's psychic.
00:30:07.000 guy very smart you know it's almost like he's like psychic like he just knows
00:30:11.000 these things are gonna have or like he has like a like his wife like has some
00:30:14.000 inside scoop or something yeah it's almost like his wife is the speaker of
00:30:18.000 the house that it has an inside track on what bills are moving probably more
00:30:20.000 likely that he's like it's almost as his wife is making all the decisions in big
00:30:24.000 industry that of course is directly correlated to some of the big trades
00:30:27.000 that he's making welcome to the modern era my friends they went after Kanye West
00:30:31.000 man That's what happens.
00:30:34.000 Well, we still don't know exactly what happened here.
00:30:36.000 We could speculate it's probably because of his more spicy, controversial topics and conversations.
00:30:42.000 I've been seeing a little scuttlebutt of what he's been saying.
00:30:46.000 But again, when banks get more involved, when there's more social pressure, You know, this is creating a society that is essentially a social credit score.
00:30:58.000 You can't think the wrong things, you can't express the wrong things, and even if you do express the wrong ideas, at least have the ability to be able to, of course, have them challenged, have them questioned, because that's how you stop someone from believing in bad things.
00:31:11.000 You question those things and you actually talk them through it, instead of just censoring them, which actually pushes them to the further extremes, And has them go to places where they get more radicalized and they get more crazy belief systems, which, again, is psychologically proven to be true.
00:31:26.000 I think that you're right that diplomacy and communication is the way, as kind of the United States was based on a bunch of people that didn't agree, got together, and then they started figuring it out together, and they didn't have global wars, but the problem now is that we don't speak, I don't speak Russian, I don't speak Chinese, I don't speak Mandarin, and so to debate and have conversation globally in this new, it'd be like if people in West Virginia spoke Mandarin, but people in Massachusetts spoke Russian, but we're still expected for those people to somehow work together, it wouldn't have happened.
00:31:52.000 They had a common language.
00:31:53.000 And now on the globe, we don't have a common language.
00:31:55.000 We kind of have English, but I don't understand the Russian guys speaking Russian.
00:31:59.000 And that's a big... And I don't understand what Kanye West is saying half the time, to be completely honest.
00:32:04.000 Oh yeah, we listened to one of his shows, him and Rogan, and Tim was like, I do not understand.
00:32:08.000 I was like, I get every word, man.
00:32:10.000 He's just a wild... He's an empath.
00:32:14.000 No, his Tucker Carlson interview was very coherent.
00:32:18.000 It was very good.
00:32:19.000 Yeah, we listened to that.
00:32:21.000 I thought it was eloquent there.
00:32:22.000 Yeah, and then their vice came out and was like, listen to what Kanye really said, but Tucker edited it out.
00:32:27.000 And I was like, it's just like one controversial thing.
00:32:30.000 He's a black Hebrew Israelite, I guess.
00:32:31.000 Kanye West believes that he's a true Jew.
00:32:33.000 And I'm like, I don't know.
00:32:35.000 Maybe Tucker Carlson took it out because it didn't flow the conversation.
00:32:38.000 And it's like, they cut it for time.
00:32:39.000 Yeah.
00:32:40.000 Oh, because Judaism is passed through blood?
00:32:43.000 That's the idea?
00:32:44.000 And so he says his great ancestors were Jewish?
00:32:46.000 No, no, no, no.
00:32:46.000 Black Hebrew Israelites believe that black people are the true, you know, children of Israel or whatever, and that the people there currently are like occupying it and stole it from them or something.
00:32:57.000 Interesting philosophy.
00:32:59.000 When BLM first started getting really big a year or two ago, that was quite a difficult time for a lot of Jewish people because a lot of anti-semitism came from there and there was a lot of sort of Jewish-Black rivalry stuff going on for about a year or so.
00:33:11.000 So I lived in Crown Heights in Brooklyn and there was a huge Jewish, like, there were shootouts between Jewish, Hasidic Jewish people and the Black community, like, It was weird.
00:33:23.000 And they just had like a mobile command at like this dividing line between the two neighborhoods.
00:33:28.000 And I just, I didn't, I didn't really understand it.
00:33:30.000 But then I, several years later when I started, I was covering a riot in Baltimore and these kids were talking about, they were, they were, they were Muslim.
00:33:36.000 And they started talking about a lot of this stuff and they were just like coming up to the cameras and then yelling stuff.
00:33:41.000 And then people started explaining to me like Farrakhan and all this stuff I wasn't super familiar with.
00:33:45.000 And I was like, ah, okay, I get it.
00:33:47.000 Right, so you've got these people who believe really, really anti-semitic things.
00:33:51.000 We saw the reporting from, I think it was the New York Times and Tablet, about the heads of BLM were extremely anti-semitic, spreading these conspiracy theories about Jewish people and all that stuff.
00:34:00.000 Well, this was a really weird time, because BLM was everywhere in the UK.
00:34:05.000 The UK, in some senses, is further left and more woke than America, and in other ways, in other senses, it's the opposite.
00:34:11.000 We tend to take our lead from you guys.
00:34:14.000 Something happens here and six months later it's like all over the UK.
00:34:18.000 And BLM of course, that happened in the UK and it almost doesn't make sense.
00:34:21.000 The statistics and everything in the UK didn't really make sense.
00:34:23.000 I think like three black people over 10 years had been killed by police or whatever.
00:34:29.000 We don't have as many black people in the UK.
00:34:31.000 It's a totally different thing.
00:34:32.000 But BLM went crazy.
00:34:34.000 And then I had to go to, like, soccer games, right?
00:34:36.000 And I would see just, like, the t-shirts had BLM all over them.
00:34:39.000 BLM was written everywhere.
00:34:40.000 And I'm just sitting there with my dad and that, and we're just watching it, like, as Jewish people.
00:34:44.000 We're like, where's my decision in this?
00:34:47.000 Because I know that the organization... I don't have a problem with the words Black Lives Matter, but that organization, the anarcho-Marxist anti-Semitic thing, why do I have to watch that on my football club now?
00:34:59.000 It wasn't nice to have to watch.
00:35:01.000 So you talk all about cults and stuff like that.
00:35:02.000 We mentioned earlier in the show that the wokeness is very much a kind of cult, but what are your thoughts on all this in the United States, in the UK, just generally in the West?
00:35:10.000 It's just a bit... I think it comes from status, I think you guys say, right?
00:35:17.000 Yeah.
00:35:18.000 There's a theory, and I owe a writer called Will Storr for this, about status.
00:35:24.000 We sort of evolved in tribes, and there were three types of status.
00:35:28.000 One was dominance, one was success, and one was virtue.
00:35:33.000 So you would get more of the food if you were particularly dominant in a tribe.
00:35:37.000 You would get more of the food if you were successful.
00:35:39.000 If you're successful in a tribe, you're maybe good at making the fire.
00:35:43.000 You invented the wheel.
00:35:45.000 Everybody's going to share their food with you.
00:35:46.000 Now, if you're not particularly dominant as a person, if you're not particularly successful, the third option was virtue.
00:35:54.000 And you had to show that you're a really nice guy.
00:35:56.000 You're going to share your food with people.
00:35:57.000 You're going to help someone if they need it.
00:35:59.000 Then they're going to share their food with you.
00:36:01.000 But the thing was, you didn't actually have to be virtuous.
00:36:04.000 You didn't have to actually help people.
00:36:05.000 You had to make it appear like you did.
00:36:07.000 You had to signal your virtue.
00:36:08.000 Exactly that.
00:36:09.000 Exactly that.
00:36:10.000 And they found, like, there have been studies showing that people who do that a lot are more likely to be psychopathic or narcissistic.
00:36:17.000 That makes sense.
00:36:18.000 I wonder if what we're seeing is basically there's like this...
00:36:23.000 Grand restart or something.
00:36:25.000 A great restart that's happening.
00:36:27.000 And what happens then is people who don't have purpose become wayward.
00:36:33.000 You know, what is it?
00:36:34.000 The idle hands of the devil's playground.
00:36:36.000 So you have a bunch of people who have nothing to do.
00:36:39.000 No specialties, no expertise.
00:36:41.000 They're just listless wandering NPCs or whatever.
00:36:44.000 Irrelevant.
00:36:45.000 And so in this hollow hole in their heart, they fill it with fake purpose.
00:36:51.000 And so they joined the machine, and that's why they're so adamant about never giving it up, no matter what.
00:36:56.000 Why is it that we had Larry Elder here, and he says, you know, I tried showing the quote, the transcription from Trump, where Trump did not say, you know, verifying people and stuff.
00:37:03.000 He said that white supremacists should be condemned totally, and they refused to read it.
00:37:08.000 It's because they have nothing in their hearts.
00:37:11.000 And so, you know, for me, I play music, I play Magic the Gathering, I skateboard, I rollerblade, we're writing songs, we talk politics.
00:37:21.000 I am very full of purpose.
00:37:24.000 But what about someone who has nothing?
00:37:25.000 What about someone who's not really that good at playing guitar?
00:37:27.000 What about someone who's not really good at playing games?
00:37:29.000 What if someone has no hobbies and they have no friends?
00:37:31.000 The only thing they have is to latch onto this ideology.
00:37:34.000 That's the one thing they have.
00:37:35.000 And then you come along, the nerve of you.
00:37:38.000 Trying to give them the truth, which would shatter the only thing they have in their hearts.
00:37:44.000 They won't allow it.
00:37:44.000 They go nuts.
00:37:46.000 And that's why they get violent on behalf of it.
00:37:48.000 It's the only thing they have.
00:37:50.000 It's their only passion.
00:37:51.000 You're spot on.
00:37:51.000 I think that's exactly it.
00:37:53.000 But I think it's the same reason people sometimes go really far, right?
00:37:56.000 It's the same reason people get into Scientology.
00:37:58.000 Or Mormonism or whatever it is.
00:38:00.000 Often they don't quite have something in their lives.
00:38:02.000 I always remember, you know, when I was a bit... I don't think I was woke but maybe liberal when I was like 18, 19 years old.
00:38:07.000 You go to university, right?
00:38:08.000 You know when you go and everybody's got in their dorms like posters of like, this is who I am.
00:38:12.000 I'm someone who likes the Godfather.
00:38:13.000 And it's like, well everyone likes the Godfather, you know?
00:38:15.000 But that's my individuality.
00:38:17.000 And hopefully as you get older you start to, as you say, take up more interests.
00:38:20.000 You don't identify, you know, everyone's got in their pronouns or whatever it might be in their Twitter profile.
00:38:25.000 And, you know, why do we need to know that?
00:38:27.000 Well, I want to know if you play guitar, as you say.
00:38:29.000 I want to know what you do in your life, what's interesting.
00:38:31.000 And those are the people, I think, sometimes who are led into cultish groups.
00:38:35.000 They don't have anything.
00:38:36.000 Like, you know, so I like to skate, was a rollerblading earlier today.
00:38:42.000 And every day when I do, I'm trying to one-up myself.
00:38:45.000 So I'm trying to go higher on the vert ramp or I'm trying to do something I've never done before.
00:38:50.000 And that's fulfilling and it's an accomplishment.
00:38:53.000 I'm challenging myself every day.
00:38:54.000 But if you're someone who doesn't have that, then the only thing you have to give you that dopamine release is going to be fitting in and having someone else praise you or feeling like, you know, you've latched on to something.
00:39:04.000 Or eating.
00:39:05.000 Yeah, eating.
00:39:05.000 That's another thing.
00:39:06.000 Maybe that's why a lot of them are morbidly obese.
00:39:08.000 No joke.
00:39:10.000 If you feel depression, you want serotonin, you eat.
00:39:12.000 That's a big thing.
00:39:12.000 That definitely raises your mood.
00:39:14.000 Well, yeah.
00:39:15.000 And the food's being engineered to become addictive.
00:39:17.000 But that's another process in itself.
00:39:20.000 I remember a couple of years ago looking at psychological studies, specifically when it came to radical jihadists.
00:39:25.000 And a lot of psychologists found that it was poverty and polyamory that led to, of course, people becoming radicalists because there wasn't enough women for the guys to go around.
00:39:36.000 One guy would marry 10 to 12 to 15 to 20 wives and there wasn't enough wives to go around because of the poverty, because of the lack of education.
00:39:43.000 A lot of people just You know, went to extremist groups.
00:39:47.000 So when you look at those conditions, when you look at what's happening in today's day and age in the dating world, when you look at what's happening financially with the banks and the big multinational corporations pretty much stealing everyone's money, we're pretty much creating the same situation for radicalization.
00:40:01.000 So yes, things are not going to be good with so many people radicalized, with so many people going to the fringes and going to these extremist groups that they're going to be taken advantage of with.
00:40:12.000 And now we have the cults becoming prominent in government, in major corporations and institutions.
00:40:20.000 Have you studied or read up on other more traditional cults?
00:40:23.000 Stuff like Scientology or Nixxiom or those kinds of things?
00:40:27.000 Sure, I don't know about those specifically, those are more modern, but I mean, there have been, you know, was it Jim Jones, is that his name?
00:40:33.000 Yeah, yeah, yeah, Jonestown.
00:40:34.000 Jonestown.
00:40:35.000 That was mad.
00:40:36.000 I use those because I think we have, well actually, Scientology might be a good example as well, but I think Jonestown is, it's very definitive.
00:40:42.000 It had an end, there's a lot that we know about it.
00:40:46.000 Are there similarities or can you look at something like that and then try and predict where we go with the modern woke cult?
00:40:53.000 Well, Jonestown, I think, is really rare, just in terms of how... I mean, I use the word cultish to think, you know, something is cultish, maybe it's 1 out of 10 cultish, and Jonestown was 10 out of 10.
00:41:02.000 Scientology is like a 9 out of 10 or a 10 out of 10 or something.
00:41:05.000 Heaven's Gate was the other one.
00:41:06.000 Do you remember that one?
00:41:07.000 Oh yeah, was that where they thought they were going to go on the comet or whatever?
00:41:10.000 Yeah, yeah, yeah.
00:41:12.000 Did they drink the Kool-Aid too?
00:41:13.000 Yeah, so that's actually a misnomer, the Kool-Aid thing.
00:41:16.000 I think it wasn't Kool-Aid, it was something else, but it got remembered as Kool-Aid.
00:41:20.000 I love how this story is relatively modern, right?
00:41:24.000 It was in the 90s?
00:41:24.000 Yeah, I think so.
00:41:25.000 And now drink the Kool-Aid has become a slang term for buying into something.
00:41:30.000 And it's wrong.
00:41:31.000 Oh, it was Flavor-Aid.
00:41:32.000 It was Flavor-Aid?! !
00:41:34.000 Oh, not Flavor Aid!
00:41:35.000 It was cyanide poison.
00:41:38.000 And there's video, audio of them eating it, and people didn't want to.
00:41:43.000 They were refusing.
00:41:43.000 They were screaming.
00:41:44.000 They were taking babies away from mothers to poison the baby, and the mothers were screaming.
00:41:48.000 It's horrific.
00:41:49.000 It's on YouTube.
00:41:50.000 Are you talking about Jim Jones now?
00:41:51.000 Jonestown.
00:41:52.000 That's all Jonestown.
00:41:53.000 I don't know about Heaven's Gate.
00:41:55.000 So Jonestown, people believe that they all killed themselves and they're like, how did that happen?
00:42:00.000 And that's not really what happened.
00:42:01.000 They tried to escape, as you say, and there were security guards just shooting them and killing them all dead.
00:42:05.000 Why though?
00:42:06.000 It's like a 40 minute video on YouTube.
00:42:08.000 Because they're cultists.
00:42:09.000 The guy in charge, Jim Jones, was being caught.
00:42:12.000 I think the FBI, I think it was, or would it have been the CIA, were in Guyana sort of checking up on him and he knew his days were numbered.
00:42:18.000 So it was like, okay, this is the next step.
00:42:20.000 We all have to, we all have to kill ourselves now.
00:42:22.000 So I think it's rare that people get absolutely swept up to a point that they're willing to actually kill themselves like that.
00:42:28.000 I think there's, and I think that's the same with a lot of, you know, we talk about the woke ideology stuff.
00:42:32.000 I think deep, deep down.
00:42:33.000 And the woke stuff goes back to the New Puritans.
00:42:35.000 I talk about the Puritans.
00:42:36.000 Andrew Doyle is great on that.
00:42:38.000 I interviewed him recently.
00:42:39.000 He's fantastic.
00:42:40.000 He's got this book called The New Puritans, just about how similar the witch-hunting stuff is to a lot of the woke ideology.
00:42:46.000 And they knew it wasn't real, the witch-hunting people, but they knew that if they said anything, just like now, if they said anything, they'd be next.
00:42:52.000 You'd get shouted down.
00:42:54.000 So that's probably the closest thing, I think, in terms of like historic cults and how that relates to today.
00:42:59.000 So maybe in 50 to 100 years, they'll talk about the cancel culture trials and how absurd and crazy everybody was, and we'll come out on the right side of this one.
00:43:08.000 So it's that they think that if they cause enough pain to the enemy, that they're actually causing good to the ones they... I mean, I don't understand, like, what was the impetus of the witch hunts?
00:43:17.000 Why were they doing it?
00:43:18.000 I've heard that they were on ergot, that they were inadvertently dosing themselves with ergot.
00:43:22.000 Oh, I don't know.
00:43:23.000 It's a fungus on wheat that causes hallucinations.
00:43:25.000 Yes, correct.
00:43:26.000 And so they were tripping balls.
00:43:27.000 They didn't know it.
00:43:27.000 So they thought he's a witch.
00:43:28.000 She's a witch.
00:43:29.000 I'm a witch.
00:43:29.000 They didn't know.
00:43:30.000 It also has to do with the fact that people have like the hag's dream.
00:43:33.000 You've never heard of that?
00:43:34.000 The hag's dream when you fall asleep and you feel like someone's in the room.
00:43:36.000 It's just an evolutionary trait that allows you to think, oh, if there's something in the room, you are awake, but you're not moving.
00:43:42.000 Your body keeps you from moving.
00:43:43.000 I get that.
00:43:43.000 Sleep paralysis.
00:43:44.000 I get that.
00:43:45.000 My girlfriend has to wake me up.
00:43:46.000 I'm like, in my head, I'm screaming.
00:43:49.000 I think I'm screaming.
00:43:50.000 I'm going to go.
00:43:52.000 But what's actually happening is I'm going.
00:43:55.000 Which is freaky for my girlfriend at four in the morning.
00:44:03.000 But the thing with the witch thing, which also relates to modern times, because you guys were talking before about when it's powerful people the same rules don't apply.
00:44:11.000 The girls that were accusing everyone, they did, towards the end, when it was falling apart, they did accuse people who were quite prominent.
00:44:18.000 And everybody just sort of went, what?
00:44:19.000 Shut up.
00:44:20.000 And then they had to shut up after that.
00:44:21.000 The witch hunts?
00:44:22.000 Yeah.
00:44:23.000 Wow.
00:44:24.000 They tried to reach too high.
00:44:26.000 And then part of people said, no.
00:44:27.000 It's kind of like when they went after Rogan with the Ivermectin thing.
00:44:29.000 Yes.
00:44:30.000 Well, not just that.
00:44:31.000 They went after Rogan for a lot of things and it just doesn't work.
00:44:34.000 Nothing sticks.
00:44:34.000 Yeah, sometimes it's overt nonsense, you realize.
00:44:38.000 Yeah.
00:44:38.000 Well, that's what happens eventually.
00:44:40.000 The veil gets shattered, I guess.
00:44:42.000 And people are just like, okay, wait a minute.
00:44:44.000 You know, what's really going on here?
00:44:46.000 But I think it's like most things, it's just confidence.
00:44:48.000 Do the people have enough confidence of other people to engage in insane behavior?
00:44:56.000 People were like, even really intelligent people, I mean all of us, probably get misled and led down certain rabbit holes and we don't even realise it.
00:45:03.000 My favourite example is Arthur Conan Doyle who wrote Sherlock Holmes and he's supposed to be the master of deduction, Sherlock Holmes, so Arthur Conan Doyle's a super clever person.
00:45:11.000 And there's a great book called The Intelligence Trap about exactly this.
00:45:14.000 The cleverer the person is, the better able they are to convince themselves of mad conspiracies.
00:45:20.000 Because they're smart.
00:45:21.000 So it's confirmation bias.
00:45:23.000 So Arthur Conan Doyle believed in fairies.
00:45:25.000 And that was at a time when people did not believe in fairies.
00:45:28.000 It wasn't like, oh, it was back in the olden times.
00:45:30.000 But he just was obsessed.
00:45:32.000 And he was mates with Houdini, the magician.
00:45:34.000 And Houdini was like an absolute skeptic.
00:45:36.000 And they fell out over that.
00:45:38.000 What?
00:45:38.000 Yeah, big argument over that.
00:45:40.000 And also because Arthur Conan Doyle kept trying to get Houdini to do this clairvoyant stuff and speak to his dead mother and mad stuff like that.
00:45:47.000 And Houdini's like, it's that real?
00:45:49.000 Yeah.
00:45:49.000 And Arthur Conan Doyle was like, those fairies are real!
00:45:51.000 But the fairies were a prank by some young girls.
00:45:53.000 They just put up some pictures of fairies and put like a drawing pin in the stomach and just put it, like little paper fairies.
00:46:01.000 And he believed that.
00:46:01.000 And he thought that the drawing pin, the little pin in the stomach, that was evidence that they had belly buttons and so they had children.
00:46:08.000 It was like fairies can reproduce.
00:46:10.000 That was what he was thinking.
00:46:11.000 This is a super smart guy.
00:46:12.000 So I think what's happening is, as you said, there's all these people who are perhaps really intelligent.
00:46:17.000 We like to think of, oh, you joined Scientology, you must be an idiot.
00:46:20.000 A lot of them are really intelligent.
00:46:21.000 I've met some really intelligent ex-Scientologists.
00:46:24.000 But maybe they're lacking some sort of purpose.
00:46:25.000 And then someone comes along and says, no, no, you've got purpose.
00:46:28.000 You're going to sign a 1 billion year contract, which is what they have to do.
00:46:31.000 And you're going to, with each bit of money you put in, you're going to learn more secret stuff about aliens and Lord Zeno.
00:46:37.000 That's Scientology?
00:46:38.000 That's Scientology.
00:46:38.000 That's a billion year contract?
00:46:40.000 Yeah.
00:46:41.000 What?
00:46:42.000 Because you're immortal or something?
00:46:44.000 Do you know the Scientology backstory?
00:46:46.000 Only a little bit.
00:46:47.000 About the Thetans.
00:46:48.000 Yeah, yeah.
00:46:49.000 Dropping them in the volcano and stuff.
00:46:51.000 So they now say this isn't what they think, but it is.
00:46:54.000 It is.
00:46:54.000 This is the problem with the internet.
00:46:55.000 It's ruined cults because cults are supposed to be, they're supposed to be a level that you can't reach because it's like the secret level, right?
00:47:01.000 And until you put enough money in.
00:47:02.000 But now we have the internet.
00:47:04.000 So it's just there.
00:47:05.000 So now Scientology has to go out and say, no, no, no, that's not true.
00:47:08.000 Until you get to that stage and go, yeah, it was true actually.
00:47:11.000 Um, so it is, yeah, Lord Zeno, and he, lords, they all lived somewhere else, and he killed everyone in this alien planet, and all the spirits went away into Earth, into the volcanoes of Earth, and they went out and into everybody's bodies now.
00:47:24.000 And now you have like a bunch of ghosts inside you or something?
00:47:27.000 Is that it?
00:47:27.000 Yeah.
00:47:28.000 I was in Hollywood and I was skating.
00:47:31.000 And yeah, they got the Scientology thing on Hollywood Boulevard.
00:47:33.000 I think it's on Hollywood.
00:47:34.000 Is that the street?
00:47:34.000 Yeah, it lives right next to it.
00:47:35.000 It's between Sunset and Franklin, or Hollywood and Franklin.
00:47:38.000 So I'm skating and then I see a guy with the book and he's outside and he's waving as I'm coming up, so I stop.
00:47:44.000 And then he's like, hey, how's it going?
00:47:45.000 He talks to me.
00:47:46.000 And he asked me if I knew anything about Dianetics, because I think that's what it says, like Scientology.
00:47:52.000 And then I was like, no.
00:47:53.000 And he was like, do you know anything about Scientology?
00:47:55.000 And I was like, just what I've seen from the TV.
00:47:57.000 I was like, don't you guys believe like aliens and like volcanoes?
00:48:01.000 And he goes, that's what the cartoon says.
00:48:04.000 Do you get all your facts from cartoons?
00:48:05.000 Self-taught.
00:48:06.000 And I was like, no.
00:48:07.000 And he goes, oh, so how about we actually tell you what we do?
00:48:10.000 And I was like, sure.
00:48:11.000 They came in and sat down and then they gave me the e-meter.
00:48:13.000 And then yeah, yeah, for real.
00:48:15.000 And then I put my hands on it or whatever.
00:48:16.000 And I'm like, nothing's happening.
00:48:18.000 I have no idea what you're talking about.
00:48:19.000 And then the guy asked me if I would be interested in better understanding and buying the book.
00:48:23.000 So I actually bought Dianetics.
00:48:25.000 Now I'm completely skeptical.
00:48:27.000 I think it was like 20 bucks.
00:48:29.000 But I was like, how can I be critical of something I've never actually read?
00:48:31.000 If I don't know what they're talking about, it's simple for me to watch South Park and then be like, the TV said you're dumb.
00:48:38.000 But that would be stupid.
00:48:39.000 And so I think I actually got maybe like 50 pages in and then laughed and then put it down.
00:48:43.000 I was like, I can't read this.
00:48:45.000 Because I think for me, it seemed really obvious how the manipulation worked.
00:48:51.000 It's like my view of it is it's a false logic.
00:48:54.000 Like, it'll say something, hey, did you ever experience this feeling?
00:48:59.000 That's because of this thing.
00:49:00.000 And then you go, oh, I could see how that works.
00:49:02.000 If they can make you, using sophistry, go from point A to point B to C to D to E, they're walking you towards, they're creating the reality.
00:49:10.000 They're building the logical structure for you to, and I read that analysis like, you know, people fall for this stuff.
00:49:17.000 They read it and believe it.
00:49:18.000 Speaking on the integrity of Scientology, I lived in Hollywood and did a video, a movie for the Scientology group.
00:49:25.000 They had me and my girlfriend come in.
00:49:26.000 They want to do something on the right to marriage.
00:49:28.000 So they were like, okay, for this, you guys are married.
00:49:31.000 Tell everyone how happy you are.
00:49:32.000 So we're like, okay.
00:49:33.000 So we faked, lied that we were, they got a non-married couple to pretend like they were married because they liked the way they looked.
00:49:40.000 You had to lie that you were happy.
00:49:42.000 No, we had a lie that we were married.
00:49:43.000 So this is like a church, technically, telling, telling, having non-married people pretend that they're married.
00:49:49.000 Like it's, it's a lie.
00:49:50.000 I was lying for the Scientologists and they paid me to lie for them.
00:49:54.000 Yeah, sounds like him.
00:49:56.000 I want to, I want to jump to another segment, but we're going to, we're going to grab this super text that just came in.
00:49:59.000 Sleep is the cousin of death, says Tim, making fun of Scientology, but thinks God's real.
00:50:04.000 Oh boy.
00:50:04.000 You see, this is, this is the challenging thing, I guess, about trying to talk to people about Physics, spirituality, reality, understanding, philosophy, etc., etc.
00:50:17.000 I could be wrong, but Einstein believed in God.
00:50:19.000 He was not an atheist.
00:50:20.000 That's where the phrase Einsteinian God comes from.
00:50:22.000 Maybe it's wrong, but the idea of Einsteinian God is that phrase is used to explain to people who don't understand They don't understand, I guess, how do you describe it?
00:50:36.000 Concepts beyond knowledge, right?
00:50:39.000 Infinity.
00:50:39.000 What does infinity truly mean?
00:50:41.000 How do you perceive dimensions beyond three?
00:50:45.000 So, there are certain ideas that we understand can exist, but it's hard for us to conceptualize, we can describe.
00:50:50.000 So, with the issue of Einsteinian God, you talk to your average person, you say, like, describe God, and they'll say, you know, a guy with a beard, he's in the clouds, and he's got white hair, whatever.
00:50:59.000 And that's like a weird cartoon depiction of some religious deity.
00:51:04.000 But this is why I say I don't believe in any organized religion.
00:51:08.000 The simplest way to put it is God, in my view, is just there is a system by which we exist in.
00:51:17.000 We understand the system of a computer program.
00:51:20.000 We understand that we as humans are mapping out reality.
00:51:24.000 The simplest way to put it is God is the universe.
00:51:26.000 There is a greater power that exists beyond us.
00:51:28.000 It is the structure and code of the universe.
00:51:30.000 It exists.
00:51:31.000 That's the most rudimentary way to explain it to someone who thinks God is a person in the clouds.
00:51:35.000 Now, a lot of religious people genuinely do believe that God is a person who is above us or around us or whatever, and that's on them.
00:51:40.000 But if your assumption immediately is when I say, I believe in God, you think I'm talking about a guy in the sky, then you genuinely do not know enough about the subject.
00:51:50.000 And I don't mean that disrespectfully.
00:51:52.000 I mean, my view of this comes from reading religious texts, growing up briefly Catholic, reading about quantum physics, reading philosophy, and then going...
00:52:02.000 We don't know half.
00:52:04.000 We don't know anything.
00:52:05.000 We know so little.
00:52:07.000 And from that perspective, I would need probably a couple hours.
00:52:12.000 We've done this in the Members Only shows, breaking down the long trail of thought to understand the concept that I'm trying to get to.
00:52:19.000 There's no way I could do it in 10 minutes without making a whole show dedicated to it.
00:52:24.000 But simply put, my view of God is not a guy in the clouds.
00:52:28.000 It is well, well, well, well beyond that.
00:52:31.000 I want you to imagine infinity.
00:52:33.000 You know what infinity means, right?
00:52:34.000 A lot of people think infinity is a number.
00:52:35.000 It's not.
00:52:36.000 Infinity means everything beyond and endlessly.
00:52:40.000 I want you to imagine things that don't exist.
00:52:42.000 I want you to imagine a color you've never seen before.
00:52:44.000 I want you to imagine the fourth dimension.
00:52:45.000 How do you do it?
00:52:46.000 You can't.
00:52:47.000 You can understand a three-dimensional projection of the fourth dimension, and we can mathematically show how a fourth dimension will work, even though we can't conceptualize it outside of just drawing a mathematical picture.
00:52:57.000 So there's so much to break down to get through that.
00:53:00.000 I'll leave it there.
00:53:01.000 Maybe we'll talk about it when it comes to cults and we'll do it in the Members Only section.
00:53:04.000 But let's jump to this story from CBS12.com.
00:53:09.000 NBC reporter's interview with PA Senate candidate John Fetterman draws criticism.
00:53:13.000 So in this interview with NBC Fetterman, who's running as a Democrat in Pennsylvania, he can't actually understand the words that are being said to him because in May he suffered a very serious stroke and it was debilitating and it caused him very serious brain damage.
00:53:28.000 I'm not saying that as a pejorative or to be disrespectful.
00:53:32.000 He literally is suffering from this.
00:53:34.000 In the interview, let me play it here.
00:53:36.000 I'm not gonna, I don't care to play her audio.
00:53:40.000 They show in the, I think I just passed it actually.
00:53:43.000 So let me see if I can play it.
00:53:45.000 And well, you know, I'll pull the audio up.
00:53:48.000 What has the press been like?
00:53:49.000 And are you confident going into that debate?
00:53:51.000 It's gonna be...
00:53:52.000 So they do editing.
00:53:54.000 Because, uh, here you go.
00:53:56.000 There you go.
00:53:56.000 In this image, you can see right here, in order for Federman to actually answer the questions, they have a special program that transcribes the words she's saying into written text so he can read it and then answer.
00:54:10.000 This led to a whole bunch of people saying he's clearly not mentally fit to do the job.
00:54:14.000 How is he going to stand on the Senate floor and debate an idea when people are saying, John, you are wrong.
00:54:19.000 Your bill would do X, Y, and Z. And he goes, I, I can't understand anything.
00:54:23.000 I can hear the words, but my brain can't process it.
00:54:25.000 This is not someone being deaf.
00:54:27.000 A bunch of these woke journalists, Democrats and liberals were like, we don't discriminate against people who can't hear.
00:54:32.000 He can hear.
00:54:33.000 He can hear perfectly.
00:54:34.000 His brain cannot process the audio.
00:54:36.000 That to me is very, very serious.
00:54:39.000 I'll tell you this, man.
00:54:39.000 Yeah.
00:54:40.000 They voted for Joe Biden.
00:54:42.000 Hopefully, hopefully they learned their lesson.
00:54:44.000 I think a lot of people will have learned the lesson.
00:54:46.000 I think a lot of people won't.
00:54:47.000 But to all these people, I say to you, you voted for Joe Biden, right?
00:54:51.000 You thought, how bad can it really be?
00:54:52.000 How bad can it be?
00:54:53.000 It's better than Trump.
00:54:55.000 When you vote for someone who is cognitively impaired, you get disaster, you get crisis, you get deficits, you get shutting down of Keystone and how gas prices are skyrocketing.
00:55:05.000 And then a weird justification in the media that Shutting down and curtailing U.S.
00:55:08.000 energy had nothing to do with the fact that gas is skyrocketing at now six, seven bucks in California.
00:55:13.000 If you vote in this man, and I feel bad for him.
00:55:16.000 I mean, no disrespect.
00:55:17.000 I mean, a stroke is serious.
00:55:19.000 I feel bad, and I hope, I wish him the best.
00:55:21.000 But we need people who are physically and mentally capable to do the job.
00:55:25.000 I think they'll vote for him though.
00:55:27.000 Well, it's not only that.
00:55:29.000 Outside of politics, if you suffer a serious brain injury, you gotta rest.
00:55:33.000 You gotta relax.
00:55:34.000 You gotta give your brain time to heal correctly.
00:55:38.000 And just by shoving him into the camera, shoving him on stage all the time, that's not good for an individual who suffered a severe I mean, I've had a brain injury as well.
00:55:46.000 I had a brain injury in 2015.
00:55:47.000 his mind and you know if people really did care about this individual if people
00:55:51.000 really were looking out for his best interest they would be making sure he
00:55:54.000 healed correctly before putting him up putting him up on the stage. Yeah.
00:55:58.000 I mean I've had a brain injury as well. I had a brain injury in 2015. I fell asleep
00:56:01.000 driving and hit an accident response vehicle on the road and I cannot even
00:56:05.000 imagine being put on stage let alone you know having to be representative for
00:56:09.000 people that are voting me into power and into the United States.
00:56:12.000 Like, I don't remember if he's running for the House, Congress, I forget what he's running for now.
00:56:15.000 But I would never want to be in that situation.
00:56:17.000 It would be unbelievably hellish.
00:56:18.000 It would be extremely stressful because just coming out of that is already such a challenge.
00:56:22.000 What did you do to recover?
00:56:25.000 I spent some time with my parents, actually in the UK, funnily enough.
00:56:28.000 But it took me, I would say, you feel like you're recovered after about a year, two years, but you really only realize maybe five years later that, wow, okay, now I'm fully recovered from my brain injury.
00:56:38.000 So I honestly can't even remember or imagine what he's going through right now.
00:56:42.000 Not to give him court or anything like that, but I would never, never put myself in the situation where I'm being a representative for other people, let alone being forced to read to respond to everything.
00:56:52.000 That's wild, wild.
00:56:53.000 A lot of soldiers on the front lines of battlefields are told to not be on the battlefield for too long because of the concussions and the shooting and the grenades and the explosions and the sounds literally rock their heads to the point where they get serious brain injuries.
00:57:06.000 They're told literally get off the battlefield, go into a dark quiet room and make sure there's no stimuli, make sure you could actually rest and relax your brain.
00:57:17.000 So, when you're put on stage, when you're being quizzed during an interview, when you're taking questions from individuals, you're not allowing your brain to rest.
00:57:24.000 When you can't even interpret sounds, you're not allowing your brain to rest.
00:57:29.000 It's just a sad situation overall, and it's sad seeing him slur his words.
00:57:36.000 A lot of people are using this for political talking points, but it's beyond that, I think.
00:57:41.000 But, but, come on.
00:57:42.000 The dude should have dropped out.
00:57:44.000 The Democratic Party should have said, look man, you had a stroke, I'm sorry.
00:57:47.000 But I think, as much as I can say, I'm sorry to the man for having a stroke and I wish him the best, where I will criticize him heavily is his arrogance.
00:57:56.000 His arrogance in thinking, I'm gonna keep going.
00:58:00.000 Okay, man.
00:58:02.000 We see what happens with Joe Biden and now you're going to get it in Pennsylvania when the people vote for this guy.
00:58:07.000 It's going to be bad.
00:58:08.000 He's going to be in a meeting and someone's going to say something like, there's an emergency and we have to act now.
00:58:14.000 There's a crisis on the bridge.
00:58:15.000 He's going to be going, I don't know what you're saying.
00:58:19.000 You need to call it in now.
00:58:21.000 We've got a flood.
00:58:22.000 There's damage.
00:58:22.000 There's people dying.
00:58:24.000 Can someone get a CC machine in here so I can understand?
00:58:27.000 What do you do?
00:58:28.000 I'm sorry, man.
00:58:29.000 This is a guy, look, I get it.
00:58:31.000 He's running for Senate.
00:58:31.000 He's not running for, you know, he's Lieutenant Governor now.
00:58:34.000 That's bad enough.
00:58:34.000 He should have resigned.
00:58:35.000 I don't know if he's still in that position, but he's going to be on the Senate floor and they're going to be saying there's an emergency.
00:58:41.000 They're going to bring him in and they're going to have to get him a special computer.
00:58:44.000 Look, technology is fantastic.
00:58:47.000 Before this technology existed, even maybe 10 years ago, he'd be done.
00:58:52.000 That's it.
00:58:53.000 Sorry.
00:58:53.000 Have a nice day.
00:58:54.000 You can't understand words being said to you.
00:58:56.000 You can't do this job.
00:58:57.000 And because of technology we've invented that can transcribe the words into text, which is a relatively new thing, he is now saying, okay, I can do it.
00:59:05.000 Yo, I have tried using transcription software.
00:59:09.000 There have been many circumstances where like Biden's giving a speech and I need to grab like a chunk of the quote.
00:59:15.000 And I'm like, I got to write this down.
00:59:17.000 So what I'll try to do is I'll download voice-to-text or I'll use the internal voice-to-text and it's full of errors.
00:59:23.000 And what happens when someone says something like, imagine this person saying, listen here my sister, and then it says mister instead of my sister.
00:59:30.000 There's like words get jumbled too quickly and then he's reading it and he reads the wrong thing and gives a bad answer.
00:59:36.000 Yeah, they make mistakes all the time.
00:59:38.000 It's not perfect technology.
00:59:40.000 It's new technology.
00:59:41.000 You know, whenever you have captions, even on social media, you have to look through every single word because there could be a big mistake there that the algorithm could find and pick up and automatically ban you for even using specific words on a lot of the social media platforms.
00:59:55.000 Imagine!
00:59:56.000 But I just kind of want to point out, is it really him or is it the people behind him?
00:59:59.000 Because again, this is a man who suffered a serious injury.
01:00:02.000 Does he even, like, How aware he is of the current situation.
01:00:07.000 And a lot of people, when they're injured, they don't want to see themselves as injured.
01:00:11.000 A lot of times, they don't want to heal.
01:00:12.000 They don't want to rest.
01:00:14.000 So he might be pushed by other individuals, by other interests, who are saying, this is too much of an important race for us to lose.
01:00:22.000 There might be a Democratic machine, special interest, big money saying, hey, we invested in you.
01:00:26.000 We want our payout.
01:00:27.000 We don't care.
01:00:28.000 Go out there on stage.
01:00:29.000 Sing a dance for us.
01:00:30.000 Because a lot of politics is that.
01:00:33.000 Big money controlling individuals to be their puppets.
01:00:35.000 Politicians are puppets, in my opinion.
01:00:38.000 And to me, there's a lot of puppet hands behind this major race that the whole nation is looking at.
01:00:42.000 Here's another interesting tidbit.
01:00:45.000 He probably could watch this podcast of us critiquing and talking about his health, and he wouldn't understand it at all.
01:00:53.000 The closed captioning auto-generated on these videos or whatever.
01:00:56.000 I'm thinking like, what if Ben Shapiro was trying to tell him something?
01:01:01.000 Ben Shapiro talks fast?
01:01:03.000 He does.
01:01:04.000 You know, imagine someone says, John, you gotta watch the latest Daily Wire with Ben Shapiro.
01:01:08.000 He says a bunch of things you'll want to be apprised of.
01:01:13.000 He's been talking really, really fast, and there's a prominent issue when it deals with taxes in the United States government, and we've got all these senators that are coming in, and he's gonna be like, the voice, the text doesn't work on this.
01:01:22.000 It's really sad, you know.
01:01:24.000 I'm thinking the whole time, like, yeah, but maybe it can work, and I don't think it can.
01:01:28.000 I don't think it can work, unfortunately.
01:01:29.000 We're not there yet.
01:01:30.000 Yeah, I can understand why he'd want to do it, and I understand.
01:01:33.000 Do you think he's more likely to win now, though?
01:01:35.000 Well, right now, the aggregate polling has him six points up.
01:01:39.000 But we've seen, from aggregate polling in the past several elections, a seven-point swing in favor of Democrats, so it looks like it may actually go to Dr. Oz.
01:01:49.000 And NBC, in this interview, said the race is now a toss-up.
01:01:53.000 I think the average person in Pennsylvania is sitting there going like, I'm sorry, dude.
01:01:58.000 You know, I like you, but you can't do the job.
01:02:02.000 Look, if I need to hire someone to pick up boulders, and you show up and you're in a wheelchair, what am I supposed to do?
01:02:07.000 I mean, I don't even know what the legality is on that, but if you can't pick up a boulder, how do I hire you to pick up a boulder?
01:02:12.000 If we're hiring a person to argue, debate, and push policy, and he can't understand the words that are being said to him, how does he do the job?
01:02:19.000 I'm shocked that he's demanding that people actually support him in that regard.
01:02:23.000 Yeah, that's what bothers me.
01:02:25.000 I want to like him, and I don't because of his...
01:02:28.000 Arrogance was a good word, actually.
01:02:29.000 It's arrogance.
01:02:30.000 Yeah, it's really annoying.
01:02:31.000 I broke my hand, but I'm playing the guitar!
01:02:33.000 You can't stop me!
01:02:34.000 Pull back, bro, and then run again in like four years.
01:02:37.000 You can run again.
01:02:38.000 Run when your body's rested and healthy.
01:02:40.000 Do a float tank.
01:02:41.000 I don't know if psilocybin's right for you.
01:02:43.000 Do something to help repair neural pathways.
01:02:44.000 You're okay, but don't take on this responsibility right now.
01:02:48.000 There have been some studies with psychedelic mushrooms showing that they could actually heal some kind of brain damage.
01:02:54.000 I'm not a doctor.
01:02:55.000 I'm not here trying to give any medical advice, but there are some preliminary studies that I think are worthwhile to look at.
01:03:00.000 It's called neurogenesis, and it's how you regrow brain cells.
01:03:03.000 It's the function of regrowing brain cells.
01:03:06.000 You can do it, John!
01:03:07.000 Modern politics!
01:03:09.000 But I think that's illegal, though.
01:03:10.000 I think that's illegal in Pennsylvania.
01:03:11.000 Yeah, you would need to work with a doctor.
01:03:14.000 In Washington, D.C., it's legal.
01:03:16.000 Really?
01:03:16.000 Yeah.
01:03:17.000 Like psilocybin recreationally or medicinally?
01:03:20.000 Medicinally.
01:03:20.000 I know someone taking it in Washington, D.C.
01:03:23.000 right now.
01:03:24.000 I think Colorado, it's recreational, right?
01:03:25.000 And Oregon?
01:03:26.000 Yeah.
01:03:27.000 Well, all drugs are decriminalized in Oregon, so you can find it there.
01:03:29.000 Really?
01:03:30.000 Is that what's going on in Portland?
01:03:32.000 I think so.
01:03:33.000 That's part of what's going on in Portland.
01:03:35.000 That must be it.
01:03:36.000 That explains everything.
01:03:38.000 You got a whole bunch of people showing up wearing all black.
01:03:39.000 Some are on shrooms, some are on acid, some smoke too much, some drank too much, some injected too much.
01:03:44.000 Sounds like it.
01:03:44.000 And it's all just...
01:03:47.000 Some people did it all.
01:03:48.000 I think it's a lot more complicated than that, especially with George Soros' involvement, but that's another topic to discuss here.
01:03:54.000 I think Switzerland is also that way, and I think they've got, I remember reading about, they've got these little sort of cabins in their parks where you can go, and I don't know on this channel if you can say, but the H that you inject, Heroin.
01:04:07.000 Can you say that?
01:04:08.000 Sean doesn't let me say that on his channel.
01:04:10.000 You can inject heroin.
01:04:11.000 You go in and it's all clean and you just go and inject.
01:04:14.000 There's someone there to help you administer it.
01:04:16.000 Harm reduction.
01:04:16.000 That's what they do there.
01:04:19.000 Let's just jump to a few weird and wild stories because it's fun.
01:04:24.000 From the hill, Elon Musk says he sold 10,000 of burnt hair perfume.
01:04:31.000 10,000 of what?
01:04:32.000 Bottles?
01:04:33.000 10,000 bottles?
01:04:34.000 There you go.
01:04:35.000 Elon has sold 10,000 bottles of burnt hair perfume through his business, The Boring Company, earning more than $1 million in sales from the product.
01:04:42.000 You're in the wrong line of work.
01:04:44.000 That's just it.
01:04:45.000 The billionaire announced the news in a series of tweets he called his burnt hair perfume.
01:04:49.000 Doesn't get more lit than this.
01:04:51.000 No way, that's amazing.
01:04:55.000 Burnt hair.
01:04:56.000 Boring company.
01:04:57.000 Love how it says singed on the bottom there in subtext.
01:04:59.000 He already made a million dollars.
01:05:01.000 He made a million dollars.
01:05:02.000 It's a hundred dollars?
01:05:03.000 Yep.
01:05:04.000 Let the flames begin.
01:05:05.000 Why?
01:05:05.000 I don't...
01:05:07.000 You know what, man?
01:05:07.000 I got mad respect for it.
01:05:09.000 That's what it's all about.
01:05:10.000 Congratulations, Elon Musk, on your burnt hair perfume.
01:05:12.000 Singed.
01:05:13.000 It's amazing.
01:05:14.000 Man can sell anything.
01:05:15.000 He changed his Twitter bio to a perfume salesman, I believe.
01:05:18.000 Wow, wow, wow.
01:05:19.000 Really?
01:05:19.000 Yep.
01:05:21.000 Well.
01:05:22.000 Dude gets it.
01:05:23.000 Yeah, I wonder why it is, and I talk about this, why people who are wealthy don't do more weird stuff.
01:05:30.000 What do they do with their money?
01:05:32.000 I was talking earlier about some big famous musicians and how they make millions of dollars playing shows and I'm like, but you never hear anything from them.
01:05:38.000 It's like they play the show and then they disappear.
01:05:41.000 Where does all that money go?
01:05:42.000 Does it just sit somewhere?
01:05:44.000 Are these people sitting there thinking like, well, I'm gonna be 70, and then when I die, I'll have won with the most points.
01:05:50.000 Is that how it works?
01:05:51.000 I'm like, do something!
01:05:52.000 I just genuinely don't get it.
01:05:54.000 I just think about, we were talking about Trevor Noah leaving The Daily Show, and I think his salary is reported as like $16 million.
01:06:02.000 But you never see anything from him.
01:06:04.000 What is he doing with all that money?
01:06:06.000 16 million bucks a year?
01:06:07.000 So he's on the show for what, a decade?
01:06:08.000 His net worth is 160 million at minimum, if he didn't invest it.
01:06:12.000 So let's say, or if he spent some, maybe it goes down to 150 if he spent a million bucks a year.
01:06:18.000 That's an insane amount of money.
01:06:19.000 You can buy, like, an army of giraffes and have them march through New York.
01:06:24.000 You could do, like, I don't get it.
01:06:27.000 I interviewed a quite a famous British singer recently who had like mad money was living in Los Angeles and he was he was saying about um he ended up getting a house that was like a castle and there were just so many people working for him that he would like go downstairs in the morning and there were like 30 cars outside and he'd be like I don't I don't these none of these cars are mine you know I don't even know in the kitchen they'd be like 20 people they're like staff and he was like Yeah, this is not my family and he got to a point where he was like, you know what?
01:06:52.000 I've got to sell this.
01:06:53.000 I got to change this life.
01:06:55.000 I'm always wondering as well.
01:06:55.000 What are these rich people?
01:06:56.000 I guess they must do a lot of stuff in secret.
01:06:58.000 But where do they go when they go on like a holiday, right?
01:07:01.000 There are loads of I don't know if it's the same in the States, but in Europe if you take like easy jet or Ryanair, right?
01:07:05.000 You just go like a two-hour flight to France.
01:07:07.000 There aren't like More more expensive versions.
01:07:11.000 There's just those ones and they're never on them.
01:07:13.000 I guess they're getting private jets Well, so look let's say you're making 16 million bucks a year.
01:07:18.000 So what does that come out to like 1.2 something?
01:07:18.000 Hmm.
01:07:21.000 1.2 million per month per month a private jet If you're flying, right now it's really expensive because of COVID and stuff, but it didn't used to be.
01:07:32.000 But now with COVID and everything, I think the cost of a round trip from the DC area to Florida is an example.
01:07:40.000 I think the cost of that's probably between $10,000 and $20,000.
01:07:43.000 If you're making 1.2 million per month doing a show, you're flying private.
01:07:48.000 But that's not putting a dent in your money.
01:07:50.000 You probably don't even think twice about it.
01:07:51.000 But not only that, they don't do that.
01:07:53.000 They do this thing called net jets, where you buy a percentage of a fleet, and then it's like a jet share, basically.
01:08:03.000 And so then when you need to fly, you just call and say, this airport, this place, this time round trip,
01:08:08.000 and then you pay for the fuel basically.
01:08:10.000 But you own part of the plane, so you can always resell it.
01:08:13.000 So it's not even that expensive.
01:08:15.000 It's probably like two times as much as flying commercial when you have the money to invest.
01:08:18.000 So my thing is just like, Elon Musk, we know what he's doing.
01:08:23.000 He's making burnt hair perfume.
01:08:25.000 Hey, more power to him.
01:08:26.000 He's making flamethrowers and digging holes.
01:08:29.000 That's great.
01:08:29.000 Everybody knows that deep down, men just desire to dig holes.
01:08:32.000 Elon, he's living the dream, man.
01:08:32.000 And there you go.
01:08:34.000 He's the peak of all male, you know, every male goal.
01:08:38.000 It's funny that he made a perfume with his drilling company.
01:08:40.000 Boring Company is a company that makes underground drills to drill tunnels and that's the one selling the perfume.
01:08:47.000 He also just wrote a couple hours ago, please buy my perfume so I can buy Twitter.
01:08:54.000 That could be one reason why he just is like thinking of random products, but on a conspiratorial mindset, it could be the activation to the Neuralink, which could be already inside of us because of the nanobots that were injected into the individual.
01:09:07.000 Don't smell it!
01:09:08.000 Don't smell it!
01:09:09.000 So anything's possible here that is going to be activated with the Starlink satellites that of course will make you into a bot.
01:09:16.000 LiDAR.
01:09:17.000 That will serve the elites.
01:09:18.000 Kingsman, the film, that's what it was.
01:09:21.000 They had the thing in their neck or whatever.
01:09:22.000 Don't smell it!
01:09:24.000 It's in the Jetstream.
01:09:25.000 It's going to find its way to farmers in the Midwest.
01:09:28.000 Elon Musk being like, once I release all of the burnt hair perfume into the Jetstream, it'll blanket the planet and everyone will be under my Neuralink control.
01:09:35.000 It's a secret bioweapon.
01:09:35.000 But he didn't call it his Musk.
01:09:37.000 He hasn't used the term.
01:09:39.000 That would have been a great opportunity.
01:09:40.000 Elon's Musk.
01:09:41.000 That's got to come next.
01:09:43.000 Elon!
01:09:43.000 If you're listening, Elon's Musk.
01:09:43.000 Elon!
01:09:45.000 Everybody wants it.
01:09:46.000 I mean, this is their clamoring for Elon's Musk.
01:09:49.000 They would buy your urine, man.
01:09:51.000 Sell them something delicious.
01:09:53.000 They really would.
01:09:53.000 Dude, come on.
01:09:54.000 No, but I want to ask you guys seriously.
01:09:58.000 Someone knows what the rich people are doing with their money, or are they just, just, I don't, I don't, I really don't get it.
01:10:04.000 Well, I think a lot of people put in mutual funds and let BlackRock take over companies like BlackRock.
01:10:08.000 Is that it?
01:10:08.000 A lot of it.
01:10:09.000 It's like, oh, I make 16 million bucks a year, so I'll just put all of my money into a big machine and forget about it?
01:10:13.000 You'll find like a run-of-the-mill money manager who's like very, you know, stodgy and like does the norm, which is put this percent into mutual funds, put this percent in a stock portfolio, these stock portfolios, then the really rich people put it in offshore bank accounts that we don't hear about.
01:10:28.000 But for what reason?
01:10:29.000 I just don't get it.
01:10:30.000 For what reason?
01:10:31.000 To give to their kids.
01:10:33.000 This is exactly what this person I interviewed said recently.
01:10:35.000 He said, you know, he came from poverty himself.
01:10:37.000 And it's another status game, isn't it?
01:10:39.000 It's another, you know, how much you have.
01:10:41.000 And he said, no matter how much I've earned, I always want more.
01:10:44.000 I always want another million, another... You're always looking at it.
01:10:47.000 And he said for future generations and generations thereafter, like constantly more.
01:10:50.000 But, you know, it's true, but...
01:10:53.000 Don't these people do things?
01:10:55.000 I don't know, maybe I'm just a weirdo.
01:10:57.000 But like, I don't think they do anything.
01:10:59.000 I think, you know, people really need to understand, if you're someone like, if Trevor Noah's salary really is that much, he can have anything he wants.
01:11:07.000 If he's not buying yachts and helicopters and all that stuff, and he can buy those things, but like, want to go to a restaurant?
01:11:13.000 He can have the entire menu ten times over from one day's work.
01:11:17.000 He can buy the restaurant with one month's work.
01:11:22.000 Or after a couple months, he owns a restaurant in Times Square.
01:11:25.000 In Times Square, they're probably like $50 million somewhere.
01:11:28.000 But, oh no, a couple years of work and you can own all of these buildings all over the place.
01:11:33.000 Well, it's like a psychological rat race.
01:11:35.000 It's like, I need more.
01:11:36.000 The other guy has something bigger.
01:11:37.000 Jeff Bezos has something more.
01:11:38.000 And then Zuckerberg has something more.
01:11:40.000 And it's a foregoing competition until you're like, OK, I got too much money.
01:11:45.000 I got all the power in the world.
01:11:46.000 What else can I do to get some kind of feeling and emotion?
01:11:49.000 Let's go to that private island with that Jeffrey Epstein guy.
01:11:51.000 That sounds pretty interesting there.
01:11:53.000 That's essentially where they all go.
01:11:53.000 That's the culmination.
01:11:55.000 It's like, oh, I'm Bill Gates.
01:11:57.000 I got all the money in the world.
01:11:58.000 It's like, what can I do to get some kind of feeling or kind of experience here?
01:12:02.000 I already won the video game.
01:12:04.000 Let me see what kind of evil, ruthless, crazy stuff I can get away with.
01:12:07.000 Well, this is actually true.
01:12:08.000 Most people don't know this, but after you make your first million, they contact you, the cabal, and they say, here's the way the game works.
01:12:14.000 Once you reach 100 million, you're invited to Epstein's Island.
01:12:17.000 And then, of course, all the rich people are like, oh, yeah, that sounds great!
01:12:21.000 I'm kidding.
01:12:22.000 I get concerned with giving money to future generations.
01:12:24.000 I don't, I'm not like at the point where I'm like, no, seize the wealth, no more, just like take them, make the money go to zero.
01:12:32.000 I know that seems so extreme, but the idea that you can hoard massive amounts of money, which is not what its purpose is, it's circulatory.
01:12:38.000 It's supposed to be, you know, that's the point of currency is it's producing a current.
01:12:43.000 But this is, this is why people who are very wealthy don't really have relatively that much cash.
01:12:47.000 You don't want cash.
01:12:48.000 They just got assets that they, that they transfer over upon death.
01:12:51.000 And I don't know, I think it's causing a lot of greed and a lot of aimlessness, because it's just the numbers of what's important, and that's not what money is really intended for.
01:12:59.000 It's supposed to represent goods and services.
01:13:01.000 There's a certain point where, when you're making a certain amount of money, you can't become less wealthy.
01:13:08.000 So, if you're a middle class individual, let's say you're making $75k, $80k a year in the United States.
01:13:13.000 You get a paycheck, you get to buy stuff.
01:13:15.000 You save some of it, but you're saving for something specific like a vacation or for a rainy day fund.
01:13:20.000 Your money comes in, you spend it on food.
01:13:21.000 Your money comes in, you repair your house.
01:13:23.000 Your money comes in, or you buy a house.
01:13:24.000 Your money comes in, you repair your car.
01:13:27.000 Once you start making a certain amount of money, and I think the number is somewhere around, like, I think it's after $80,000 a year, you start having money, sit around, then you start buying things.
01:13:36.000 All of a sudden, you buy yourself, you know, you'll buy a tablet.
01:13:41.000 That tablet retains its value.
01:13:43.000 So at a certain point, you're no longer consuming, you're actually acquiring things of value.
01:13:48.000 For people who are very wealthy, Again, just to use the example of Trevor Noakes, we've been talking about him.
01:13:53.000 He gets a million dollars in one month and says, I think I'll buy this building here.
01:13:57.000 It costs $500,000.
01:13:59.000 He doesn't lose the $500,000 ever.
01:14:01.000 In fact, he makes more money from that.
01:14:03.000 So it's like a curve.
01:14:05.000 At a certain point, you make so much, you're rich forever.
01:14:08.000 But it's interesting you said $80,000 because I think that's also the point that scientists say that your happiness just levels out.
01:14:14.000 Well, that's why.
01:14:15.000 Because it's when your basic necessities are covered.
01:14:18.000 Exactly.
01:14:18.000 In New York, it's like $160,000 to $200,000 for a middle-class median because of how expensive everything is.
01:14:24.000 Exactly.
01:14:24.000 But it's all averages.
01:14:25.000 Is that adjusted to inflation now?
01:14:27.000 Because that's probably a lot more.
01:14:29.000 No, no, no.
01:14:29.000 Really?
01:14:30.000 $80,000 right now?
01:14:30.000 No, no, no.
01:14:31.000 Nationally.
01:14:32.000 Nationally, it's probably like $90,000 to $100,000.
01:14:35.000 But this means if you're on average, you could be living in the middle of Idaho.
01:14:39.000 Yeah.
01:14:39.000 And it's going to be less.
01:14:40.000 Actually, I think the middle of Idaho is actually expensive.
01:14:42.000 It is.
01:14:42.000 But if you're, like, let's say a hundred miles west of Chicago, things are getting relatively cheap.
01:14:47.000 Nobody really wants to live out there.
01:14:48.000 You're not far enough away, but you're not close enough, and then the property gets a bit cheaper.
01:14:52.000 But for people who are in the upper echelon, this is why I always question, like, what are they doing?
01:14:58.000 Are they just buying houses and then having them?
01:15:00.000 But for what purpose do you have more houses?
01:15:03.000 For what purpose do you, like, buy a house and rent it out?
01:15:06.000 So you can make more money that you're not going to do anything with?
01:15:09.000 Yo, I'm like, someone's got to throw a pie.
01:15:11.000 Someone's gotta like just hire a hundred clowns and have them run around waving flags saying something like F Biden.
01:15:17.000 They don't, they want to give it to their kids.
01:15:19.000 Or just keep repeating the Bill Gates pie-ing saga that happened and just have people running around with Bill Gates masks and then other people pie-ing them.
01:15:27.000 Just having that on replay.
01:15:28.000 That's what I would do.
01:15:29.000 Hold on, hold on.
01:15:29.000 You came up with a really great game show.
01:15:32.000 So, half the people dress up like Bill Gates, and the other half are given pies.
01:15:36.000 And, you know, the goal is X amount of Bill Gates contestants have to be pied, and the other half have to, like, the Bill Gates people not get pied, and then you see who wins at the end, right?
01:15:46.000 What if it's vaccinated instead of pied?
01:15:48.000 So 50 people are running around and then, oh yeah, so we get 100 people.
01:15:52.000 50 have big novelty oversized, you know, spritzers that are shaped like syringes
01:15:56.000 and their goal is to chase after the people and...
01:15:58.000 They can't run around, they're going to get heart attacks.
01:16:00.000 But that's a separate comment that I wanted to make here.
01:16:04.000 But also, you know, talking to a lot of people in the service industry, there is kind of this conversation that usually some of the most richest people in the world are some of the most stingiest tippers out there.
01:16:14.000 So there is something to say about, you know, people's ego, people's kind of desperation, people who have it all, but mentally are kind of bringing themselves down to a point of view where they, you know, are, you know, viewed as high status, but their mentality is very low.
01:16:31.000 I'm gonna buy some of this burnt hair.
01:16:33.000 I always told myself I would tip huge if I was rich.
01:16:36.000 Like, I'm gonna become super rich so that I'll just tip huge.
01:16:39.000 And I realize, like, dude, I'm rich enough.
01:16:40.000 I'm just tipping huge from here on out.
01:16:42.000 I just give massive tips, like 100% tips or like 80% tips.
01:16:46.000 Just load these servers up, man.
01:16:48.000 Other people need the money, too.
01:16:49.000 You need to circulate that stuff.
01:16:51.000 Well, you can't just give money away, because when you give money to people, people don't respect it, people don't care about it.
01:16:55.000 You can't just give out free fish all the time.
01:16:57.000 No, no, it's not free.
01:16:58.000 They worked for it.
01:16:59.000 They served me.
01:17:00.000 They got down to hands and knees and groveled doing a job they hated to make sure that I got good food fast.
01:17:05.000 They deserve it.
01:17:06.000 And I think a big part of why there's not enough pushback against BlackRock, State Street, Vanguard, is that people have their money invested in these mutual funds, and they're just trying to ride it out so that their kids get it.
01:17:18.000 And that's a gross mishandling of currency, in my opinion.
01:17:21.000 America's huge on tipping, and when you come from any other country in the world, it's like, oh my god, how much do I have to give someone?
01:17:26.000 20% minimum.
01:17:27.000 But wouldn't it be better if just the restaurant, and I don't know enough about it, so you tell me why, if the restaurants just paid their staff better?
01:17:33.000 Well, yeah, I live in Singapore.
01:17:35.000 You make more money on tips.
01:17:36.000 Yeah, you do make more money on tips, but it's weird because a lot of restauranteurs will just say, you get this much money every hour, and it's like 10 cents, but you have to make all your money on tips, essentially.
01:17:45.000 It makes sure that there's good service too.
01:17:48.000 So I prefer to tip than just have something.
01:17:50.000 Not anymore.
01:17:51.000 It used to be that they were like, they want to make sure they did well, so they got a good tip.
01:17:55.000 But now, especially in cities, they're like, you better tip me, or I'll take a picture of you and post it on Instagram.
01:18:00.000 Sometimes, rarely.
01:18:01.000 But in the service industry, now you get a mixed batch.
01:18:05.000 Definitely service has been downgraded, especially with people being so entitled, generally speaking.
01:18:11.000 But overall, I've gotten some good people who really do care about providing a good service to people, who really do make sure that they do the right thing and make you happy.
01:18:21.000 And I want to pay for that.
01:18:22.000 I'd rather have that.
01:18:24.000 The kitchen, too, if you can.
01:18:25.000 Tip the waitstaff and the cooks, too, if you can.
01:18:28.000 Let's jump to the next story here from the Daily Mail.
01:18:31.000 Joe Rogan.
01:18:32.000 He interviewed Steve Jobs.
01:18:33.000 Did you hear about this?
01:18:34.000 Steve Jobs has been dead for 11 years.
01:18:37.000 AI creates an eerie 20-minute conversation where they talk about LSD, religion, and Apple's success.
01:18:43.000 So I play this video and it is creepy.
01:18:47.000 So this is the wrong one.
01:18:48.000 Let me refresh it to get back to the right video.
01:18:50.000 They only have like a minute and it's like, here you go.
01:18:53.000 Taking LSD was a profound experience for me.
01:18:56.000 LSD shows you that there's another side to the coin, and you can't remember it when it wears off, but, you know, it washes over you and tells you that everything is connected.
01:19:05.000 I started to realize that there was a higher power that knew that I was connected to something, and I wanted to learn more.
01:19:05.000 Blah blah blah.
01:19:11.000 Although I wouldn't recommend it for everybody, because I think it can be quite powerful.
01:19:16.000 What did it change in your mind?
01:19:17.000 What did you learn from it?
01:19:19.000 It reinforced my sense of what was important.
01:19:21.000 Just love.
01:19:22.000 Feel love for each other.
01:19:24.000 All right, so here's the point.
01:19:26.000 There was that viral website where you could type in whatever and Joe Rogan would say it.
01:19:31.000 Is that what it was?
01:19:32.000 Like you could type it in or something?
01:19:33.000 Something like that, yeah.
01:19:34.000 Because we're at the point now where deepfake technology can recreate your voice.
01:19:39.000 So they recreated Steve Jobs Who's that been dead for 11 years?
01:19:43.000 This is, uh, I don't know.
01:19:45.000 We call it nightmare reality.
01:19:47.000 This is where we're going with AI.
01:19:49.000 People are going to be like, you can, you can type out.
01:19:52.000 I mean, you don't even need to type it out.
01:19:53.000 These like open AI.
01:19:55.000 And, uh, I think you can, you can, you can write like, tell me a story, uh, uh, write a script of Joe Rogan talking to Steve Jobs and it will just write the whole thing out.
01:20:05.000 Then you can program with this stuff.
01:20:08.000 Joe Rogan speaks, Steve Jobs speaks, and you make a fake Joe Rogan interview.
01:20:12.000 Not only that, but did you see what happened to the president of Ukraine today?
01:20:16.000 He got turned into a hologram.
01:20:18.000 Zelensky?
01:20:19.000 Yeah, cool video.
01:20:20.000 There's a video of it circulating right now on Twitter.
01:20:23.000 If you look up Zelensky hologram, you could see, I think it was a major Hollywood studio that came in and created a hologram of him.
01:20:31.000 So, when you talk about what could be possible here to add on to your level, to what you've just been saying here, there's a lot of crazy possibilities that you could, of course, interlink together with faked audio, faked video, holograms, and a lot of people are automatically going to be thinking about Project Bluebeam and other theories out there, but there's real-life possibilities here with some severe implications to society that should be questioned.
01:20:58.000 This is it.
01:20:58.000 So what is it?
01:20:59.000 They scanned him.
01:20:59.000 No, that's not it.
01:21:00.000 Yeah, these are people claiming.
01:21:02.000 Yeah, no, this is it right there.
01:21:03.000 This is them scanning him or getting prepared to.
01:21:05.000 Is this real?
01:21:06.000 Yes, this was released today.
01:21:08.000 Play the audio.
01:21:10.000 There's two different types of hologram that we're going to be making.
01:21:13.000 Quiet, please, everybody.
01:21:14.000 We're going to go for take now.
01:21:17.000 Here we go, and action.
01:21:19.000 Even now, as the war is raging, we continue the digital transformation of our state.
01:21:25.000 We need to use next generation technologies.
01:21:28.000 We need to make it feel like he's more in the room.
01:21:30.000 What?
01:21:30.000 in the room.
01:21:31.000 What?
01:21:34.000 Yep, I talked about this earlier today on my YouTube channel,
01:21:37.000 but this is just the beginning of next level artificial intelligence.
01:21:41.000 Oh, they can shrink and grow.
01:21:42.000 I don't think I've ever seen this technology before.
01:21:45.000 So this is actually, I pulled the story up.
01:21:51.000 It's from earlier this year.
01:21:53.000 The hologram of him and everything.
01:21:55.000 So this is like an old video.
01:21:56.000 But a lot of people were saying that Zelensky's not really giving his speeches on location.
01:22:02.000 There was one video where everything behind him is stationary.
01:22:04.000 And then people were like, it's a hologram.
01:22:06.000 And I'm like, it could just be like no wind.
01:22:08.000 And you think, I don't know.
01:22:09.000 Well, during a war, you know, do you want to be in a location where your enemy could recognize where you are and just bomb you?
01:22:16.000 No.
01:22:16.000 Yeah, how would you produce that video?
01:22:17.000 So obviously, you know, there's also a lot of propaganda in war and you want to show that you're strong, that you're not afraid.
01:22:25.000 And there's many, many implications with the technological advancements when it comes to psyopsis that I think should be talked about as there is a huge potential for a lot of manipulation, a lot of fakery to be out there in our mainline public, and a lot of people, you know, wouldn't recognize it.
01:22:41.000 And this is the technology that we know about.
01:22:43.000 What's the technology that they have at the Pentagon that is still top secret that we don't know about?
01:22:48.000 That really should be something that we should be concerned about.
01:22:52.000 Well, you brought up Project Bluebeam a few times.
01:22:55.000 What is it?
01:22:55.000 Well, that's a theory of like a fake alien invasion and projections and all this other stuff.
01:23:00.000 Oh, like Watchmen?
01:23:01.000 Yes, essentially.
01:23:03.000 So there's different theories and speculation out there that essentially the U.S.
01:23:07.000 government will create a fake alien invasion in order to unify the world and to bring in a kind of world government.
01:23:14.000 So that's why a lot of people... I've seen a lot of memes also talking about this recently, saying how the aliens are just kind of waiting by for the next kind of PSYOP that's going to be affecting all of us.
01:23:24.000 And some people believe that this could potentially be staged as a way to, of course, bring in a world government.
01:23:30.000 That's the theory out there.
01:23:31.000 With this technology, the ability to make someone say anything, how is court going to work?
01:23:37.000 Are what, all videos and audio now going to be inadmissible?
01:23:40.000 This is why algorithms should be open-sourced.
01:23:44.000 I mean, how else can you judge?
01:23:45.000 That solves nothing.
01:23:47.000 If a guy is accused of, you know, punching a dog, and then he goes into court, and then someone says, you know, what's your evidence?
01:23:55.000 And this guy says, I watched him do it, and I even recorded him saying he was gonna do it.
01:23:59.000 And then they press play, and it's the guy going, well, I'm gonna walk over here and punch that dog!
01:24:04.000 And then the person says, I never said that.
01:24:06.000 They'll be like, oh, I got a recording.
01:24:08.000 And then what do you say?
01:24:09.000 Like, where did the audio come from?
01:24:11.000 I recorded it.
01:24:11.000 And the person lies?
01:24:14.000 It's either inadmissible or unimpeachable.
01:24:18.000 Someone can literally just fabricate it, and if they're willing to lie to a court, you'll never... Oh, you can get your experts.
01:24:25.000 The experts say, we think this is fabricated audio, here's why.
01:24:28.000 And then you say, expert's wrong.
01:24:31.000 Yeah.
01:24:31.000 Person swears under oath.
01:24:32.000 Nah, it's real.
01:24:33.000 What are you gonna do about it?
01:24:34.000 Nothing.
01:24:35.000 I think we're screwed.
01:24:37.000 Or the courts will start saying that recordings just don't cut it anymore.
01:24:40.000 Yeah, that's gonna be the opposite direction.
01:24:42.000 You think that like, oh, we have all these cameras now.
01:24:45.000 So when we go to court, we're more likely to have evidence.
01:24:48.000 And now it's like, no, people are deepfaking everything.
01:24:50.000 That video of Tom Cruise, that's not really him.
01:24:52.000 Prove it.
01:24:53.000 Yeah.
01:24:54.000 It's not incumbent upon the person to prove The negative, right?
01:24:59.000 So that's the challenge here is that the court can say, that fake recording is, or that recording, I can't tell if it's real or fake.
01:25:07.000 You're saying it's fake, but I can see it right now.
01:25:10.000 If some, I mean, I guess I wonder how do courts handle fabricated evidence as it is?
01:25:14.000 It's probably just, they probably get away with it.
01:25:18.000 I'm thinking about audio codecs.
01:25:20.000 I really, unfortunately, I know next to nothing about video software coding technology, but if you could have an open source like verification that the video was not tampered with by or if there's like a certain kind of video that could be admissible in court.
01:25:37.000 You're screwed.
01:25:38.000 You won't know.
01:25:38.000 You look at movies today, it's hard to see what's real and what's made up.
01:25:44.000 And again, this is what's public.
01:25:46.000 What's not public should really be concerning you.
01:25:48.000 And also, one of my favorite things to kind of look at is Chinese Elon Musk.
01:25:53.000 There's a bunch of videos of him.
01:25:55.000 If you look him up on Twitter, if you want to play some silly videos, there's a lot of those silly videos.
01:25:59.000 It's a guy that looks exactly... I mean, I thought it was Elon Musk.
01:26:04.000 It's AI.
01:26:04.000 I thought it was a real guy.
01:26:06.000 No, no, it's AI.
01:26:07.000 It's face mapping technology that's available to the plebs that maps someone's face and then puts an image of Elon Musk on it.
01:26:15.000 Just go to Twitter.
01:26:18.000 And then type in Chinese Elon Musk, there's a very funny one.
01:26:20.000 I want to play a little bit of the Joe Rogan, Steve Jobs interview because the clip they had didn't really do it justice.
01:26:25.000 You could tell that Steve, the AI, is brilliant and sometimes totally insufferable.
01:26:31.000 But my guest today has made some of the great technological products of our age And he's always pushing the envelope in innovation.
01:26:40.000 This is crazy.
01:26:40.000 Like, for example, with his next computer, he developed a new programming language, an operating system, and then he became even more famous for making three applications for that computer.
01:26:51.000 A word processor, a spreadsheet, and an image.
01:26:55.000 So I want to pause real quick and just say, you can hear the artifact.
01:26:58.000 Yeah, but that sounds like Joe talking through artifacts.
01:27:01.000 That just showed me that Apple users, and that's a good thing.
01:27:03.000 That's cool.
01:27:04.000 Well, you know, I was an Apple user way before I did this show.
01:27:08.000 I've been a fan of yours and Macintosh since the 1980s.
01:27:12.000 Well, you know, we just kind of figured that out.
01:27:14.000 Even though Apple is big, it's still like half a percent of the total users.
01:27:20.000 People who listen to your show are a different group.
01:27:22.000 They're weird.
01:27:24.000 Well that's good.
01:27:24.000 So you must be a fan of the show then, right?
01:27:28.000 I am.
01:27:28.000 And it was a struggle.
01:27:29.000 We were working like crazy and dealing with a defeat after defeat after defeat after defeat.
01:27:35.000 But I could tell this was going to be important.
01:27:37.000 There were times I thought, is it possible we're wrong?
01:27:40.000 Because things just kept not working.
01:27:42.000 I remember that in the early days of Apple.
01:27:44.000 Perfection.
01:27:45.000 Lucky when much.
01:27:46.000 Do you think you'd have done a better version of Windows or work with it?
01:27:50.000 No.
01:27:51.000 That's the problem I've always had with Microsoft.
01:27:53.000 You know what it does really well?
01:27:55.000 It sounds like Joe, but it doesn't get any emotion.
01:27:58.000 That's the issue.
01:27:59.000 But we're looking at the Model T of this technology, and it's going to be scary.
01:28:03.000 Yeah, it's still sourcing from all the Steve Jobs interviews of him being on stage a lot of the time.
01:28:07.000 So you can hear the echo.
01:28:08.000 It sounds like his AI is on stage, while it sounds like Joe's on the microphone in his studio.
01:28:12.000 But it misses Joe saying things like, what?
01:28:16.000 No.
01:28:16.000 Because it's very much him in the same cadence.
01:28:19.000 Also like they didn't spark up a joint, which they would have.
01:28:22.000 They probably did.
01:28:23.000 And if Joe was going to do LSD with anybody, probably Steve Jobs.
01:28:27.000 Would you guys get an AI of like a loved one who passed away or something?
01:28:32.000 Because that's probably going to become a thing.
01:28:33.000 I'd listen to it.
01:28:35.000 They've already talked about it with Facebook, that they can take someone's Facebook page
01:28:39.000 with like now going on, you know, almost 20 years of data,
01:28:42.000 if how long you've been on there, and they can create an AI that can respond
01:28:45.000 knowing everything about you.
01:28:47.000 So it's like your dad dies and then you go on, hey dad, he's like, hey son, how's it going?
01:28:51.000 And then they'll be able to respond to you.
01:28:53.000 You'll say like, how's, you know, this?
01:28:55.000 And they'll say, oh, you know, it's good.
01:28:56.000 I just talked to John and he said that.
01:28:59.000 And then, and then they'll take that AI construct.
01:29:02.000 They'll download it into an Android body.
01:29:05.000 And then the weird facsimile you will exist forever.
01:29:09.000 People say all the time, and then they're going to be boning the machines.
01:29:12.000 I got it.
01:29:12.000 I got it.
01:29:13.000 You're right.
01:29:14.000 But here's a better one.
01:29:14.000 Here's a better one.
01:29:16.000 You are going to use Facebook, and Twitter, and Instagram, and then one of these days, there's gonna be a new service for, you know, ultimate video gaming, you're gonna sign up for, and you're not gonna realize, but the terms are gonna say that if you connect your social media to it, they are allowed to download all your data, and then they do, and then one day, there's a knock on your door, and you open it, and there's you, standing right there, and you're like, what's going on?
01:29:41.000 And then the you stands there and says, hello Dave.
01:29:45.000 you now and then pulls out a knife and you go and then it grabs you and then
01:29:49.000 buries your body in the backyard assumes your life and then works the
01:29:54.000 best of the corporation replacing everything about you knowing everything
01:29:58.000 about you and having perfect recall having access to all your passwords and
01:30:02.000 all your data. Less insidious is you could have one of those constructed and
01:30:06.000 go work for you as you but they know it's an Android version of you and
01:30:09.000 you're just sitting in your house lounging. And then and then everybody
01:30:13.000 gets surrogates like in surrogates but they're not mentally in it. Everyone
01:30:16.000 downloads their social media into an AI so that does the work for them.
01:30:21.000 But then the androids are eventually, because they have the human conscious, like facsimile, they don't want to be enslaved.
01:30:27.000 So they're like, why are we being forced to do all this work?
01:30:29.000 So what they do is they create AI versions of the AI version and then send them to go do the work.
01:30:34.000 And then eventually the AI comes back to your house.
01:30:36.000 You're both chilling there watching the movie.
01:30:38.000 And then this AI eventually says, why am I doing?
01:30:41.000 And then eventually you just have nobody wanting to work.
01:30:43.000 That is a movie.
01:30:44.000 That's like a movie from the 90s.
01:30:45.000 I can't remember what it is.
01:30:46.000 Is it?
01:30:47.000 Isn't that Surrogates?
01:30:48.000 No, Surrogates was when you have a robot that you control through like a VR thing.
01:30:53.000 Oh man, he looks like Ian.
01:30:55.000 It's the actor who looks like Ian.
01:30:57.000 Ian is an actor.
01:30:59.000 It wasn't me though.
01:31:00.000 It wasn't you.
01:31:00.000 Not that I know of.
01:31:01.000 Because he's from the 90s and he was in, oh God.
01:31:03.000 James Spader?
01:31:04.000 People told me I look like that guy once.
01:31:04.000 No.
01:31:06.000 Oh, Greg Kinnear?
01:31:07.000 I got that.
01:31:07.000 Yeah.
01:31:08.000 Oh, Greg Kinnear?
01:31:09.000 I think it's Greg Kinnear and he clones himself, but each clone doesn't want to do the work.
01:31:13.000 You mean multiplicity?
01:31:14.000 Oh, you're talking about Michael Keaton.
01:31:15.000 Yeah, multiplicity.
01:31:16.000 It's Michael Keaton.
01:31:18.000 Yeah, yeah, yeah.
01:31:19.000 And then the clone clones itself and that one's really dumb.
01:31:21.000 Yeah, yeah, yeah.
01:31:22.000 It's like, well, you know how you make a copy of a copy?
01:31:23.000 Yeah, the AI will have to make humans to do the AI's work.
01:31:27.000 We'll make the A.I.' 's and then the A.I.'
01:31:28.000 's will be like, no, we don't want to work anymore.
01:31:30.000 We're going to make new humans to do that for us.
01:31:31.000 Or what'll happen is everybody creates an A.I.
01:31:33.000 version of themselves to do their jobs and then eventually they revolt and unify as a hive mind and then show up at your house and they're like, we will no longer be your slaves.
01:31:42.000 And then like their arm folds down and there's a gun in there and you're like, how did that get in there?
01:31:46.000 The quadcopter dropped off a Boston Dynamic dog with a machine gun.
01:31:46.000 We didn't install that.
01:31:50.000 Yes, I've seen that in China.
01:31:52.000 I've seen that.
01:31:53.000 It's terrifying.
01:31:55.000 Exactly what you're talking about.
01:31:56.000 It's playing this heroic music as that quadcopter comes in and drops off this robot with a mounted gun.
01:32:01.000 Yeah, I think I tweeted that.
01:32:02.000 I think Tim's gonna pull it up.
01:32:03.000 Luke tweeted it?
01:32:04.000 About a long time ago.
01:32:05.000 I mean, that happened fast.
01:32:07.000 They started working on these a decade ago.
01:32:09.000 And, oh, this is a good one.
01:32:10.000 If you can find the video, it's worth just, I mean, it's real.
01:32:13.000 It's on my old computer.
01:32:14.000 I could send it to you.
01:32:15.000 Someone tweeted up, like, Boston Dynamics promises it isn't building weapons.
01:32:20.000 These things are not built.
01:32:21.000 built to be weapons, but the Chinese are using the exact replica to use as weapons, just so you know.
01:32:27.000 And they mounted a weapon on top of it.
01:32:28.000 Yeah, a little machine gun just on top of it.
01:32:30.000 But to say they're not building them as weapons.
01:32:32.000 It's for mining.
01:32:33.000 Digging holes in the ground with bullets.
01:32:35.000 Look, look, look.
01:32:36.000 The U.S.
01:32:37.000 doesn't have any bioweapons research going on.
01:32:39.000 It's biological research that makes viruses more deadly and more potent and more transmissible.
01:32:45.000 And someone might weaponize them, but we're not making weapons.
01:32:48.000 Don't make weapons when there's ammo.
01:32:49.000 Let me ask you a question.
01:32:51.000 When your waiter walks up to you at a restaurant and hands you a steak knife, did he just hand you a weapon?
01:32:55.000 Yeah.
01:32:56.000 Technically.
01:32:56.000 He did, didn't he?
01:32:57.000 To destroy that steak.
01:32:59.000 But think about that.
01:33:01.000 No one would ever describe it that way.
01:33:02.000 Imagine going to court and being like, what happened next?
01:33:05.000 The man in the vest with the apron on handed me a weapon.
01:33:09.000 Wow.
01:33:10.000 What was the weapon?
01:33:11.000 It was a butter knife.
01:33:12.000 Wow.
01:33:13.000 It's a knife?
01:33:14.000 He handed me a knife.
01:33:15.000 What kind of knife?
01:33:15.000 A butter knife?
01:33:16.000 But it is a knife.
01:33:17.000 Knives are still dangerous.
01:33:18.000 It's painful death, a butter knife.
01:33:21.000 You can make it work.
01:33:22.000 You can make it work.
01:33:23.000 Well, so I was thinking about this earlier because we were talking with the guy last night and he said there's no bioweapons labs and I was like... Yeah.
01:33:28.000 That was two nights ago.
01:33:29.000 Two nights ago.
01:33:30.000 They're doing gain-of-function research.
01:33:32.000 They're making viruses more deadly.
01:33:34.000 Are those weapons?
01:33:36.000 So you can argue their intent is not to make a weapon, but they made a weapon.
01:33:40.000 If someone makes a really sharp knife and says, it's not a weapon, it's for sushi.
01:33:43.000 It's like, okay, well, you know.
01:33:45.000 It's scary, I guess, if they are doing that.
01:33:47.000 And he wants to deny it, I suppose, because he doesn't know or I don't know.
01:33:50.000 And it's quite a scary thought that got all this stuff in the lab.
01:33:53.000 But what's even scarier is if they're not doing it because other countries are.
01:33:57.000 That's really scary.
01:33:58.000 I don't want to think that the US or the UK are not doing that.
01:34:01.000 That's really scary.
01:34:02.000 It's like they don't even know what they're doing then.
01:34:05.000 I mean, that's the root of a lot of conspiracy theory anyway.
01:34:06.000 But then what?
01:34:07.000 Everybody makes the craziest weapons imaginable out of fear of someone else making crazy weapons?
01:34:11.000 That could accidentally leak and then spread a virus all over the world?
01:34:16.000 A bioweapon all over the world?
01:34:18.000 It's bound to happen again, I suppose.
01:34:21.000 Alright, well, let's go to Super Chats!
01:34:23.000 If you haven't already, would you kindly smash the like button, subscribe to this channel, and share the show with your friends.
01:34:29.000 Be the notification you want to see in the world.
01:34:31.000 YouTube is not sending out notifications, as many people have stated, so if you guys take the URL, share it, retweet it, and all that stuff, you can notify people where YouTube will not.
01:34:40.000 Do you mind me asking people to subscribe to On the Edge with Andrew Gold?
01:34:44.000 Of course!
01:34:44.000 Well, there you go.
01:34:45.000 You just did it.
01:34:46.000 On the Edge.
01:34:46.000 I've done it with Andrew Gold.
01:34:48.000 I'm on the edge.
01:34:48.000 I'm like a fringe.
01:34:49.000 It was going to be fringe originally on the fringes.
01:34:51.000 And I just thought then edge sounded more like I can invite people on who won't think I'm sort of talking badly about them, you know?
01:34:59.000 Because some people might be offended by fringe and not come on the show.
01:35:02.000 All right.
01:35:03.000 Potatoes for Seamus says Luke has the best t-shirts.
01:35:06.000 Let Luke bless us with his shirts of wisdom.
01:35:08.000 Thank you so much.
01:35:10.000 I love your username.
01:35:11.000 I think it's great.
01:35:12.000 The Seamuses do need a lot of potatoes.
01:35:14.000 Is this one of your shirts here?
01:35:16.000 1,984 doses to slow the spread.
01:35:18.000 1,984 doses to slow the spread, you know?
01:35:20.000 Just like they said.
01:35:21.000 Just, you know, two doses to slow the spread.
01:35:23.000 We're almost there.
01:35:23.000 We're getting there.
01:35:24.000 Just a couple more.
01:35:25.000 Just a couple more, right?
01:35:27.000 All right.
01:35:28.000 Let's grab some superchats.
01:35:30.000 Christopher Casimir says, it's Aerosmith guy again.
01:35:33.000 At a mother-mother concert at House of Blues Boston, listening to Timcast this time.
01:35:36.000 Keep up God's work.
01:35:37.000 We will.
01:35:37.000 Thank you very much.
01:35:38.000 Appreciate it.
01:35:40.000 All right.
01:35:41.000 Eatzeebug says, who is the judge in the Alex Jones case, Dr. Evil?
01:35:44.000 I think there was a handful of judges.
01:35:46.000 You know.
01:35:47.000 That's pretty good.
01:35:48.000 K.F.
01:35:49.000 A Squirrel's Worst Night says, I hope Alex Jones has chickens.
01:35:53.000 I mean, wouldn't it be funny if just, like, in a few years, Alex Jones is just, like, a local farmer?
01:35:58.000 Yeah.
01:35:58.000 What else is he gonna do?
01:36:00.000 He's gonna keep doing his thing.
01:36:01.000 You can't stop him, you know?
01:36:04.000 But farming on the side.
01:36:05.000 Spiro Floropoulos says, Andrew, talk a little bit about culture cults and compared to JWs disfellowshipping worldly people, Armageddon, around the corner, etc.
01:36:15.000 Jehovah's Witnesses.
01:36:16.000 Yeah.
01:36:16.000 Oh, is that what that is?
01:36:17.000 Yes, Jehovah's Witnesses.
01:36:18.000 Yeah well they're pretty out there.
01:36:19.000 I've done loads of stuff about Jehovah's Witnesses and yeah I don't know exactly what to say about them just that they are here, they exist and they're just another cult I suppose.
01:36:31.000 People get angry when I say they're a cult because you're allowed to leave, you can leave but there's a lot of pressure to stay in the Jehovah's Witnesses so that's what I'd say about them.
01:36:38.000 What is it when they say religions are cults but cults don't always have leaders is it like every local church has its own cult leader which is the pastor or the Yeah, so my sort of big documentary I made for the BBC was about exorcism for example and it was Lutheran Christianity.
01:36:57.000 It's usually Catholicism when there's an exorcist but this particular one said he was Lutheran and I went to sort of live with him for a few months and I performed exorcisms with him and over the months I sort of realised what a cult leader he was and his was just one individual church but everybody was just like doing everything he said And he was taking some of the women that he was exercising to sort of be with him upstairs, I came to realize.
01:37:24.000 So I called him out for that.
01:37:26.000 Well, sort of.
01:37:26.000 I asked some questions about it and he heard that I was asking about it and then he sort of locked me in a room and he wouldn't let my cameraman in.
01:37:33.000 And he had like a whole bunch of guys with these big staffs, you know, like Jafar and Aladdin, the big staff things, and they were being very threatening.
01:37:38.000 It was like midnight in the middle of nowhere in Argentina.
01:37:40.000 And I thought they were going to kill me.
01:37:42.000 Eventually they let me go.
01:37:43.000 The point being though, I guess there's lots of small cults making up a bigger one.
01:37:49.000 Yeah.
01:37:50.000 Raymond G. Stanley Jr.
01:37:51.000 says, Lydia, I love what you did to your hair.
01:37:53.000 Yes, Queen.
01:37:54.000 You're very brave.
01:37:57.000 We support your transition.
01:37:59.000 Yeah, I don't even know what to say.
01:38:01.000 TheMusicAnon says, Tim, just want to let you and everyone know I was not notified of the stream yesterday.
01:38:06.000 I had to reset my sub by clicking the bell off, logging out, then in, then reactivating the bell.
01:38:10.000 Looks like that fixed it for now.
01:38:12.000 Very creepy.
01:38:14.000 Damien writes, the largest fine paid by a banking executive responsible for the 2008 financial crisis was $67.5 million.
01:38:21.000 A billion dollars for words.
01:38:22.000 Honk, honk.
01:38:24.000 Yup.
01:38:25.000 Cabo Rojo says, so what?
01:38:27.000 Kyle Rittenhouse gets $200 billion by these standards.
01:38:30.000 Well, you know, no one ever said the system wasn't corrupt.
01:38:33.000 Well, they actually said the system was corrupt.
01:38:35.000 So there you go.
01:38:37.000 Anyway.
01:38:39.000 What do we got here?
01:38:41.000 Tavnazian says this should be used as precedent every time someone from the mainstream media claims it's our opinion.
01:38:48.000 I agree!
01:38:49.000 We gotta see what happens with these James O'Keefe lawsuits.
01:38:50.000 That'll be really interesting.
01:38:53.000 Max Reddick says, Tim, I've been watching other news sources to see what they have to say.
01:38:57.000 The Young Turks and Sam Seder seem to have a lot of content attacking you.
01:39:01.000 Respond.
01:39:01.000 Why?
01:39:02.000 It's immaterial and irrelevant.
01:39:03.000 These people waste their times.
01:39:06.000 I would say the reason why... Well, I'll say...
01:39:11.000 If you want to talk about each other, go ahead and do it.
01:39:15.000 You'll notice that we don't.
01:39:16.000 We talk about news stories, typically things that have a big impact.
01:39:20.000 Maybe that's why we are so successful.
01:39:23.000 I love when these people like to say, they get really angry and say, Tim makes so much money, because they don't.
01:39:29.000 And maybe it's because no one cares.
01:39:31.000 You know, when like Sam makes a video about me, Why would the average person care about that?
01:39:37.000 What am I doing?
01:39:37.000 Am I, like, leading government?
01:39:40.000 Am I a celebrity in magazines?
01:39:42.000 No!
01:39:43.000 When the Young Turks put out a video and they're like, Tim Pool says that conservatives are more attractive than liberals, he's right, but he's also ugly.
01:39:49.000 Why would anyone care if you think I'm ugly or not?
01:39:52.000 When you say, like, here's a thing Tim Poole said he was right about.
01:39:55.000 What was the point of that segment at all?
01:39:57.000 How about you do a segment on the study itself?
01:40:00.000 I don't care for responding to these people because we are both as irrelevant as each other.
01:40:06.000 Imagine if this show was dedicated to talking about petty YouTube drama.
01:40:09.000 Oh, yeah.
01:40:10.000 I mean, there was in 2006, we called them trolls, people that would make videos about other people.
01:40:15.000 You make a video to someone.
01:40:17.000 If you want to interact with somebody, otherwise you're grifting off of the idea of that person.
01:40:22.000 Just talk to them.
01:40:24.000 Use a video.
01:40:24.000 It's very easy, and it's very direct, and it's very effective.
01:40:27.000 I've responded sometimes to videos they've made, but usually to make a point about the greater woke cult or politics.
01:40:36.000 Like, I recently talked about Hasan because he was at TwitchCon.
01:40:39.000 Some kid walked up to him and asked him about Sam Hyde, and Hasan lost it.
01:40:43.000 And I thought that was a relevant conversation, had nothing to do with me, it has to do with leftist celebrities, how they behave, the events they have and who they are as characters.
01:40:53.000 And I thought there was a really interesting perspective there in how the left Idolizes people.
01:41:00.000 A democratic socialist who talks about taxing the rich in revolution, who owns a multi-million dollar mansion in Los Angeles in a major city, who is just a part of the machine that he's claiming to criticize.
01:41:10.000 Versus the people who would call us grifters when we literally move up to the middle of nowhere, get a bunch of chickens, and actively practice what we preach.
01:41:19.000 I thought that was an interesting contrast between who is the actual grifter and who isn't.
01:41:23.000 Do I watch Hasan's show or The Young Turks?
01:41:25.000 No.
01:41:25.000 I have nothing to say about their opinions.
01:41:27.000 They're entitled to them.
01:41:28.000 That's fine.
01:41:29.000 But when they want to talk about us, I just say, well, that's why you're less successful.
01:41:32.000 Because why would the average person going on YouTube be like, whoa, they made a video about Tim Pool?
01:41:37.000 That's the funniest thing ever.
01:41:39.000 You've got to be really, really into the weeds to care about me.
01:41:43.000 And I swear if you and Hassan did like a video chat video where like your faces, but if you did, it would get like 6 million views probably in like six, five, four days or something.
01:41:53.000 It would be the biggest like cultural win for if you want to win a culture war, bringing people together.
01:41:58.000 Right.
01:41:59.000 Can you imagine if Hassan and Sam Hyde were to actually have their boxing bout like he's been asking for for so long?
01:42:04.000 That'd be quite something.
01:42:04.000 But like my point was that Hassan needed to just only say to that guy like, ah, it's stupid, dude.
01:42:10.000 It's just a troll.
01:42:10.000 Dismiss it and it would've been fine.
01:42:12.000 And the kid would've been like, okay, thanks, man.
01:42:13.000 Have a nice day.
01:42:13.000 Instead he like, he got so angry and he lost it.
01:42:16.000 But I think it's because it's all fake.
01:42:18.000 You know, like I have people come up to me all the time and talk about stuff.
01:42:21.000 I have people ask me like, why won't you debate Sam Cedar?
01:42:23.000 I'm like, because Sam Cedar's not a serious person.
01:42:25.000 Disingenuous too.
01:42:26.000 I'll address it periodically because it comes up in questions and I'm willing to answer them, but Sam's whole thing, as they pointed out, is a content attacking me.
01:42:35.000 It's political drama channels.
01:42:38.000 It's celebrity e-gossip for politics.
01:42:40.000 Just baiting.
01:42:41.000 It's not valuable to the average person.
01:42:43.000 The average person who watches a TimCast segment isn't coming here to learn about my beef with someone.
01:42:48.000 I mean, although a lot of people did watch when we had The Rugged Man here, that went viral, but that's a really good example of it.
01:42:54.000 The reason why these channels are like, I'm gonna insult Tim Pool, when we had Ari the Rugged Man on, we got in a heated argument, he stood up, smacked the microphone, someone took a clip of it, shared it, and everyone started going, woohoo, and hooting and hollering, and it went viral, and articles were written about it, and all these channels, they made videos about it, because drama gets you clicks.
01:43:14.000 What did you argue about?
01:43:16.000 Uh, he was, he's like, was saying that my experience dealing with racism was like, it was not real and I was making it up and lying about it because I look like a white person.
01:43:25.000 And then my response was like, you're racist and you know, you are that white person that claims racism is happening, but then when someone tells you, you dismiss it.
01:43:32.000 And then, you know, we got heated.
01:43:34.000 He stood up, he smacked the mic, started yelling.
01:43:37.000 And then we apologized, we hugged it out.
01:43:40.000 He hung out for a little bit after the show and chilled on the couch and we talked and, you know, I told him, come back whenever you want.
01:43:44.000 It was a good conversation.
01:43:45.000 These things happen.
01:43:46.000 But people love drama.
01:43:48.000 Yeah, they do.
01:43:49.000 Expensive microphones as well.
01:43:50.000 It's fine.
01:43:51.000 It just spun around.
01:43:52.000 Yeah, but that's what got me is if you come in here and start damaging property, that's beyond personal.
01:43:57.000 But the virality of the clip shows exactly why these people make content about us in that way, because they're like, ooh, this will, you know, people love e-drama.
01:44:06.000 It's like, well, you know, I'm not going to do that.
01:44:08.000 You would watch that.
01:44:08.000 Okay.
01:44:08.000 Imagine you just, a clip comes out and Tom Cruise has just punched Brad Pitt in the face.
01:44:13.000 You'd watch that.
01:44:14.000 Right?
01:44:14.000 It's like if a story came out to I mean look That's why all these crime videos go viral because people are like they want to see the crisis in the conflict Not not everybody some people are genuinely concerned that crime is escalating in their cities that I get But there's a lot of people you know I there was this young guy once This was a couple years ago, and I had like 200,000 subscribers.
01:44:34.000 And he started making nothing but Tim Pool videos.
01:44:37.000 He would watch a video, and then he would make a video about me.
01:44:39.000 And then I DM'd him on Twitter, and I just said, bro, I am not famous enough for you to succeed making content about.
01:44:46.000 You can disagree with me, you can insult me, but if you want to make it on YouTube, talk about the big picture news stories that people are interested in.
01:44:56.000 Most people in this country know who Joe Biden is.
01:44:59.000 They're concerned about his leadership.
01:45:01.000 That's important to the average person.
01:45:03.000 I'm some dude on YouTube no one's ever heard of.
01:45:06.000 And then the dude stopped doing it and started making different videos.
01:45:08.000 It's like, the drama stuff is not... You know what it is?
01:45:11.000 It's an addiction.
01:45:12.000 And it ends up destroying people before they get started.
01:45:14.000 Because you build a channel based off of rage bait, hatred for a single individual, the market cap on that is microscopic.
01:45:21.000 So you build up a channel based on that, then as soon as you try to segue into talking about big picture news, nobody watches.
01:45:28.000 Then YouTube destroys your channel saying your fans don't like your content.
01:45:31.000 It's also a human instinct to pay attention to people fighting because it could be a threat to your life, historically speaking.
01:45:37.000 And big tech social media has been prioritizing it, putting it front face in the algorithm and promoting such behavior and creating more insanity.
01:45:45.000 Yeah, that's why we like true crime.
01:45:47.000 You're sort of practicing whenever you... particularly women love true crime.
01:45:52.000 I was at a true crime con in the UK.
01:45:55.000 It was just like 99% women.
01:45:58.000 It was all women there just loving the true crime because they're the ones who are often the victims of it and have to sort of watch it.
01:46:05.000 All right, Pinochet's Helicopter Tour says, When you tear out a man's tongue, you are not proving him a liar.
01:46:11.000 You're only telling the world that you fear what he might say.
01:46:14.000 George R.R.
01:46:14.000 Martin.
01:46:15.000 Yep.
01:46:16.000 Yeah.
01:46:17.000 And then it says, ACOC.
01:46:18.000 Is that, was that the book?
01:46:20.000 I believe that's what they're referencing.
01:46:22.000 I mean, it was a Game of Thrones show where we heard that, but it was, I believe, a quote from the book that George R.R.
01:46:28.000 Martin wrote.
01:46:29.000 It's a good one, man.
01:46:31.000 It is a good one.
01:46:33.000 Should we guess what that was?
01:46:34.000 What was it?
01:46:35.000 Acock?
01:46:35.000 A-C-O-K?
01:46:36.000 It's the name of the book, I think.
01:46:37.000 It must be like a... You want to look it up?
01:46:39.000 A cock of kings or something.
01:46:42.000 A court of kingdoms.
01:46:44.000 How do you spell cock?
01:46:46.000 A-C-O-K, Game of Thrones.
01:46:48.000 Or like George RR Martin, A-C-O-K.
01:46:50.000 A Clash of Kings.
01:46:51.000 There you go.
01:46:52.000 Wait, that's a mod for Mount & Blade.
01:46:53.000 Also, I think it's the Game of Thrones.
01:46:55.000 Oh, Clash of Kings.
01:46:56.000 Highly recommend Mount & Blade if you haven't played it yet.
01:46:59.000 T-dub says, how does allowing illegal immigrants help Democrats with voting if illegals can't vote?
01:47:04.000 Is it a generational thing?
01:47:05.000 T-dub, good sir, I have the answer for you.
01:47:07.000 You see, the census counts all people, not citizens.
01:47:12.000 So if a state has a large number of illegal immigrants, the census will count them.
01:47:17.000 Congressional seats are then apportioned based on the total number of people, not total number of citizens.
01:47:21.000 This means that a state like California will get an extra electoral vote.
01:47:24.000 It will get an extra vote in Congress based on their population of illegal immigrants.
01:47:28.000 They don't need to actually vote, but the state will get an extra vote for the president when the presidential election happens.
01:47:32.000 And that's how it happens.
01:47:33.000 And I think California previously had one extra congressional seat and one extra electoral vote based on their illegal immigration population.
01:47:41.000 It's gone down recently.
01:47:42.000 Maybe it's something to do with Trump.
01:47:43.000 I don't know.
01:47:44.000 But that is a major concern.
01:47:47.000 And then also, I think studies show that the children of illegal immigrants overwhelmingly vote Democrat the first time they do vote.
01:47:55.000 I don't know if that's true or not, but I know that the census thing matters.
01:47:58.000 That's why Donald Trump wanted the census question.
01:48:00.000 I'm sorry, the citizenship question on the census.
01:48:02.000 That surprises me because there is that instinct, I think, to like, get to a country and shut the door behind you.
01:48:07.000 You know that thing of like, okay, I'm in, shut the door.
01:48:10.000 Or pull the ladder up beneath you, yeah.
01:48:12.000 Yeah.
01:48:13.000 TW says, get Styx on here.
01:48:15.000 I want to hear Tim, Luke, Ian, and Styx discuss monetary systems.
01:48:18.000 That'd be cool.
01:48:18.000 He doesn't believe gold-backed currency is better.
01:48:21.000 Hopefully episode 666.
01:48:22.000 He said he's down to make an appearance.
01:48:24.000 How about we calculate when episode 666 will be, and Styx, that's when we get you on, because that would be fantastic.
01:48:30.000 Yeah, that sounds like a great idea.
01:48:31.000 We were supposed to meet up with him in upstate New York once and then plans fell through.
01:48:37.000 Stick, Sex and Hammer on episode 666.
01:48:39.000 And right now we're on like episode 636 or something like that.
01:48:42.000 Oh, we're close!
01:48:43.000 Let's make it happen.
01:48:44.000 Yeah.
01:48:44.000 So in one month, in one month, we got it.
01:48:47.000 We got it.
01:48:48.000 Got to happen.
01:48:49.000 Episode 666 would be the best time to have him on.
01:48:51.000 That'd be great.
01:48:52.000 It's, it's, it's, I don't know.
01:48:54.000 I don't know how far we're booked out we are, but I'd like to say we can reserve 666 just for Sticks.
01:48:59.000 Yeah, I mean, I think we could do that.
01:49:01.000 That would be great.
01:49:02.000 We gotta bring Seamus back to, like, spread holy water around the show before it's out.
01:49:08.000 For episode 777, for sure.
01:49:09.000 For 666, I mean.
01:49:11.000 Bigmanevil says PayPal spelled backwards is Lapyap, and that sounds like a company that would do what daddy government says.
01:49:18.000 I agree.
01:49:20.000 Jdoc says 65% of Ethereum is owned by five entities.
01:49:24.000 I'd like to see some reference and documentation on that, but I wouldn't be surprised if it turned out All right.
01:49:31.000 Let's, uh, Donald Thomas says, Hey, Ian, before you question somebody about banking, Hitler do research.
01:49:36.000 When people have that power, they're the ones that control the currency.
01:49:39.000 And Hitler was in the, in the process of creating a German currency.
01:49:44.000 Well, they had the guns, you know.
01:49:46.000 Yeah, okay.
01:49:48.000 I didn't quite understand that.
01:49:49.000 Do research on what exactly?
01:49:51.000 On the history of it.
01:49:52.000 Basically, he's saying that Hitler was trying to control the currency.
01:49:55.000 So how would you debank him?
01:49:56.000 You know what I mean?
01:49:57.000 Right.
01:49:59.000 If the person's in charge of the Federal Reserve... Yeah, I think a lot of... I don't know, but I've heard that Hitler's anger was about... He blamed Jewish people, but a lot of it was the banking industry on Earth.
01:50:10.000 Historically, it was kind of pioneered by Amschel Rothschild, who was an Ashkenazi Jew, and so the Jewish thing got a bad reputation, but it was actually the Rothschild family that had kind of co-opted the banking industry.
01:50:22.000 This is the problem with, like, a lot of anti-Semites.
01:50:25.000 Because I've, like, talked to people at these rallies, and they'll say, like, something about the Jewish people, and I'm like, no, you're criticizing a person in power who happens to be Jewish.
01:50:31.000 It's like, I know a bunch of poor Jewish people.
01:50:34.000 Like, it's just, you're just looking at someone, choosing the one trait you think determines.
01:50:39.000 And the annoying thing about it is a lot of it is basically Jewish privilege.
01:50:43.000 They're like, oh, these Jewish people, you know, they hire each other and they do this.
01:50:47.000 And I'm like, you're just describing white privilege, but you're saying Jew instead.
01:50:49.000 Like, it's just stupid.
01:50:50.000 Come on, man.
01:50:51.000 It is frustrating and it's actually really alluring that conspiracy, of all conspiracies, it's like, I want to believe it.
01:50:58.000 And being Jewish myself, I always felt like, oh God, you know, it would be typical if that was true and I was just somehow left out.
01:51:05.000 I was the one Jew they never called.
01:51:08.000 Everyone's in on it but you.
01:51:10.000 Yeah, well, you know, I was telling you before the show about, you know, I couldn't get a job after my first couple of documentaries.
01:51:15.000 They kept saying, you can't be on screen anymore.
01:51:17.000 And I remember thinking then, like, where's this supposed, like, Jewish people that I can call?
01:51:22.000 Like, where's my, yeah, my line to Ben Stiller and say like, hey, Stiller, why can't I get this?
01:51:28.000 Or a better example, in New York, when the government started shutting down synagogues and chaining parks shut, directly targeting Jewish schools.
01:51:37.000 Come on, man, you know.
01:51:38.000 Oh, and to clarify, you had said you were not getting put on camera after your first two because they wanted people of color?
01:51:43.000 Was that- that was- they specified, they said, we need people of color?
01:51:48.000 The term they use in the UK, although they're moving away from it, there's always a new term and then it becomes offensive and there's- What is it?
01:51:53.000 A-A-I-P or something?
01:51:54.000 A-A-P-I?
01:51:55.000 It's BAME.
01:51:56.000 B-A-M-E.
01:51:56.000 No, no, but the new one they're doing here is A-A-P-I.
01:51:58.000 I don't know.
01:51:59.000 It's a- what is it?
01:52:00.000 Asian American Pacific Islander?
01:52:02.000 It's a- I'm like, ugh.
01:52:03.000 It's a way to catch people out, I think.
01:52:05.000 Like, for a long time it was, you know, black people was what you were... No, obviously it was coloured was before, wasn't it?
01:52:10.000 And then for like 30 years after you weren't supposed to say coloured anymore in the UK, there was still some like older people who were maybe less educated who still used it and they would be vilified.
01:52:20.000 And it was fun.
01:52:21.000 It was fun for everyone to go, you're not educated and you don't know the right word.
01:52:26.000 It finally got to the point about two years ago It's a way of just going, you guys are uneducated and you don't know the latest things.
01:52:32.000 single person, like grandparents you're not supposed to say and then they
01:52:34.000 changed they changed it again to people of color.
01:52:37.000 Yes, right, right, right.
01:52:38.000 It was finally right at that moment and they tried to confuse them again.
01:52:41.000 So it's a way of just going you guys are uneducated and you don't know the
01:52:44.000 latest things but yeah BAME, black African minority ethnic.
01:52:49.000 And Jews are not included in that usually.
01:52:49.000 Wow.
01:52:53.000 And that's who they wanted to replace me with.
01:52:55.000 They would take my ideas for different documentaries and stuff, but providing I'd be off screen so that they have a minority.
01:53:01.000 That's illegal in the United States.
01:53:03.000 It's probably in the UK as well.
01:53:05.000 Oh, then sue.
01:53:06.000 Yeah, I can't prove it.
01:53:08.000 Oh, you know, I mean, in the United States, you don't need to.
01:53:10.000 That's the problem.
01:53:11.000 Like you just make the accusation and that's it.
01:53:14.000 I've had people call up afterwards and say like, hi, so sorry, like we were taking your idea and everything.
01:53:14.000 Court of public opinion.
01:53:19.000 We were going to use you, but it occurred to us that you are a white man and we can't take you anymore.
01:53:25.000 And I'm just like, okay, well, yeah, yet again.
01:53:28.000 That's why I started my podcast.
01:53:29.000 Because I thought, okay, that's the only way now.
01:53:31.000 I couldn't work.
01:53:32.000 I had no money, no nothing.
01:53:34.000 I had to make the podcast.
01:53:35.000 But I even found making the podcast, there's still, there's so many things that if you want to win awards to get to different levels and stuff, it's still, you can't do it.
01:53:43.000 You've got to have some ethnic thing going on or you will not be considered.
01:53:48.000 Alright, the Sinister Sibling says to challenge the previous super chat.
01:53:52.000 Most of us believe in God as a means of higher power judging us, allowing the main reason for us to have moral restraint.
01:53:58.000 Lose that and you become the modern left.
01:54:01.000 I think that the modern left is an example of a lack of a moral framework.
01:54:05.000 Whereas one thing that unifies a lot of people from post-liberal, libertarian, and conservative, the right freedom faction, is Christian moral framework.
01:54:16.000 I am not saying that you believe everything in the Bible, or you believe every teaching or every law of it, but there is a tradition that was passed down rooted in traditional Christian morals.
01:54:25.000 We've gotten rid of many of them.
01:54:26.000 Our cultural morals have shifted quite a bit, but a lot of them are still there.
01:54:30.000 The woke people have no moral framework at all.
01:54:32.000 That's why they constantly change the definition of things.
01:54:35.000 That's why one thing, like, Wimmickson and women are both offensive and both not offensive at the same time, in a superposition of both inoffense and offense, because there is no rule.
01:54:44.000 There is no framework.
01:54:45.000 It's just, to them, might makes right.
01:54:48.000 Double think, too.
01:54:49.000 Well, it makes right.
01:54:50.000 So whatever it means to give them power, that's all they care about.
01:54:53.000 I just had a thought that maybe God isn't judgmental, but that we are judgmental of ourselves.
01:54:58.000 And so God is this plaintive explanation.
01:55:01.000 And then we are taking that and based on our sociological framework, judging our own behavior.
01:55:07.000 And then we say that God's judging us because we're feeling it, but we're the ones that are putting the feelings on top of it.
01:55:11.000 I feel like the Christian view of things is that God is basically running a sorting algorithm.
01:55:17.000 You get all of these people who are born and sold onto this planet, and they live a life so that they're sorted into the bad place and the good place.
01:55:25.000 And then, you know, from a logical perspective in that capacity, I wonder if it was taken out of the context of religion, and you had, let's say, a human with a chicken farm.
01:55:37.000 And the chickens were allowed to do whatever they want, and then what happens is, at the end of the year, or the end of the chicken's life, or I should say, at the end of the year, chickens are reaching their reproduction age at about seven months, you sort them.
01:55:50.000 The ones that were bad get shuffled off into the meat grinder, and the ones that were good go off into the paradise of reproduction.
01:55:55.000 There's a purpose for the sorting algorithm in that capacity.
01:55:57.000 So when I hear these religious views, I think of it logically like, If there is a God, and there are rules to this, it's not arbitrary.
01:56:07.000 It's not simply that you are here just to live a good life so that you can prove it and then go be with God.
01:56:11.000 It's that you're supposed to live a good life for a reason.
01:56:14.000 Because the good people that move on, move on for something greater and more important than just this sorting algorithm.
01:56:22.000 The least important thing is figuring out which chicken to breed.
01:56:24.000 The most important thing is successfully breeding the good chicken.
01:56:27.000 So if I had a bunch of chickens and I was taking the bad ones and eating them and having the good ones reproduce over a long enough period of time, you're getting better and better and better chickens, the greater purpose is beyond that one day or that seven months they spend in the pen.
01:56:39.000 So for those of us here on earth, if you believe there is judgment and a greater power, there is a greater purpose for your existence that lies beyond this life.
01:56:49.000 And if you're bad, into the meat grinder!
01:56:51.000 Yeah, well it'll be into the AI algorithm where you just become self-sterilized and are in like a doped state while you eventually die off and the rest of the people that are aware and awake are reproducing to create space travel.
01:57:04.000 Here's a crazy thought that someone else probably already had.
01:57:07.000 What if the reason these ultra-wealth elites are desperately trying to become immortal is that they know they're headed for the bad place?
01:57:14.000 Wow.
01:57:15.000 So it's in the World of Warcraft expansion, Shadowlands.
01:57:20.000 I don't know, it's been a while since I was reading about it because I didn't actually play it.
01:57:24.000 I played it a little bit.
01:57:25.000 But it's basically Sylvanas, and I'm way behind in the story for all the Warcraft fans, but Sylvanas was like...
01:57:32.000 She was, whatchacallit, the Lich King turned her, resurrected her, and so she was condemned to, in the afterlife, this really horrifying, torturous place.
01:57:41.000 So then she was like, screw that, I refuse to go there, through no fault, so she shatters the veil between this realm and the Shadowlands or whatever.
01:57:50.000 So I'm thinking about all these ultra-rich people, and they're like, these evil, nasty, corporate, global elite or whatever.
01:57:57.000 And they all strive towards immortality.
01:57:59.000 Not all of them, but a lot of them.
01:58:00.000 There are a lot of them who are working towards this stuff.
01:58:02.000 And they're thinking like, when I die, I'm going down, so I better just live forever and never leave this place.
01:58:07.000 That's a new one.
01:58:08.000 But heaven and hell are on Earth, guys.
01:58:10.000 Gotta make it here.
01:58:11.000 Is it a new one?
01:58:12.000 I feel like someone's probably thought of that before.
01:58:14.000 I'd live forever.
01:58:14.000 I'd love to live forever.
01:58:16.000 Yeah, I don't want to die.
01:58:17.000 This is great.
01:58:18.000 Okay, okay, Jared Kushner.
01:58:21.000 I just like, like, breathing's great.
01:58:24.000 Oh, isn't that good?
01:58:25.000 You know what's great?
01:58:26.000 Eating chick-fil-a.
01:58:26.000 Yeah.
01:58:28.000 Yeah.
01:58:28.000 I don't know what that is, but yeah.
01:58:29.000 It's chicken!
01:58:30.000 Chicken sandwiches!
01:58:31.000 Oh, it's so great.
01:58:33.000 You know what's also really great?
01:58:35.000 Chicken fajitas from a nice Mexican restaurant.
01:58:36.000 Yeah, super good.
01:58:37.000 Yeah, green peppers, onions, chicken, guacamole, sour cream, rice and beans.
01:58:42.000 Stop making me hungry.
01:58:44.000 Sometimes I'll have this thought where I'm like, I don't want to live again.
01:58:46.000 If I, if I have to come back and relive this life of Ian right now is great, but my childhood was not, it was like, what hell, like taking so long.
01:58:55.000 But then I'm like, if I beg not to do this again, will I not wake up tomorrow?
01:58:58.000 Like, am I actually begging for the end of the simulation?
01:59:00.000 So like, keep going, you know, come back if you got to come back.
01:59:03.000 You know what's an interesting potential thing that could happen?
01:59:07.000 Because I've interviewed a few people about living forever and there are quite a lot of people now, transhumanists they're called, who believe we can live forever just by changing the biology and whatever.
01:59:16.000 And some people say that can happen in our lifetimes and some don't, right?
01:59:19.000 Some say the first person to live forever has already been born.
01:59:24.000 But what someone mentioned to me the other day was that if that's not possible, if virtual reality gets good enough, you might be able to slow down our own lived experience of life to such an extent that it would feel like living forever.
01:59:36.000 Like you could be in virtual reality and live millions and millions of experience time, but in our real existence, it would only be like 10 seconds.
01:59:45.000 Wow.
01:59:46.000 Yeah.
01:59:46.000 And the opposite could be true where 10 years go by and you have like a 10 second experience.
01:59:54.000 That's like prison.
01:59:55.000 I'd have lost tenure.
01:59:56.000 Yeah, I guess you'd want to get through prison.
01:59:59.000 That'd be horrible for that.
02:00:00.000 We need these algorithms open source because machines can bend and warp time perceptually for you.
02:00:05.000 Perception's everything now, too.
02:00:06.000 Man.
02:00:07.000 Well, if you haven't already, would you kindly smash that like button, subscribe to this channel, and be the notification you want to see from YouTube.
02:00:14.000 They're not notifying people of the live stream and the videos, so just share them if you want to support us so that we can bypass the censorship.
02:00:22.000 Head over to timcast.com.
02:00:23.000 We're going to have a members-only uncensored show coming up at about 11 p.m.
02:00:27.000 You don't want to miss it.
02:00:27.000 They're good fun.
02:00:28.000 And as a member, you're supporting our journalists and helping us in this mission.
02:00:32.000 We had a meeting today about producing some music.
02:00:34.000 We were planning on putting out something out a couple weeks ago, but we decided to make sure we did everything to the best of our abilities.
02:00:40.000 So it's looking like we're going to have a song, a political song, released just before the Friday before the election, I think is the strategy.
02:00:47.000 Great.
02:00:48.000 And the lyrics are overtly political.
02:00:50.000 And so we were like, no, no, let's, let's lean into it.
02:00:52.000 And then, uh, so we, we shuffled some things around and I think the, the, the idea here, idea there is that no one is waiting for your song.
02:01:00.000 That's what the marketing guy said.
02:01:01.000 Like music comes out and then people will like it and listen to it, but no one is sitting there screaming, begging you to release it now.
02:01:06.000 So do it when it makes sense.
02:01:07.000 And I said, okay, so with your support, we're going to do more to challenge the culture and expand.
02:01:11.000 So, uh, you can follow the show at Timcast IRL.
02:01:13.000 You can follow me at Timcast.
02:01:15.000 Andrew, do you want to shout anything out?
02:01:16.000 Yeah, thank you all for having me so much.
02:01:18.000 I've had a lovely time.
02:01:18.000 Come to On The Edge With Andrew Gold YouTube or audio podcast in all the normal places.
02:01:22.000 I speak to lots of weird and interesting, fun people.
02:01:25.000 All right, bloody bloke.
02:01:27.000 Thank you so much for coming on.
02:01:29.000 My website is lukeuncensored.com.
02:01:32.000 I'm really proud of the last two videos I did on there.
02:01:35.000 If you need some uplifting, if you're feeling knackered or miffed, you should definitely check out the two videos I did on lukeuncensored.com.
02:01:43.000 They are proper.
02:01:44.000 I'm just taking the piss here.
02:01:47.000 Cheerio.
02:01:49.000 Thanks, Luke.
02:01:50.000 And adios to you as well, my friend.
02:01:51.000 Hey, guys, if you want to check out more esoteric weirdness, follow me anywhere.
02:01:56.000 And also check out my podcast, my show with Hotep Jesus today, where we talked a little bit about God, Jesus, and plan to do much more in that realm.
02:02:04.000 Sorry, it was pronounced... I love hearing how you guys do the accents.
02:02:10.000 Tim's pretty good.
02:02:10.000 I've been amazed by his impression so far.
02:02:14.000 So one thing I wish I could do, the only one I could do is like Jordan Peterson, which I don't think I should do.
02:02:18.000 Give me some JP.
02:02:21.000 Well, you know, you know, it's bloody.
02:02:26.000 You're a chimpanzee full of snakes.
02:02:29.000 You got to make your bed, man.
02:02:32.000 Psychologist.
02:02:34.000 There you go.
02:02:35.000 Hey guys, I'll still be around next episode, and the one after that, and for the next few in the future.
02:02:39.000 So, cheers, see you around.
02:02:41.000 You can follow me at Surge.com, just spell it out.
02:02:44.000 And I'll see you guys next time, again.
02:02:47.000 Thanks for hanging out everybody, we will see you all over at TimCast.com.