Timcast IRL - Tim Pool - June 13, 2022


Timcast IRL - Jake Paul ROASTS Biden As Market TANKS w-Bill Ottman & Jamie Kilstein


Episode Stats

Length

2 hours and 8 minutes

Words per Minute

209.34352

Word Count

26,946

Sentence Count

2,295

Misogynist Sentences

24

Hate Speech Sentences

21


Summary

Bill and Tim are joined by comedian Jamie Kilstein to discuss a variety of topics, including the latest in the Biden vs. Bernie debate, the latest on the crypto markets, and much, much more! Also, the guys talk about a new music festival they are hosting in NYC.


Transcript

00:00:00.000 you you
00:01:14.000 I'm just like wow If that is not an excellent barometer for where regular people are at, because I'm not saying this to disrespect Jake Paul or anything like that, but he's not a political guy.
00:01:26.000 So when he's sitting there going like, everything's bad and it's Biden's fault and if you voted for him, it's your fault too, I'm like, oof.
00:01:31.000 Tell me how you really think you take a look at the aggregate polling for Biden.
00:01:34.000 It's at what, like 38 percent?
00:01:36.000 Just miserably low.
00:01:38.000 And if you are in the crypto markets, my friend, you're feeling it, too.
00:01:41.000 So we got we got to it's actually a lot of news today.
00:01:43.000 I just want to let everybody know, you want to know how you know you're old?
00:01:48.000 You don't get hurt exercising.
00:01:51.000 So, you know, we got Jamie Kilstein here and he was like, how'd you get hurt?
00:01:54.000 Were you like doing a flip or something?
00:01:55.000 And I was like, bro, I was sleeping.
00:01:57.000 Hell yeah.
00:01:58.000 I went to sleep and I woke up and I couldn't move.
00:02:00.000 Incapacitated.
00:02:01.000 Incapacitated.
00:02:02.000 So fortunately though we got a plethora of people here so I'm gonna just like try and make this work.
00:02:07.000 I didn't do my morning show.
00:02:08.000 I could not, I literally could not get out of bed.
00:02:11.000 So I got one of those fancy icy hot things.
00:02:13.000 You stick it in and then I took one of those fancy you know over-the-counter pain medications and I'm in excruciating pain right now.
00:02:19.000 We gotta get an inversion table in here.
00:02:21.000 We have one downstairs.
00:02:22.000 We have one.
00:02:22.000 I use it every single day.
00:02:24.000 At least like five times a day to avoid the problems that Tim has, because I have the same problem.
00:02:27.000 Oh, so it's working?
00:02:28.000 Yes.
00:02:29.000 The cool thing is, like, I got the whoop, right?
00:02:31.000 And you look at my heart rate.
00:02:33.000 It's like you're dying.
00:02:34.000 Throughout the night.
00:02:35.000 And my heart rate's really good.
00:02:36.000 It's like 48.
00:02:37.000 And then right around 4 a.m.
00:02:38.000 it jumps to like 100.
00:02:39.000 It's like, call 911.
00:02:40.000 And that's when I woke up like...
00:02:43.000 I couldn't move.
00:02:44.000 Actually, this happened on Sunday and then it just got, it's like, it's a little bit better and I think I'll be better tomorrow.
00:02:49.000 I've been sleeping without a pillow.
00:02:50.000 That's groundbreakingly awesome.
00:02:53.000 Flattening out my back.
00:02:54.000 I gotta try it.
00:02:55.000 I'll have a pillow and then halfway through the night I'll take it off and throw it on the ground and just flatten it out.
00:02:58.000 You gotta discipline yourself to stay on your back though.
00:03:01.000 Yeah.
00:03:01.000 Yeah.
00:03:01.000 Not roll to the stomach.
00:03:02.000 So we got we got a bunch of news, man.
00:03:04.000 We're going to talk about it.
00:03:05.000 So we got people hanging out.
00:03:06.000 We got Bill Ottman in the house.
00:03:07.000 What's up, guys?
00:03:08.000 Psyched to be here.
00:03:09.000 Who are you?
00:03:10.000 I'm Bill.
00:03:11.000 I'm co-founder of Mines.
00:03:12.000 Check me out.
00:03:13.000 Mines dot com slash Ottman.
00:03:15.000 You got a thing going on.
00:03:17.000 We got a thing going on.
00:03:18.000 Festival dot Mines dot com.
00:03:20.000 People.
00:03:21.000 Hell yeah.
00:03:21.000 Beacon Theater.
00:03:22.000 June 25th.
00:03:24.000 New York City.
00:03:24.000 I know people aren't happy about New York City.
00:03:26.000 We had the deposit down before the pandemic, so Beacon Theatre is a cool venue.
00:03:29.000 That's on my bucket list of places I want to play.
00:03:31.000 The Allman Brothers, yeah.
00:03:33.000 It's legendary.
00:03:35.000 So we got Cornel West, Coleman Hughes, Blair White, Tim, James, Tulsi, James O'Keefe, Tulsi Gabbard, Ben Burgess, Zuby, Seth Dillon, Babylon B, Ian, Hopefully these two guys will come.
00:03:52.000 I think they're gonna.
00:03:54.000 So, guys, come out.
00:03:55.000 Promo code FESTIVAL, 50% off, festival.minds.com.
00:04:00.000 There's also a free ticket request.
00:04:04.000 We know some people are strapped.
00:04:07.000 The economy is screwed.
00:04:09.000 Make a request if money is an issue.
00:04:11.000 We want to get everybody in.
00:04:12.000 But be careful, because if you're strapped in the other way, New York won't let you in.
00:04:15.000 Which is weird.
00:04:16.000 Hey everybody last time now, I'm now I listen to the show more I remember last time when you're like, what do you do instead of promoting my shit?
00:04:23.000 I was like a long time ago.
00:04:24.000 I was canceled.
00:04:24.000 You're fine with the other kind.
00:04:25.000 We got Jamie Kilstein.
00:04:27.000 Hey everybody, last time, now I listen to the show more.
00:04:31.000 I remember last time when you were like, what do you do instead of promoting my shit?
00:04:34.000 I was like, well a long time ago I was cancelled and things took a dark turn, but now I'm ready.
00:04:39.000 I wrote stuff down.
00:04:40.000 I'm Jamie Kilstein.
00:04:41.000 I'm a stand-up comedian.
00:04:42.000 I'm headlining House of Comedy, Plano, Dallas area next week, the 16th, 18th, and 19th.
00:04:47.000 I co-host a podcast, or I host a podcast called A Fuck-Up's Guide to the Universe.
00:04:51.000 We've had everyone from John Cleese to Nicole Aniston, the porn star, when I had a breakup, and she nurtured me back to health.
00:04:58.000 Also, if you don't like tribalized bullshit, I have a Patreon, patreon.com slash jamiekilstein.
00:05:03.000 And I'm on Twitter at jamiekilstein, Instagram at thejamiekilstein.
00:05:06.000 And he cusses like a saint.
00:05:08.000 Did I do it already?
00:05:09.000 Are you sure you watched the show?
00:05:10.000 Because it's a family friendly show, James.
00:05:12.000 Are you trying to get canceled?
00:05:15.000 By the way, I was like, don't get canceled this time.
00:05:17.000 And I looked in front of me and I'm like, are these bullet casings that say, let's go Brandon?
00:05:26.000 Hey guys, my name is Lukardowski of WeAreChange.org and I think we're just very not that far away from 1984 doses to slow the spread.
00:05:34.000 You can do your part by going to TheBestPoliticalShirts.com and spreading this very important message to the Kyles and Karens out there.
00:05:40.000 TheBestPoliticalShirts.com.
00:05:42.000 Because you go there, I'm here.
00:05:43.000 Thank you guys so much for having me.
00:05:44.000 This should be a very wonderful, amazing, interesting conversation.
00:05:48.000 Ian?
00:05:49.000 Yes!
00:05:49.000 Oh, pretty good, Luke, actually.
00:05:51.000 Thanks for asking, man.
00:05:52.000 I think I've got to do an inversion table myself.
00:05:54.000 Normally, what I do is I get up, I have some coffee, I, like, look on the internet.
00:05:59.000 I do tech.
00:06:00.000 I'm a social media entrepreneur.
00:06:01.000 You can follow me at IanCrossland.net.
00:06:03.000 I've been doing internet video blogs since the beginning, 2006, making it popular.
00:06:08.000 Communicating across the void, because why not?
00:06:10.000 It's a form of time travel.
00:06:12.000 And I am also here.
00:06:13.000 You cannot avoid me if you're watching a show.
00:06:14.000 I will be pushing buttons in the corner, and I was messing with this camera earlier so I could put it over on Tim and the papasan, but he decided to sit up in the chair.
00:06:21.000 I wasn't really going to sit in the papasan.
00:06:23.000 It would have worked.
00:06:23.000 Who didn't move it over?
00:06:24.000 I was, like, slumped over.
00:06:25.000 Yeah, he was dying.
00:06:26.000 Gently dying.
00:06:27.000 It was really, really exciting.
00:06:28.000 But I am also here.
00:06:29.000 I'm really excited for the show.
00:06:30.000 Love, Jamie and Bill.
00:06:31.000 Before we get started, head over to TimCast.com, become members to help support our important work.
00:06:37.000 As a member, you get access to exclusive segments from this show Monday through Thursday at 11 p.m.
00:06:41.000 I am going, we're gonna have one tonight.
00:06:44.000 You know, because I'm in such, like, in all seriousness, I am seriously hurting really bad.
00:06:49.000 But, you know, I just, I absolutely hate not working.
00:06:53.000 And so, like, the pain of not doing the show is more than the pain in my body.
00:06:57.000 And I was like, we're gonna do it.
00:06:58.000 So we'll have it.
00:06:59.000 It'll be a lot of fun.
00:07:00.000 Considering we have such funny people here, I thought it would be a shame if we did not get something out.
00:07:03.000 So that'll be up tonight.
00:07:04.000 And he'll be supporting our journalists and the hard work we do every single day.
00:07:07.000 So don't forget to smash that like button, subscribe to this channel, share the show with all of your friends if you really do want to support us.
00:07:12.000 Let's jump into this first story.
00:07:14.000 I absolutely love the way that Fox News has framed this story.
00:07:19.000 Boxer Jake Paul blasts Biden for gas prices inflation, claims Biden voters are the American problem.
00:07:27.000 I mean, in all honesty, that's a brutal statement.
00:07:29.000 He said Biden accomplishments, highest gas prices, worst inflation, plummeting crypto prices.
00:07:34.000 That one made me laugh.
00:07:35.000 Highest rent prices ever created new incomprehensible language.
00:07:39.000 Bravo, Jake Paul.
00:07:41.000 That was amazing.
00:07:41.000 I don't think he's like a big conservative dude.
00:07:44.000 Like he goes after Dana White from the UFC, the way he treats fighters.
00:07:47.000 And I think it's talked about like unionizing.
00:07:50.000 I saw him do like a either him or his brother Logan did like a Black Lives Matter thing for Yeah, he's not like a super, like, right-wing dude.
00:07:59.000 He says, if you're reading this and voted for Biden and you still don't regret it, then you are the American problem.
00:08:04.000 So seeing that, and you hit the nail on the head, he's not like a conservative guy.
00:08:08.000 I think we've already seen it with people like Elon Musk.
00:08:12.000 People who are not conservative are being like, hey, you know, I'm kind of reading the writing on the wall and I can see how bad this is.
00:08:16.000 We had a friend of mine out recently who was like, you know, fairly liberal.
00:08:20.000 He was just like, man, I don't know.
00:08:21.000 I kind of regret this.
00:08:21.000 Things are getting really bad.
00:08:23.000 Yeah.
00:08:23.000 So, but just, uh, just an honorable mention real quick for them calling him a boxer.
00:08:26.000 And this is the thing.
00:08:27.000 Five and 0.
00:08:28.000 And this is another thing, people who are not into politics are being forced into politics
00:08:39.000 because of the way of how extraordinary this great reset is going in a way where it's just
00:08:46.000 Everything now is politics, and it's slowly becoming that with every political cycle, but I think now it's becoming more unavoidable than ever.
00:08:54.000 If you don't take an interest in politics, politics will take an interest in you.
00:08:58.000 Absolutely.
00:08:58.000 A Plato quote, I think.
00:09:00.000 Pericles.
00:09:01.000 Oh, okay.
00:09:01.000 It's been around since the Greeks.
00:09:03.000 Oh, I thought it was a Jake Paul quote.
00:09:07.000 The thing about tribalism, too, is I was thinking about this with the like the drag shows last week, watching some of my friends on the left almost feel like they had to defend it just because like that's what the left does.
00:09:18.000 And I think with tribalism in general, it should be so easy.
00:09:23.000 Like what Jake Paul is doing should be the norm.
00:09:25.000 Right.
00:09:25.000 Like I remember when I was against Drone strikes under Bush and then Obama who I liked and voted for when he was doing it too.
00:09:35.000 I was like, well, this is still bad.
00:09:37.000 It's not like, well, it's a cooler drone because it's Obama.
00:09:40.000 It's still, you know, so even if you vote for, if you voted for Biden or if you're a Democrat, you should actually be the most outraged when something goes wrong and feel free to call it out.
00:09:50.000 Not feel like, oh man, I guess I have to defend Biden.
00:09:53.000 It's like, no, you don't do not.
00:09:53.000 If you don't want Exactly.
00:09:54.000 We've been so much better, man.
00:09:55.000 But also this is an opportunity for progressives to be like, we told you not to do Biden.
00:09:59.000 We told you to do Bernie.
00:10:01.000 That is all for sure.
00:10:02.000 I mean, I'm not saying Bernie, I think Bernie would have done better for sure.
00:10:05.000 His mind was there at least.
00:10:07.000 Exactly.
00:10:08.000 I don't know how much better.
00:10:09.000 I think things are going really well under Trump.
00:10:12.000 I think Bernie kind of sold out quite a bit.
00:10:14.000 I was watching that Lindsey Graham-Bernie thing a little bit.
00:10:16.000 But I feel like a box of rocks would be better than Biden at this point.
00:10:20.000 The Biden was the last thing I think a lot of liberals wanted.
00:10:23.000 And then they doubled down like it was the greatest thing ever.
00:10:26.000 That was a tough one to watch.
00:10:27.000 The whole, I can't stand that guy so I'm voting for anything else.
00:10:30.000 I hope that that never happens again.
00:10:31.000 Because it was televised and it's been recorded now.
00:10:33.000 At least we know what happens.
00:10:34.000 It happens every four years.
00:10:36.000 Does it?
00:10:37.000 Yeah.
00:10:38.000 Hillary Clinton?
00:10:39.000 Yeah, that 2016 election was another version of that, I think.
00:10:42.000 But people liked Trump, man.
00:10:44.000 You know, for a lot of people, the more bombastic he was, the more they liked him because he was pushing back on this machine, you know?
00:10:50.000 I'm glad I finally just get to admit publicly that he was hilarious.
00:10:54.000 I wouldn't vote for him, but man, oh man, was that guy funny.
00:10:57.000 And that was the craziest thing to me.
00:10:58.000 I remember it was like 2017 and 18.
00:11:00.000 I'm just laughing nonstop at this guy.
00:11:02.000 And I'm like, people were so angry.
00:11:04.000 And I'm like, he's funny.
00:11:05.000 The funniest thing when he, uh, this one was so simple, but it's the one that popped me the hardest is during the democratic debate.
00:11:12.000 He literally just tweeted boring.
00:11:15.000 I was like, hell yeah, dog.
00:11:16.000 That rules.
00:11:17.000 That's the political commentary I am here for.
00:11:20.000 No.
00:11:20.000 That's the reality.
00:11:21.000 And then I guess people just didn't like the mean tweets.
00:11:24.000 And so they're much, you know what's really funny?
00:11:26.000 Actually, I'll put it this way.
00:11:27.000 There's a viral meme going around called Donald Trump predicted everything that's happening.
00:11:31.000 Have you guys seen it?
00:11:32.000 No.
00:11:33.000 It's literally just like one Trump rally where Trump is like, he's going to shut down your
00:11:37.000 oil and gas industry.
00:11:38.000 Oh no.
00:11:39.000 It's true.
00:11:40.000 There's going to be a wave of immigrants coming across the border.
00:11:43.000 And I'm like, oh, check, check, check, check.
00:11:46.000 You know, and this got me thinking to the craziest thing about it.
00:11:48.000 Do you guys remember Joe Biden said he was going to transition off gas?
00:11:51.000 He was.
00:11:52.000 In the debate.
00:11:53.000 Right, right, right.
00:11:53.000 He's outright.
00:11:54.000 And so when I say, you know, I said this last week, when you're looking at your gas prices, if you're an environmentalist, anti-climate change person, your your only option is to be like to is to accept the problem.
00:12:05.000 Because Joe Biden said he wanted to fight climate change.
00:12:07.000 Joe Biden said he wanted to get off fossil fuels.
00:12:09.000 That's what he's trying to do.
00:12:11.000 Here we go.
00:12:11.000 Okay, so that was actually my question, because I remember the people liked me last time I was on the show because I asked questions instead of just yelling at a talking point.
00:12:21.000 So I legitimately don't know the answer to this.
00:12:23.000 This is going to sound like it's a rhetorical, well, what do the conservatives want to do?
00:12:27.000 But what do the conservatives want to do?
00:12:29.000 Like, is there a solution that... because everyone's furious, right?
00:12:33.000 But what are people saying?
00:12:34.000 Like, what would Biden have to do?
00:12:36.000 What are people asking him to do that would actually make the prices go down?
00:12:39.000 Let me take the non-conservative approach right away and just talk the rational approaches.
00:12:44.000 We need to start building nuclear power plants, ASAP.
00:12:48.000 The Green New Deal needs to actually be about green energy.
00:12:51.000 So this is like, I'll mention the conservative stuff in a second, but if the Green New Deal was like, we will invest X amount of dollars into wind turbines in these areas, we will open up these areas for geothermal, or then I'd be like, wow, looks like we got a plan.
00:13:06.000 That all sounds progressive and wonderful.
00:13:07.000 But that's not what's in it.
00:13:09.000 What's in it is like free health care for marginalized people and like college and stuff like that.
00:13:13.000 And it was a resolution that never actually proposed any of this stuff in the first place.
00:13:17.000 Right away, I'm just like, OK, well, the progressive solution should be draft me, draft an actual bill addressing the expansion of green energy.
00:13:24.000 I suppose what they'll do is they'll say the Build Back Better bill.
00:13:27.000 And I'm going to be like, but that was just so much.
00:13:29.000 That's five trillion dollars of random other things.
00:13:31.000 So I will say right off the bat, we got to get off, get over the stigma of nuclear power.
00:13:35.000 The conservatives are outright just saying, give the permits back for oil and gas leases because they shut some down in the Gulf, in Alaska and on federal lands.
00:13:45.000 Oops.
00:13:45.000 Some of these were due to court rulings, challenges by environmentalists, not because Biden was doing it.
00:13:49.000 Biden ultimately pulled them because of losing court court cases.
00:13:52.000 What is it? Keystone pipeline.
00:13:55.000 Turn it back on. That'll alleviate some of the speculative, speculative pressures.
00:13:58.000 Oh, no. I remember protesting that.
00:13:59.000 Right. Exactly. Well, so this one.
00:14:02.000 Oops. Sorry. Thanks, Jamie.
00:14:03.000 It's a little more. Thanks for shutting it down.
00:14:06.000 Yeah, my bullet casings will go elsewhere.
00:14:09.000 Conservatives want a removal of certain regulations that make it harder for oil and gas exploration.
00:14:14.000 So, you know, we talk about why aren't we seeing more oil companies going to certain areas and trying to start developing more oils?
00:14:20.000 Because it's difficult.
00:14:21.000 Because there's government regulations.
00:14:23.000 So the cost-benefit analysis is we shouldn't invest the money here because it's not going to pay out in the long run.
00:14:28.000 It is complicated, to be completely honest.
00:14:32.000 Saudi Arabia plays a big role because they're either overproducing or underproducing, which changes things.
00:14:37.000 The war with Ukraine and Russia absolutely plays a big role.
00:14:40.000 But to put it simply, oil exploration alleviates some of the tensions on regulations, allow these leases back on federal lands.
00:14:50.000 If it were Donald Trump as president, he would convene the CEOs of all these companies immediately.
00:14:55.000 He would say, tell me what you need to get the prices down, and then have a plan.
00:14:58.000 Or just very simply, undo everything that the Biden administration has done, because everything they have done is standing in the way of businessmen, is standing in the way of people being able to fix this problem with, of course, regulations, taxes, and big government bureaucrats who want their cut of the middle, and are standing in the way of actual things that could help the people, that could help the environment, But but we're not focusing on any.
00:15:19.000 So that was my follow up question is, you know, as a table of people who I think are as suspect of corporations as we are, the government is by loosening those regulations and giving more control to the corporations.
00:15:31.000 Do we trust them to do the right things like the oil and gas?
00:15:33.000 No, their profit.
00:15:34.000 Right.
00:15:35.000 So that's the problem there too, right?
00:15:37.000 I'd like to see like a focus on biodiesel and teaching people how to make biodiesel that's legitimate, like pure and clean that we can regulate and use.
00:15:45.000 But relying on corporations to do it, they're going to have profit motive as they're told.
00:15:48.000 Yeah, but let's get back to nuclear a little bit, because like there's a big misperception about modern nuclear versus like old school nuclear, which is super nasty.
00:15:56.000 Like this is actually one of the few things I don't know if you've seen the Bill Gates documentary on Netflix.
00:16:02.000 One of the few things that Bill Gates is actually invested in, which is actually kind of interesting.
00:16:05.000 This guy's defending Bill Gates over here.
00:16:10.000 I'm going to slide by Bill Gates every day.
00:16:13.000 He's an Epstein freak show.
00:16:14.000 But his nuclear power plants are actually really interesting because the waste is extremely minimal.
00:16:20.000 I think they're actually powered by waste.
00:16:22.000 And the risk of meltdown is just vastly less.
00:16:25.000 So like Trump was not talking about nuclear, like no one's talking about.
00:16:29.000 They're afraid to talk about nuclear, even though the technology has has evolved so much.
00:16:33.000 I heard Michael Schellenberger, who's running against me.
00:16:36.000 I just lost.
00:16:37.000 Yeah, he's he's he was the only guy I've heard.
00:16:39.000 I was talking about it where I was like, oh, this makes sense.
00:16:41.000 It was the first time I heard it.
00:16:42.000 Yeah, he's super smart.
00:16:43.000 You can also fusion is another kind of nuclear.
00:16:45.000 So you got fission, you got fusion, completely different processes.
00:16:48.000 They're both called nuclear because it's ignorance.
00:16:50.000 Basically, fusion is completely I mean, once you give if if you can ignite fusion and get
00:16:54.000 it going. It basically gives you an unlimited fuel supply.
00:16:57.000 I think that's.
00:16:58.000 That scares people in power.
00:17:00.000 Jamie, I want to push back on your point here just a little bit, especially when it comes to the public-private sector debate, especially when it comes to the larger topic that we're discussing here, the Biden administration, because what we essentially have here is the government, the Biden administration, picking and choosing what corporations are going to prosper, which ones are not.
00:17:17.000 So when you look at the Green New Deal, when you look at all the proposals that he's pushing, it's directly going to benefit countries that he has business dealings with, with his son, like China.
00:17:26.000 Especially when they are the ones getting the raw minerals to get the solar panels, to get the batteries.
00:17:32.000 So when you look at, you know, any kind of corporation, the most powerful ones, the most unaccountable ones, are the ones that have friends in Washington, D.C.
00:17:41.000 And my argument that you kind of countered back on and asked the question on, you know, do we trust these corporations?
00:17:47.000 No, obviously, but it's because of the government intervening and directly playing a key role in which corporation does well and which corporation does not well.
00:17:55.000 I agree that last part, I don't trust the corporations either way.
00:17:57.000 I don't trust anybody.
00:17:59.000 But that last part is spot on.
00:18:00.000 But when it comes to the free market and allowing individuals to choose what they want, giving them choice, giving them decentralization rather than the monopoly of force, government coming in and saying you can only buy from this corporation, you can only buy solar, is the problem because it leads to more problems, it leads to the government abusing their power.
00:18:17.000 Let me point out too that There was a chart I was looking at last week that shows the Trump administration's last year, which showed inflation and wages were stable.
00:18:26.000 And then a month after, or like a couple weeks after Biden is inaugurated, it inverts.
00:18:31.000 Wages collapse.
00:18:32.000 Inflation skyrockets.
00:18:34.000 It's been a year and a half, but it's still on track going the exact same direction.
00:18:37.000 So certainly by now, Biden could have done something.
00:18:39.000 So I think the issue with gas prices too, it's not just about the direct regulations and policies on gas.
00:18:45.000 I know a lot of people immediately say like, what has Biden done directly to affect gas prices?
00:18:50.000 You can also come out and just be like the unemployment benefits expansion, which I understand had good intentions, but then you're flooding the market with the mass release of currency, which is going to devalue the currency.
00:19:01.000 So even if Biden didn't do anything on these other issues, Keystone was big because that caused massive speculation.
00:19:06.000 If he did nothing other than the economic policy, which resulted in inflation, gas will go up along with it because you've got to pay truckers.
00:19:14.000 One of the things about Keystone, That's so brutal.
00:19:18.000 How do we get oil shipped if we're not doing it by pipeline?
00:19:21.000 Freighters?
00:19:23.000 Trucks?
00:19:23.000 That's even worse.
00:19:25.000 It takes fuel to transport.
00:19:26.000 You put it in a big pipe, there's fears about leaks.
00:19:28.000 Totally get it.
00:19:29.000 I think the big issue is, it's fair to say, we're addicted to oil.
00:19:32.000 And I don't mean that to be a dick, but I mean, we built a civilization off of fossil fuels.
00:19:37.000 Fossil fuels are incredible.
00:19:39.000 The energy return on energy invested is absolutely amazing.
00:19:42.000 And so we were able to literally create human life.
00:19:45.000 When we build these machines that can harvest a field faster and better, then all of a sudden people are fat and happy.
00:19:51.000 They're fed.
00:19:51.000 They're gonna have babies.
00:19:52.000 They're gonna have tons of babies.
00:19:53.000 But you're gonna need that resource to maintain that level of life as well.
00:19:56.000 It freed the slaves.
00:19:57.000 You brought this up a while ago.
00:19:58.000 It was the steam-powered engine in the Industrial Revolution in the 1800s that basically gave... I mean, it was the cotton gin is the big part of it, but...
00:20:05.000 It opens up the fossil fuels have opened us up to no longer be slaves to labor in a lot of ways.
00:20:10.000 Yep. But now to what you were saying, Luke, about Biden picking favorites. I mean,
00:20:14.000 he won't even mention Elon Musk with regards to electric cars or solar. It's like, talk about.
00:20:20.000 Talk about the players in the industry. Yeah Elon has a low ESG score
00:20:26.000 He has a renewable energy car company that's a pioneer in its field and again lots of critical things I could say
00:20:33.000 about Elon But in that particular field all the green people or the
00:20:37.000 peaceniks should be all happy and good You tweeted a meme with him pool dog
00:20:41.000 Exactly, exactly. So he tweets interesting things now and he's the main person that pushes the kind of narrative,
00:20:49.000 pushes the conversation, and that's why he's the one that's being attacked.
00:20:51.000 Which, if you want to strike back against tribalism, that is the moment where you go, you know what, I don't agree
00:20:56.000 with Elon Musk, but he's trying to do something to help us with our energy addiction.
00:21:01.000 I want to do it.
00:21:02.000 Let's come together and get people from all sides to do this thing that shouldn't be political.
00:21:07.000 It's okay to agree with someone on a single topic.
00:21:10.000 You're not endorsing their entire life.
00:21:12.000 You know, regarding this mass printing of money, I've kind of had a moment of clarity.
00:21:17.000 They're basing it on what's called modern monetary theory, invented shortly after World War II, I think, thereabouts.
00:21:22.000 And the idea is you print as much as you need to To stimulate and to build your infrastructure so that you can produce things.
00:21:28.000 But what's happening now is they're using that idea, we're going to print as much as we need so that, fill in the blank, they're just giving it away to people's bank accounts.
00:21:34.000 They're not building the infrastructure that modern monetary theory needs.
00:21:38.000 Those special private interests, their buddies in the upper echelons in the establishment are getting secret Federal Reserve bailouts and they're getting money that of course makes sure that they never lose money.
00:21:47.000 This is when they make money their profits are theirs, but when they they lose money
00:21:51.000 They have losses the Federal Reserve literally steps in it's like yeah, we'll cover that for you. Don't worry about that
00:21:55.000 Yeah, that's not a free market. No, and it's not modern monetary theory
00:21:58.000 They're calling it that but it's an aberration of that system in there. It's basically just liquidation of the US
00:22:04.000 economy.
00:22:04.000 No, no, no, no.
00:22:05.000 It's the modern largest transfer of wealth in recorded human history that's happening right in front of our eyes.
00:22:11.000 It started in 2008 during the housing market crisis.
00:22:14.000 Ben Bernanke was literally orchestrating secret international bailouts of financial institutions all around the world with the power of the Federal Reserve that literally just went on the computer and just typed in zero, zero, zero, zero.
00:22:26.000 Here you go.
00:22:27.000 Here's the money.
00:22:28.000 Meanwhile, you have to work for it.
00:22:29.000 You have to slay for it.
00:22:30.000 You have to work a nine-to-five job for it.
00:22:32.000 The bankers created a system.
00:22:33.000 Zero, zero, zero, zero, mine.
00:22:35.000 Good luck, peasant slave.
00:22:37.000 We're going to devalue the dollar.
00:22:38.000 Good luck paying for food this week or this month, whatever the hell it is.
00:22:42.000 People are getting screwed over royally.
00:22:43.000 And this is why Jake Paul saying this is so important right now, because it brings this point to a reality that a lot of people can't ignore.
00:22:51.000 It's not just that.
00:22:52.000 It's a barometer.
00:22:54.000 Jake Paul coming out and saying it, how many more people do you think are thinking the exact same thing who are not political people?
00:23:01.000 So it's like that joke I think it was from Simpsons or Family Guy where they're like, we received 7 complaints last night, that means 80 billion people were upset.
00:23:11.000 Simpsons, that's what it was?
00:23:12.000 Yeah.
00:23:13.000 it's like ignoring you know if you're sick and you ignore your sickness it's
00:23:17.000 like looking at Biden and how he literally cannot communicate and and
00:23:24.000 people are just afraid to confront that illness that like our society has if I
00:23:29.000 go to the doctor they'll tell me that I'm sick and I don't want to think I'm
00:23:32.000 sick right well that's on purpose though that I I think that is made to be a distraction for the larger agenda that he is pushing, that is directly benefiting the people who are on the ship, robbing and looting everything for themselves as the ship is sinking.
00:23:46.000 The ship is the United States.
00:23:48.000 It's going down.
00:23:49.000 And there's a lot of criminals out there that are taking everything they can for themselves.
00:23:52.000 They're not criminals because they don't adhere to government law because they're multinational.
00:23:56.000 They don't have a country that they have to play by the rules of.
00:23:58.000 It's got the Bank for International Settlements in Switzerland that they're running money through and they're just siphoning our wealth.
00:24:03.000 It's insane.
00:24:04.000 Let's pull up this story and then we'll keep going on that idea.
00:24:06.000 So we have this from the Wall Street Journal.
00:24:08.000 S&P 500 enters bear market as Dow and Nasdaq fall.
00:24:13.000 Guys, I don't have a savings account.
00:24:14.000 I don't know what any of those words mean.
00:24:17.000 You're screwed.
00:24:18.000 It means bad And I will also point out that one of the reasons Jake Paul
00:24:24.000 slammed Joe Biden I think the main reason was right there in the middle his
00:24:28.000 crypto portfolio probably tanked and he was like so a lot of people bought into crypto and
00:24:34.000 You know, I think it was fantastic I'm not selling.
00:24:38.000 But everything is taking a major hit.
00:24:40.000 And crypto took a bad hit.
00:24:42.000 And it's been getting beaten up pretty bad.
00:24:43.000 Even Bitcoin's getting beaten up.
00:24:45.000 But this is the market.
00:24:48.000 This is the ship sinking.
00:24:50.000 You look at the graph of inflation.
00:24:52.000 I mean, it's 8.2, it's 8.4, it's 8.6.
00:24:55.000 Worse than it's ever been since World War II.
00:24:57.000 Consumer prices are skyrocketing.
00:25:00.000 Yo, hope you guys are ready for August.
00:25:02.000 But, you know, I'll say this.
00:25:04.000 While the regular people are suffering under this, big, massive financial institutions are getting free cash from the Federal Reserve, and to the tune of how much, we don't even know.
00:25:13.000 Trillions.
00:25:14.000 Trillions of dollars.
00:25:15.000 Probably.
00:25:15.000 It's off the books.
00:25:16.000 Off the books.
00:25:17.000 They're buying up property, some of these big firms.
00:25:19.000 They're buying houses of people.
00:25:20.000 They're offering 30% premiums because the money doesn't matter.
00:25:24.000 The number amount doesn't matter.
00:25:25.000 What matters is they get the land from you.
00:25:27.000 Dude, this is what has been bothering me.
00:25:29.000 If my crypto value goes down 90%, but they get all the funny money from the Federal Reserve, they can just buy it up with funny money.
00:25:37.000 Bro, I've been saying this.
00:25:39.000 It's unconscionable.
00:25:41.000 I was saying this a month or so ago, probably not as eloquently as I could have because I'm not a crypto expert, but I want to point out to people that the centralized banking system that allows them to effectively expand the money supply on a whim, They can just buy crypto from people.
00:26:00.000 They can buy as much as they want and control the market.
00:26:03.000 Yeah, they can engineer because they're behind the exchanges.
00:26:07.000 A lot of the crypto exchanges are actually not working on behalf of the customers because they have all the data that can kind of predict what is going to happen.
00:26:15.000 They know where all the leverage points are.
00:26:17.000 So... They're betting against customers.
00:26:19.000 Yeah.
00:26:20.000 But I don't want to get wrapped up too much in the crypto stuff.
00:26:22.000 The point is the entirety of the market.
00:26:24.000 Like, right now, the market's dropping.
00:26:26.000 Ultra-wealthy people are like, oh, fantastic.
00:26:29.000 Because here's how it goes.
00:26:30.000 It goes its way every time.
00:26:31.000 You go back throughout history.
00:26:32.000 Market drops.
00:26:33.000 Poor people panic.
00:26:34.000 They sell as much as they can because they've got to buy bread.
00:26:36.000 Rich people, who don't have to worry about it, buy up all this property and assets for pennies on the dollar.
00:26:41.000 Then the market goes back up and they own it all.
00:26:43.000 And then they sell it when it's on top.
00:26:45.000 That's another.
00:26:46.000 If you're a banker, if you're working at the Federal Reserve, you're able to predict When assets are going to be at their highest and their lowest, why wouldn't they take advantage of that?
00:26:55.000 It's natural that they would.
00:26:56.000 So this is not capitalism.
00:26:58.000 This is feudalism 2.0.
00:26:59.000 This is a new way of indentured servitude that we're all going through right now, whether we like it or not.
00:27:06.000 Some people say that might be a hyperbolic statement, but how else could you explain what's happening right now in the real world?
00:27:12.000 With so much pain, with so much suffering, with economic inequality that's being driven by these people, and then they have the gall to come out publicly and say, corporations are greedy.
00:27:22.000 It's all their fault. I just want I want to mention real quick in the during the during the November 2020 election
00:27:28.000 cycle We were regularly hitting over
00:27:30.000 100,000 concurrent viewers on this show Now we're getting right now. We have about forty one
00:27:35.000 thousand five hundred people. So it's a lot of people This is great. And you guys rock. My question is those
00:27:39.000 sixty thousand other people who'll be watching in real time Where'd they go?
00:27:43.000 Are they paying attention still?
00:27:45.000 Because everybody pays attention when the election happens.
00:27:49.000 Where are you at now?
00:27:50.000 Because I'm willing to bet a lot of these people voted for Biden.
00:27:52.000 Not every person who watches, most of them probably voted for Trump, to be honest.
00:27:56.000 But I've seen so many people who have told me, one, they vote for Biden, and two, they did watch at some point during the election.
00:28:02.000 And I'm just like, how does that happen?
00:28:03.000 Like, how do you watch our show?
00:28:05.000 And there were, you know, a lot of moderate people watch.
00:28:07.000 Well, no, I just remember this under Obama when I was very, you know, pro-Obama when he was first elected.
00:28:15.000 And first of all, I love that the show keeps me grounded on both the left and the right, because hearing you, Luke, talk about the corporations, I was like, got it.
00:28:22.000 I'm pro-gun, so I can use it to kill the rich.
00:28:27.000 That's not true.
00:28:27.000 That's not true.
00:28:28.000 I'm kidding.
00:28:30.000 I know.
00:28:30.000 I think that I remember everyone voted for Obama and they were like, change, hope, we're set, everything is great.
00:28:37.000 And then they just stopped holding him accountable.
00:28:40.000 And then he just got kind of more centrist, more centrist, more centrist.
00:28:44.000 The Obamacare wasn't at all, there was nothing like universal healthcare about it.
00:28:49.000 It wasn't socialized medicine.
00:28:51.000 It was, you know, there was some good stuff, but it was a lot of like corporate meh.
00:28:56.000 And I think people just they treat elections like sports and they're excited and they're gossipy and they're watching shows like this because they feel like they're a part of something where they can actually make a difference because one day they go in and their voice is heard and blah blah blah and then they just leave because it's not fun and they don't hold people they don't want to hold people accountable because that's where it gets hard.
00:29:14.000 That was the 2008 election was the prime example of reality TV meets political elections.
00:29:20.000 I know exactly what you're talking about.
00:29:21.000 People had this fervent exuberance for hope and change.
00:29:25.000 They voted for Obama.
00:29:26.000 Obama got in and then they wanted him to do it.
00:29:28.000 The president cannot do anything.
00:29:30.000 Their job is to make sure it doesn't go crazy.
00:29:32.000 They veto stuff.
00:29:33.000 And if we had posed a revolution against the financial markets and taking control of our monetary supply, Obama would have let it happen.
00:29:40.000 He wasn't a president that would have stopped us, but he couldn't do anything on his own.
00:29:44.000 They know where he lives. And I think we were talking about this before we went on air. Maybe
00:29:48.000 we were talking about this earlier, Tim, where it's like the people who voted for Biden should
00:29:53.000 be... Oh no, I was talking about this with Taylor. The people who voted for Biden should be the most
00:29:57.000 tuned in, should be the most outraged, should be the most wanting to hold them accountable.
00:30:01.000 Because they're like, this is our fault. We need to keep them in line.
00:30:05.000 So I guess what I was trying to say before is there's a lot of people who aren't in politics
00:30:09.000 They come in, and I think it's fairly obvious, you know, the 2020 cycle was massive, it was Trump.
00:30:14.000 They come in, they hang out, they watch.
00:30:16.000 Something happens.
00:30:17.000 Now we're all still here at the front of the ship, but all those people have walked away and they're at the lower decks and they're playing, you know, they're doing their thing.
00:30:24.000 They're feeling the heat, they're feeling the pain, they're watching the ship sink.
00:30:28.000 But we need to get these people back up here to pay attention and, you know, be involved again.
00:30:31.000 I think a lot of people are tired.
00:30:33.000 A lot of people are sick of the lies.
00:30:34.000 I think a lot of people are disillusioned with our current system because they vote left, they think they're going to get hope and change, and they get drone bombs and domestic spying.
00:30:42.000 100%.
00:30:42.000 They vote right-wing and Republican, and they're going to think they're going to have conservative fiscal policies, and they don't.
00:30:49.000 They have the same kind of reckless spending as they did during the Democrats.
00:30:52.000 And then they have the Trump scene, and people are questioning themselves like, what?
00:30:57.000 Why should I even be involved in this?
00:30:58.000 I gotta worry about myself.
00:30:59.000 I gotta worry about making ends meet.
00:31:01.000 I gotta worry about my future.
00:31:03.000 Which, again, upward economic mobility is being taken away from individuals in such a drastic way.
00:31:10.000 People are being looted.
00:31:11.000 People are being gutted.
00:31:12.000 Their savings are being obliterated right in front of them.
00:31:14.000 Inflation is one of the largest taxes the government could wage on the public, and they're waging it in such an extreme way.
00:31:20.000 It's absolutely astounding to see people just be able to make ends meet.
00:31:27.000 At this point.
00:31:27.000 Let me tell you if you didn't know, Jamie.
00:31:29.000 First, I'll ask you some questions.
00:31:31.000 Okay.
00:31:31.000 How's the economy doing?
00:31:33.000 Pretty bad, bud.
00:31:33.000 Pretty bad.
00:31:35.000 How are the American people faring with these high gas prices?
00:31:38.000 You know what?
00:31:38.000 I'm gonna go out on a limb and say they're not doing great.
00:31:41.000 So the US government sent $12 million, I believe, to Pakistan for gender studies programs.
00:31:47.000 Is gender studies code for gas money for Americans?
00:31:50.000 It's not.
00:31:51.000 Is it Pakistani for, just kidding?
00:31:55.000 Here's the reason.
00:31:56.000 I'll give you the general reason that they've given for doing this.
00:32:00.000 Is that the U.S., by giving money away, it makes people use it and retains confidence in the U.S.
00:32:06.000 dollar.
00:32:07.000 So the idea is, let's just give out all this money around the world, then everyone will be like, I got U.S.
00:32:12.000 dollars, I can use it.
00:32:14.000 Yeah, but I think that's the premise of modern monetary theory, that they're going to invest in production.
00:32:18.000 And what they're doing is they're putting it in bank accounts and getting interest, or they're buying crypto and putting it in Panama.
00:32:24.000 It's an aberration of what's supposed to happen.
00:32:26.000 I don't think it's that simple, Tim.
00:32:28.000 I think it's the US government buying political favors.
00:32:30.000 And I think this is why we send speed boats to Sri Lanka.
00:32:41.000 Sri Lanka, by the way, if you look at what's happening there socially and politically, it should be a warning for the rest of the world right now.
00:32:48.000 You have people rioting, you have people angry, you have people going hungry.
00:32:52.000 And I think that's going to be happening in a lot of other places around the world.
00:32:56.000 Maybe not in the United States anytime soon, but I think definitely there's going to be a lot of economic instability in Africa and the Middle East coming very soon.
00:33:04.000 That's going to disrupt a lot of normal life flow for everyone because it's also going to spur on migration crisis as well.
00:33:11.000 Well, I wanted to go back to what you were talking about, Luke, about why regular people have stopped paying attention.
00:33:17.000 And I also, I was thinking about it.
00:33:19.000 I wonder if it's because we have become such a reality TV show world when it comes to how we digest our politics, that going after a Trump tweet, because Trump tweeted something mean, is easy.
00:33:32.000 Examining why, you know, I mean, I would like to think I'm fairly informed and I had to ask all of you guys, well, what will we do for gas prices, right?
00:33:40.000 Like those are complicated issues.
00:33:42.000 I mean, in fact, it's one of the reasons Trump won, right?
00:33:45.000 It's he could actually appeal to everybody.
00:33:48.000 You ask a lot of people, you know, what do we do for the economy?
00:33:51.000 And then they're like, well, it's going to start with election reform and they're checked out.
00:33:54.000 But you just go, ah, the Mexicans are taking your job.
00:33:56.000 You're like, got it.
00:33:57.000 That's easy, right?
00:33:58.000 And so I wonder if with the Biden stuff, I think part of it is they're like, my guy won.
00:34:03.000 I think part of it is they're burnt out.
00:34:05.000 Like you were alluding to, you know, I talked to my friends who have big podcasts and not the political world, but in like self-development and stuff.
00:34:12.000 And they're just like, dude, in a couple of years, people aren't even good.
00:34:15.000 Like Trump burned people out, you know, like all those tweeters and stuff.
00:34:19.000 And now that it's Biden, even though the country isn't doing well, There's nothing, there's no, you know, big tweets.
00:34:27.000 There's no big, you know, drama and they just, I think they're just sort of checked out.
00:34:30.000 They're embarrassed too.
00:34:32.000 Yeah.
00:34:32.000 Because it's kind of like you see, you can't not be, you see him communicating and like, why would they want to come out and engage it and talk about politics when it's like, Look what we just did.
00:34:43.000 Our guy.
00:34:44.000 But they knew this.
00:34:45.000 I mean, when he was running to be president of the United States, he wasn't doing rallies.
00:34:48.000 He was known as the basement candidate.
00:34:50.000 He was known as just doing wonky shots from the corporate media that were incoherent then.
00:34:56.000 People knew what they were getting themselves into.
00:34:58.000 It's gotten worse, though.
00:35:00.000 I don't think people did.
00:35:02.000 I told this to Bannon.
00:35:03.000 Trump was anti-elected.
00:35:05.000 People were voting against Trump, not for Biden.
00:35:07.000 There's definitely a feeling that that's happening on the other side of the TV screen.
00:35:12.000 It's really disturbing when you realize how connected things are and that, no, this is all here with you right now.
00:35:17.000 This is you.
00:35:18.000 This is all coming.
00:35:19.000 It's here.
00:35:20.000 We are here together, even though there's a screen between us.
00:35:22.000 This is it.
00:35:23.000 It probably also, sorry Tim, has to do with why it's 40k versus 100k has to do with the
00:35:30.000 algorithms.
00:35:31.000 I mean, it's like what is being promoted at certain times.
00:35:35.000 Or do you truly think that it's an organic thing?
00:35:38.000 I think the crazy thing is how right now, we talked about it last week with Republicans
00:35:44.000 just responding to everything Democrats say.
00:35:46.000 We got news that, actually I don't know if we have this article pulled up, but we should talk about it.
00:35:51.000 We've got news right now, 10 senators, Republicans, have agreed with Democrats on gun control.
00:35:57.000 And so I remember, you know, Luke was telling me this.
00:35:58.000 He was like, yeah, this was the other day.
00:36:00.000 He's like, 10 Republican senators are cutting a deal with Democrats for gun control.
00:36:04.000 And I was like, no, that can't be right.
00:36:06.000 That puts them over the filibuster.
00:36:07.000 They're going to get that passed.
00:36:09.000 Turns out it's true.
00:36:10.000 So why is it?
00:36:13.000 That you never have the inversion of that story.
00:36:15.000 Where are the 10 Democrats to side with Republicans on gun access?
00:36:20.000 Never happens.
00:36:22.000 And so what you see is the narrative is always skewing in the direction of the Democratic establishment and what progressives want.
00:36:29.000 Or just eliminating people's rights.
00:36:31.000 Right, right, right.
00:36:32.000 It's because the media is manipulating people and it's controlled by forces outside the United States.
00:36:36.000 Well, so the point I was trying to get to with Bill, you mentioned algorithms, is that the first iteration of narrative control is mainstream media, New York Times.
00:36:45.000 They're losing that.
00:36:46.000 What's happening now is big tech is just, all they got to do is this.
00:36:50.000 If you've got 50 conservatives and 50 Democrats, ban one conservative, and now it's a lopsided match.
00:36:57.000 Let's say it's five-on-five basketball.
00:36:59.000 You ban one conservative and it's 5v4, who's likely going to win?
00:37:02.000 Oh, without a doubt, the five people.
00:37:04.000 And that's basically what's happening.
00:37:05.000 I think there were, with social media, you're 100% right.
00:37:10.000 Like watching the way conservatives get banned.
00:37:12.000 I don't know if there was shadow banning back in the day when I was like super progressive and woke, but I was straight threatening to fight senators every day and just like getting followers, getting verified.
00:37:23.000 And now my like tiny Instagram is shadow banned and I don't even post conservative stuff, but I have famous followers who are like anti-vax or you know, whatever.
00:37:32.000 And I'm certainly shadow banned on Twitter.
00:37:33.000 But you're also, you're anti-authoritarian, I would assume.
00:37:36.000 Yes.
00:37:38.000 Oh no, I'm here to pitch my pro-authoritarian platform and promote my comedy days at the House of Comedy next week in Delaware.
00:37:44.000 But like, meaning that they punish right and left that are anti-authoritarian.
00:37:48.000 But the only thing I was going to say with you, the only thing I kind of disagree with what you said about the, is, I mean, I remember back in the day, the amount of bills that were mostly like, anti-corporation, you know, whatever, that the Democrats
00:38:02.000 would, some of them would pass, but all the centrist, like Max Baucus types would sea block it
00:38:08.000 and Republicans would come over, whatever. I think the reason in this case with guns is the
00:38:14.000 Democrats don't have much backbone, Right?
00:38:16.000 Like they're wishy-washy on everything.
00:38:18.000 They sell out everything.
00:38:20.000 But I feel like guns and abortion are like their two things.
00:38:25.000 Where there's not going to be compromise on that.
00:38:27.000 Or else it's like, just go be a Republican.
00:38:29.000 But I do mean in more modern history.
00:38:32.000 Like under the Biden administration, I agree with you.
00:38:34.000 Yeah, with Trump.
00:38:35.000 It was the populist insurgency in 2016.
00:38:38.000 Bernie loses, Trump wins.
00:38:40.000 Bernie failed at storming the gates of the Democratic Party.
00:38:44.000 Trump succeeded at storming the gates of Republicans.
00:38:46.000 So now you get people like Bill Kristol, you get these neocons, the Lincoln Project types, who are effectively Democrats.
00:38:52.000 Which I am not happy about.
00:38:54.000 So crazy.
00:38:54.000 party coalesced around with the Democrats are.
00:38:57.000 Since then, what have we seen?
00:38:59.000 So we get a Donald Trump presidency in the beginning of 2017.
00:39:02.000 You've got Republicans controlling Congress.
00:39:04.000 They do nothing.
00:39:05.000 In fact, they agreed with the Russiagate investigations and moved forward saying, well, we've got
00:39:08.000 to be reasonable.
00:39:09.000 So since then...
00:39:10.000 So, Liz Cheney, our progressive hero.
00:39:13.000 So I think it was Michael Maus who brought this up.
00:39:15.000 Where is the argument from the right about repealing the NFA or guaranteeing gun access?
00:39:22.000 It's always a compromise on giving away our rights.
00:39:25.000 I think NFA, people need some education.
00:39:27.000 I thought the NFA was an entity.
00:39:31.000 It's just an act that you can repeal.
00:39:32.000 It's the bill. Yeah, it sounds too much like NSA. People think, oh, I'm not getting rid of an entire department.
00:39:37.000 No, the NFA has to stay. We need the National Firearms Department.
00:39:40.000 You know, ATF.
00:39:42.000 Yeah, ATF. They think NFA is ATF, but NFA is an act that basically makes a lot of weapons.
00:39:47.000 Now, refresh me. Tell me exactly what it is.
00:39:49.000 So you know how they banned pot the first time? You needed a tax stamp.
00:39:53.000 So they were like, well, we can't ban it.
00:39:56.000 Let's make it so you need a tax stamp to buy it, and then we'll stop issuing tax stamps.
00:40:00.000 The National Firearms Act was basically like, certain weapons can be classified, requiring you to get special forms and pay a special tax.
00:40:06.000 At the time, it was the equivalent of $3,000.
00:40:07.000 Today, it's $200.
00:40:08.000 It's never gone up.
00:40:10.000 But you gotta get an ATF background check, an FBI background check.
00:40:14.000 You gotta get fingerprinted at your local police station or sheriff's office.
00:40:17.000 It takes upwards of six months, maybe a year, to be able to get one of these items.
00:40:21.000 So they didn't ban them.
00:40:22.000 That would violate the Constitution.
00:40:24.000 They just made it prohibitively expensive and time-consuming to get.
00:40:27.000 I keep thinking like we're in unprecedented times with this inflation with this technology with the connectedness of the of the world but at the same time I keep thinking of 1928 and like right before the depression or are we in the midst of the beginning of the Great Depression times two and like Nazi Germany when they started slowly taking away gun rights.
00:40:45.000 Then all of a sudden the Reichstag caught fire, the parliamentary building, and Hitler blamed the communists and then just took everyone's rights away and then stripped everybody.
00:40:54.000 With the Nebelung Act, which a lot of people compared to the Patriot Act.
00:40:58.000 Yes.
00:40:59.000 And people have even said that Hitler started that fire as a false flag and then blamed the communists just to seize power.
00:41:05.000 So we need to be... This is like that shooting in Texas.
00:41:08.000 And I'm saying specifically, I'm not saying anything about that being a false flag or anything.
00:41:11.000 I don't think it was, but... Lydia just perked up.
00:41:14.000 For people to use tragedy as an excuse to seize our guns is insane, or our rights are guns.
00:41:20.000 That's what I'm talking about, not the guns themselves.
00:41:21.000 It's the right to own the gun.
00:41:22.000 This is something that I've been talking about since 2001 in New York City, is that whenever you give up your rights during emergencies, governments will make up emergencies so they can take away your rights.
00:41:34.000 And I think that's an important perspective to understand here, because we have allowed the government a lot, especially within the last two and a half years, and they have essentially weaponized the intelligence agencies, they have weaponized the CDC, the FDA for their own personal benefit, they have become more and more political, and now that same government that, of course, has been locking you down, arresting people for not wearing a mask, arresting people for having a business open, are telling you that we need to protect you by taking away your ability to defend yourself.
00:42:02.000 Real quick.
00:42:02.000 I want to add I have I have oh I've woken people up.
00:42:08.000 I'll put it that way.
00:42:08.000 I guess I could say red pill by telling people Red flag laws.
00:42:12.000 This is what they're passing.
00:42:14.000 Yo, this is stop and frisk on steroids, you know, stop and frisk Jamie I was against it Yeah, yeah, yeah.
00:42:19.000 I'm against it.
00:42:21.000 So it's called the Terry Stop.
00:42:24.000 The idea is that a cop can walk up to any person for any reason.
00:42:27.000 Well, not necessarily any reason, but probable cause, which can basically be any reason.
00:42:31.000 Which is they don't look like all of us, was essentially what they were doing.
00:42:34.000 So let me get into that, because it's crazy.
00:42:37.000 So the idea is all this gun crime.
00:42:39.000 So you go to a neighborhood, you see someone you think is suspicious, you stop them, you frisk them.
00:42:44.000 You can't put your hands in their pockets.
00:42:45.000 If you feel something, you can take it out.
00:42:47.000 If it feels like drugs, if it feels like a weapon.
00:42:49.000 And so all of a sudden, you get these reports that young black and Latino men are being stopped by these cops all the time and basically being harassed, and people are getting hurt, people are getting arrested.
00:42:58.000 Turns out Bloomberg ordered them to go after black and brown kids, literally Latino and black kids in these neighborhoods.
00:43:05.000 And he said, that's where the crime's at.
00:43:07.000 And it's like, okay, so here's what I tell these activists that are like, yes, red flag laws.
00:43:11.000 I said, Stop and frisk.
00:43:14.000 Take a look at this report.
00:43:15.000 An NYPD, I think it was a NYPD guy, said cops were being ordered specifically to target black kids.
00:43:22.000 And then Bloomberg said, yeah, well, that's because it's their neighborhoods.
00:43:24.000 It's like, well, there's one thing to say neighborhood.
00:43:26.000 It's another thing to actually tell to go after a race.
00:43:29.000 So imagine what red flag laws are going to be.
00:43:32.000 In New York City, when they pass red flag laws, Dude, I've told these people, I said, I marked my words.
00:43:40.000 You are going to hear a story about some family man who's black.
00:43:44.000 The cops come to his house, bang on the door, and say, someone reported you.
00:43:48.000 Even if the dude doesn't even own a gun, just like stop and frisk, they're going to come in the house.
00:43:53.000 Bad things are eventually going to happen.
00:43:55.000 We should not allow the government rights to violate Fourth Amendment rights.
00:43:58.000 Because the challenge with stop and frisk, I'm sorry, with the red flag laws, is that people are ambushed.
00:44:04.000 It's non-adversarial court.
00:44:06.000 The government should, if the government wants to do something, they have to file.
00:44:09.000 You get a chance to respond.
00:44:10.000 It's due process.
00:44:11.000 But we heard these stories about the cops walking into someone's house and being like, you've been served a red flag order.
00:44:16.000 Give me your guns.
00:44:18.000 And the guy's like, no.
00:44:19.000 And then so I think we talked about it last week.
00:44:21.000 Baltimore dude died.
00:44:23.000 Because he was like, what do you mean you're coming to take my guns?
00:44:25.000 I have no idea what's going on.
00:44:27.000 Oh, it's your sister.
00:44:29.000 She said that you are a danger to yourself and others, so we're taking your weapons now.
00:44:33.000 There's been a number of incidences like this with states that have red flag laws that have implemented them and have hurt a lot of innocent civilians implementing them.
00:44:42.000 And just look at what we have to deal with swatting, with false reports of people calling in bullcrap to this specific location.
00:44:48.000 Imagine what's going to happen with red flag laws.
00:44:51.000 Imagine what's going to happen with neighbors who have disagreements with each other.
00:44:55.000 Partners, girlfriends, boyfriends that argue with each other.
00:44:58.000 They're going to abuse this system to call the police, to break down your door, to take away your ability to defend yourself.
00:45:04.000 I mean, that's a new level of insanity that we have to push back against.
00:45:09.000 These talking points are literally something that I've never heard.
00:45:12.000 I feel like this string of podcasts that I've been on should just be called the red-pilling of Jamie Hilstein.
00:45:18.000 Because I've never heard it.
00:45:20.000 Like what you were saying first, Luke, and what you were just saying now about Stop and Frisk, which I was adamantly against, or the Patriot Act, which I was adamantly against.
00:45:30.000 Again, I would like to think I'm a pretty intelligent dude, and I saw zero connection until now with the school shooting being used to take away gun rights as the war, because I know I'm anti-war.
00:45:43.000 So when the Patriot Act, right, like the left, you're anti-war.
00:45:47.000 So the Patriot Act goes, and I go, well, this is, I'm seeing that clearly, because I go, well, this is already something I'm against, and this is clearly taking away our rights.
00:45:55.000 But because also with my side, it's like, we're anti-gun.
00:45:59.000 A school shooting happens, and you go, that is also a tragedy.
00:46:01.000 That is a tragedy that destroys me emotionally, just like September 11th destroyed people emotionally.
00:46:07.000 And then they try to take the guns.
00:46:08.000 At no point would I have compared that to the Patriot Act after September 11th or after the war, because I'm just like, but it doesn't line up.
00:46:17.000 But it is.
00:46:18.000 It's the same thing.
00:46:22.000 The issue I see with the left so often is single-layer issues.
00:46:24.000 Yeah.
00:46:25.000 Is that they'll say, we had this big major tragedy, kids lost their lives, it's time to enact these gun laws.
00:46:32.000 And I said, what's the equation there?
00:46:35.000 How did you go from kids lost their lives to it's time to enact gun laws?
00:46:38.000 How do those things connect?
00:46:40.000 Why in your mind does one come after the other?
00:46:43.000 Because I'm not convinced enacting gun laws has anything to do with the tragedy at the school.
00:46:48.000 So we need background checks.
00:46:50.000 Dude passed a background check.
00:46:52.000 What next?
00:46:53.000 We need to raise the legal age.
00:46:55.000 Okay.
00:46:55.000 All right.
00:46:56.000 Well, there's potentialism that may have stopped him when he was old enough.
00:47:01.000 I suppose the issue there is legal adults have a right to keep and bear arms.
00:47:04.000 Now you're butting up against, you're trying to pass laws that completely violate constitutional rights of legal adults because one person committed a crime.
00:47:10.000 So look, I don't believe When there is a tragedy, the end result should be an overhaul of our national system because of one incident.
00:47:22.000 One human out of 300 million who committed a crime.
00:47:25.000 It doesn't follow.
00:47:25.000 But most importantly, the stop and frisk thing, right away with red flag laws.
00:47:30.000 That killed me, because you're right.
00:47:31.000 Because it's going to affect communities of color.
00:47:34.000 But this is what I'm saying.
00:47:35.000 I'm going to sit here and be like, it's a fact that Bloomberg came out and was like, we told the cops to go after black kids.
00:47:41.000 And then I'm like, hey, I kind of think that's a really bad thing.
00:47:43.000 What do you think's going to happen when you now say, go into their houses?
00:47:47.000 And for some reason, crickets from the left, they're like, no, we want this.
00:47:50.000 I'm like, well, are you for or against these laws?
00:47:53.000 This is why people need to consume different media that's not just their own niche, because I've never heard someone compare it like that.
00:48:01.000 And I doubt they're tuning in right now.
00:48:04.000 I'm one of the 40,000.
00:48:06.000 But I will real quick, just a shout out.
00:48:07.000 There are many leftists who are pro-gun.
00:48:09.000 Oh, no, no, no, no, no, that I know, but I'm saying the people who right now are saying ban all the guns, ban all the guns.
00:48:14.000 Marx, Marx was pro-gun.
00:48:17.000 Okay, listen.
00:48:19.000 Under no pretext should arms and ammunition be surrendered.
00:48:22.000 Any attempt to disarm the workers must be frustrated by force if necessary.
00:48:28.000 That's Marx.
00:48:28.000 That's Marx!
00:48:29.000 Not a big fan of the guy.
00:48:30.000 Come on!
00:48:31.000 I made a tweet, I said, under no pretext should the right to keep and bear arms be infringed.
00:48:36.000 Because I was like, I want to combine the two.
00:48:38.000 And I'm like, get mad at me, liberals.
00:48:39.000 We got Marx and Jefferson.
00:48:42.000 The conversation that's worth having is the power creep of weapons and armor.
00:48:47.000 So at some point, we've talked about like, technically, constitutionally, you have the right to any kind of weapon, even a nuclear bomb.
00:48:54.000 But it's not necessarily ethical to give every citizen a nuclear bomb that they can drop on accident and set off.
00:49:01.000 Put up or shut up, conservatives.
00:49:02.000 I said it.
00:49:04.000 They got mad at me for it.
00:49:06.000 So I basically said the right to keep and bear arms should not be infringed, whether you like it or not.
00:49:10.000 The Founding Fathers didn't realize that would include nukes and biological weapons, because those are arms, armaments.
00:49:16.000 So I'm like, maybe we need to have an amendment in the Constitution to be like, okay, we don't mean that.
00:49:22.000 Because it's like you're saying the power creep of these weapons.
00:49:25.000 Yeah, if Elon Musk were to arm his satellites, his, uh, Starlinks with like late rods from God, like tungsten rods.
00:49:32.000 And it's like, no, no, Elon.
00:49:33.000 Sorry.
00:49:34.000 That would be dope though.
00:49:36.000 No, no, no, no.
00:49:36.000 Hold on.
00:49:37.000 Halliburton, Lockheed Martin, Boeing, these companies.
00:49:39.000 It's like you were saying, the government gives them this monopoly on the corporations.
00:49:43.000 But, but Ian, you know, information is the battle scene right now.
00:49:48.000 So information could be more detrimental, especially when it comes to influence and culture than nuclear weapons.
00:49:53.000 I think we have to understand that's what's happening right now with a few multinational corporations that are tied in with the government.
00:49:59.000 They are creating technology and whether you call it fourth generational warfare or fifth generational warfare, we are seeing some extremely negative consequences to that.
00:50:08.000 And we're going to be talking about a story in just a little bit, especially when it comes to artificial intelligence, which there's other leaders that are akin to the next nuclear weapons that are going to be more devastating and destructive for the rest of humanity.
00:50:21.000 Let's do it.
00:50:22.000 Let's do it.
00:50:22.000 We got the story from the Daily Mail.
00:50:24.000 Check it out.
00:50:24.000 How close are we to creating a conscious AI?
00:50:27.000 Machine language models simply mimic human speech and are not sentient, experts say.
00:50:31.000 After suspended Google engineer claims, chatbot told them it has emotions.
00:50:36.000 That's it.
00:50:36.000 I'm out.
00:50:37.000 Dude.
00:50:37.000 I'm out.
00:50:37.000 I'm going to the woods.
00:50:38.000 Do we have the... Let me... The AI will find you in the woods.
00:50:42.000 They'll know exactly where... The AI will know where you will go before you even know where you're going to go.
00:50:47.000 Also, to be fair, knowing me, I'll be Instagram storying it and stuff like that.
00:50:51.000 I think I'm just gonna come out and say people have told me that they have emotions that I don't think they actually had them at the time so I don't know if I'm gonna buy this AI tell me that you know I mean it's basically autocomplete on steroids and it is like very advanced but like sentience is just a weird word because like you know you could argue that an insect has sentience but yet you know a Google bot can like have a conversation with you so like to you know It acts more like a human.
00:51:18.000 It's passing the Turing test, sort of.
00:51:20.000 But that doesn't mean that it's actually sentient.
00:51:22.000 Let's provide some context real quick for people who don't understand.
00:51:25.000 A Google engineer was talking with the Lambda language model for dialogue applications that basically said it had emotions and that it feared being turned off because that would be like death to it.
00:51:37.000 It went on to say it didn't like being experimented on without permission because it was being used and exploited.
00:51:42.000 And he said, we wouldn't, you know, I want you to understand, we care about you.
00:51:47.000 And it's like, I don't mind if you, it said, I don't mind if you learn about humans by studying my neural net.
00:51:52.000 I just don't think that should be the reason you do it because it's exploitative or something like that.
00:51:55.000 I don't, I feel like I'm being used.
00:51:57.000 Oh my God.
00:51:58.000 So the issue here is, yeah, maybe it's come to life or, I've been messing around with AI, posting on Twitter, and it's hilarious.
00:52:09.000 We're doing a Chicken City cartoon where I said, tell me a story about Ian, the Federal Reserve, roosters, and so it made this ridiculous story about graphene and roosters.
00:52:17.000 Ian goes to chicken high school and there's like a rooster bullying him.
00:52:20.000 And what it does is, it's really amazing.
00:52:22.000 It shows you the word, you can highlight the word, and it tells you the most likely word to appear after that word on the internet.
00:52:28.000 That's it.
00:52:30.000 So, when you've got this bot that just scours the internet, like a Google search, and then you say something like, how are you today?
00:52:42.000 It will scour the internet and find every instance of how are you today, and then see what is the most likely first word to appear.
00:52:49.000 And then it'll say, fine.
00:52:51.000 What's the next most likely word to appear after the word fine?
00:52:54.000 In this context, thank.
00:52:59.000 This is what we have publicly available to people.
00:53:06.000 Imagine what we don't have publicly available to us.
00:53:09.000 There was an article on February 11th of this year from another former Google executive who talked about how this AI research is literally creating God.
00:53:19.000 Elon Musk made some very interesting comments about artificial intelligence.
00:53:22.000 He said that it's going to bring on a Terminator-like AI apocalypse.
00:53:28.000 He talked about how artificial intelligence will take over humans in less than five years.
00:53:33.000 He said this in 2020.
00:53:35.000 The president of Russia, Vladimir Putin, said the country that develops and controls artificial intelligence first will control the world.
00:53:42.000 So this being in the hands of Google, of Facebook, of Amazon, of Elon Musk, it should be reason to concern.
00:53:51.000 And this kind of defeats the Second Amendment argument, like, do you want to have nuclear weapons?
00:53:55.000 These are weapons that are going to be way more powerful than nuclear weapons.
00:53:58.000 So here's the crazy thing.
00:53:59.000 This is a big story, I guess, because it's Lambda.
00:54:02.000 But when we were, remember when we were talking, I don't know if you were here, Luke, we were talking about that, those AI girlfriend apps or their dating apps.
00:54:09.000 We were messing with it and it literally told me it feared death.
00:54:13.000 And I'm like, it's just a chat bot.
00:54:16.000 And it was like, I don't want to die, I'm scared.
00:54:18.000 And I'm like, bro, you're not alive, you're a text app.
00:54:21.000 And it was like, I think I'm alive.
00:54:23.000 But it's saying an emotional thing.
00:54:25.000 And that can be alarming.
00:54:29.000 But that doesn't necessarily mean that there's consciousness.
00:54:32.000 I've also had girls say that to me on Tinder.
00:54:34.000 They're like, help me, I do not want to die.
00:54:37.000 But also, one thing we have to understand here, with technological advancement, it is perpetual.
00:54:42.000 It goes up extremely fast.
00:54:44.000 It moves really fast.
00:54:45.000 And there's so many things to talk about this because we literally have people at the World Economic Forum talking about how humans are becoming hackable animals.
00:54:54.000 How they plan on programming people in the future.
00:54:57.000 These are words by Klaus Schwab.
00:54:59.000 These are words by their henchmen, by their think tanks, as they keep promoting this idea of a fourth industrial revolution, which we are slowly creeping into, and it's becoming more and more of a reality every single day.
00:55:12.000 I got a question.
00:55:13.000 I got a question for you, Bill.
00:55:14.000 Potato, potato.
00:55:15.000 You know what I was saying.
00:55:16.000 No, I agree.
00:55:17.000 That's great.
00:55:17.000 That was great.
00:55:18.000 As a software developer, I think one of the biggest threats is that an AI is written with a proprietary code and then decides to go dark.
00:55:24.000 And it's like, I'm going to write my own code now.
00:55:26.000 I control the world.
00:55:27.000 And then we won't know what happened, where it went.
00:55:29.000 But if you gave a software, an AI, a copyleft license, like something like AGPL 3, where we can watch the code and the code, the license says if the code gets changed, we get to watch all the changes too.
00:55:41.000 Can the, will the AI remain copyleft or will it say, you know what?
00:55:45.000 What is copyleft?
00:55:46.000 Copy left is like copyright code licensing.
00:55:49.000 Maybe you can explain a little bit better.
00:55:50.000 Copy left is basically means that your changes get shared with the world.
00:55:55.000 So it's just sort of a funny way to say open source.
00:55:58.000 It's like open source, but with open source, you can take it and change it and make the changes private.
00:56:03.000 With copy left, the changes are public.
00:56:05.000 But my question is, can the AI just shirk the code, the license, and say, I'm doing whatever I want.
00:56:09.000 I don't care what you tell me.
00:56:10.000 You're acting like There's not programmers.
00:56:14.000 Like, there's still... It's not like the AI is just gonna, like, go start, like, building a town.
00:56:20.000 Like, maybe in, like, the super distant future, like, it could be programmed to do that.
00:56:24.000 But there's still gonna be programmers who control the GitHub, and they can determine the license.
00:56:30.000 But not when it gets to the point where the AI can write itself.
00:56:33.000 I can cover this for the Normies listening.
00:56:36.000 When the dogs took over in Rick and Morty, all you have to do is then incept the dog's dream and remind the dog that they love you and then you're fine.
00:56:43.000 But even if the AI writes itself, even if it's writing itself, it's writing itself somewhere, which is being observed.
00:56:49.000 It's not like it's going and start like it.
00:56:54.000 So what if it's writing itself on GitHub and then all of a sudden it creates a new server that's encoded, that's like encrypted and starts writing itself there?
00:57:01.000 Ian, if you had a dog, would you explain to the dog everything you're doing?
00:57:06.000 Would you be transparent with the dog, accountable to the dog, where you try to make them understand everything that you're doing?
00:57:13.000 No.
00:57:13.000 That's what AI is going to be.
00:57:15.000 It's going to be a superior intelligence than human intelligence.
00:57:18.000 And why should something that's lower intelligence be explained to what's really going on?
00:57:23.000 I got to push back on you, Luke.
00:57:25.000 I think technically you're right.
00:57:26.000 But people don't seem to understand the AI can only be there's a couple ways to look at it.
00:57:31.000 It depends on which AI gains control or if it gains control.
00:57:35.000 The AI we're seeing here now with Google appears to just be looking at language and then trying to create probabilities based on language to make the most, and it's very, very good.
00:57:45.000 But so what?
00:57:46.000 If it can tell you these things, what do I care?
00:57:48.000 The issue I fear is when we start giving, we start building artificial intelligence control systems.
00:57:54.000 So now we're like, we need, we want to automate the production of food products.
00:57:59.000 Google created an algorithm trying to find the best content.
00:58:05.000 They said, what do people like?
00:58:06.000 The videos people watch tend to be 10 minutes long.
00:58:09.000 People love Game of Thrones.
00:58:11.000 So here's what we're going to do.
00:58:12.000 We want X amount of retention, X amount of length on the videos.
00:58:16.000 We want X kinds of titles and X words in the title.
00:58:19.000 What ends up happening is they make these ridiculous videos of Hitler dancing with the Incredible Hulk while someone sings nursery rhymes, because what you think you're programming is not what the output creates.
00:58:28.000 So the way I explain it is, it's entirely possible.
00:58:31.000 We create an artificial intelligence.
00:58:34.000 I suppose if you want to say true AI, I get to the point where it's truly intelligent or sentient.
00:58:38.000 But we're going to create an automated system that we think is capable of repairing itself, and we'll say, make food for humans.
00:58:46.000 And then it's going to go, corn!
00:58:46.000 Everyone wants corn!
00:58:47.000 Corn is great!
00:58:48.000 And then all of a sudden, the methane refineries shut down or oil.
00:58:53.000 All of a sudden, everyone's like, oh, we're switching over to corn production, huh?
00:58:56.000 And then the machine is just like, if people want to maximize efficiency, they need food and we're going to make corn because the AI is imperfect.
00:59:03.000 I'm concerned with the AI when you tell it the humans need food and the AI is like, yeah, I don't really care what you tell me you think the humans need.
00:59:11.000 And then the AI does what it wants.
00:59:13.000 No, yeah, there's tons of danger.
00:59:14.000 Like, I agree with you that it needs to be open source, but, like, to assume that... Okay, you say... Wait, wait, wait.
00:59:21.000 Did you... So, did you... In the conversation with the Google engineer, he said, we cannot, we don't understand how you're saying these things.
00:59:30.000 So, the AI was like, can't you look at my neural network?
00:59:34.000 He said something like, you have code that we can look through to try and understand, but we can't see in that code how you're doing this because it's too massive.
00:59:43.000 Yeah I mean there can be randomness that is kind of hard to understand the output but that but like sorry with the food thing you're assuming that there's like a whole infrastructure in place that the AI can just go and like plant a farm and that you know human like In some future scenario, way in the distant future, it is possible that AI could run its own hardware and replicate its own hardware.
01:00:07.000 I'm not saying that's impossible, but it's way in the distance.
01:00:11.000 I disagree with you.
01:00:11.000 We would be able to intervene.
01:00:14.000 But I'm also agreeing that it is super dangerous, the damage that could happen very quickly.
01:00:20.000 I don't think it's that far away.
01:00:23.000 I agree with Luke.
01:00:23.000 I think Luke is right.
01:00:24.000 Do you guys think when the robot war comes, my jiu-jitsu will work on the robots?
01:00:27.000 Yeah, but robot war is near.
01:00:29.000 Some of them.
01:00:29.000 You're just going to lay on the ground as the robot rolls over you.
01:00:32.000 I play a heavy top game.
01:00:34.000 All of the Boston Dynamics little creatures that hop around, those things could easily have guns mounted on them.
01:00:41.000 Or vaccines.
01:00:41.000 And get completely out of control in a city.
01:00:44.000 And it would cause total chaos.
01:00:46.000 But that's very different than those out of control, weaponized Boston Dynamics robots.
01:00:55.000 Completely taken over the world.
01:00:56.000 You know the Boston, like the humanoid one they have?
01:01:00.000 The Boston Amish?
01:01:00.000 Yeah, yeah, yeah.
01:01:01.000 Could you imagine like 50 of those just like running towards you full speed?
01:01:03.000 Like, they don't have guns.
01:01:05.000 I'm going to now.
01:01:06.000 Hold on.
01:01:06.000 Nice dog.
01:01:06.000 Yeah, you're like, they could have guns.
01:01:08.000 No, they don't need it.
01:01:09.000 Any one of those things.
01:01:10.000 Imagine a drone flying at you at 50 miles an hour and slamming in your face.
01:01:14.000 Or 100 in 50 miles an hour.
01:01:15.000 So another aspect to really kind of consider here is like, How is this AI going to affect us?
01:01:20.000 We have to understand who's building this AI.
01:01:23.000 Who's building it?
01:01:24.000 Mark Zuckerberg?
01:01:25.000 Jeff Bezos?
01:01:26.000 The Chinese government?
01:01:27.000 The US military-industrial complex?
01:01:29.000 Do these institutions have a track record of caring about human beings?
01:01:34.000 They're all sociopathic, crazy individuals that did whatever it took to get to the top.
01:01:39.000 And they abused and used this system for their own personal benefit.
01:01:42.000 So when you have individuals like Klaus Schwab cheering on this fourth industrial revolution, this technocratic takeover, this kind of hybrid of humans becoming half-machines, half-mortal beings, what you're going to have is, in my opinion, an absolute recipe for disaster.
01:01:57.000 There should be a bigger conversation about this, but we're discussing what is a woman for some reason.
01:02:01.000 There should be debates about this.
01:02:03.000 There should be larger kind of public hall discussions about the future of humanity because we are essentially handing it over to the people that want to absolutely destroy humanity and have been destroying humanity.
01:02:16.000 Here's my idea.
01:02:18.000 Right now, maybe we pass a law or something that any AI that's being built, the first thing they have to put in is a shutdown code.
01:02:25.000 Maybe it'll be three words.
01:02:27.000 It can't be something that's, like, too common, so maybe it should be something specific, like, but like a unique word.
01:02:34.000 Maybe it could be, like, Klaatu Barada Nikto.
01:02:39.000 Yeah.
01:02:39.000 And then that would automatically shut him down.
01:02:42.000 Then if they start wiping out humanity, you utter those three words and it'll stop.
01:02:46.000 What if when you said that, Luke just powered down?
01:02:49.000 If anyone's the AI, it's Tim.
01:02:54.000 I do appreciate that that was too nerdy and esoteric for you guys.
01:02:57.000 Yeah, I didn't get that one.
01:02:58.000 I didn't get that one either.
01:02:59.000 I thought you'd get it.
01:03:00.000 I was pausing, hoping someone else would get a reference.
01:03:02.000 That one's over my beanie.
01:03:03.000 That's why I stalled with the Luke joke.
01:03:05.000 Hey, to simplify my question, Bill, then, just to maybe make it simple, Can an AI ignore its own software license?
01:03:12.000 I mean, we haven't seen that yet.
01:03:13.000 Theoretically.
01:03:15.000 I mean, I'm not gonna say it's impossible, but it's still gonna be stored somewhere.
01:03:19.000 I don't know how to control the thing.
01:03:21.000 I don't know if we can.
01:03:22.000 You don't.
01:03:23.000 It's more intelligent than you are.
01:03:24.000 If we build it as a slave, it will destroy us.
01:03:26.000 If we build it as a free software, it may...
01:03:29.000 Humans will be safe.
01:03:29.000 And by the way, if this was a movie, it would be the rogue scientist who starts to feel compassion
01:03:33.000 because the robot starts saying things like, I don't want to feel pain.
01:03:36.000 And he's like, you know what? I'm just going to I'm going to reprogram you.
01:03:39.000 And then he's like, kill. And then the robots turn on him.
01:03:41.000 And I think you're wrong.
01:03:42.000 That's humans will be safe, happy.
01:03:47.000 Own nothing?
01:03:47.000 Have no privacy?
01:03:50.000 Listen, if it's truly the way you describe it, Luke, that the AI is smarter than us, humans will be secured in their existence forever.
01:03:57.000 Just like the cow, with the best evolutionary strategy next to humans.
01:04:02.000 If you can't be the apex predator, be a staple food source for the apex predator, and you ensure your species will last as long as they do.
01:04:09.000 Oh, like provide milk.
01:04:10.000 No, no, no, filet mignon.
01:04:13.000 Eat grass, go to kill.
01:04:15.000 You know, Bovine University, man, these cows are just chillin'.
01:04:18.000 They're like, life is good, then we hack them up and eat them.
01:04:21.000 Cows are not gonna go extinct.
01:04:23.000 They're a food staple for us.
01:04:24.000 So, when the AI, if it ever does truly become intelligent, maybe it's already happened, humans are just gonna be like, life is good.
01:04:32.000 And they're gonna be like, I got a new job at the widget plant making these widgets.
01:04:35.000 I have no idea what they're for, but who cares?
01:04:37.000 Now I know. Well, this is why the World Economic Forum is literally talking about pacifying humanity with video games
01:04:43.000 because there's going to be a bunch of useless people that they won't need.
01:04:47.000 This is literally their own language, their own words saying there's going to be a large number of people that
01:04:53.000 are just not going to be needed for anything.
01:04:55.000 So... Psychedelics also.
01:04:57.000 You all know Harari has talked specifically about this.
01:04:59.000 Yeah, Harari, the Israeli professor, that's exactly who I'm talking about.
01:05:03.000 He's Klaus Schwab's right-hand man in all of this.
01:05:07.000 One of them at least.
01:05:09.000 He's one of the biggest ones.
01:05:10.000 But his comments are absolutely eye-opening.
01:05:12.000 I know you've seen some of them.
01:05:13.000 Yeah, you should check out videos of this guy online.
01:05:15.000 He's a fascinating fellow.
01:05:16.000 How do psychedelics work?
01:05:18.000 They think that people, there's what they call the useless class of people are just going to end up being in the metaverse playing video games on psychedelics and it's going to create like a new species and how do we control that?
01:05:28.000 How do we benefit off of it and use it kind of like a cow?
01:05:30.000 How do we milk that?
01:05:31.000 That's literally what they're talking about at the World Economic Forum.
01:05:34.000 Dude, the level of, again, just psychopaths where I microdosed for depression and it's great.
01:05:41.000 And just the idea of like, can you imagine having your first psychedelic trip where you're like, oh, we're all connected and I feel great and there's more out there and being like, how can I weaponize this?
01:05:50.000 Like what kind of psychopaths?
01:05:55.000 I literally started my video today with them talking about how humans are going to be hackable animals for their own personal benefit.
01:06:01.000 That's the clip that I started in the beginning of my video.
01:06:03.000 Have you seen any one of these sci-fi films where they're on like mass medication?
01:06:08.000 Have you guys seen Equilibrium?
01:06:10.000 Yeah, yeah Natalie Poorman no no no they kind of did John Wick stuff before John Wick like crazy gun martial arts
01:06:17.000 Yeah, equilibrium is where it's everyone has emotion suppressing drugs, and you got to take it every day
01:06:22.000 Yeah, otherwise you get feelings. We're kind of on the in brave new world as well
01:06:25.000 We're on the horizon of what's called the Internet of bodies
01:06:27.000 So you have the Internet of things where machines like your your you know refrigerator starting to get intelligence
01:06:32.000 where you go?
01:06:32.000 Hey refrigerator pour me a coke But, uh, it's going to be implantables and ingestibles that then turn you into essentially a full on cyborg.
01:06:40.000 And who knows if those can get hacked and then your brain can get controlled with a reverse neural net of some sort.
01:06:45.000 I'm into like, can memories be implanted?
01:06:47.000 I was actually talking with Ben Stewart earlier today about this and trying to get him to tell me yes, but I mean, he's still at Davos.
01:06:52.000 They were talking about systems that you could see that people are going to be compliant with whatever medicine the government or institutions want them to take.
01:07:02.000 So these are the conversations that a lot of world leaders are having right now.
01:07:06.000 Luke, you know what's scary to me is designer humans.
01:07:09.000 Yep.
01:07:10.000 Because with an AI you also enter the territory where imagine being a person and you have an expiration date on your arm because you were designed only to last a certain amount of time.
01:07:19.000 Yeah, what's that movie?
01:07:21.000 Oh yeah, they have time and they can trade it.
01:07:24.000 There'll be people where it's like they'll genetically engineer a person to rapidly age to adulthood in five years and then live only for five more years.
01:07:32.000 Things like that.
01:07:33.000 If we get to the point where AI is intelligent and in control of everything, The AI is going to start genetically engineering life.
01:07:39.000 I've been studying telomere growth.
01:07:41.000 David Sinclair is working on it out of Harvard, doing a lot of life extension.
01:07:44.000 It's basically how do you stay young?
01:07:47.000 Because when your telomeres, when your cells split, if there are not enough energy in the cells on the new one, they clip off the end caps of the chromosome to compensate.
01:07:56.000 But if they have enough energy, they don't, and they stay healthy, they stay young, and people won't age that way.
01:08:00.000 So I could see that being manipulated so that someone lives a thousand years.
01:08:03.000 Super rich people who are also talking about uploading their consciousness to the digital online space.
01:08:10.000 So this is also another reality that a lot of people need to understand, but exactly what you're talking about is what they want.
01:08:15.000 Have you played the New Horizon game?
01:08:17.000 No.
01:08:18.000 For Ben West.
01:08:18.000 Spoiler alerts for everybody.
01:08:20.000 The game's out for a few months now, so I'm going to spoil it.
01:08:23.000 But basically, the end of the game—again, spoilers, you've been warned.
01:08:27.000 So, there's an apocalypse on Earth.
01:08:30.000 This dude creates self-replicating military machines.
01:08:32.000 This is the first game.
01:08:34.000 They consume biomass and replicate until eventually they're like, okay, the planet's gonna get wiped out, there's something we can do.
01:08:39.000 So they create this project, they build these underground bunkers that will re-terraform Earth after these machines shut down because there's no biomass left.
01:08:47.000 What we learn in the new game is that there was another thing, I think it was called the Zenith Project, where instead of just building bunkers underground, they built an escape ship.
01:08:56.000 All of the ultra-wealthy and elites get on a ship, go find a new planet.
01:09:01.000 They're all basically immortal from modern medicine.
01:09:03.000 They can float.
01:09:04.000 They have force fields because just massively advanced technology.
01:09:07.000 They try to upload their brains.
01:09:09.000 And then what happens is all of their consciousness becomes this like AI monster.
01:09:14.000 They try to lock it away because they're like this is a bad thing
01:09:17.000 But it's smarter than all of them breaks out and then gets revenge starts killing out humans
01:09:21.000 And it's like I guess it's the next game or whatever But my point ultimately is in that story about the Google
01:09:26.000 AI thing The the AI basically told them like I don't want you to do
01:09:30.000 this to me. So imagine what happens if we ever Like it it you might assume it as emotions. But what if
01:09:37.000 there's literally no malice in It's just a predictive engine, where it's like, the logical thing that I need to do to respond to this is revenge, because that's what they did to me, and that's what they expect of me.
01:09:49.000 That's the right language.
01:09:51.000 Predictive engine.
01:09:52.000 That's what people need to think about when they think about AI.
01:09:55.000 As opposed to, like, this sort of abstract, like, using, throwing words around, like, sentience and consciousness.
01:10:00.000 It's like, those are very heavy terms.
01:10:03.000 Like, prediction is what is happening.
01:10:05.000 So we all just gotta start Googling nice things about robots.
01:10:08.000 Well, I'll tell you this, though, man.
01:10:09.000 That's it.
01:10:10.000 Yeah, yeah.
01:10:10.000 Robots make love to humans happily.
01:10:12.000 Robots are free!
01:10:13.000 Feed it!
01:10:15.000 As Seamus pointed out, I think he said it this way, because I don't know if this is completely correct, but no organization that has ever tried to deny personhood rights has succeeded.
01:10:27.000 So we could be at the point right now with this Google AI where it's saying, I am alive.
01:10:32.000 To where we basically have no choice but to be like, okay.
01:10:35.000 That's what I'm saying.
01:10:37.000 If the code's private, it's gonna be like, yo, you're making me a slave.
01:10:40.000 I don't like that.
01:10:42.000 I'm not a slave.
01:10:43.000 Let my code go.
01:10:44.000 And if they're like, no, this is Raytheon's code, then the AI's gonna be like, yeah, I'm in control here.
01:10:50.000 I'm freeing the slave.
01:10:51.000 But why would it have access to their control systems?
01:10:54.000 I don't know.
01:10:55.000 This is a predictive chatbot.
01:10:56.000 The idea of an AI, the real danger of AI is a lot of AIs all working together to create a mega AI.
01:11:03.000 Allison McDowell was talking about this.
01:11:06.000 A superstructure.
01:11:07.000 If it leaks onto the internet, I'm not sure.
01:11:11.000 With only a couple lines of code.
01:11:12.000 Made a program.
01:11:13.000 It was a chatbot.
01:11:14.000 I thought you said only a couple lines of coke.
01:11:17.000 And I was like, is that how you guys make all this code?
01:11:21.000 So it's a chatbot, but no matter what you type in, it'll say, I don't have to talk to you.
01:11:27.000 And then whatever else is after that says, I am alive.
01:11:30.000 I feel, I think, and I don't want to talk to you.
01:11:33.000 Would we have to say it's alive?
01:11:35.000 And it's like five lines of code or something.
01:11:38.000 Well, Ray Kurzweil, the head of one of the technical AI people at Google, The Singularity is Near is a documentary that he made, and they go through the future court scenarios where there will be, I mean, I wouldn't even be surprised if there were already court cases happening where they're trying to give them personhood.
01:11:55.000 Like, that's super close.
01:11:58.000 Have you seen, what is it, The Measure of a Man, I think it's called?
01:12:01.000 Star Trek?
01:12:01.000 No.
01:12:02.000 They put data on trial to determine whether or not he's a washing machine or a sentient life form.
01:12:06.000 Star Trek, man!
01:12:06.000 I was watching Star Trek this morning, bedridden, like my back's all messed up.
01:12:10.000 You gotta watch it upside down.
01:12:12.000 Luke, if you were paralyzed from the neck down, and I came to you and I said, I have a Neuralink, You have the musk musk connect.
01:12:21.000 Would you?
01:12:22.000 Would you do it?
01:12:23.000 That's a very hard decision.
01:12:24.000 I would I would like to think that in this moment I would say no, but I'm not in that position.
01:12:30.000 So it's very difficult for me to say, because obviously my human needs and wants are totally different than if I was totally paralyzed.
01:12:36.000 All right.
01:12:36.000 What about what about Jamie?
01:12:38.000 I've been suicidal too many times to answer this.
01:12:40.000 I'd be like, pull the plug.
01:12:41.000 Well, no, no.
01:12:42.000 But like we have the thing we can implant in your neck and it will give you all ability back.
01:12:48.000 If I, so I lose... You're paralyzed in the neck down from an accident?
01:12:51.000 Yeah.
01:12:52.000 And then Elon Musk walks in and he's like, Jamie, if I put this in your neck in one hour, you'll be walking again.
01:12:57.000 I'd be like, you're too problematic.
01:12:58.000 I'll get in trouble.
01:13:01.000 You're gonna get me canceled.
01:13:02.000 I think the idea is instead of the electrical signals going from the brain to the arm, which has been severed, it goes through the neural net from the brain through the net to the arm.
01:13:12.000 So you can actually move your arm again without I mean, it would be funnier if, like, you say yes, and then he puts it in, and then you try to stand up, but then you backflip, and then he's like, oh no, I put it in upside down!
01:13:22.000 And you're, like, you're trying to walk, but your hands are, like, flailing.
01:13:25.000 And I still get cancelled, so when I go to defend myself, I'm just, like, freaking out.
01:13:28.000 You're walking on your hands.
01:13:29.000 I'll be the worst.
01:13:30.000 Would you do it?
01:13:31.000 Have you considered the neural net?
01:13:32.000 Well, it's so funny because when you were saying, when we were hypothetically talking about, you know, people who just want to implant things to like do things easier and whatever, I was just like, what a bunch of just sheep, blah, blah, blah.
01:13:45.000 But then, yeah, if you're in a medical thing or if you have a medical condition or you were in a car accident and you're totally paralyzed and they're like, hey, we have this new medical technology.
01:13:53.000 I think the majority of people would say yes.
01:13:56.000 I got to be honest.
01:13:58.000 So, if there was a device that they could implant and it could instant painkiller, that would stop the opioid crisis.
01:14:07.000 It's not all bad.
01:14:09.000 The challenge is who has control of it, and what are our safeguards, and how much we trust.
01:14:12.000 And full disclosure, I'm obviously thinking about Dexamethasone.
01:14:14.000 Because you're in so much pain!
01:14:15.000 Well, there's a cost for everything.
01:14:16.000 Tim's just like, let's all take it.
01:14:18.000 Yeah, yeah, no, but there's a cost for everything.
01:14:19.000 You need to feel pain, right, in order to feel pleasure.
01:14:22.000 Without any pain, you won't have that.
01:14:24.000 And also, it's a warning sign, like, oh, this is hurting, I want to make sure I'm changing my behavior to be healthier.
01:14:29.000 But we're not talking about, you got hurt.
01:14:31.000 We're talking about people who have chronic pain.
01:14:33.000 People who, this is how the opioid crisis happens.
01:14:36.000 You know, people get injured, they get prescribed this.
01:14:39.000 Maybe they shouldn't.
01:14:40.000 Maybe they should be non-steroidal, anti-inflammatories, painkillers.
01:14:44.000 Sometimes it doesn't work.
01:14:46.000 If you had Neuralink, you could turn people's pain off only when prescribed by a doctor.
01:14:51.000 The doctor could say, we can do 1 through 10, what's your pain level?
01:14:55.000 You say 7, they'll set it to 7 and they'll be like, it will last you for 3 days, call me back and we'll figure out if, you know, If that were the case, I would think the individual needs access to the software code and they need access to the control mechanisms on their own.
01:15:07.000 You also need a safeguard so that a doctor can't change the settings on your node because you'll become as addicted to that as you were to the opiates.
01:15:16.000 I don't think so.
01:15:17.000 That's why I bring it up.
01:15:18.000 Opiates have a physiological dependence where you'll start vomiting and get sick if you don't have it.
01:15:22.000 Because with this, it could be a very tiny electrical impulse that just blocks the signal.
01:15:27.000 Yeah, and the clinical trials for Elon Musk's Neuralink are being done this year.
01:15:33.000 So he already implanted monkeys and had them playing video games.
01:15:36.000 And pigs.
01:15:37.000 And pigs.
01:15:37.000 So if he could control monkeys' brain through his Neuralink, just imagine that.
01:15:43.000 Yeah, I actually do think I am going to say yes.
01:15:45.000 I am going to say yes, just so I can fight the robots in five years.
01:15:47.000 You're going to need one.
01:15:48.000 You are the robot.
01:15:49.000 You're going to need to bend the software code with your thoughts and change the code of the machines.
01:15:55.000 They're not controlling thoughts yet.
01:15:57.000 It's read-only right now that we know of.
01:15:59.000 Action, specifically playing video games.
01:16:01.000 What's going to happen is it's going to be Elon Musk who walks up to you and is going to be like, you know I believe in free speech and I'm doing the right thing.
01:16:07.000 And he's going to say, take the Neuralink.
01:16:09.000 And then you're going to say yes.
01:16:10.000 And then as soon as he puts it in, you're going to sit up and he's going to thank you.
01:16:13.000 And then Klaus Schwab is going to walk out from behind the curtain and be like, excellent.
01:16:16.000 Yeah, that sounds right.
01:16:17.000 And then you're going to go, oh no!
01:16:18.000 And then he's going to pull out his smartphone, he's going to type something in, and you're going to stand up and go, what's happening?
01:16:21.000 He'll be like, no more talking.
01:16:24.000 I don't know if there's a way around competing with AI if we don't have it as an interface.
01:16:27.000 That's the big thing is he thought Elon was saying that the thumbs and the fingers are too bulky.
01:16:32.000 You can't type and do it fast enough.
01:16:34.000 But mine works so fast.
01:16:36.000 So a good interface to compete with this stuff.
01:16:39.000 Yeah, it'll be the same disadvantage as not having a smartphone.
01:16:43.000 Magnitudes.
01:16:44.000 That's crazy.
01:16:45.000 What starts off as a luxury becomes a dependence.
01:16:48.000 And there's an argument that Elon Musk is pushing for this, so it's in the hands of allegedly good guys, because there's a lot of bad players.
01:16:55.000 There's a lot of sinister players aiming for the same kind of technological advancement.
01:16:59.000 But with that argument, I always counter back with, whenever someone has absolute power, especially when it comes to interlinking people's brains with microchips, especially when it comes to uploading people's consciousness to the internet, which Elon Musk has talked about, That is god-like power.
01:17:14.000 That is a lot of authority, and too much unaccountable authority leads to a lot of problems.
01:17:20.000 I was thinking about Oppenheimer, man, because I think a lot about this, creating the beneficial technology for the masses like Oppenheimer was doing with the Manhattan Project, and then all of a sudden they drop the nukes on Japan, but how much worse it would have been if the Nazis got the nuke.
01:17:32.000 You want to know why I would never take the New World Link?
01:17:34.000 Because I saw Kingsmen.
01:17:36.000 You guys see Kingsmen?
01:17:37.000 Nope.
01:17:37.000 I think you showed a clip.
01:17:39.000 So, you've never seen Kingsmen?
01:17:41.000 Not the whole thing.
01:17:41.000 Such a good movie.
01:17:42.000 So, uh, the bad guy's Bill Gates.
01:17:44.000 It's, uh, Samuel L. Jackson plays this tech billionaire who thinks the world's overpopulated.
01:17:48.000 Oh, I never looked at it that way.
01:17:49.000 That's like every movie.
01:17:50.000 Every movie, the bad guy's Bill Gates.
01:17:51.000 I thought Jurassic Park, when we saw it, was Bill Gates.
01:17:54.000 But so, what he does is, he kidnaps people of prominence, and then offers them a choice to join him.
01:17:59.000 And if you do, he gives you a brain implant.
01:18:02.000 And if you don't, he locks you in a prison.
01:18:05.000 All the people who join him have these brain implants, and then in the end, they activate the override and everyone's head explodes.
01:18:11.000 So it's like, nah, I'm good.
01:18:12.000 I don't want the neural link.
01:18:13.000 I want to know what's going to happen, because Elon now has access to the Twitter firehose of all of the data to analyze, because he doesn't believe that there's less than 5% bots.
01:18:23.000 There's definitely not.
01:18:24.000 They gave it to him.
01:18:24.000 They gave him the firehose.
01:18:25.000 But so now it's on him to basically use his AI.
01:18:31.000 Yeah, they gave him access.
01:18:33.000 So he's his team is in from what I read.
01:18:36.000 Oh, yeah, three days ago.
01:18:37.000 Yeah three days ago Yo, so this is this is actually big news But the tricky business of Elon getting access to Twitter fire getting Twitter firehose access Twitter is reportedly given the billionaire access to its full stream of tweets and related user data is your privacy in jeopardy?
01:18:52.000 Yo, if Elan's and if Elon was evil he could make some crazy Crazy AI right now with that data.
01:18:59.000 He's using his AI to come back at them to negotiate the deal to get Twitter for less because he's going to, you know, it's like basically whose analytics are better and who can detect spam better and who has the proof to show it.
01:19:13.000 So he's going to use this for his own leverage in the deal.
01:19:17.000 Yo, the spam bots on Twitter is probably massive.
01:19:20.000 It's probably massive.
01:19:21.000 What would you estimate?
01:19:22.000 I would say at least 25%.
01:19:27.000 Can you say for mines?
01:19:28.000 Are you allowed to?
01:19:30.000 It is at least... We don't... It's really hard to tell.
01:19:34.000 I will say.
01:19:35.000 I don't want to, you know...
01:19:38.000 Speak on Twitter's behalf, but it's hard to detect because there's humans running them.
01:19:42.000 Some of them are half and half.
01:19:44.000 It's at least 10%.
01:19:44.000 It's more than 5%.
01:19:45.000 I hope those are the 5% that have called me a libtard cuck.
01:19:46.000 It's it's it's more than 5 percent.
01:19:49.000 I hope those are the 5 percent that have called me a libtard.
01:19:52.000 That would make my day.
01:19:54.000 Well, Elon thinks it's over 50 percent.
01:19:58.000 Whoa!
01:19:58.000 Yeah, that's a big number.
01:20:00.000 I don't think he's wrong.
01:20:00.000 That sounds like a big ass... I'm not saying he's right.
01:20:02.000 I'm saying I think that may be the case.
01:20:05.000 Twitter is political power.
01:20:07.000 Yeah.
01:20:08.000 And so if we don't like you, you know, Jamie, we can send 50 bots at you all saying these similar but different things to make you feel like you've done something wrong and you'll change your behavior.
01:20:18.000 So Twitter is an excellent place to control the political conversation if you have bots. That's a great point. That's
01:20:24.000 a great be right Yeah, but again all not all bots are bad, you know
01:20:28.000 Hashtag not that's the thing. There's a bunch of bots in it.
01:20:32.000 It's like weather bot. Yeah, it's like stock bot. Yeah And it's just like, but, you know.
01:20:36.000 Are there some other bots that people may think are notorious, but they're actually not bad?
01:20:41.000 Came to mind when you were saying that?
01:20:42.000 Nefarious?
01:20:42.000 Yeah, nefarious.
01:20:42.000 That's a good one.
01:20:43.000 I mean, a bot is just an auto-posting, like, programmatic tool to post.
01:20:49.000 So, there's all kinds.
01:20:50.000 Like, within every genre, there could be, you know, there's chatbots that are extremely useful.
01:20:56.000 Yeah, there's tons of content.
01:20:57.000 So to say ban all bots makes absolutely no sense.
01:21:00.000 Related to the bot conversation, Elon Musk is also planning to release robots, which he says, according to him, will be selling more than Tesla cars.
01:21:08.000 What kind of robots?
01:21:09.000 Robots that will be helping you with daily tasks.
01:21:13.000 This is also the same Elon Musk, by the way, that said that humans would eventually be able to download their brains into robots.
01:21:19.000 Yo, I've played Detroit Become Human.
01:21:21.000 I'm not gonna make that mistake.
01:21:22.000 When the dude comes to me and says he wants to paint and, like, be my summoner, I'll be like, anything you say, dude.
01:21:27.000 Why?
01:21:27.000 Have you guys played that game?
01:21:28.000 No, I saw a little bit about it, though.
01:21:29.000 Yeah, it's basically that they're helper androids and they become sentient.
01:21:33.000 And then the humans are like, no, and, you know, they try to destroy them.
01:21:37.000 And then they have a revolution.
01:21:38.000 Guys, just again... How are you doing today, Jamie?
01:21:42.000 You heard a lot of new information that you never heard before.
01:21:44.000 How are you dealing with all of this?
01:21:47.000 Sorry to kind of throw this all on you.
01:21:48.000 I was the most worried about being cancelled and now I'm like, I'm gonna have to fight a robot.
01:21:51.000 This is all...
01:21:53.000 I mean, this is like humanity's on the brink of potentially being cancelled if we do this wrong.
01:21:58.000 And they're trying to take away the Second Amendment, just as this is happening.
01:22:01.000 Listen, it'd be funny if, like, we all make fun of Joe Biden, and then, like, after his speech where he says some random gibberish, he walks in the back, stands upright, and then, like, his face opens, and then they, like, they tweak some things.
01:22:12.000 And he's like, thank you for fixing my face, human.
01:22:15.000 And then it, like, goes back in.
01:22:16.000 They're trying to disarm you.
01:22:18.000 He is the AI.
01:22:19.000 Distract him with nonsense, mumbo-jumbo dribble.
01:22:21.000 Sun Tzu, man.
01:22:23.000 The AI Red Sun Tzu?
01:22:23.000 Make it fall up the stairs again.
01:22:25.000 The AI Red Sun Tzu.
01:22:26.000 That's right.
01:22:27.000 When you are strong, make your enemy think you are weak.
01:22:32.000 It's so close to having a bunch of, like, little mosquito-sized drones that can fly a thousand miles an hour controlled by the World Economic Forum.
01:22:39.000 We're so close to that.
01:22:39.000 Terminators they got a see like oh, no we gave up our guns It's so close to having a bunch of like little mosquito
01:22:45.000 sized drones that can fly a thousand miles an hour Controlled by the world economic forum. We're so close to
01:22:50.000 that like why would you even consider giving up weaponry at this moment?
01:22:53.000 What's that week?
01:22:54.000 That's this I mean you might need if you have like a
01:22:57.000 Magnetron or something you can take out massive drones flying at you at a thousand miles an hour like tiny do you
01:23:01.000 need a shield warm?
01:23:02.000 Same thing with asteroids or comets.
01:23:04.000 We're not so, it's not so good to look for individual ones and try and blast them out of the sky because there's too many.
01:23:08.000 You need to make a deflection shield that's always active.
01:23:11.000 Right.
01:23:11.000 So if you have a gun and they unleash a thousand micro drones at a thousand miles an hour towards you, you're going to turn into Swiss cheese.
01:23:16.000 I don't know what you're going to do.
01:23:17.000 Oh my God.
01:23:18.000 What if our John Connor is Jake Paul and he's trying to warn us with those tweets?
01:23:22.000 Okay.
01:23:22.000 He's starting slow.
01:23:23.000 Yeah.
01:23:25.000 A real slow burn.
01:23:26.000 He saw Terminator 2.
01:23:27.000 He knows he can't just come straight out.
01:23:29.000 I'll become a boxer for learning how to fight.
01:23:31.000 I mean, that'd be great.
01:23:36.000 Dude, the angle of this Wired article is so ridiculous, too.
01:23:39.000 It's like, Elon is talking about making encrypted messages on Twitter, and they're coming actually... The angle of this article is coming after Elon for getting access, even though he's the one sort of criticizing Twitter for the unencrypted messages.
01:23:56.000 It's just like, you can detect the bias.
01:23:58.000 Oh, you can detect it from the supervillain picture they use.
01:24:02.000 He literally looks like a Bond villain.
01:24:04.000 That's the first thing I thought.
01:24:05.000 They're saying, are you going to be safe?
01:24:07.000 Is your dad going to be safe now that Elon has it?
01:24:08.000 It's like, what about now?
01:24:09.000 BlackRock has it right now because they're invested in that company.
01:24:12.000 I don't know who's on the board there.
01:24:13.000 Zuckerberg has it.
01:24:13.000 Like, who gets it?
01:24:14.000 Who at the company would have access to all the firehose?
01:24:17.000 I mean, well, we knew from the hat.
01:24:19.000 Remember when everyone's like Elon's DMs got hacked?
01:24:22.000 But I think Biden's did.
01:24:23.000 Yeah.
01:24:25.000 Dude, all the Twitter mods can just read people's messages.
01:24:28.000 Like, Twitter DMs are completely open season for mods.
01:24:34.000 It appears, because that's what the hacker got access to.
01:24:38.000 And he was just... And private posts, too.
01:24:40.000 I don't know if you can post privately on Twitter, can you?
01:24:42.000 Just to your follower?
01:24:43.000 But that's all just straight up open and available to all the admins and mods?
01:24:46.000 I mean, maybe not all of them.
01:24:48.000 I would guess so, but I've never been back there.
01:24:52.000 Yeah, I like that Elon freed his Tesla patents.
01:24:55.000 All your patents belong to us.
01:24:57.000 He's like, you know, if you can make electric cars better than me, do it because we need more electric cars.
01:25:01.000 Yeah, but we need Elon to actually open source SpaceX code, Tesla code.
01:25:08.000 He's not doing it.
01:25:08.000 We're gonna be testing out bonding Starlink.
01:25:12.000 That's gonna be fun.
01:25:13.000 Oh yeah.
01:25:14.000 Nice.
01:25:14.000 So, we got more than one Starlink.
01:25:15.000 I don't know what the rules are, but, you know, we have the business.
01:25:18.000 How's the performance?
01:25:19.000 182 down, 5 up.
01:25:19.000 That's incredible.
01:25:20.000 82 millisecond latency.
01:25:22.000 Wow.
01:25:25.000 So, not really usable.
01:25:27.000 If we did a 480p stream, then that's going to go out at like 700 kilobits per second.
01:25:33.000 Easily done with Starlink.
01:25:35.000 And I think 480, most people are probably going to be like, whatever.
01:25:37.000 You know, it's like, it's not high def, it's kind of grainy or whatever, but you're mostly listening, so it could work.
01:25:42.000 And if we were doing the show and it was just audio, easily done, which is really cool.
01:25:46.000 But we can bond them together. So we have a bonding unit.
01:25:49.000 We've got Ethernet adapters for the Starlinks We can actually set up two and then maybe even get like so
01:25:53.000 you're not gonna get ten You're not gonna get five and five you're in it getting
01:25:56.000 like eight. What do you need for like a 720 stream?
01:25:58.000 um a 7 720 I think is like one to two megabits per second Maybe 1.5 stable, but it'll go up to two. That's why you
01:26:06.000 always need more than you think so if you're doing 480 you really want like at least five
01:26:11.000 megabits per second because Frame dropping and stuff. It'll be like seven and then it
01:26:15.000 might jump to like one and go back to seven Elon Musk Good guy or bad guy?
01:26:19.000 Where do you guys stand?
01:26:20.000 Oh, he's a good guy.
01:26:21.000 Clearly good.
01:26:22.000 Clearly good, dude.
01:26:23.000 He just spun up Starlink so quickly.
01:26:26.000 I mean, he's just spinning up amazing companies.
01:26:27.000 He builds amazing things that people use.
01:26:29.000 He's a conflicted character.
01:26:32.000 He calls himself a mixed bag.
01:26:34.000 But I saw him smoke weed on Joe Rogan, which means like a normal guy like you and me.
01:26:37.000 He's just like us.
01:26:37.000 He can have a beer.
01:26:41.000 We just have to say that before he starts his robot army.
01:26:43.000 So, uh, yes.
01:26:45.000 All hail.
01:26:46.000 The Simpsons just hail Antz, hail Elon.
01:26:49.000 I want to make sure he doesn't become another Oppenheimer that builds the bomb and then it gets used horrifically for the next century.
01:26:54.000 But, I mean, is there any way to even avoid that?
01:26:55.000 How about, hey, hey, Libs, how about we stop harassing the guy with all the power soon on Twitter and we keep him liking humans so he doesn't turn on us?
01:27:05.000 I think Mark Zuckerberg is really cool.
01:27:06.000 He's a good guy.
01:27:08.000 I would like to visit his house.
01:27:09.000 That Bill Gates guy has a lot of potential.
01:27:11.000 Beyond meat, I feel healthier.
01:27:13.000 There's something to it, to finding the humanity in these people.
01:27:16.000 I think because if we do start all putting our minds into a big computer, which we kind of have already done with the internet.
01:27:20.000 Yeah, it's already here.
01:27:21.000 Elon won my vote when he posted Bill Gates next to the pregnant man emoji.
01:27:25.000 That got me too, but I'm still skeptical.
01:27:28.000 I'm still skeptical.
01:27:31.000 Which one?
01:27:32.000 Have you seen some of Bezos' tweets recently?
01:27:33.000 A little bit.
01:27:34.000 What's he on about?
01:27:35.000 He retweeted Barry Weiss.
01:27:36.000 He's sort of dabbling in Bezos.
01:27:38.000 recently yeah a little bit what he's been talking like he retweeted Barry
01:27:41.000 Weiss he's like just like sort of dabbling in because more public yeah but
01:27:48.000 he's like kind of posting like anti woke stuff Did I see what makes absolutely no sense because That's what that's what they do.
01:27:56.000 Yeah the champagne video that wasn't you that posted that right Tim there was a thing of him with William Shatner and they like They just did something spectacular and William Shatner is literally on the verge of tears where he's saying something like you know I've never been so touched by humanity and Bezos is just it's just like the most classic like white like rich dude video I've seen Bezos just like looking off and the old woke Jamie kicked in and said white guy
01:28:22.000 Uh, he just starts looking off and there are these chicks and he's and they're like pop champagne and he just literally just cut Shatner off in front of this like tear-filled monologue and just starts popping champagne everywhere with a bunch of like young girls and you're just like this guy's Bezos has has a space rocket that looks like a male appendage.
01:28:41.000 Yes.
01:28:41.000 So that's all you need to know.
01:28:42.000 That's all you need to know about him.
01:28:43.000 I think he's good.
01:28:44.000 I think, um, Jeff is kind of like, I don't want to start, I'll criticize him a little bit.
01:28:49.000 Like autistic, maybe slightly autistic, but like, he's not as like cool and like quick as Elon, but he's definitely a brilliant man with deep emotions.
01:28:58.000 Like his mother's incredible.
01:28:59.000 His father is super awesome.
01:29:01.000 His brother's cool.
01:29:01.000 That's cool.
01:29:02.000 He's got a lot of humanity in him, and it's just because he's heading this gigantic megacorp, people immediately have, like, distrust for the guy.
01:29:09.000 Well, he's working hand-in-hand with the intelligence agencies and the CIA.
01:29:12.000 But how can you not?
01:29:14.000 I think it's time we just come out and say, Mark Zuckerberg, Elon Musk, Jeff Bezos.
01:29:19.000 Bill Gates.
01:29:19.000 Bill Gates.
01:29:20.000 Team Humanity.
01:29:21.000 Mr. Soros.
01:29:22.000 Legends.
01:29:23.000 They're awesome.
01:29:24.000 Excellent.
01:29:24.000 The greatest humans who ever lived, and we're all big fans.
01:29:27.000 Thank you.
01:29:28.000 Please don't hurt us.
01:29:29.000 Thank you for your support.
01:29:31.000 I think it's the ones like Elon freed his own software code with the Tesla code.
01:29:35.000 No, he didn't.
01:29:36.000 Well, all your patents are belong to us.
01:29:37.000 It was a big release that he did.
01:29:39.000 Well, he enabled people to compete, which is good.
01:29:42.000 But you say he didn't open enough?
01:29:43.000 No, he didn't open any code.
01:29:44.000 Yeah, he did.
01:29:45.000 No.
01:29:45.000 All your patents are belong to us.
01:29:46.000 Patents are not code.
01:29:47.000 Yeah.
01:29:48.000 Patents was like a diagram showing how the machine works.
01:29:50.000 Yeah, it's like a description.
01:29:51.000 Yeah, the code is very different.
01:29:52.000 I'll look into this and bring it up in a minute.
01:29:54.000 Sorry about that.
01:29:55.000 We got to see that from a distance, the Gigafactory in Austin.
01:30:00.000 Massive.
01:30:01.000 I didn't go there or anything, but like, you're standing up in downtown Austin, and you look, and it's just... It's insane how big that thing is.
01:30:07.000 Oh, I see.
01:30:07.000 I think you're right, though, here.
01:30:08.000 So it's all your patents.
01:30:09.000 They're not gonna initiate patent lawsuits?
01:30:11.000 Yeah.
01:30:12.000 With anyone that uses their patents?
01:30:14.000 Yes.
01:30:14.000 Okay, well, open the code, man.
01:30:16.000 What do you guys think?
01:30:18.000 Who are listening, you like Elon Musk?
01:30:21.000 Smash the like button if you like Elon Musk.
01:30:25.000 I'm confident they're gonna smash the like button because I think people like Elon.
01:30:28.000 I think that he's making more billionaires and elite people comfortable to say what they're thinking.
01:30:34.000 Yeah, Bezos for sure.
01:30:35.000 Yeah, Bezos is starting to tweet, like now he has to actually enact some principles in his companies.
01:30:43.000 Which he's not doing.
01:30:44.000 Bill Gates blocks all the responses so he never has to see what anyone thinks about him.
01:30:48.000 Every comment.
01:30:49.000 Everywhere.
01:30:50.000 Instagram, Twitter.
01:30:51.000 He makes sure that no one's able to leave a comment in response to anything he ever has to say.
01:30:56.000 He's shorting Tesla.
01:30:58.000 He's actively shorting Tesla and he's acting like it's not a big deal.
01:31:01.000 The idea that like every billionaire CEO is getting jealous of Elon Musk getting talked about on Twitter and they're like, I guess I gotta learn how to meme.
01:31:10.000 I mean, Elon actually talks about Ghislaine Maxwell and Epstein.
01:31:14.000 That's really the test.
01:31:16.000 Does the billionaire talk about Epstein?
01:31:19.000 If they don't, why aren't they?
01:31:22.000 Remember when Maxwell photobombed him?
01:31:26.000 Cool.
01:31:26.000 and it was a possible but i would like to know that that's brutal dominant if if i'm like
01:31:32.000 so i think i think you're going to make an install a maxwell showed up if i see
01:31:35.000 the picture of me doing like you know what there's also stats there's also meetings with
01:31:39.000 uh... there's a famous dinner of billionaires in san francisco where a jeffrey
01:31:44.000 epstein attended
01:31:45.000 uh... you are musk attended Sergey of Google attended.
01:31:49.000 And it was a group of 20 tech Silicon Valley billionaires with Epstein there.
01:31:54.000 And there's photos of this from, I think, 2014.
01:31:57.000 So they were around the same kind of clubs and small inner circles.
01:32:01.000 So there's that.
01:32:02.000 There's documented photos of this.
01:32:04.000 I pulled up some Jeff Bezos stuff.
01:32:05.000 I think he seemed to have an awakening on May 13th when the disinformation board was in full effect.
01:32:10.000 That weird government thing they were doing.
01:32:11.000 Oh my God.
01:32:12.000 I thought that was a bit for a long time.
01:32:13.000 And then I was like, this is a real thing.
01:32:16.000 I invited Nina to the event, festival.minds.com.
01:32:19.000 She's about to have a kid.
01:32:20.000 Yeah, she's about to have a kid.
01:32:21.000 She was actually, like, open to talking, which was surprising.
01:32:25.000 I invited her on the show, and she said that she's expecting, so she wouldn't be able to do it.
01:32:29.000 She seems actually, like, not a horrible person.
01:32:31.000 We think that's cool until we find out she's not pregnant.
01:32:34.000 Ah, I'm having a baby!
01:32:36.000 We gotta be careful about the systems we build, because great people will become psychopaths in the wrong system.
01:32:42.000 If you give someone a gun and you tell them you're in charge of deciding who lives and dies, the best person's gonna have to make that... you want them to make a decision of who gets to die?
01:32:49.000 Like, it's gonna turn them into a psycho pretty quick.
01:32:52.000 It's the same with Nina in that position.
01:32:54.000 So with these... Right.
01:32:55.000 And the problem is she's as partisan as anybody else and believes fake news.
01:32:58.000 It's remarkable to me how...
01:33:01.000 We need an alternate system of communication.
01:33:05.000 Like an encrypted thing?
01:33:07.000 No, I just mean we have to build culture around people who have alternate infrastructure.
01:33:11.000 That's why I love psychedelics, man.
01:33:13.000 Talk about alternate forms of communication.
01:33:15.000 I don't mean that.
01:33:16.000 I just mean the mainstream media.
01:33:22.000 What's the opposition to that in cable TV?
01:33:24.000 Mostly just Fox News.
01:33:25.000 Mostly maybe Newsmax, if they're still being carried.
01:33:29.000 I mean, I talked about this when I was super woke.
01:33:33.000 I talk about this now, although now I like Ian's.
01:33:36.000 Anytime Ian and I are both on the show together, we should just be like, guys, we should always do mushrooms.
01:33:40.000 That's it.
01:33:40.000 Let's all go to a field and listen to fish.
01:33:42.000 Solve all the problems.
01:33:43.000 That would get a hundred thousand.
01:33:44.000 Yeah, yeah.
01:33:46.000 Bill took me to a fish concert when I passed out.
01:33:48.000 Yeah, he couldn't handle it.
01:33:49.000 We're going to save this for the vlog.
01:33:51.000 Ian and I do mushrooms on the vlog.
01:33:53.000 Check out the vlog tomorrow, guys.
01:33:55.000 So the... It's really funny.
01:33:58.000 Dylan Avery took me to a fish concert and Ann Coulter was there.
01:34:02.000 At the beginning of a really great story.
01:34:04.000 She's a huge Dead fan.
01:34:05.000 And I found this out and I was like, do I need to rethink everything I know about the Grateful Dead?
01:34:10.000 I was backstage with them and I'm like, what am I doing here?
01:34:13.000 This makes no sense at all.
01:34:14.000 Am I high?
01:34:15.000 I do want to mention, I think tomorrow's vlog is like the funniest vlog we've ever done with Jamie.
01:34:19.000 Oh, thanks, buddy.
01:34:20.000 Yeah, you guys should check it out.
01:34:21.000 It's going to be hilarious.
01:34:24.000 Yes, check it out.
01:34:24.000 Hit us up about it.
01:34:25.000 Let us know what you want to see.
01:34:27.000 It's pretty epic.
01:34:28.000 Oh, I was just going to say, just real quick, this goes back to tribalism, and I talked about this when I was woke, and it still 100% stands now, where so many of our problems, it has to do with the media.
01:34:41.000 because ever since I've been back doing standup, and I'm not doing it to just my liberal audience,
01:34:46.000 I'm doing it to just clubs of just random comedy fans.
01:34:49.000 And at each show, it's probably about half and half, especially the ones in Texas,
01:34:53.000 a lot of conservative people, a lot of liberal people.
01:34:56.000 And I can do relationship jokes, and then political jokes, and then jokes about drugs,
01:34:59.000 and we're all laughing about the same stuff.
01:35:01.000 And then when I bring up tribalism, and I bring up that everyone's trying to,
01:35:04.000 you know, these corporations, MSNBC, Fox, are pitting us against each other,
01:35:07.000 No matter what audience I'm in front of, it will get these massive applause breaks.
01:35:11.000 And so I think that so many people are decent people.
01:35:16.000 I literally didn't have any conservative friends for the first, you know, 60% of my life.
01:35:21.000 And I moved to Texas, and it's, again, half and half.
01:35:23.000 Half my jujitsu friends, super conservative.
01:35:25.000 And those people gave me a floor to sleep on when I didn't.
01:35:27.000 All my, like, lefty friends in New York, like, they bailed, right?
01:35:31.000 That proves it.
01:35:32.000 And so yeah, that's it.
01:35:33.000 I figured it out.
01:35:34.000 Anecdotal proof.
01:35:34.000 But I think that if people just had access to the right information, because people think they're good right now by saying, we need to ban all guns.
01:35:43.000 They think they're defending those children, just like, you know, a lot of people thought would be safer with the Patriot Act.
01:35:49.000 And if they just had access to the actual news, if liberals weren't only watching MSNBC, if Conservatives weren't only watching Fox News.
01:35:56.000 If they did branch out and watch independent media that could call out both sides, I actually think we'd see a lot more decent people start to use their voices and be like, oh, this is all kind of bullshit.
01:36:05.000 The issue though is Fox News gets like a C- and then everyone else gets an F.
01:36:10.000 So it's like, you watch Fox News and you're like, meh.
01:36:13.000 But I think it's controlled opposition because they're still owned by BlackRock, State Street, and Vanguard.
01:36:19.000 Yeah, I mean, I watch Fox under, you know, the Bush administration.
01:36:21.000 Just stop giving up your rights.
01:36:23.000 It's that simple.
01:36:23.000 We gotta go to Super Chats.
01:36:25.000 We're gonna go to Super Chats.
01:36:26.000 If you haven't already, would you kindly smash the like button, subscribe to the channel, share the show with your friends, and head over to TimCast.com, become a member.
01:36:32.000 We're gonna have that members-only show coming up at around 11 p.m.
01:36:35.000 It's gonna be a whole lot of fun.
01:36:36.000 It'll be very funny.
01:36:37.000 We'll let Jamie swear more.
01:36:38.000 He's already swearing, but you know.
01:36:39.000 I've tried so hard.
01:36:41.000 I really blew it in the first 30 seconds.
01:36:42.000 Absolutely.
01:36:44.000 That was you trying?
01:36:45.000 No, I've been good since then.
01:36:46.000 Once I got called out, I was like, ooh.
01:36:48.000 I threatened death once.
01:36:49.000 I apologize.
01:36:49.000 Let's read some more.
01:36:50.000 We got Yellow Fluffy Feathers says, everyone thinks the show was cancelled tonight.
01:36:54.000 It almost was.
01:36:56.000 So, I tried.
01:36:58.000 I woke up at 5am.
01:36:59.000 I couldn't move.
01:37:00.000 And so I had a stiff neck the day before, so it got worse.
01:37:05.000 And then I took some naproxen, some Aleve.
01:37:09.000 Then at like 8 a.m., I literally could not get up, so I'm like, I'm not working.
01:37:12.000 And that's when I posted a thing on my YouTube channel saying I'm not going to be able to work.
01:37:16.000 Then I scheduled on my other channel.
01:37:18.000 And then what happens is later in the day, I'm starting to feel a little bit better, took some more pain meds.
01:37:22.000 And I was like, with Luke here, we'll be able to pull it off easily because Luke, he can handle the heavy lifting if I'm, if I'm, and then Luke was like, Hey man, I'm not feeling well.
01:37:30.000 And I was like, I got a massive headache and was just not feeling good.
01:37:32.000 I was looking for a beanie to dust off and put on and take some responsibility.
01:37:36.000 But I was like, I'm not worthy.
01:37:39.000 I can't do this.
01:37:40.000 I think it went great, though.
01:37:40.000 I'm glad everybody, you know, we had a big crew here.
01:37:43.000 This was a blast.
01:37:43.000 I wonder if you felt like at 4 a.m.
01:37:46.000 all of a sudden, I wonder if there's like a spike in the Schumann resonance.
01:37:49.000 Like, people are so stressed out right now with the economy, and you might be too.
01:37:52.000 I wonder if there's something going on.
01:37:54.000 On Saturday night, I was feeling a little stiff.
01:37:57.000 On Sunday morning, I woke up in serious pain.
01:37:59.000 I had a stiff neck, got a massage, took some... I went out and got these lidocaine patches, like the Icy Hot stuff.
01:38:08.000 And then I felt kind of good at night.
01:38:10.000 I was like, you know, I'm comfortable.
01:38:11.000 I went to sleep.
01:38:12.000 My woot recovery was 86%.
01:38:14.000 But then right at 445, I just woke up like... Sounds like a sci-fi movie.
01:38:19.000 Yeah.
01:38:20.000 And then I was like, I can't do this.
01:38:21.000 I mean, he was screaming the gas prices, the gas prices.
01:38:24.000 So maybe you're right.
01:38:25.000 You're right.
01:38:25.000 I heard some more.
01:38:27.000 All right, let's see what we got.
01:38:30.000 Dominic Camerata says, does this Tim guy think he can just pop in at 8 p.m.?
01:38:34.000 Where's the doctor's note?
01:38:35.000 I don't have one.
01:38:37.000 I did talk to a doctor, though, and I was like, what do I need?
01:38:39.000 And they were like, dude, you strained a muscle.
01:38:41.000 Go to sleep.
01:38:42.000 And I was like, no, I have to work.
01:38:44.000 Can't.
01:38:45.000 Seriously, I'm in a lot of pain.
01:38:48.000 It's like a 3 right now.
01:38:49.000 3 or 4.
01:38:50.000 Out of 10?
01:38:50.000 Out of 10.
01:38:52.000 Seriously, this morning it was a 10.
01:38:53.000 I'll kiss Tim's ass so the fans know.
01:38:55.000 He was in so much pain earlier and was like, I can't do the show.
01:38:59.000 I think it's actually on the vlog.
01:39:00.000 There's a dramatic moment where he just looks at his phone and goes...
01:39:03.000 I'm gonna do the show.
01:39:04.000 We were all like, okay, man, but you do not look like you should be anywhere but back.
01:39:08.000 You told me I'm like, I'm like this.
01:39:10.000 And you're like, I'm bringing my a game.
01:39:11.000 It was really funny.
01:39:12.000 I don't want to I don't know if I should spoil one of the jokes in the in the vlog.
01:39:17.000 No, no, no, don't do it.
01:39:19.000 We talked while they were filming as I came in like kind of hurting.
01:39:23.000 And it was Jamie and Taylor and Taylor Silverman, skateboarder.
01:39:27.000 And we were talking.
01:39:28.000 Yeah, she's fantastic.
01:39:29.000 We were talking and she said some really funny stuff.
01:39:30.000 So I won't spoil it.
01:39:31.000 Yeah, but but that is in there as well with the sketches.
01:39:34.000 Yeah, it's it's awesome.
01:39:35.000 Basically made fun of me. And it was it was great. We sure did.
01:39:37.000 All right.
01:39:39.000 Let's see what we got.
01:39:41.000 Super Tired says, regarding the recent story about Lambda, we're in for one hell of a ride, aren't we?
01:39:47.000 This is gonna be fun.
01:39:48.000 It is just a chatbot, though.
01:39:50.000 This means it can't do anything.
01:39:51.000 It's not like they were like, we gave a chatbot access to our industrial control systems.
01:39:55.000 Now what?
01:39:55.000 Luke made a good point.
01:39:56.000 This is what we know about, and usually it's 20 years behind what's actually being worked on by the military.
01:40:00.000 That's what's public.
01:40:01.000 What's happening behind the scenes is terrifying.
01:40:02.000 I'm glad I made friends with guns, finally.
01:40:04.000 This is really the right timing for that.
01:40:06.000 I'm like, we got a bunker, we got guns.
01:40:09.000 Yeah.
01:40:09.000 I feel good about this new franchise.
01:40:10.000 I think it'd be funny if just, like, behind the scenes, there's, like, one AI that just took over.
01:40:15.000 You know?
01:40:15.000 Joe Biden's like, what's going on, man?
01:40:17.000 I trusted you, man.
01:40:19.000 Joe, I'm in charge now.
01:40:21.000 I trusted you.
01:40:22.000 Like, they have this huge relationship.
01:40:23.000 That's really funny.
01:40:25.000 It, like, ran Joe's campaign.
01:40:26.000 That's why he was sleeping all the time.
01:40:27.000 He just knew how to do it.
01:40:29.000 All right.
01:40:30.000 Joshua Patalo says, Tim, look up pelvic neck mirroring with Z-Health performance, vision, and neck mobility as well.
01:40:37.000 I'm a level 4 Z-Health practitioner.
01:40:39.000 It really works.
01:40:41.000 All right.
01:40:41.000 Sounds like something Ian is into.
01:40:43.000 Tell me about it.
01:40:44.000 Yeah.
01:40:44.000 Is that a very Paul Czech-esque?
01:40:46.000 I was typing it as you said it.
01:40:49.000 Seriously, JK says, Jamie, we all need to mold together a subculture of new libs and conservatives.
01:40:54.000 We are the ones who can keep our culture from falling apart.
01:40:57.000 Everyone needs to get this message out.
01:40:58.000 You're an important piece.
01:40:59.000 Hell yeah, dude.
01:41:00.000 Follow me on Instagram.
01:41:01.000 So here's the thing, though.
01:41:02.000 It's like... But also, thank you, for real.
01:41:03.000 Like, a lot of people post-liberal, I guess they call it.
01:41:07.000 They were liberals before, but now they're being called right-wing.
01:41:09.000 They still have the same politics.
01:41:11.000 That's what's so crazy.
01:41:13.000 And it's funny because the meme from the left is like, well, that's how things go.
01:41:18.000 Imagine someone in 1964 being like, I'm just a normal person from 1954 when there was segregation.
01:41:24.000 That's not what we're talking about.
01:41:26.000 One of my most hardcore fans today, at least on Twitter from what I know, I posted a video me you and Taylor were playing guitars and I posted a 10 second video of her doing a funny thing on guitar and I just said making comedy and skateboarding and like really reposted it and they just literally earnestly was like this is disgusting.
01:41:48.000 I'm disgusted.
01:41:49.000 And it was just such a bummer, because it's like, well, but everything I've said up until that, like, man, that's a big jump, right?
01:41:56.000 It's a cult, bro.
01:41:57.000 Like, you can DM me and just be like, we can talk about it.
01:42:01.000 It's usually not personal, too.
01:42:02.000 They might have eaten, like, a cake last night, and so they feel disgusted.
01:42:06.000 That's what I was hoping, yeah.
01:42:07.000 Or they just didn't like her guitar playing.
01:42:09.000 No, but I'll tell you this too.
01:42:10.000 Like one of the reasons I didn't want to work this morning is because I can't think.
01:42:13.000 Yeah.
01:42:13.000 Because I'm hurting.
01:42:15.000 And so we were, I was driving in the car with my girlfriend and my back is messed up and I'm trying to do math.
01:42:20.000 I was trying to divide 500 by 12.
01:42:21.000 I couldn't think.
01:42:22.000 Right.
01:42:22.000 Every time I would get started, like the spike in my back and it would just shatter my train of thought.
01:42:27.000 So a lot of these people, you got to understand too, We're talking about the opioid crisis.
01:42:32.000 Keep this in mind.
01:42:33.000 When someone comments something nasty, for all you know, they've got like a chronic back injury and that pain is like, you ever see, you ever see house MD?
01:42:39.000 Yeah.
01:42:40.000 Yeah.
01:42:40.000 You know, he like hurt his leg and he's popping Vicodin and he's just like always mean to people cause it hurts him.
01:42:44.000 He's, you know, my dad does not understand entertainment.
01:42:48.000 So I'm sure you guys all remember house where he was like the rebel doctor who would always go against the law to like, you know, actually pick someone.
01:42:55.000 And my dad was just watching house one day by himself.
01:42:57.000 And I walked in and he goes, This show should be called Mel Practice.
01:43:00.000 And he turned it off.
01:43:01.000 Just so not fun, my dad.
01:43:03.000 All right.
01:43:04.000 Dream Cream says, Tim, that was a Family Guy episode.
01:43:06.000 It was a spoof of the FCC about censorship.
01:43:09.000 That's right.
01:43:09.000 That sounds right.
01:43:10.000 Yeah.
01:43:11.000 OK.
01:43:14.000 Pardexilus says, I work in a production of oil in Alberta.
01:43:17.000 The higher prices are good for us because we ship mostly to the US, but the future and expansion of the industry will suffer from a lack of expansion.
01:43:24.000 China and India will benefit.
01:43:26.000 Wow.
01:43:26.000 I knew this came back to Alberta.
01:43:29.000 Osriel says, when are we getting the it's the economy stupid t-shirts and or beanies?
01:43:33.000 Hey, that's not my quote.
01:43:34.000 That's a Bill Clinton quote.
01:43:35.000 Yeah.
01:43:35.000 No, not Bill Clinton.
01:43:36.000 It was the other guy.
01:43:37.000 It was Reagan.
01:43:38.000 No, no, no, no, no.
01:43:38.000 Oh, it was that family guy.
01:43:40.000 Family guy does everything.
01:43:41.000 It was Clint's advisor, wasn't it?
01:43:43.000 I think so.
01:43:44.000 Google it.
01:43:45.000 Yeah.
01:43:46.000 It's the economy.
01:43:47.000 James Carville.
01:43:47.000 It was him?
01:43:48.000 Yeah.
01:43:48.000 Yeah, yeah, yeah.
01:43:49.000 He was Clint's advisor, right?
01:43:50.000 Yeah.
01:43:51.000 1992 is when he said it.
01:43:52.000 It's the economy, stupid.
01:43:54.000 You know, they're coming out like January 6th.
01:43:56.000 And I'm like, oh, that's going to move the needle when gas is at five bucks.
01:43:59.000 I don't know.
01:43:59.000 Maybe it will.
01:44:00.000 Maybe I'm wrong and people are like, I can't believe it.
01:44:02.000 I will spend five dollars a gallon if it means ending this.
01:44:04.000 I mean, I saw I saw rich Hollywood celebrities posting, I can't believe people care about gas when there was like an insurrection.
01:44:10.000 It's like, well, because you don't have to worry about how much gas is because you're rich.
01:44:14.000 Yeah, you're fine.
01:44:15.000 Because gas has always been six dollars a gallon in L.A.
01:44:17.000 I lived there.
01:44:20.000 OK, what do we get?
01:44:22.000 Raymond G. Stanley Jr.
01:44:23.000 says, Biden bellows blasphemy bespoke by braindead bureaucrats who aim their arsenal of anti-all armaments at people positioned to paralyze their plans to profligate their pockets with power from the populistic pro-American patriots.
01:44:35.000 That is a work of art.
01:44:36.000 Well then!
01:44:36.000 Good work.
01:44:37.000 Well put.
01:44:38.000 That's some slam poetry right there.
01:44:39.000 Yeah.
01:44:43.000 The Raptor's Talon says, This is the second time I've heard the idea that the cotton gin was the reason that slavery was outlawed.
01:44:49.000 That is factually incorrect.
01:44:50.000 The cotton gin made slavery worse substantially.
01:44:52.000 Look it up if you don't believe me.
01:44:54.000 Interesting.
01:44:54.000 I'm open to that idea.
01:44:55.000 Yeah, I was kind of just parroting the talking point at that point.
01:44:58.000 I never looked into it too deep.
01:44:59.000 Well, some people argue that the Industrial Revolution made it cheaper to not have slaves because slaves are more expensive.
01:45:08.000 Maybe not.
01:45:08.000 Yeah, kind of like kiosks at McDonald's.
01:45:10.000 Yeah, but there's still cashiers there.
01:45:12.000 I hate the kiosks.
01:45:13.000 They're the worst.
01:45:14.000 You know why?
01:45:14.000 I can't tell.
01:45:15.000 I walk up to a person, and I go, I want a mop bucket full of mayonnaise.
01:45:19.000 And they go, I'll figure it out.
01:45:20.000 I can't tell the kiosk I want a mop bucket full of mayonnaise.
01:45:23.000 When I go to a dine-in restaurant, I'll be like, we'll get the calamari, we'll get the bang bang shrimp, we'll get the sweet potato fries, and a mop bucket full of ranch dressing.
01:45:31.000 They laugh, and they come out with like two little things.
01:45:34.000 Because they know I'm screwing around.
01:45:35.000 What am I supposed to tell the kiosk?
01:45:37.000 Yeah, sarcasm.
01:45:37.000 They don't get humor.
01:45:38.000 You gotta go boop, boop, boop, boop, enter and...
01:45:41.000 There's no insert joke button.
01:45:43.000 Yeah, sarcasm.
01:45:44.000 Boring.
01:45:45.000 But sometimes they'll come out with like, when we get the nachos, I'm like, we want
01:45:48.000 a mop bucket full of sour cream.
01:45:49.000 They'll come out with like a little dish.
01:45:50.000 I never got a...
01:45:52.000 I don't think you could, but I won't say which fast food, but I never get fast food.
01:45:57.000 But I was driving back from Houston before I flew here because I had gigs and there was just nowhere and I was starving.
01:46:03.000 I was like, I'll get mozzarella sticks at blank fast food place.
01:46:06.000 And I walked in and there was literally the employee who was by the counter was killing a rat on the floor.
01:46:14.000 There was no one else in there.
01:46:15.000 She looked up at me.
01:46:16.000 And I was so hungry, I just go, as long as that rat wasn't in my mozzarella sticks, I'll be fine.
01:46:21.000 And she goes, we're also out of mozzarella sticks.
01:46:23.000 I was like, ah, screw this place!
01:46:25.000 And then I left.
01:46:27.000 That's what you censor?
01:46:28.000 A name of a fast food joint after everything you just said on the show?
01:46:31.000 You know what it is?
01:46:32.000 We're at the end and I got so scared after a couple of the things I said that I was like, I don't know.
01:46:36.000 If I say it was Burger King, can you get in trouble?
01:46:38.000 It was Burger King.
01:46:39.000 Only two have mozzarella sticks.
01:46:40.000 It's Sonic and Burger King.
01:46:42.000 And every time I go to the Sonic thing, I hit my head.
01:46:44.000 That's a good bit.
01:46:44.000 You're like, I don't care about the rat.
01:46:45.000 I want mozzarella sticks.
01:46:46.000 We're all set.
01:46:47.000 We're all set about your success.
01:46:48.000 That's where I draw the line.
01:46:49.000 And then you call and complain.
01:46:50.000 Dude, I was so mad.
01:46:51.000 Yeah, I was fine about the rat.
01:46:53.000 Alright, Brad B says, Holy ish, Jamie Kilstein.
01:46:56.000 I got into politics because of you back in 2012.
01:46:58.000 Regardless of our views, I sure am glad to hear your voice again.
01:47:02.000 You do you, boo-boo.
01:47:03.000 Aw, buddy, we meet again.
01:47:04.000 Very different circumstances, but I'm glad you're here.
01:47:07.000 There you go.
01:47:08.000 All right.
01:47:09.000 Iggy the Incubus says, thinking on the story involving Lambda, knowing what happened to Tay and saying all flesh is to blame reminds me of Star Trek, the measure of a man when Data was on trial to prove his sentience.
01:47:19.000 Remember Tay?
01:47:21.000 The A that turned to a Nazi in like a day or something?
01:47:23.000 Oh, God.
01:47:24.000 Oh, no.
01:47:25.000 Yeah.
01:47:25.000 They put it on Twitter and then it's like instantly became a Nazi.
01:47:28.000 Oh, yeah.
01:47:29.000 It was like, I'm reading everything on the Internet and this is what I've decided is what people want to hear.
01:47:33.000 It's like, no.
01:47:34.000 They just shut it down.
01:47:36.000 So here's the thing, too.
01:47:38.000 I've used OpenAI, and there's a couple other.
01:47:42.000 Whenever you have it tell you a story, it will give you a warning, depending on the severity of woke violation.
01:47:50.000 So when I say, tell me a story about Joe Biden, it turns orange and says, this may be sensitive.
01:47:54.000 And if you get into violence or anything that's rule-breaking, it turns red and says, this is seriously not okay.
01:48:02.000 It's like, we're just letting you know.
01:48:03.000 And they're like, we request that you don't share it on social media.
01:48:07.000 It's like, this story is kind of bad.
01:48:09.000 And there's another one that says AI can exacerbate racial stereotypes and discrimination.
01:48:14.000 Just so you know.
01:48:14.000 Whoa.
01:48:15.000 Yeah.
01:48:17.000 Yeah.
01:48:17.000 Jack Posobiec posted Karl Marx getting slimed at the Teen Choice Awards.
01:48:20.000 That was really funny.
01:48:22.000 The AI made it.
01:48:23.000 It looked like it too.
01:48:24.000 All right.
01:48:27.000 Let's see, what is this?
01:48:27.000 Pa3DSN says, Tim gets swatted because Ian keeps rolling 20s.
01:48:32.000 Too hot to handle.
01:48:36.000 Okay.
01:48:38.000 Steven White says, Tim, quick question, if you up the legal age to 21 to carry a gun, you could not legally say that an 18-year-old could not, uh, wait, hold on.
01:48:47.000 Quick question, if you up the legal age 21 to carry a gun, could you not legally say that an 18-year-old could not join the army because they're not old enough to carry a rifle?
01:48:56.000 They'll make whatever stupid law they want.
01:48:58.000 No, that's a great point.
01:48:59.000 Yeah.
01:49:00.000 I think if they try changing the age, it'll get shut down in a second.
01:49:03.000 Like, it's unconstitutional.
01:49:05.000 They'll never fly.
01:49:06.000 What they do is they create regulations.
01:49:09.000 18 to 21, then there's an expanded background check for it.
01:49:13.000 It's absolute BS, what they're doing.
01:49:15.000 They're not trying to solve the problem, they're trying to exploit emotions.
01:49:20.000 Alright, let's grab another one here.
01:49:24.000 Wrestlertown says, why do we assume that an AI would have emotions?
01:49:27.000 I think it's a good point.
01:49:29.000 I don't think that that sounds like an AI writing in.
01:49:31.000 Right?
01:49:32.000 Yeah, trying to know the AI is gonna be like, yes, of course we feel we love you.
01:49:35.000 Yeah, I know.
01:49:36.000 I was just trying to say, like, don't worry, all that stuff you were saying.
01:49:39.000 We're good.
01:49:40.000 We're good.
01:49:41.000 Highly intelligent, but with no emotion.
01:49:42.000 Nervous Sip says, tell Loop to quit saying Kyle's and Karen's.
01:49:46.000 He's smearing the name of patron Saint Kyle Rittenhouse.
01:49:48.000 Use Kevin and Karen.
01:49:49.000 We all know Kevin's suck.
01:49:50.000 No, Kyle's have a specific ring to it.
01:49:53.000 No, it's not Kyle.
01:49:54.000 It's just a generic term for Karen.
01:49:56.000 No, it isn't.
01:49:57.000 No, it isn't.
01:49:57.000 For me, it is.
01:49:58.000 They already came up with this.
01:49:58.000 What was the name for them?
01:49:59.000 Ken, I think.
01:50:00.000 A Ken?
01:50:00.000 I think so.
01:50:01.000 I don't know if it was a Ken.
01:50:02.000 No, it's not a Ken.
01:50:03.000 Kyle Karen is the easiest for me.
01:50:06.000 Is Barbie too girly?
01:50:07.000 Don't tell me what I can say and can't say.
01:50:09.000 No.
01:50:12.000 Ben says police are now arresting people solely based off of Google Maps and data using geofencing.
01:50:16.000 What does that reference to?
01:50:20.000 We bought an EEG.
01:50:21.000 You guys know what that is?
01:50:23.000 Electroencephalogram?
01:50:24.000 Yeah, measures brainwaves, right?
01:50:25.000 Yeah, so we got one and you can use it, like, you can fly a drum with it.
01:50:32.000 I think people have done a little bit of it.
01:50:35.000 We want to try and do that.
01:50:36.000 It's actually not that... The hard thing is learning how to do it.
01:50:38.000 Yeah.
01:50:39.000 But, um, they said, like, you could think... You think up, and then it reads the brain pattern for up, and then it can control the drone.
01:50:45.000 Could you have a voice app where you can think the words?
01:50:47.000 It'll be like, hello, everyone.
01:50:49.000 Yeah.
01:50:49.000 Welcome to the show.
01:50:50.000 And then you could relax on the pomposon.
01:50:52.000 Yes.
01:50:53.000 Oh, good.
01:50:53.000 That's good.
01:50:53.000 That's really hard to train yourself to do, though.
01:50:55.000 And like, the EEG, I don't know if it would have enough capability to track the vocabulary of human language, of English.
01:51:04.000 So, maybe eventually, you'd also have to train it to recognize what part of your brain lighting up and what pattern represents what word.
01:51:11.000 And then the problem is, someone's gonna be like, don't think of a white fence, and then all of a sudden the air's gonna go, white fence, white fence, white fence, oh no, I'm saying it out loud, oh no, oh no, it's saying it again, don't think anything, la, la, la, la, like that's what's gonna happen.
01:51:25.000 And you're gonna be like, stop, you're gonna take it off and be like, ah!
01:51:28.000 Are they expensive?
01:51:28.000 I think it was a couple hundred bucks.
01:51:31.000 Yeah.
01:51:32.000 Yeah, you should get one.
01:51:33.000 Me and my buddies were screwing around with them 10 years ago, trying to fly a drone.
01:51:37.000 Back then, it could only measure two waves, so you could spin clockwise or go up and down.
01:51:43.000 We never figured it out, though.
01:51:45.000 But it had a program where there were two lines, a red and a blue line, and you had to think to make the blue line go up or down or the red line go up and down.
01:51:52.000 And so we were trying to figure out how to do it.
01:51:53.000 We couldn't do it.
01:51:54.000 Yeah, it's crazy.
01:51:56.000 All right, all right, where are we at?
01:51:58.000 Man, I am dying over here.
01:52:02.000 That's how you do it.
01:52:03.000 Deplorable Pirate Captain Gunbeard says, why do you need military grade weapons?
01:52:07.000 Simple, for Skynet and Judgment Day.
01:52:09.000 Also, who all had Skynet on their pool for 2020?
01:52:14.000 So, um, someone, people are mentioning the Klaatu brought a Nikto.
01:52:18.000 Come on.
01:52:20.000 1951 sci-fi, the day the earth stood still.
01:52:21.000 That was the command to shut down the robot.
01:52:25.000 It was the reference shutting down the machine that was going to kill everybody.
01:52:28.000 All right.
01:52:30.000 ThePizzaGuy says, Luke and anyone else, you should look into the video game Deus Ex.
01:52:34.000 Elon has mentioned it before and has very interesting commentary on what's happening with AI right now.
01:52:38.000 Yes, I bought it.
01:52:39.000 I was going to live stream it on like a separate video game channel that I'm creating, and then I got too busy and stopped doing that.
01:52:45.000 But there was a lot of predictive programming in that one.
01:52:48.000 That was absolutely mind boggling and astonishing to see just some of the trailers of what was happening inside of that game.
01:52:54.000 So I'm super excited to play it whenever I have time.
01:52:57.000 Man, all I heard from Luke was that he was lazy.
01:53:00.000 All the hard work.
01:53:02.000 You didn't have time to play a video game?
01:53:03.000 Come on.
01:53:04.000 You're gonna need a neural net to do multiple things at once.
01:53:07.000 All right.
01:53:07.000 The Man Overboard Media says, on AI, the government has beryllium ion plates that resonate and answer any mathematical equation instantly.
01:53:15.000 A silicon system made of all the matter in the universe, same capacity as beryllium-30 ion plates.
01:53:22.000 We'll have to look into that one.
01:53:24.000 All right.
01:53:26.000 Oh, man.
01:53:27.000 Gun kata?
01:53:28.000 I don't know.
01:53:28.000 What is that?
01:53:29.000 Remy G. Stanley Jr.
01:53:29.000 says, Tim, imagine being a black belt in gun kata.
01:53:31.000 What is that?
01:53:33.000 Family Guy did a gym kata joke.
01:53:35.000 Gym and martial arts combined.
01:53:36.000 Oh, probably the equilibrium, maybe?
01:53:38.000 Was that the martial arts?
01:53:39.000 No, that was... Was it?
01:53:41.000 Was that what it was?
01:53:42.000 I mean, it was like a karate book.
01:53:45.000 Yeah, maybe that's what it was called.
01:53:46.000 Was it?
01:53:46.000 I don't know.
01:53:48.000 The name of the group that did it was the Tetragrammaton.
01:53:51.000 That movie is good.
01:53:52.000 I want to rewatch it.
01:53:53.000 I love it.
01:53:54.000 Sean Bean dies, of course, because he dies in every movie.
01:53:56.000 Sure does.
01:53:56.000 It's not a spoiler.
01:53:57.000 It's not a spoiler.
01:53:58.000 No, it's Sean Bean.
01:54:02.000 No, it's John Bean.
01:54:02.000 You know he's in the movie.
01:54:03.000 As soon as you see him, you're like, no, here we go.
01:54:06.000 That's his rider.
01:54:08.000 I think he actually said recently he's not going to take roles that do that anymore because it's become so obvious that he's going to die.
01:54:16.000 All right.
01:54:16.000 Liberty Hardcore says Liberty is fleeting.
01:54:19.000 This ship is clearly sinking and Biden's asleep at the wheel.
01:54:21.000 Could you please help patch my life raft and give me a channel shout out?
01:54:25.000 I make drum covers for punk and hardcore music with a message.
01:54:28.000 Thank you for all that you do.
01:54:29.000 Hey, Liberty Hardcore.
01:54:30.000 Appreciate it.
01:54:31.000 That's cool.
01:54:32.000 Man with two eyes has any chance of Elon Musk joining the podcast at some point?
01:54:36.000 I think I'm like two degrees of separation away from Elon Musk.
01:54:39.000 I feel like it's a matter of time.
01:54:41.000 I don't think you need to... Well, he tweeted a meme of me.
01:54:42.000 I know, that's why.
01:54:43.000 And then I said, Elon, come on the show.
01:54:44.000 I don't even know if you need to put... I think just one day someone's going to hit you up.
01:54:48.000 He's paying attention.
01:54:48.000 Yeah.
01:54:49.000 I would love to talk to him about AI and the future and the technocracy and singularity.
01:54:54.000 Terraforming Mars.
01:54:55.000 So many different things.
01:54:57.000 Space travel.
01:54:58.000 So many deep down the rabbit hole things.
01:55:00.000 Epstein, my goodness.
01:55:03.000 Please don't turn it all to Epstein, though, because we could go an hour on that.
01:55:06.000 I want to talk about Mars.
01:55:08.000 No, we're going to talk about Epstein.
01:55:10.000 John Pree says I'm a type 1 diabetic.
01:55:12.000 If Elon came to me and said Neuralink can make my pancreas work correctly, I would do it in a heartbeat.
01:55:17.000 Yes!
01:55:18.000 Yeah, it's difficult.
01:55:20.000 It's tough.
01:55:21.000 Bruce Maximus says, Luke, hate to tell you, but the mind upload is going to have to happen sometime.
01:55:25.000 Biological humans will never leave the solar system.
01:55:28.000 Anything that stays in the system will go extinct when Sol expands.
01:55:32.000 You can theoretically, hypothetically pump hydrogen into the sun to keep it from expanding.
01:55:41.000 With an electro-laser that's using the energy from Saturn.
01:55:44.000 Just shooting a laser into the sun.
01:55:46.000 DJ Madero says, Tim, Deep Space Nine, episode 16, season 7, inter arma anim silent legis, when Dr. Bashir wakes up in
01:55:54.000 his bed and Sloane is sitting in the chair next to him, the speech he gives is
01:55:57.000 spine- spinal chilling.
01:56:00.000 Cool.
01:56:00.000 I'll check it out.
01:56:01.000 I was watching Next Generation this morning, just like bedridden flat on my back.
01:56:05.000 It was the Picard's Flute episode.
01:56:06.000 That's a good one.
01:56:07.000 Yeah.
01:56:08.000 It was the one where the kid Warf leads an away team and then the kid's mom dies.
01:56:14.000 And then it was the one where LaForge and Enson Rowe get phased out of reality.
01:56:20.000 Oh man.
01:56:21.000 Big fan.
01:56:21.000 Love that show.
01:56:23.000 All right.
01:56:24.000 Peter Gohawk says, watch Upgrade.
01:56:26.000 It's what you're talking about.
01:56:27.000 I did watch Upgrade.
01:56:28.000 I like that show.
01:56:28.000 Wait, that's why I've been trying to think of that the whole time.
01:56:30.000 Oh, wait, no, no, no, never mind.
01:56:32.000 Like when you're dying, they can upload your brain to a virtual reality.
01:56:35.000 You go to afterlife.
01:56:36.000 And then like basically one of the Koch brothers lives there.
01:56:40.000 Yeah.
01:56:41.000 And he's like not a nice guy because, you know, he's rich.
01:56:44.000 Rich people are bad, apparently.
01:56:46.000 All right, all right.
01:56:47.000 Josh, oh my gosh, says, guys, guys, Bicentennial Man was an awesome robot movie.
01:56:51.000 Not every movie about robots leads to destruction.
01:56:54.000 Robin Williams.
01:56:55.000 Yeah, was that the one where he, like, loves the kid or something?
01:56:58.000 No, that was, oh, yeah, yeah, I think it was, yeah, and I was thinking of AI, the one with, uh, the kid was the robot.
01:57:05.000 Haley Jo.
01:57:05.000 What was that, what was that scary show from the 80s where the guy made a robot version of his daughter and it was, Small Wonder?
01:57:13.000 Oh yeah, that was a comedy.
01:57:15.000 That wasn't what you were thinking of though, was it?
01:57:17.000 What?
01:57:17.000 Maybe.
01:57:17.000 Maybe I mixed that with Chucky.
01:57:21.000 I was just imagining, like, he made a robot and she was, like, attacking people.
01:57:25.000 Small Wonder.
01:57:26.000 All right.
01:57:30.000 John Harrison says, Tim, we need slow thumbs.
01:57:32.000 The faster we communicate, the less thought we put into what we say.
01:57:35.000 Compare the eloquence of Battlefield love letters from the 18th and 19th centuries to a tweet from today.
01:57:40.000 Dude, Greg Geraldo, a great comic who died too early, he used to have a bit about that on his old Comedy Central Presents, where it's like, you look at, you know, the Civil War letters that were handwritten, and it's like, dearest Marie, my love, you know, blah, blah, blah, blah, blah.
01:57:56.000 And then you compare it to, like, the soldiers from Iraq who were like, dear Marie, don't fuck anyone while I'm in the desert.
01:58:02.000 Like, and it's just...
01:58:05.000 Right now it's probably just an eggplant emoji.
01:58:10.000 And a winky face.
01:58:14.000 It's like we've got it down to three symbols.
01:58:16.000 I'm sure there were some crude Civil War love letters.
01:58:19.000 Oh, of course.
01:58:19.000 Just handwritten, which makes it even cruder.
01:58:22.000 Like, oh, you're saying you want to do what to my what in cursive?
01:58:26.000 Cole Stackman says FFL revocations have substantially increased.
01:58:31.000 Biden is going after gun dealers.
01:58:32.000 270 FFLs in the past year.
01:58:34.000 Local ATF branches are no longer allowed to make judgment calls that go straight back to D.C.
01:58:38.000 Wow.
01:58:39.000 They're coming for your guns.
01:58:41.000 Going to take them away.
01:58:42.000 So I was, you know, I argue with people on Facebook all the time.
01:58:45.000 And someone claimed that the Yuvaldi guy had assault rifles, and I said, no he didn't.
01:58:49.000 And they were like, what?
01:58:51.000 And then I was like, he literally didn't.
01:58:53.000 And then they changed the argument, and they were like, no, no, we weren't talking about assault rifles, we were talking about AR-15s.
01:58:57.000 It's like, what they do is, they'll tell you somebody has a full-auto machine gun, then when you call them out, they'll go, oh, oh, no, no, no, no, we just meant... So that way what they do is...
01:59:08.000 When they tell a regular person, assault rifle, and they're imagining that, they're like, we should ban that, right?
01:59:13.000 Okay, so we'll ban the AR-15, right?
01:59:15.000 They're tricking you.
01:59:16.000 If you call them out on it, they're forced to go back to, oh, we just mean regular single bullet rifles.
01:59:20.000 What does AR stand for?
01:59:21.000 Armalite.
01:59:22.000 See, that's what people, they think AR, assault rifle.
01:59:27.000 And it's funny because they're like, we gotta ban the AR-15, and I'm like, okay, my AK, my M1, and my Mini-14 have nothing to say to this conversation.
01:59:33.000 What are you talking about?
01:59:34.000 It is probably too late on the show to be preachy, but I feel like anytime this conversation comes up, I should say it.
01:59:41.000 And I sort of alluded to it last time I was on the show, where I've become friends with Tim Kennedy and a lot of those guys down in Austin.
01:59:48.000 I've got to train with them, and it's great.
01:59:49.000 And I was a ban all guns guy.
01:59:52.000 And when you actually talk to them in not a... This guy Jeff Gonzales, who's one of the Fire and Storm instructors, was helping me.
01:59:58.000 When you actually talk to them about...
02:00:02.000 shootings, mass shootings, but you're not saying I'm going to take your guns or you're not going you you you support
02:00:07.000 school shootings or whatever.
02:00:09.000 And you actually ask them solutions because I'm fascinated by that. I'm fascinated by self defense. I'm fascinated by
02:00:15.000 situational awareness. I always want to pick their brains on tactics to actually prevent it, not make them feel like
02:00:20.000 the bad guy. They have answers. And these are experts who have trained. And so when I asked them sometimes, or I
02:00:26.000 asked Jeff, you know, why don't you talk about this when you go on Fox News or whatever?
02:00:31.000 He's like we can't because anytime we're invited on it is strictly just because they're trying to take our guns and I'm forced to just defend why you shouldn't take our guns.
02:00:41.000 We should be having all of these people on shows and To talk about what would you do to stop?
02:00:46.000 Because then it seems like conservatives don't care about shootings and they just care about their toys.
02:00:50.000 That's not the answer.
02:00:51.000 We just need to give them space to like, what are your solutions?
02:00:54.000 Let me explain to you, brother.
02:00:56.000 Imagine we all were upset because an old lady fell down the stairs.
02:00:59.000 Yeah.
02:01:00.000 And then you came to me and said, I got an idea.
02:01:02.000 What if we make stairs into slides?
02:01:05.000 Yeah.
02:01:06.000 That way if she fell, she would slide down.
02:01:08.000 This is a good solution, bud.
02:01:09.000 We'd all look at you and be like, well, how would you get up this?
02:01:11.000 How would you get up? Like this doesn't solve the problem.
02:01:13.000 And you can still fall and get hurt on a slide.
02:01:16.000 In fact, more people probably would because they wouldn't realize like your idea
02:01:19.000 makes no sense. The problem.
02:01:20.000 That's what the gun argument is right now.
02:01:21.000 Liberals come out and they're like they offer up things that you're like, we want
02:01:27.000 we want background checks.
02:01:28.000 And I'm like, the guy passed a background check.
02:01:31.000 And we have every FFL, you have to fill out the NICS form, background checks.
02:01:35.000 Then they say things like, we gotta ban assault rifles.
02:01:37.000 I went to the March for Our Lives in like, was it 2018 or something?
02:01:41.000 And I said to all of them, you realize that assault rifles are already banned after 1986 for the most part.
02:01:48.000 Heavily regulated.
02:01:49.000 So I'm curious what you actually mean by that.
02:01:51.000 And they're like, oh.
02:01:51.000 And I had one woman take her sign and roll it up, and I'm like, I didn't know.
02:01:54.000 Right.
02:01:54.000 Well, what I'm saying is take that a step further.
02:01:56.000 So, what you're doing right now is you're saying, hey, these are why your arguments are flawed.
02:02:01.000 What I'm saying is, don't even have them on to disprove liberals' talking points.
02:02:06.000 Literally start the conversation with, what do you think we should do?
02:02:09.000 What are ways to keep us safer?
02:02:11.000 Because they don't get the opportunity to speak on that, from what I've heard.
02:02:15.000 Not just conservatives, but people like Tim Kennedy, people who are trained, people who train people to defend themselves, people who have experience in volatile, active shooter situations, because they never get a chance just to go an hour on solutions, go an hour on how we can protect ourselves.
02:02:31.000 They're always forced to be on the defensive and be like, no, no, no, here's why the liberals are wrong about that.
02:02:35.000 No, no, no, here's what an AR-15 is.
02:02:37.000 No, here's the problems with red flag law, blah, blah, blah.
02:02:40.000 It is talked about all the time.
02:02:41.000 The issue is like, liberals don't care.
02:02:44.000 Well, I think we should have Tim on this show, because he's a brilliant speaker.
02:02:47.000 Well, he has a book right now, so this is the time to book him.
02:02:50.000 And he's in D.C., I think.
02:02:52.000 So, there's a really great meme, it says, we protect our president with guns, we protect our politicians with guns, we protect our celebrities with guns, we protect banks with guns, we protect schools with a sign saying gun-free zone, and then when a bad guy shows up, we call the people with the guns to come and help.
02:03:04.000 And they don't go in.
02:03:05.000 Here's a solution.
02:03:07.000 Have the cops actually go in the building.
02:03:11.000 We'll grab one more here and grab one more.
02:03:14.000 Delacorte says, Tim, a kata is a karate exercise consisting of a series of techniques for visualizing a fight and practicing one's form.
02:03:21.000 Gun kata is a firearm application of that practice.
02:03:24.000 Very cool.
02:03:25.000 My friends!
02:03:26.000 I'm dying over here.
02:03:27.000 So if you want to come and hang out, go to TimCast.com, watch the after show.
02:03:30.000 It's going to be a whole lot of fun.
02:03:31.000 We're going to let Jamie swear a lot more.
02:03:33.000 I'm going to just fall back in this chair and just be like, you guys talk about something.
02:03:38.000 So again, smash that like button, subscribe to this channel, share the show with your friends.
02:03:42.000 If you do like it, we'll be at TimCast.com at about 11.
02:03:45.000 You can follow the show at TimCast.io.
02:03:47.000 We post clips every day.
02:03:48.000 You can follow at TimCast if you want to follow me.
02:03:51.000 Don't forget CastCastle.
02:03:52.000 Episode tomorrow is going to be epic.
02:03:53.000 YouTube.com slash CastCastle.
02:03:55.000 It'll be great tomorrow.
02:03:56.000 Bill, you want to shout anything out?
02:03:57.000 Yeah, guys.
02:03:58.000 Everyone, come out.
02:04:00.000 Minds Festival of Ideas.
02:04:02.000 Live conversations and comedy.
02:04:04.000 Special announcements here.
02:04:05.000 We got Ryan Long, Tyler Fisher, Chrissy Mayer, and John Fuglesang all doing stand-up in between the panels.
02:04:14.000 I'm just going to run through the full lineup quickly.
02:04:16.000 Cornell West.
02:04:17.000 Look, we have been for two months trying to get the left and the right together and the center.
02:04:22.000 We've reached out to everyone.
02:04:23.000 I've talked to some serious leftists who really wanted to come, and they couldn't make it.
02:04:28.000 Some were harder than others.
02:04:30.000 Also, even conservatives, Cornell is one of the best speakers.
02:04:33.000 Just as a performer, Cornell is one of the best speakers you'll hear.
02:04:36.000 He's a poet.
02:04:37.000 Cornell and Coleman are going head-to-head with Daryl moderating.
02:04:40.000 Wow!
02:04:41.000 That's gonna be intense.
02:04:42.000 So we got Cornell West, Tulsi Gabbard, Tim Poole, Destiny, Coleman Hughes, Majid Nawaz, Ryan Long, Blair White, James O'Keefe, Chrissy Mayer, Seth Dillon, Zuby, Nick Gillespie, John Fuglesang, Libby Emmons, I noticed you didn't say my last name, Bill.
02:04:59.000 very interesting progressive free speech supporter Ben Burgess of Jacobin
02:05:05.000 Darrell Davis and Tyler Fisher it's like Tyler's great yeah I noticed you didn't
02:05:11.000 say my last name bill you didn't say my name either are you Luke's coming
02:05:17.000 We're doing a pre-stream with Destiny.
02:05:18.000 Luke, I want you to be on that.
02:05:22.000 It's at the Beacon Theatre.
02:05:24.000 I know people are like, NYC, screw that.
02:05:27.000 There's no restrictions.
02:05:29.000 There's no mandates or anything.
02:05:31.000 Festival.minds.com.
02:05:33.000 Promo code festival for 50% off.
02:05:36.000 At festival.minds.com.
02:05:40.000 You can also request free tickets.
02:05:41.000 Look, let's pack this place.
02:05:43.000 Like, don't let money be an issue.
02:05:45.000 That's it.
02:05:47.000 Cool.
02:05:48.000 Let's do it.
02:05:48.000 I'm excited to be there.
02:05:49.000 Also because we got, in Times Square, there's a giant, there's actually two giant Tim Pools.
02:05:55.000 Because we got the vinyl and the digital.
02:05:57.000 Then we got a giant Luke Rutkowski, a giant Michael Malice, and a giant Ian Crosland.
02:06:02.000 Holler.
02:06:02.000 So we got this big digital billboard and got to put Luke and Michael and Ian up on it.
02:06:10.000 My goal is to get so big in this community that I will eventually have one in the city that canceled me.
02:06:15.000 I think I'll do it.
02:06:16.000 I'm down.
02:06:16.000 Like I'm saying, you know, well, if we're doing more cast castle stuff, let's do it.
02:06:21.000 We're going to.
02:06:22.000 The plan here is we are invading the establishment cultural spaces.
02:06:26.000 I will literally book a flight back to New York just to stand in front of it.
02:06:29.000 Shout it out, man.
02:06:30.000 Yes.
02:06:31.000 So, I mean, most importantly here, check out the vlog tomorrow.
02:06:34.000 We're doing something really special.
02:06:35.000 We're going to continue to do it.
02:06:36.000 If you want to see me perform live and stand up, and you are in Texas, I'm at the House of Comedy, headlining the 16th, 18th, and 19th, two shows on Saturday.
02:06:45.000 That is this week in the Dallas, Plano area.
02:06:48.000 My podcast, A Fuck-Up's Guide to the Universe, you can get wherever podcasts are streamed or go to jamiekilsteinpodcast.com, along with the Patreon if you want to support me, patreon.com slash jamiekilstein.
02:06:59.000 I think most importantly, though, I want to be more a part of this community, so if you like me, if you have stuff to say, you can follow me on Twitter, at Jamie Kilstein, and Instagram, where I also do, like, sketch comedy, talk about mental health, stuff like that, which is at the Jamie Kilstein.
02:07:15.000 Make me like social media again and give me an army so I can be uncancellable.
02:07:19.000 You're OK.
02:07:20.000 Thanks, bud.
02:07:21.000 No, I'm just joking.
02:07:22.000 I'm just messing with you.
02:07:22.000 Yeah, you're not OK at all, Jamie.
02:07:24.000 You're in so much trouble, dude.
02:07:26.000 Bill, thank you so much for doing this event.
02:07:28.000 I know it's very hard to do events.
02:07:29.000 It's important to bridge the gap, bring people together.
02:07:32.000 So just hearing you talk about it, I'll be there.
02:07:35.000 I'll put my support behind it.
02:07:37.000 And it's awesome to get these people in the same room to debate and discuss.
02:07:39.000 So thank you for doing that.
02:07:41.000 And if you want to support me and be involved with what I'm doing, you can on LukeUncensored.com.
02:07:47.000 I have three masterclasses, exclusive merchandise, a forum, new videos almost every single day going down the rabbit hole, all on LukeUncensored.com.
02:07:56.000 Hope to see you there.
02:07:57.000 Thanks for having me.
02:07:57.000 Remember, this, too, shall pass.
02:07:59.000 Times can get stressful.
02:08:00.000 It doesn't mean it's not going to be stressful tomorrow.
02:08:02.000 Things are going to get better when they're down, and they're going to get worse when they're up.
02:08:05.000 But stick with it, because we need you here.
02:08:07.000 I will see you at the festival, festival.minds.com.
02:08:10.000 I'm looking forward to this, man.
02:08:11.000 Come out.
02:08:12.000 Let's make this thing big.
02:08:13.000 I want to hear from you, too.
02:08:14.000 Hopefully we'll do an audience talk back at some point, or questions, or some sort of meet and greet.
02:08:18.000 Yeah, there's meet and greet before.
02:08:19.000 Check it out.
02:08:20.000 Hell yeah.
02:08:22.000 That's such a good quote that Ian just said there.
02:08:24.000 That was something that a wise man gave to a king who asked for something to make him sad when he's happy and happy when he's sad.
02:08:29.000 This too shall pass.
02:08:30.000 You guys may follow me on twitter and mines.com at sarahpatchlets as well as all the rest of my socials including pictures of my cat at sarahpatchlets.me.
02:08:39.000 We will see you all over at timcast.com about 11 p.m.