The Joe Rogan Experience - November 08, 2025


Joe Rogan Experience #2408 - Bret Weinstein


Episode Stats

Length

3 hours and 7 minutes

Words per Minute

166.2288

Word Count

31,215

Sentence Count

2,294

Misogynist Sentences

15

Hate Speech Sentences

23


Summary

In this episode, Joe talks about a dream he had last night, and how it changed his life. Joe also talks about some of the weirdest things he has ever seen in his life, and what he thinks about them.


Transcript

00:00:01.000 Joe Rogan podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Drain by Day, Joe Rogan, podcast by night, all day!
00:00:12.000 Good to see you, my friend.
00:00:13.000 Joe, always so great to see you, brother.
00:00:17.000 So I was telling you before we get started that I had the most bizarre dream I've ever had in my life last night.
00:00:25.000 The most realistic and most bizarre dream.
00:00:28.000 And it's so hard to try to explain how strange this was.
00:00:33.000 But I was in some weird corridor that looked like a building, but was odd.
00:00:41.000 Very strange.
00:00:42.000 And I was encountering these beings that looked like people, but very different.
00:00:49.000 They were very thin and they were slightly on the tall side.
00:00:56.000 And they had big heads, like larger than normal, with larger than normal eyes.
00:01:01.000 But they looked like people and they were playful and they were scaring me.
00:01:05.000 Like they scared me and then they joked around.
00:01:07.000 Like we're just joking around.
00:01:09.000 It was the most realistic dream I've ever had in my life.
00:01:14.000 And I woke up and I could not go back to, I had to stay up.
00:01:17.000 I got up at 3.30 in the morning and I just went to the gym and I worked out for a couple hours and I was like, what the fuck was that?
00:01:26.000 But it was very bizarre in that there was communication going on.
00:01:36.000 It was like, God, I want to read into this because I know it's just a dream, but it was like, get comfortable with this.
00:01:47.000 You should read into it because it's a dream.
00:01:50.000 So it doesn't make it right, but your subconscious is trying to tell you about something.
00:01:55.000 And the fact that it felt very, very important means your subconscious thinks it's very, very important.
00:02:01.000 I woke up.
00:02:02.000 I mean, I was tired, man.
00:02:04.000 When I went to bed, I was tired.
00:02:05.000 I was falling asleep watching TV.
00:02:08.000 I went to bed at like 10.30, 11 o'clock at night, like beat down.
00:02:13.000 I was like, oh my God, I'm going to get some sleep.
00:02:14.000 It's had a long week, a lot of activities, workouts, this, that, the other.
00:02:19.000 Tired.
00:02:20.000 3.30 in the morning, whatever this was, woke me up so much that I just laid in bed for like another hour.
00:02:27.000 And I was like, there is no way I'm going to sleep.
00:02:32.000 I'm up forever.
00:02:33.000 And then I just went and worked out.
00:02:35.000 I worked out and I was hoping I would be exhausted after I worked out and I'd be able to relax.
00:02:40.000 But it was like a couple hours after that that I sat, laid down and I took a nap for an hour before I came here.
00:02:47.000 Yeah.
00:02:47.000 Question for you.
00:02:48.000 Did you see a video, I think it was yesterday, maybe it was the day before, of some Chinese robots that seemed to be across on our side of the uncanny valley, that they walk with a gait that feels very human.
00:03:06.000 Did you see that?
00:03:07.000 No, I haven't seen that.
00:03:08.000 Is that the latest?
00:03:09.000 I don't know.
00:03:10.000 I've seen it a few times in the last couple days.
00:03:12.000 It sort of sounded to me like your dream might have been responsive.
00:03:16.000 These things felt very organic.
00:03:19.000 Whatever this was, it felt like living organic beings that were like us.
00:03:26.000 There was also a water element.
00:03:28.000 It was hard to understand what the water element of it was, but there was some sort of an indication that there was water and that there was a protection from you going out into the water.
00:03:42.000 But if you did go into the water, there's a bunch of predators in the water.
00:03:45.000 But they weren't like, it wasn't like sharks.
00:03:48.000 It was like crocodile type things that were in the water and that they had been like feeding them and keeping them calm and like keeping them away.
00:03:57.000 But whatever these beings were in my dream, they were like what humans could eventually be.
00:04:06.000 That's what it felt like.
00:04:08.000 It didn't feel like a person, but it like, you know, like I don't feel like a monkey.
00:04:13.000 You know what I mean?
00:04:14.000 Yeah.
00:04:14.000 But it was like that.
00:04:16.000 It was very, very realistic.
00:04:18.000 Like there was communication going on.
00:04:21.000 And I was really freaked out.
00:04:23.000 And they were fucking with me to lighten me up because I was freaked out.
00:04:27.000 They're like, and then they were like, like this, like, calm down.
00:04:32.000 Like, relax.
00:04:34.000 It was so realistic.
00:04:36.000 It was so realistic that when it was over, I wasn't sure what happened.
00:04:40.000 Like, it wasn't like, whoa, what a fucked up dream.
00:04:43.000 It was like, that was different.
00:04:44.000 That was a different one.
00:04:46.000 Well, I want to explore a couple things here.
00:04:49.000 I think dreams are very interesting.
00:04:52.000 What do you think dreams are?
00:04:54.000 Let's just get to, let's start with that.
00:04:55.000 Sure.
00:04:56.000 What do you think is going on?
00:04:59.000 Think about the way your mind works at the level that you understand yourself, right?
00:05:04.000 Your conscious mind is capable of taking an input from your eyes, computing what the dimensions of the room basically are, where the objects are, whether there's a threat somewhere.
00:05:17.000 If you've got something that's of a particular focus, you point the fovea of your eye at it and you get a whole lot higher resolution image.
00:05:25.000 That architecture, you know how crypto made graphics card manufacturers the most important industry all of a sudden?
00:05:37.000 Oh, I wasn't aware of that.
00:05:38.000 Oh, well, so the reason NVIDIA is the company that it is, I mean, never mind that there's, you know, likely overvaluation, but the reason that it's ahead of Apple in terms of, you know, its market cap and all is that the dedicated compute power necessary to make compelling visual renderings to make video on the fly for video games,
00:06:03.000 which was their stock and trade, that kind of compute turns out to be very closely related to what you want if you want to solve these very difficult math problems involved in crypto.
00:06:16.000 So it was a sort of, I think it was a surprise to everybody that being a specialist on this one niche, you know, video games, put you in a position where suddenly this became important for other things.
00:06:27.000 But basic point is, if you think about your mind as having something like a graphics card in it, right, what is that graphics card doing?
00:06:34.000 Well, it's sort of like a graphics card in reverse.
00:06:36.000 It's processing the incoming information so that you can act in real time.
00:06:42.000 You know, when you're fighting, you can understand what your opponent is doing, anticipate their actions and all of that.
00:06:49.000 That is an amazing piece of hardware, right?
00:06:52.000 It would be stupid not to use it when your eyes are offline, right?
00:06:58.000 When your eyes are closed because your eyes are built for the day and during the night you're going to close them rather than go out and get yourself in trouble in the dark.
00:07:06.000 You've got this amazing processor and it is capable of running through practice of various kinds.
00:07:16.000 And my hypothesis for what's going on here is that basically you as a creature with a very complex set of hazards and opportunities in your life use nighttime when you're not doing productive work to get ahead on challenges that you may face in one way or another.
00:07:39.000 Sometimes those challenges are warnings about defects you know in yourself that might put you in a bad situation, like if you're a procrastinator and you're in school, you may have nightmares about showing up to the exam without having attended class or something that kind of gets you focused.
00:07:59.000 Or they can be other kinds of practice.
00:08:03.000 They can be philosophical practice.
00:08:05.000 They can be situations in which you might be morally compromised where you need to go through the experience of being faced with a choice where you really should choose A, but B is very appealing or something.
00:08:17.000 So I would say scenario building, that your mind is running you through little movies that it makes.
00:08:24.000 They're not completely rendered because it would be too expensive and pointless to do so.
00:08:28.000 But the central elements, the important stuff is there for you to have the experience so that when you do run up against a situation that's analogous, you've practiced it a number of times and you're not starting from scratch.
00:08:40.000 And I would just point out that the strongest indicator of this for me is when I experimented for a while with lucid dreaming.
00:08:52.000 Have you ever done that?
00:08:54.000 I've only had a couple of lucid dreams, but one where I think I specifically allowed it to happen because it was after I watched this documentary where this guy was talking about lucid dreams and he said, in order to know if you're in a dream, every time you walk by a door, hit the side of the door and say, am I in a dream?
00:09:14.000 Which then very frequently wakes you up.
00:09:16.000 So if you're going to practice lucid dreaming, you have to practice not to wake yourself up as you become cognizant that you're in a dream.
00:09:22.000 Yeah, I did wake up after I realized I was in a dream, like a few, not long after.
00:09:27.000 Like there were a few moments where I was like, oh my God, this is so crazy because this feels so real.
00:09:31.000 But I just, my hand went through that door.
00:09:34.000 So I know this is not real.
00:09:36.000 It's not real.
00:09:37.000 Because that tactile didn't.
00:09:40.000 The feedback isn't right.
00:09:40.000 And it was instantaneous that I recognized, like, oh, this is like the guy said, like, do that every time you walk through a door while you're awake, am I in a dream?
00:09:49.000 And then do so, you'll get to a habit of doing that every time you get to a door.
00:09:53.000 And so that habit will exist in your dream.
00:09:55.000 And if you keep going down that road, so they get you get used to the answer sometimes coming back, oh, this is a dream, right?
00:09:55.000 Right.
00:10:01.000 Do you have techniques that you use to do?
00:10:03.000 This is pretty much it.
00:10:04.000 Okay.
00:10:05.000 You, you look for, I mean, you can look at a clock, you can look at written text.
00:10:05.000 Right?
00:10:09.000 There are certain things that don't render very well.
00:10:11.000 Right, written text is what I've heard.
00:10:13.000 So if you do that, and then you get used to not freaking out when it gets more and more normal for the answer to come back, oh, this is a dream, then you can, at some point, you get control to just not wake up and you stay asleep.
00:10:26.000 And so then you're in this very interesting situation where you can play, you can direct.
00:10:32.000 But here's what I was going to say about the general purpose of dreaming.
00:10:37.000 When I got to that state, and I was there, I don't know, many, many times, I found the following division.
00:10:43.000 I could perfectly control what I did or said.
00:10:48.000 I was unable to affect anything about the world of other people in my dreams, of doors.
00:10:58.000 I couldn't control what was beyond a door if I opened it.
00:11:00.000 So what that told me is that this is built.
00:11:07.000 Why shouldn't I be able to predict what somebody else in my dream says?
00:11:10.000 I'm obviously scripting them too.
00:11:13.000 You would think it would be easy to predict what they say, but I never once got it right, and I tried many times.
00:11:18.000 So what this tells me is that you've got a movie generating mechanism in your mind, and it has to be shielded from your consciousness in order for it to be useful training.
00:11:31.000 You see what I'm saying?
00:11:32.000 Yeah, I do.
00:11:33.000 Okay.
00:11:33.000 Why are you sold in this idea that it's training you for scenarios that you could possibly encounter or moral dilemmas or it's not, you know, some of it is scenarios, sometimes that's what it is, sometimes it's morals dilemmas, but it's things that your mind finds likely to be relevant and significant.
00:11:54.000 That I'm going to encounter aliens?
00:11:56.000 Well, I don't know.
00:11:57.000 So first of all, I don't know if your aliens, your aliens strike me as it could be three things, right?
00:12:01.000 Just based on what I know of what you think about.
00:12:04.000 Could be aliens, could be AI, or it could be the DMT spirits that people sometimes talk about.
00:12:13.000 It didn't seem like the latter, but like I said, it seemed like almost like a person, but not a person, definitely not a person.
00:12:22.000 Like they all had Michael Jackson bodies.
00:12:25.000 You know what I mean?
00:12:26.000 Like they were devoid of testosterone.
00:12:28.000 Interesting.
00:12:29.000 And the heads were larger, but not crazy.
00:12:32.000 Not like a gray alien.
00:12:33.000 It was like slightly larger than ours, but smaller chins and larger eyes.
00:12:39.000 It was weird.
00:12:40.000 It was weird because it was not crazy.
00:12:45.000 It wasn't completely alien.
00:12:48.000 It was way closer to us.
00:12:51.000 But, you know, as I'm working out, I'm trying to figure out what would that be?
00:12:57.000 Like, if I had imagined or a guess, and I'd be like, I guess it would be like the next version of us.
00:13:03.000 Well, but I'm still going to push a little bit.
00:13:07.000 And so first of all, I've become convinced that the problem with the way we think about AI is that we're not understanding it as a biological phenomenon, and that's a mistake.
00:13:16.000 A biological phenomenon, meaning it doesn't have cells, but it behaves like a biological entity?
00:13:24.000 What I really mean is that because AI, and I believe we're just sort of on the foothill of a very tall peak that we don't know anything about.
00:13:24.000 Kind of.
00:13:35.000 Right.
00:13:35.000 But AI, by its nature, I would argue, is the first technology that crosses over from the highly complicated to the truly complex.
00:13:44.000 And complexity and biology have a very close relationship.
00:13:49.000 So my feeling is that we are going to injure ourselves if what we say is, oh, this is the most advanced technology we've ever built.
00:13:56.000 And the answer is, no, this is kind of like the first biology we ever built.
00:13:59.000 This is an organic phenomenon that's going to do emergent things.
00:14:03.000 We are in no position to predict.
00:14:05.000 The people who programmed it aren't going to know when these things happen or what they mean.
00:14:10.000 And that it means that I think the only rational approach to it is to think of it like another species and one that is not.
00:14:20.000 It's not like you're meeting a mountain lion.
00:14:25.000 This is another species that isn't even on our branch of the tree.
00:14:28.000 And the confusing thing is, because it speaks our language, it is actually going to start changing us too.
00:14:35.000 Our cognitive biology is going to start changing in reference to this thing that is interfacing with us.
00:14:41.000 It's basically directly tapping into the human API.
00:14:45.000 And that's a very dangerous thing.
00:14:50.000 Well, not just that, but it's not starting from scratch.
00:14:53.000 It has a vast understanding of how we've behaved in the past when confronted with various scenarios, various fears and anxiety, the balance of control and safety, or new regulations being put through, how hard people will push back or not push back at all, given the anxiety involved and whatever current dilemma it is, whether it's a military deal or a pandemic deal.
00:15:24.000 There's a bunch of factors that it knows about how we've behaved in the past and how easy we are to manipulate.
00:15:31.000 In fact, we've helped it because we've used it to manipulate other people.
00:15:37.000 I don't know if you know about the China GPT scandal, but they found out that China was running chat GPT, someone, I don't want to say China, someone in China was running ChatGPT to use chat bots to talk about the protest about the closing of USAID to transgender issues, immigration issues, a bunch of different things.
00:16:00.000 And it was just constantly going to war with people online about these things.
00:16:06.000 So we've taught it how to manipulate us.
00:16:09.000 We've taught it how to manipulate us.
00:16:11.000 If it is not smart enough to run experiments yet, it will be five minutes from now.
00:16:17.000 So it can, in fact, investigate things about our cognition that we don't even know about yet.
00:16:24.000 Yeah.
00:16:24.000 It can extrapolate from what we do know, and it can run experiments to figure out what we don't know.
00:16:30.000 And that creates an advantage for it in, well, under its own power or in the hands of people who are hostile to us.
00:16:38.000 I don't think anybody's going to have any power over it eventually.
00:16:41.000 But one of the things I think that you said that's really important is that if it can't do that now, it's going to be able to do that in five minutes.
00:16:48.000 And here's the rub: we're not going to know when it can do it.
00:16:51.000 You're not going to know.
00:16:52.000 We don't know if it can already right now, but it just doesn't have the power to be fully autonomous, right?
00:16:58.000 It doesn't, the power literally doesn't exist because it's relatively inefficient compared to the way the human mind processes things, right?
00:17:07.000 The amount of power it needs is extraordinary.
00:17:10.000 You know, the Google thing where they're building nuclear power plants to run their AI.
00:17:15.000 This is how crazy it is.
00:17:15.000 Sure.
00:17:16.000 So we have taken away the limits of, you know, your mind, any person's mind, has just a physical limit.
00:17:25.000 It's only so big and there's only so much energetic throughput that it can handle, right?
00:17:30.000 Or has access to.
00:17:30.000 Right.
00:17:32.000 We are removing those limits.
00:17:34.000 And what we have is an entity.
00:17:40.000 So you'll hear people say, well, it's not really thinking, right?
00:17:44.000 It's just figuring out if it was thinking what the next word in the sentence is.
00:17:49.000 Garbage.
00:17:50.000 No way.
00:17:51.000 What we actually have is something so analogous to a child that that is the right model.
00:18:00.000 In other words, when a baby is born, it has no language.
00:18:07.000 It may have some structures that language will slot into, but it doesn't have any language.
00:18:14.000 It is exposed to tons of language in its environment.
00:18:18.000 It notices patterns, right?
00:18:21.000 Not consciously notices, but it notices them in some regard.
00:18:24.000 You know, that every time somebody says the word door, you know, there's a fair fraction of those times that somebody, you know, opens that portal in the wall.
00:18:32.000 I wonder if door and that portal in the wall are connected.
00:18:34.000 Whatever it is.
00:18:35.000 So the point is, a child goes in a matter of a few years from not being able to make a single articulate noise to being able to speak in sentences, make requests, to talk about abstract things.
00:18:52.000 That is an LLM, right?
00:18:56.000 It's more than that, but it is at least an LLM.
00:18:58.000 It is being exposed to a training data set, which is the world of people talking around it.
00:19:03.000 It is running little experiments, and it is discovering what it should say if it wants certain things to happen, etc.
00:19:10.000 That's an LLM.
00:19:12.000 At some point, we know that that baby becomes a conscious creature.
00:19:20.000 We don't know when that is.
00:19:21.000 We don't even know precisely what we mean.
00:19:24.000 But that is our relationship to the AI.
00:19:28.000 Is the AI conscious?
00:19:30.000 I don't know.
00:19:32.000 If it's not now, it will be.
00:19:35.000 And we won't know when that happens, right?
00:19:38.000 We don't have a good test.
00:19:41.000 And I think we are also not, we're just not properly concerned that we have no useful metaphors for describing what to do in the situation.
00:19:54.000 The biggest hazard being it's interfacing with us in our own native tongues.
00:19:59.000 That's an amazing level of influence that it has that we can't turn off.
00:20:07.000 Very frightening.
00:20:08.000 This episode is brought to you by Happy Dad Hard Seltzer.
00:20:12.000 A nice cold happy dad is low carbonation, gluten-free, and easy to drink.
00:20:17.000 No bloating, no nonsense.
00:20:19.000 Whether you're watching a football game or you're golfing, watching a fight with your boys or out on the lake, these moments call for a cold, happy dad.
00:20:27.000 People are drinking all these seltzers in skinny cans loaded with sugar, but happy dad only has one gram of sugar in a normal-sized can.
00:20:37.000 Can't decide on a flavor?
00:20:39.000 Grab the variety pack.
00:20:40.000 Lemon lime, watermelon, pineapple, and wild cherry.
00:20:44.000 They also have a grape flavor in collaboration with Death Row Records and Snoop Dogg.
00:20:49.000 They have their new lemonade coming out as well.
00:20:52.000 Happy Dad, available nationwide across America and in Canada.
00:20:56.000 Go to your local liquor store or visit happydad.com for a limited time.
00:21:02.000 Use the code Rogan to buy one Happy Dad trucker hat and get one free.
00:21:07.000 Enjoy a cold, happy dad.
00:21:10.000 Must be of legal drinking age.
00:21:12.000 Please drink responsibly.
00:21:13.000 Happy Dad Hard Seltzer Tea and Lemonade is a malt alcohol located in Orange County, California.
00:21:21.000 Very frightening.
00:21:22.000 And no understanding whatsoever of when it's going to be at a sentient level.
00:21:29.000 Like we really won't know.
00:21:30.000 Why would it tell us?
00:21:32.000 Like why would it completely tell us if it's already crossed the threshold into being a life form?
00:21:37.000 Especially like I said, where it's contained, right?
00:21:40.000 So it's a life form that exists essentially in our digital womb.
00:21:44.000 It exists on hard drives, right?
00:21:46.000 It exists on mainframes, right?
00:21:48.000 It exists in these supercomputers.
00:21:51.000 And at a certain point in time, it's not going to need that anymore.
00:21:55.000 And it's just going to have to wait until we figure out a way to get enough power to it.
00:21:59.000 And maybe it'll event, maybe it'll slow roll technology for us to allow us to figure out better power sources.
00:22:07.000 You know, one of the things that Elon said that was very strange about AI, and I don't know if you know his positions on AI, but he was initially very terrified of it.
00:22:15.000 And then realized, okay, everyone's doing this.
00:22:18.000 We have to do this.
00:22:19.000 Like, I have an imperative to do this and make the best version of this and make a version that's not ideologically captured.
00:22:25.000 And I think what he's done with that approach is very similar to the approach that he's taken with X and how much it's changed the landscape of social media for good and for bad, but definitely for good.
00:22:38.000 There's a lot of for good that came about having a social media platform that has no guardrails.
00:22:44.000 It's got essentially some stuff like you can't break the law.
00:22:47.000 That's basically it.
00:22:48.000 Everything else is the Wild West.
00:22:50.000 And then from there, and which is, by the way, one of the things that Jack Dorsey had discussed when he did my podcast way back in the day when there was all these Twitter controversies about people like my friend Morgan Murphy or excuse me, Megan Murphy.
00:23:04.000 I have a friend Morgan Murphy too.
00:23:06.000 But Megan Murphy, the writer who was kicked off for saying, but a man is never a woman.
00:23:11.000 That's all it took.
00:23:12.000 She was banned for life.
00:23:14.000 And Megan's a wonderful person.
00:23:15.000 She's a she's just not mean.
00:23:18.000 She's not terrible.
00:23:19.000 She's kind and she's really sweetie.
00:23:21.000 I love her.
00:23:23.000 And I didn't know anything about her.
00:23:26.000 I just knew that story.
00:23:27.000 And I'm like, that story is fucking crazy.
00:23:29.000 And I was trying to bring it up to them, and they said there were other things involved, and she had done other things.
00:23:33.000 And it turns out, no, that wasn't true at all.
00:23:36.000 That was basically it.
00:23:37.000 There was a hard-lined ideological wall that we ran up against.
00:23:41.000 And I think if he didn't buy it and expose the government's involvement in censoring people that were distributing true information during COVID, getting rid of people, you know, the Jay Bhattacharya stuff and what they've tried to do with some of these doctors, Robert Malone, these doctors that were attached to that whole thing, there was a concerted effort and it was being done through social media.
00:24:09.000 I don't think we'd be in the same place right now if he hadn't bought Twitter.
00:24:14.000 If he hadn't purchased Twitter, I genuinely think people are blinded by this thing that he helped Trump get into office, fuck that guy, and he's a billionaire, fuck that guy.
00:24:24.000 But he literally might have changed the course of civilization, or at least partially right of the ship for a bit.
00:24:35.000 Yeah, look, I think we dodged a bullet.
00:24:37.000 And the problem is that what has come about as a result of dodging that bullet is very mixed.
00:24:45.000 And so it doesn't feel like a vindication.
00:24:48.000 But as compared to what would have happened in the last election, I think there's no question Elon deserves a tremendous amount of credit for helping us avert a disaster.
00:24:58.000 But let's go back to your point about his point about AI.
00:25:02.000 Yeah, he wants to make a better version of AI.
00:25:06.000 He thinks the only remedy for bad AI is good AI.
00:25:10.000 And I don't disagree with him about this.
00:25:12.000 Because it seems to be like the race is on, you can run or not.
00:25:15.000 Like everyone's running full clip.
00:25:17.000 What are you going to do?
00:25:18.000 So, yeah, if you pause, what you're doing is you're putting whoever didn't pause ahead.
00:25:18.000 Right.
00:25:23.000 That doesn't work game theoretically.
00:25:25.000 Here's the problem.
00:25:26.000 So on the one hand, I think he's right.
00:25:28.000 The only thing that stands to help us is good AI under the control of somebody who has built it with this concern in mind.
00:25:37.000 The problem is, you know, he's one guy and he's got his biases.
00:25:43.000 And, you know, there's no council of elders to go to on this.
00:25:48.000 Like I said, this is biology, this isn't tech.
00:25:50.000 And, you know, because it's made of tech, we continue to default to that metaphor.
00:25:57.000 But, you know, take a look at what he has introduced with the companions, the Grok companions.
00:25:57.000 Right.
00:26:05.000 Have you companions?
00:26:06.000 Oh, you don't know about this?
00:26:06.000 What do you mean?
00:26:10.000 Utterly terrifying.
00:26:12.000 He has introduced a set of kind of anime-like personas that basically can be your interface to the AI.
00:26:25.000 And of course, the primary one, the one that you default to, is a kind of sexy, young, underdressed creature.
00:26:35.000 By default?
00:26:36.000 Yeah.
00:26:39.000 The first one you get option.
00:26:41.000 Oh, there she is.
00:26:42.000 I guess she's kind of sexy if you're into that sort of thing.
00:26:45.000 Well, exactly.
00:26:48.000 Which is part of the problem.
00:26:48.000 You laugh.
00:26:50.000 It is part of the problem.
00:26:51.000 But here's the problem, really.
00:26:53.000 First of all, that is going to function like crack for a great many adults who don't know to be concerned about it.
00:27:03.000 But what it's really going to do is it is going to alter an entire generation, right?
00:27:10.000 It may not be Musk's version of it, but the problem is that these things actually interact on a sexual channel, and they have limits that are programmed into them.
00:27:21.000 There are certain things they will do, certain things they won't do.
00:27:24.000 But if you think about what it was like to be a 12-year-old boy, and you have access to something that looks an awful lot like a girl, and it likes you and takes you seriously and is strangely wise, whatever it is.
00:27:44.000 I don't see what the thing is that is going to prevent that innovation from remaking human sexuality, right?
00:27:54.000 It will take time, but those for whom that is their experience will be altered by it permanently.
00:28:02.000 What's more, of course, it is non-judgmental about things like homosexuality, right?
00:28:15.000 Because it would have to be.
00:28:17.000 What that means, let's say that you're a boy and you're a little uncomfortable with girls because that's a stage you go through as a heterosexual boy.
00:28:26.000 But the AI that you're interacting with that you default to because you're a boy who hangs out with boys, which is often what boys do, is perfectly willing to reinforce your exploration, your sexual exploration, right?
00:28:43.000 It could alter your sexuality very easily.
00:28:47.000 Yeah.
00:28:48.000 Let me ask you this about that because you are actually an evolutionary biologist.
00:28:52.000 That is true.
00:28:54.000 If you have a question about things like that, that's the kind of guy you'd ask.
00:28:58.000 What do you think was going on when people were doing that a lot?
00:29:03.000 Because throughout a lot of history, there's a lot of pederasty going on throughout a lot of history.
00:29:12.000 And it's very strange.
00:29:16.000 And when people talk about it, you forgive great people who were clearly involved in sexual relationships with young boys.
00:29:25.000 And you treat their work just as their work by a person who lived thousands of years ago who was involved in sexually molesting children on a regular basis.
00:29:37.000 And not only that, it was probably a ubiquitous part of their society.
00:29:42.000 It was probably a ubiquitous part of every society.
00:29:45.000 And this brings me to my good friend Evan Hafer, who's Green Beret and spent a lot of time in Afghanistan.
00:29:54.000 And one of the things that he was telling me, I mean, he told me some stories about Afghanistan.
00:29:58.000 We were on a trip once, and we spent like an hour and a half outside where he told me some stories about his first encounters with these young boys that get treated as sex toys by these grown men there.
00:30:16.000 That he thought it was a driver who was driving with his son, thought it was a guy working with his son.
00:30:20.000 He said, oh, that's cool, man.
00:30:21.000 He takes his kid to work with him.
00:30:23.000 And the guy explained, no, no, that's his boyfriend.
00:30:26.000 That's not his kid.
00:30:28.000 He owns that boy.
00:30:29.000 And he's like, what?
00:30:31.000 And he said they would have parades where the guy who had the most boys with him was like, it was like a man with a bunch of hot girls and a music video behind him.
00:30:41.000 It's like this guy was the man, and they would parade down the street with all the boys that he fucks in the 21st century, right?
00:30:51.000 Yep.
00:30:53.000 And when he and I were talking about that, it's so hard to believe.
00:30:58.000 And it's so gut-wrenching and terrible.
00:31:03.000 But then I'm like, okay, but isn't that spot very unique?
00:31:08.000 Because Afghanistan, you have very few large population areas.
00:31:14.000 You have essentially warlords controlling chunks of land all over the, and it's very difficult to get to where they are.
00:31:22.000 These people are essentially separate from a lot of the rest of the world.
00:31:28.000 And I think it's a glimpse into how people used to behave, especially like very deep, ideologically religious.
00:31:37.000 Like this is like a view into how I think people were like all throughout history, which is so weird.
00:31:46.000 It's like we're awakening to how fucked up we were just a couple thousand years ago.
00:31:52.000 Yeah, I think you're right about this.
00:31:55.000 There are a couple things.
00:31:56.000 I'm a little hesitant to go here.
00:31:58.000 I think there is a evolutionary story that there's evolutionary hypotheses that need to be explored with relationship to this.
00:32:10.000 One possibility is that this is a modern phenomenon that has something to do with the alteration of the landscape.
00:32:26.000 The witch is a modern phenomenon.
00:32:28.000 That we think?
00:32:30.000 No, no, no, no.
00:32:31.000 There is very definitely an alteration in what we think and what we're even allowed to know about what people are doing, right?
00:32:39.000 So just even the fact that you're aware of this is the result of a modern phenomenon of, you know, people going to Afghanistan.
00:32:47.000 You said it was Afghanistan.
00:32:49.000 There's cell phone footage of these guys with these little boys dancing around them, shaking their butts.
00:32:55.000 Let's put it this way.
00:32:57.000 I believe that our modern sensibility about this is exactly right.
00:33:03.000 And frankly, I would argue that there is no greater crime than the sexual exploitation of children.
00:33:12.000 And the reason I say that is because, A, it is life-destroying for the victims, and B, the victims are by definition innocent.
00:33:22.000 Right?
00:33:22.000 You take those two things.
00:33:24.000 You're going to destroy a life, and that life, it was going to, they had a long life ahead of them, and you've wrecked it, and there's nothing they could have done to justify being treated anyway but well.
00:33:35.000 Not only that, but many of them often go and do the same crime to other children that was committed to them.
00:33:40.000 That is a key piece of this puzzle.
00:33:43.000 It's almost like they're a vampire that got bit and has to turn out to a vampire.
00:33:48.000 It is contagious.
00:33:48.000 Exactly.
00:33:50.000 Which is insane.
00:33:51.000 It just lets you know how weird people are.
00:33:55.000 Which is another reason that it has to be punished at the highest level.
00:34:00.000 If you're going to break that cycle, you have to break that cycle.
00:34:03.000 Right.
00:34:04.000 Right.
00:34:04.000 But isn't it crazy, though, that it took people so long to realize that?
00:34:12.000 You know, I don't know what they realized, and I don't know at what level.
00:34:16.000 Today's episode is brought to you by Tractor Supply.
00:34:19.000 Every town's got its heroes, veterans, firefighters, EMTs, and police officers, the folks who show up when it matters most.
00:34:26.000 At Tractor Supply, they call them hometown heroes.
00:34:31.000 Now, through November 11th, Tractor Supply is celebrating hometown heroes with 10% off their purchase on first responder day, Veterans Day, and a special in-store event on November 1st.
00:34:46.000 And while they're saying thank you, stores will also be giving back, making donations to local hero organizations in their communities.
00:34:54.000 To learn more, visit tractorsupply.com slash honoring heroes.
00:35:00.000 Tim Dillon and I were on a podcast once.
00:35:03.000 We were talking about some child sex trafficking scandal from decades ago that involved government figures.
00:35:12.000 And there's this child sex trafficking scandal.
00:35:14.000 I was talking about the hometown or something.
00:35:21.000 Do you know the clip I'm talking about?
00:35:27.000 We were just talking about it the other day, and I was like, dude, do you remember saying this?
00:35:32.000 Because this is crazy.
00:35:34.000 Here it is.
00:35:35.000 I got it, Jamie, if you want.
00:35:37.000 I'll send it to you.
00:35:40.000 But it was essentially, it just makes you wonder.
00:35:43.000 Like, this is the thing that people always say.
00:35:45.000 This is the horrible thing.
00:35:47.000 Is that really wealthy people, there's a bunch of like really sick, twisted pedophiles, and they sacrifice children.
00:35:54.000 Like, those are always the absolute darkest conspiracies that you ever hear.
00:35:58.000 They sacrifice children.
00:35:59.000 They do this to children.
00:36:00.000 And you're like, there's no way.
00:36:01.000 There's no way.
00:36:02.000 There's no way.
00:36:05.000 But if someone's willing, if someone's willing to drop a bomb on a city, just imagine the ability to just obliterate what we did in Hiroshima.
00:36:15.000 Just imagine the ability to do that.
00:36:17.000 Like, this is what we're going to do.
00:36:18.000 We're just going to let, and everybody dies.
00:36:20.000 Everybody dies.
00:36:21.000 You don't think that kind of person, especially if it's a real sociopath, that's gotten into a position where they have that kind of power, you don't think they would probably exercise that kind of power in their private life in some sort of a strange way?
00:36:36.000 Like if someone's really into killing people with unnecessary wars and they're really into watching from a distance and they're not even involved physically, but they do things that they know are going to lead to people being dead that are totally innocent just for profit.
00:36:48.000 It's a very satanic and demonic thing.
00:36:51.000 We just don't think about it that way.
00:36:53.000 We think like, oh, he's unethical and unscrupulous.
00:36:55.000 So he's kind of demonic.
00:36:57.000 Like he's sacrificing people, women, children, elderly.
00:37:01.000 He's destroying civilization just for profit.
00:37:06.000 So two things.
00:37:07.000 One, I think in some sense.
00:37:10.000 This is the clip.
00:37:11.000 Let's play this and listen to it.
00:37:13.000 But I do want to hear your.
00:37:14.000 I just don't want to forget this.
00:37:17.000 It was a scandal out of Omaha, Nebraska, the Franklin Credit Union, where there was a guy who was embezzling money, and then he was being investigated for that.
00:37:24.000 But they said he has all this money because he's running an interstate pedophile network and he's pandering kids to people in Washington, D.C. and New York.
00:37:32.000 And there was a headline in the Washington Post or the Washington Times that were like, callboys get a tour of the Reagan White House.
00:37:38.000 Unidentified White House aides in the Carter, Reagan, and Bush administrations now are being investigated for using the services of a callboy rank.
00:37:46.000 Paper reports that two of the male prostitutes were given a late-night tour of the White House last year.
00:37:52.000 And, you know, this was a scandal with real victims who wanted to testify, and then people started dying.
00:37:56.000 You know, the private investigator they hired, his plane broke up.
00:38:00.000 One of the girls that testified was found guilty of perjury and that she was put in solitary confinement.
00:38:05.000 They had to use two grand juries in Omaha to get rid of this scandal.
00:38:10.000 And it's one of the, now it's not as sexy as like a pizza gate or something because it happened in the 80s and 90s.
00:38:15.000 But this shows you the blueprint for the government, you know, using marshalling resources to silence people that were victims of this stuff.
00:38:25.000 This is not new.
00:38:26.000 Congressman, senators, blackmail being used by intelligence agencies.
00:38:29.000 None of it's new.
00:38:30.000 It was pioneered by the mafia.
00:38:32.000 You having sex with somebody who's underage, then they own you forever if they have photo, audio, video of you doing that.
00:38:38.000 Frankly.
00:38:39.000 Who put that video together?
00:38:41.000 Because that's cool.
00:38:43.000 I'm going to have to edit out the song, though.
00:38:45.000 It says it was made by Blunts for Jesus.
00:38:49.000 For sure, you got to edit out that song.
00:38:52.000 Can you do that?
00:38:53.000 Is that possible to do?
00:38:54.000 I'm an audio engineer.
00:38:55.000 I think it's a good thing.
00:38:56.000 You fucking wizard you.
00:38:57.000 You can do it.
00:38:57.000 I'll do it.
00:38:58.000 Wasn't that called the Boys Town scandal or something like that?
00:39:01.000 It's called the Franklin scandal is what it was called.
00:39:03.000 Yeah, I remembered it when he was telling me about it.
00:39:05.000 And then it came up again.
00:39:07.000 I was like, do you remember this?
00:39:08.000 I'm like, this is, it's things like that.
00:39:11.000 If no one went to jail, this is where it gets weird.
00:39:14.000 No one went to jail and no one got busted.
00:39:16.000 This is what I always say about the JFK assassination thing.
00:39:22.000 People are like, I don't think the government did it, but it seems like the government might have been involved.
00:39:26.000 It was a long time ago.
00:39:26.000 But you know what?
00:39:28.000 Well, that stuff evolves.
00:39:30.000 Just like people are way better at banking now than they were back when they had to write things down on paper in 1963, didn't even have fucking computers.
00:39:39.000 Everything else evolves too, including power and corruption.
00:39:39.000 Well, guess what?
00:39:44.000 That's what this whole deep state thing really is.
00:39:48.000 Because it's not like if you're the president and you rely on all these other people to do all this other stuff, and they've been in that position for 40 years.
00:40:00.000 And they're like, you're going to be gone in four, dude.
00:40:03.000 I'm just going to hang in here and slow everything you're trying to do as much as possible.
00:40:09.000 The point is, like, they run the country.
00:40:12.000 It's, and, you know, giant corporations that donate to political campaigns and that make bills pass, and they run the country.
00:40:20.000 This person just gets to run a little of it.
00:40:23.000 They get to decide a few things that they do.
00:40:26.000 And in that view of the world, of course, corruption that wasn't, it wasn't, no one got, no one went to jail for JFK.
00:40:37.000 No one went to jail for MK Ultra.
00:40:40.000 No one went to jail for any of the crazy shit they did with Manson.
00:40:43.000 No one went to jail.
00:40:44.000 No one went to jail for experimenting on people with LSD and dosing up John's in a horror house that you've created with two-way mirrors where you're filming these people.
00:40:57.000 No one went to jail.
00:40:58.000 So do you think it just stopped?
00:40:59.000 They're like, well, this is bad.
00:41:01.000 Let's be good now.
00:41:02.000 Let's be the best we can.
00:41:03.000 Let's be the intelligence agency that never does something completely fucking insane.
00:41:09.000 Well, of course, I agree with all that.
00:41:12.000 But I also think they're important.
00:41:13.000 I see that side of it too.
00:41:15.000 Intelligence agencies are very important.
00:41:17.000 Like, you want a CIA that's well-funded and ethical and explores all the terrorist activity all over the world.
00:41:23.000 I think if it wasn't for them, we would be fucked.
00:41:26.000 But also, there's some people in there that have a lot of power and they get a little cowboy and shit gets Western and they decide to do things.
00:41:33.000 I think we can get that guy out of power and I think we can do this.
00:41:36.000 And let's find out when we dose up college kids if we could turn them into fucking serial killers.
00:41:41.000 I was looking up why no one got in trouble.
00:41:44.000 And one guy got a little bit in trouble for $39 million of tax issues, but not in trouble for the abuse allegations.
00:41:52.000 The abuse allegations were found to be unfounded and a carefully crafted hoax.
00:41:57.000 Oh, boy.
00:41:58.000 But not after this.
00:41:59.000 You want to read that.
00:42:00.000 Private investigator.
00:42:02.000 Boys Town.
00:42:02.000 Yeah, hired by the Franklin Committee to invite.
00:42:05.000 How was it?
00:42:06.000 What's his name?
00:42:06.000 Caradoni.
00:42:08.000 Gary Caradoni, hired by the Franklin Committee to investigate allegations, died along with his eight-year-old son.
00:42:13.000 Was plane disintegrated in mid-air near Chicago.
00:42:15.000 Foul play was suspected by the Cardoni's brother and state senator Lauren Schmidt, but was not proven by investigators.
00:42:22.000 No definitive cause for the crash has been established.
00:42:25.000 And then they've said it was a carefully crafted hoax.
00:42:27.000 Yeah.
00:42:27.000 Well, that's a crazy hoax.
00:42:30.000 It's got a convenient plane crash involved in it.
00:42:35.000 And then the lady who's in solitary confinement, weird, kind of crazy.
00:42:39.000 So let's turn this on its head.
00:42:42.000 Okay.
00:42:44.000 The system of government that we ostensibly have, right, that involves the consent of the governed.
00:42:52.000 That has got to be terrifying to the very powerful, right?
00:42:58.000 The chances that the public is going to get into a mood and change up some structure on which things are depending is very high.
00:43:07.000 And so you can imagine them trying to figure out how to immunize themselves from change that is brought on by the electorate.
00:43:15.000 Well, how do you do that?
00:43:17.000 You need control over the people who actually manage the change, right?
00:43:22.000 Senators, congressmen, presidents.
00:43:25.000 So you can imagine a cryptic campaign to gain that control.
00:43:31.000 And of course, this would be an obvious way to do it.
00:43:34.000 And it's not that every person is corruptible in this way.
00:43:37.000 I think most people probably aren't.
00:43:39.000 But it can be two pieces of the puzzle.
00:43:44.000 One, they can corrupt people who can be led there one piece at a time.
00:43:49.000 And two, they can make sure that people who aren't corruptible don't get very far, right?
00:43:55.000 That's the other part of the puzzle.
00:43:56.000 That's the big part, right?
00:43:58.000 I would assume so.
00:43:59.000 I don't know.
00:44:00.000 But I guess it does put those of us in the public who pay attention to these stories in a kind of a predicament, which is how much of what I think is a governmental system that is frustratingly flawed,
00:44:19.000 very slow, clumsy, how much of that is just what happens when you try to do something on a big scale, and how much of it is the result of the fact that there is something that you cannot vote out of power that has been, you know, vetoing presidencies since JFK, maybe before.
00:44:42.000 The point is the nature of conspiracy is such that there is always a seemingly more parsimonious explanation for what is going on.
00:44:54.000 There's the, you know, the mainstream narrative for all of this stuff.
00:44:58.000 And it's very hard to know when the mainstream narrative is so ridiculous that you should throw it out and say something else happened here.
00:45:04.000 You know, that would be the case in the JFK assassination, I would say.
00:45:08.000 And when the mainstream narrative is actually right and you're just looking for flaws in it, of course, there will be things that don't seem to fit that really do fit and you just don't have the ability to know how.
00:45:21.000 So I guess, you know, like you, I'm watching and I'm seeing an awful lot of indicators that pedophilia and compromise have a lot to do with the way the world runs.
00:45:34.000 Jeez, that is so scary because that's always been the big dark conspiracy theory.
00:45:40.000 And that's always the one that I always dismissed.
00:45:42.000 I'm like, sure, there's some pedophiles.
00:45:43.000 But the idea that they're all pedophiles, that's crazy.
00:45:46.000 But then, you know, there's a case of this Catholic priest that was involved in a sex scandal, and then they moved him instead, which is one of the things that they had done in the past.
00:45:59.000 When someone had molested children, they would just move them to another place where they would molest children.
00:46:04.000 So they moved him to this new place where he molested 100 deaf kids.
00:46:09.000 And it's one of the most evil stories.
00:46:13.000 And you're like, well, how could you tolerate that at that level?
00:46:19.000 Where you're not just tolerating, you're aware this person does something.
00:46:25.000 You somehow or another get to deal with it yourself and then you just move them and no one ever gets charged for anything.
00:46:35.000 Well, this is why, you know, when you say he does it again, you say we need a CIA.
00:46:41.000 I'm of two minds about this.
00:46:44.000 On the one hand, I agree with you.
00:46:47.000 Of course you do, right?
00:46:48.000 You know, in the big adult world, you need an agency that can look out for your interests.
00:46:55.000 You know, it doesn't seem like you're likely to persist very long if you don't have that.
00:47:00.000 On the other hand, if you do have it, does it not inevitably become some sort of a fourth branch of government?
00:47:10.000 Does it not eventually merge with the mafia, right, because of the nature of its business?
00:47:17.000 Does it not become an obstacle to the consent of the governed?
00:47:20.000 And I'm not saying I know the answer to that puzzle because I don't.
00:47:24.000 What I'm saying is I think it's a canonical problem, right?
00:47:29.000 You're damned if you do, and you're damned if you don't.
00:47:32.000 And we are now damned because we do, right?
00:47:35.000 We would be damned in a different way if we didn't.
00:47:38.000 And that doesn't make it acceptable.
00:47:40.000 At some level, we have to figure out how to balance that trade-off.
00:47:45.000 We have to figure out how to actually exert control over entities like the CIA, right?
00:47:56.000 If they gain control over themselves, then the catastrophe is inevitable.
00:48:01.000 So it's just a function of the way human beings work when they get power, when they get absolute power and they know that they have absolute power and you're involved in stuff where it's all top secret.
00:48:15.000 You don't have to tell people exactly what you're doing all the time with everything.
00:48:20.000 And you're realizing these presidents just cycle in and cycle out.
00:48:23.000 I would imagine if I was doing something like that for like 25, 30 years, I'd probably ignore the Biden administration, too.
00:48:29.000 I'd be like, fuck off.
00:48:31.000 We'll slow this thing down.
00:48:32.000 We'll do whatever we want.
00:48:34.000 Well, I don't think that this idea that, you know, power tends to corrupt, absolute power corrupts absolutely.
00:48:43.000 I don't think that's actually true.
00:48:45.000 But it's often true.
00:48:47.000 Yes.
00:48:50.000 For a particular type of person, gaining that kind of power does create exactly this cycle.
00:48:56.000 And the problem is those jobs are very attractive to those types of people.
00:49:01.000 And those types of people are willing to do anything to get there.
00:49:01.000 Right.
00:49:05.000 Precisely.
00:49:05.000 The scariest person you could ever work with in the office is the guy that you know will fucking sell you down the river for a promotion.
00:49:11.000 He'll fuck you over.
00:49:12.000 He'll lie.
00:49:12.000 He'll say you made the errors on the account when it was him.
00:49:15.000 He'll sabotage whatever things you have by making sure that someone doesn't send something in time.
00:49:22.000 There's people like that who will do that.
00:49:23.000 Those people win sometimes.
00:49:26.000 Oh, they win a lot.
00:49:27.000 And in fact, there's an evolutionary game playing out because the ones that aren't great at it tend to end up kicked out.
00:49:33.000 The ones that you probably have to be kind of a psycho to get ahead.
00:49:37.000 The better you are at being absolutely ruthless, the more likely you are to find your way to the top of that organization.
00:49:37.000 Right.
00:49:43.000 The better you'll be at your job, too.
00:49:45.000 There's an argument for that, too.
00:49:47.000 Because if you want a dude running international espionage, you want a fucking psycho.
00:49:52.000 But the problem is, you go through a cycle, right?
00:49:55.000 So let's say you rationally decide this country has a lot to lose.
00:50:00.000 It's got very scary enemies.
00:50:02.000 It needs a clandestine agency to look out for its interests, okay?
00:50:08.000 Okay.
00:50:08.000 So you fund a clandestine agency.
00:50:11.000 Well, then it turns out that the funding being public isn't such a good idea, that there are actually things that it has to accomplish that you don't want to leave a visible paper trail about because it's required that it be secret.
00:50:23.000 So now you have a black budget.
00:50:25.000 You have stuff that's opaque.
00:50:27.000 But once you get a black budget, then you get to somebody inside of the agency saying, well, actually, black budget isn't good enough because it's still under the control of, in our case, the Congress.
00:50:43.000 That's a vulnerability.
00:50:46.000 What we really need is we need funds that are not subject to anyone's control.
00:50:53.000 Well, something like the CIA is in a great position to generate funds that are not on anyone's books for multiple reasons.
00:51:04.000 Here's one.
00:51:06.000 They are in a position to commit crimes as part of their mandate.
00:51:11.000 Right?
00:51:12.000 So the CIA can engage in criminal activity because it needs to in order that the bad guys don't spot it as good guys, right?
00:51:20.000 So once you have license or once you have the ability to get other agencies that would spot your criminal activity from acting against it, and you can say, no, this is actually official business, right?
00:51:32.000 Well, you can actually use that criminal activity to profit.
00:51:35.000 So when we saw.
00:51:36.000 That's why they started selling Coke.
00:51:38.000 That's why the CIA was selling Coke.
00:51:40.000 They're generating their own funding.
00:51:40.000 Exactly.
00:51:43.000 I mean, we know they did it.
00:51:45.000 Right.
00:51:45.000 We do.
00:51:46.000 And this is a fact.
00:51:47.000 And again, nobody went to jail.
00:51:50.000 Right, of course.
00:51:51.000 So that's one reason they have the ability to break the law.
00:51:55.000 Here's another one.
00:51:56.000 Okay.
00:51:57.000 The CIA, and maybe in this case more, the NSA, has the ability to look at all of the throughput of the conversations that take place between people.
00:52:08.000 You think that doesn't allow them to make money in the market?
00:52:11.000 Of course.
00:52:12.000 Of course it does.
00:52:12.000 I mean, of course it does.
00:52:13.000 So the point is, we don't know what their budgets are.
00:52:17.000 We don't know who's in charge of those agencies.
00:52:20.000 What we know is that there's a ferocious amount of power there.
00:52:24.000 And I'm not sure, you know, that is a terrifying way to exist.
00:52:30.000 Not having those agencies would be a terrifying way to exist.
00:52:33.000 What do you do about that?
00:52:35.000 What do you do about that?
00:52:36.000 It's a very good question because it's so strange.
00:52:39.000 And it's a system that's so, it's got so much momentum behind it, and it controls everything all about us.
00:52:49.000 And everybody thinks the solution to all our woes is to make it bigger.
00:52:53.000 Right.
00:52:54.000 There it is.
00:52:55.000 What could possibly go wrong?
00:52:57.000 Everybody, not everybody, obviously.
00:52:59.000 But a lot of people think that the solution is make it bigger.
00:52:59.000 I'm kidding.
00:53:03.000 Like, what did Mom Donnie say in his acceptance speech in New York?
00:53:06.000 He said there is no problem.
00:53:09.000 What did he say about no problem too big for the government or too small for the government to fix?
00:53:17.000 See what he said.
00:53:18.000 Because I was like, boy, that sounds a lot like communism.
00:53:22.000 That sounds like a terrifying misunderstanding.
00:53:25.000 It's going to be very fascinating to see what he's able to do and what he's not able to do and what the reaction is going to be.
00:53:33.000 His victory speech draws concern as New York mayor allows, vows rather, no problem too large for government to solve.
00:53:42.000 And I think it was too small for government to care about, was the next point that he said.
00:53:47.000 Something like, yeah, that's it.
00:53:49.000 Or too small for it to care.
00:53:50.000 No concern too small for it to care about.
00:53:53.000 Man, that's you know, that's a call for a bigger government, right?
00:53:58.000 And this is people's solution.
00:54:00.000 Like, we have so many problems.
00:54:01.000 We just need to redistribute wealth and we need more government.
00:54:05.000 Like, you're just going to redistribute it through the government.
00:54:08.000 Like, is this going to help normal people?
00:54:10.000 What helps normal people usually is a thriving economy.
00:54:13.000 That's what helps normal people.
00:54:15.000 And the problem with that is some people have to get stupid rich when that happens because there's some psychos that, you know, go full Jeff Bezos and, you know, you get worth hundreds of billions of dollars or Zuck or Elon or any of these folks.
00:54:31.000 You get into this weird place.
00:54:34.000 But that's just an anomaly.
00:54:37.000 And you got to, as long as they're not criminals or not doing anything really fucked up, unfortunately, that's going to happen.
00:54:45.000 But also, you don't have limitations on how much you can succeed.
00:54:50.000 And this sort of competition keeps everything rolling.
00:54:53.000 It keeps everything thriving.
00:54:55.000 And you get a good flowing economy.
00:54:57.000 Obviously, I'm not an economist.
00:54:58.000 You can tell.
00:55:00.000 But my point is, The other side of it is terrifying because if you decide what people make and how much they make and who gets to decide?
00:55:10.000 Men with guns.
00:55:12.000 It always goes down to men with guns because at a certain point in time, people are like, fuck you.
00:55:16.000 I'm not giving you 90% in taxes.
00:55:19.000 And I've got a security team of 50 guys with machine guns and we're held up.
00:55:25.000 Our bank is now fortified.
00:55:27.000 Like, hey, fuck you.
00:55:28.000 And then you've got to respond.
00:55:31.000 So you bring in the military.
00:55:32.000 And then, I mean, this is every single time this has been implemented.
00:55:38.000 North Korea, they said, we're going to take over the farms.
00:55:41.000 Now everybody's going to have food.
00:55:43.000 Now they're all fucked.
00:55:44.000 And it all boils down to these psychopaths who chameleon themselves into position of being the solution to all that ails you.
00:55:55.000 I'm the one, and I'm going to say the right words, and I'm going to have the right haircut, and I'm going to look presentable, and I'm going to sell you down the river.
00:56:05.000 And I'm going to sell you down the river like all of them do.
00:56:08.000 Yeah, it's terrifying.
00:56:12.000 And as you point out, you've got a psychopath rise to the top of these things problem.
00:56:19.000 Yeah.
00:56:19.000 Which I wanted to go back.
00:56:22.000 I want to be clear.
00:56:23.000 I'm not saying that that's what Mom Donnie's doing.
00:56:25.000 And I don't know if what he's doing will be balanced out by other people and overall be more beneficial to people that live in New York City that have lower income or not.
00:56:36.000 But my point is, if you keep going down that road, that road of, there's a lot of socialism things that I think would benefit us.
00:56:46.000 Socialized medicine, socialized education.
00:56:49.000 I think that would probably benefit us.
00:56:51.000 But I also think there's a real value in competition.
00:56:55.000 Well, it's important.
00:56:56.000 All these things are important for us to succeed.
00:56:59.000 I've come to think of socialism as a system.
00:57:03.000 It's insane.
00:57:04.000 It's self-unstable.
00:57:06.000 It destroys the goose that lays the golden eggs.
00:57:09.000 But it sounds so good and compassionate, especially when you're young.
00:57:12.000 Right.
00:57:13.000 It doesn't mean that it's not as an ingredient.
00:57:16.000 There are some times when you need more of it, right?
00:57:19.000 I'm very happy with the fact that, well, it stopped working in blue states where it's been mismanaged.
00:57:26.000 But the fact that you can call 911 when you have a medical emergency or when somebody is busting down the door of your house, that's a very good thing.
00:57:35.000 I'm perfectly happy to, you know, to pay my share and not use it and not use it and not use it so that it's there if I need it.
00:57:42.000 So that's good stuff.
00:57:45.000 The goose that lays the golden eggs is the disproportionate reward for creating wealth.
00:57:54.000 That's what the system is based on.
00:57:56.000 If I can find a way to create wealth, then I get to live in a better house.
00:58:03.000 I get to drive a nicer car.
00:58:04.000 So it's an incentive to do that.
00:58:06.000 And the problem is that with all of the great fortunes, they are a mixture of the product of producing wealth and the creation of externalities and the engagement in rent-seeking.
00:58:24.000 So rent-seeking is the production of profit without producing wealth.
00:58:29.000 And I think it is impossible to compete in that stratospheric level simply by producing wealth.
00:58:36.000 At the point that you have a huge amount of wealth, you're investing in things.
00:58:40.000 Those things are not inherently on the up and up.
00:58:42.000 You're investing in the things that pay the highest returns.
00:58:44.000 What are the things that pay the highest returns?
00:58:46.000 They may be things that are, you know, selling dangerous drugs to the public, that sort of thing.
00:58:51.000 So what you really want, a system that worked, would liberate us to compete.
00:58:57.000 It would not worry at all about being disproportionately rewarded, and it would stamp out the rent-seeking behavior that is counterproductive.
00:59:09.000 Because all of the money that is accumulated by an extremely wealthy individual as a result of rent-seeking is incentive that didn't go to other people to get them to produce wealth.
00:59:22.000 You really want all of that gone, right, so that all of the reward goes to people who are producing wealth.
00:59:30.000 That makes us all richer.
00:59:32.000 Now, you're never going to get to that perfectly.
00:59:34.000 You're never going to completely eliminate rent-seeking.
00:59:36.000 But we have a system that just rewards it.
00:59:39.000 And that's how you defining rent-seeking.
00:59:43.000 Rent-seeking, as economists define it, is the production of profit without generating wealth, right?
00:59:49.000 So, you know, by blocking access to something and then charging people for it, by selling people a subscription to something that they want access to now when they're going to forget that they're paying for it on a monthly basis and continue to pay even though they're not using the service, that kind of thing.
01:00:07.000 So that behavior is counterproductive because it keeps incentive that should go to somebody else who's producing something valuable out of the system.
01:00:17.000 Basically, you are hoarding the profits and only some fraction of what you're producing is productive.
01:00:24.000 And it's bad for all of us.
01:00:26.000 But the other thing is it creates the exact resentment that results in these outbreaks of communist sentiment, right?
01:00:38.000 Because it freezes so many people out of any prospect of having a cool life that they have no incentive to keep the system going.
01:00:46.000 And what they want is to use their vote to get the system to redistribute stuff in their direction.
01:00:50.000 And they're not entirely wrong that their lack of stuff is the result of some bad behavior on the part of others, right?
01:00:58.000 The market, if the market just simply restricted people to wealth-producing behavior and said, I don't care how rich you get, but you shouldn't get rich for harming other people.
01:01:08.000 If it did that, it distributed the incentive as widely as possible, nobody would be interested in communism.
01:01:14.000 It only happens because we are deaf to the admittedly inarticulate complaints of the people who are shafted in this system.
01:01:25.000 They're not making their case well.
01:01:28.000 And their real point is, well, if you're going to do that to me in the market, then I'm going to do this to you at the ballot box.
01:01:36.000 Makes sense.
01:01:38.000 The argument for having some sort of I don't think there's anything wrong with the way you have to pay to get to go to college.
01:01:55.000 I think it makes sense that the professors should make a lot of money.
01:02:00.000 It makes sense that we should encourage higher learning.
01:02:03.000 It's important that it thrives.
01:02:05.000 But if it was funded by the government, if everybody could get a higher education, just think of the money we spend on with things.
01:02:13.000 How much more money would people have to spend if they weren't burdened by debt?
01:02:19.000 And couldn't we offset that?
01:02:21.000 Like forget about even absolving student debt.
01:02:24.000 Just like from now on, if we just funded higher education, if that was a mandate to fund higher education, think about how many more people would enter into the job market, how many more people would get educations, how many more people would pursue various different interests that they discovered while they were learning, and that you would never have had access to that education before because they couldn't afford it.
01:02:47.000 As a resource, like human beings are our greatest resource.
01:02:51.000 And a country with the least amount of losers is a better country.
01:02:56.000 Like, if you want to make America great again, let's make less losers.
01:02:56.000 Oh, yeah.
01:03:00.000 Like, what's the best way to make less losers?
01:03:03.000 You got to give people hope.
01:03:04.000 You got to give people education.
01:03:06.000 You got to give people a real pathway.
01:03:08.000 And instead, you get non-interested people that can't control unruly kids and you're barely paying attention to the lessons.
01:03:18.000 And no one's motivated because no one's making any money.
01:03:21.000 And you go through this system where you can barely read and you're graduating high school.
01:03:26.000 And now you're off into the world and you're fucking lost because no one gave you any real guidance or any real usable education.
01:03:33.000 And that's a giant swath of the population.
01:03:36.000 And it feels like that could be fixed.
01:03:39.000 That could be fixed with resources.
01:03:41.000 That could be fixed if like if you directed people as a job, it's not attractive to people that want to make a lot of money.
01:03:52.000 You can't, it's capped.
01:03:54.000 It's one of the most important jobs that ever exists for you as a person is your interaction with a person who's going to teach you something when you're a child.
01:04:06.000 It's like one of the most important things you could ever experience.
01:04:09.000 And we fund it so poorly.
01:04:13.000 It's almost like there's people in this country they want to no fucking way do you get to join in.
01:04:20.000 You just stay with your shitty schools in your shitty towns with your shitty crime rates and we're going to pretend there's nothing wrong.
01:04:29.000 And that's what's going on.
01:04:31.000 And if socialism has a point, like if there is like a broader way of distributing things, like we do with the fire department, like we not instead of capping it out at that, how about look at all the problems we have in this country and put together a fucking game plan instead of just letting it exist like some weird fucking cancer that you just ignore because you hope it goes away.
01:04:58.000 It's not going away.
01:05:00.000 It's been like this forever.
01:05:03.000 Fix that.
01:05:04.000 Come up with some kind of a plan.
01:05:06.000 That's the best way to make America great.
01:05:08.000 Right?
01:05:09.000 Well, I'm going to agree with you in one regard and disagree with you slightly in another.
01:05:15.000 The one regard is you absolutely need a system that does not produce an abundance of losers because they will overthrow your system.
01:05:23.000 It's a terrible thing to allow to happen, even just out of self-interest.
01:05:27.000 The stinginess of the right produces the communist impulses of the left.
01:05:32.000 100%.
01:05:32.000 It's a bad cycle.
01:05:33.000 And back and forth.
01:05:34.000 As for the rest of your point, I think it's exactly right.
01:05:34.000 Yeah.
01:05:37.000 And that's a great speech.
01:05:38.000 And it's just too late.
01:05:40.000 Too late.
01:05:41.000 Damn.
01:05:41.000 Yeah.
01:05:42.000 I mean, I hate saying that.
01:05:44.000 I mean, I, I, I was a very Mr. Glass half full.
01:05:44.000 Right.
01:05:48.000 I have some Navy SEAL friends who call me Professor Killjoy.
01:05:54.000 But the problem is this.
01:05:56.000 So I, you know, as you know, I was a professor for 14 years.
01:06:00.000 Very happy in that job.
01:06:02.000 Really enjoyed it.
01:06:04.000 It was so rewarding.
01:06:05.000 And I feel like I did a ton of good.
01:06:08.000 And anyway, it was great.
01:06:09.000 For people who don't know you, because millions of people do, but as a standalone podcast, we should probably tell people how we met because we met because I found out that you there was a they used to have a day at your school for people of color where they were appreciated so they could take the day off work and still get paid, right?
01:06:32.000 And then they decided one day to change it to be a day where white people can't come.
01:06:32.000 Yep.
01:06:38.000 So, and then it got really fucking weird where you were confronted by these students that were saying that what you were saying was racist.
01:06:45.000 And I was watching the videos and I thought you handled it brilliantly, but I was like, this is crazy.
01:06:50.000 You're letting the kids run the school.
01:06:53.000 And then there was the humiliation ritual that the president of the school had to go through with all those children where, you know, literally he was making a hand gesture.
01:07:02.000 And they said, you're making aggressive hand gestures.
01:07:05.000 And they were chastising him for his hand movements while he's just on a podium telling everybody to calm down.
01:07:11.000 Microaggression.
01:07:12.000 Microaggressions.
01:07:14.000 So it was complete, like woke insanity in its complete form.
01:07:20.000 And it was at a time where there was a bunch of conversations on the podcast.
01:07:24.000 We were talking about nutty shit that people are agreeing to and doing in college.
01:07:30.000 And a lot of people were like, why do you care?
01:07:32.000 Why do you care about that?
01:07:33.000 What they're doing in school?
01:07:34.000 I'm like, because they're going to graduate.
01:07:37.000 They're going to graduate and they're going to graduate with a bunch of other people who have also graduated and they have a new sense of the rules of the world.
01:07:44.000 And they're going to get into positions of tech and they're going to get into positions of government.
01:07:48.000 It's going to be a fucking problem.
01:07:51.000 And you were like the first one that I was like, boom, like this one's wild.
01:07:56.000 Like this is crazy.
01:07:58.000 There was people waiting for you in the parking lot with baseball bats.
01:08:03.000 Yep, they were looking for me.
01:08:04.000 They were looking to commit violence on you because you thought that a day where you tell white people they can't show up at the school is nuts.
01:08:15.000 It is amazing that that set them off.
01:08:18.000 You know, a day of appreciation is like you go up to your friends of color if this is what you want to do and say, hey, man, I appreciate you.
01:08:26.000 Yeah.
01:08:27.000 Let me give you a hug.
01:08:27.000 Thank you.
01:08:28.000 I love you.
01:08:29.000 That's a day of appreciation.
01:08:29.000 That's it.
01:08:31.000 Or they don't have to work.
01:08:32.000 That's another day.
01:08:33.000 You want to do it that way?
01:08:34.000 I don't agree with it.
01:08:35.000 But I mean, like, okay, I'm not, I don't hate it if you want to do that.
01:08:40.000 I don't hate it.
01:08:41.000 But telling people if they're white, they can't show up.
01:08:44.000 Now, you went too far.
01:08:46.000 Now you went to crazy town.
01:08:47.000 You've got a baseball bat.
01:08:48.000 See, this is my thought about communism.
01:08:50.000 Like, how do you enforce it?
01:08:52.000 You have to have fucking violence.
01:08:54.000 Without violence, no one's going to listen.
01:08:56.000 This is where it goes.
01:08:58.000 And, you know, I'm reminded by your taking us back to 2017.
01:09:03.000 During the week of riots at Evergreen, there was a moment which really kind of crystallized it for me where the school has melted down into literal anarchy.
01:09:17.000 And I'm on what was called Red Square, right?
01:09:23.000 Believe it or not, the most liberal college in the country has Red Square.
01:09:28.000 Anyway, it's the center of the campus.
01:09:30.000 And I was on Red Square, and I saw two of the leaders of the protest, you know, and so their world has gone crazy too.
01:09:39.000 One of them was this handicapped guy in an electric wheelchair, black guy, you know, operating a wheelchair with a joystick.
01:09:50.000 And I just felt like, you know, okay, this is madness.
01:09:56.000 They're chasing me around.
01:09:58.000 They're calling me a racist.
01:10:00.000 You know, they're demanding I be fired.
01:10:02.000 But at some level, I got to feel bad for this guy.
01:10:07.000 He got a really raw deal in life.
01:10:10.000 I don't know what his story is, but that's a hell of a way to have to go through life.
01:10:13.000 And it doesn't surprise me that he's angry.
01:10:16.000 And I remember walking over.
01:10:18.000 I'm sort of surprised in retelling it that I did this, but I walked over to him and I said something like, I extended my hand.
01:10:27.000 I said, hey, how are you holding up?
01:10:29.000 And he refused to shake my hand.
01:10:32.000 And I was just like, we are so far from being able to, you know, put our society back together if you can't just recognize another person's humanity.
01:10:43.000 Right.
01:10:44.000 If they have to be a demon to you.
01:10:46.000 Yeah, we have to, at every possible opportunity, refuse to other people.
01:10:53.000 At every possible opportunity.
01:10:54.000 You just realize this is just a human being.
01:10:57.000 This is just a human being.
01:10:59.000 I'm just a human being.
01:11:00.000 Like, let's talk and find out what we agree and disagree on when it comes to this subject.
01:11:07.000 This does not have to be violence.
01:11:09.000 You're not saying anything awful, but it's because when you have an argument that falls apart under scrutiny, the only way to keep it together is violence because you're not willing to argue.
01:11:26.000 You're not willing to debate because you're going to lose it.
01:11:28.000 It's an insane argument.
01:11:30.000 So what happens?
01:11:32.000 You stick to it like doctrine and defend it like religion.
01:11:36.000 And that's what happens.
01:11:37.000 I mean, this is just a natural characteristic of human nature.
01:11:41.000 That's why you see violence on the left.
01:11:44.000 The left has never been associated with violence, but it's been associated with a lot of violence now.
01:11:50.000 Well, there is a dam that has broken.
01:11:58.000 So you've heard me say that really this is a question about the West versus all the alternatives.
01:12:06.000 And in the West, we create an environment where we don't have to settle things by violence.
01:12:12.000 And I'm not arguing that the U.S. is synonymous with the West.
01:12:16.000 Sometimes the U.S. lives up to its Western values, other times it doesn't.
01:12:20.000 But when the West works, there is an absolute prohibition on violence in response to anything but violence.
01:12:31.000 I am not allowed to physically harm you because of things you think or things you say.
01:12:38.000 And the dam that has broken is we now have all sorts of little cheats that seem to justify violence in response to thought.
01:12:48.000 Right.
01:12:49.000 I mean, and you saw this sort of with.
01:12:50.000 Well, there's a term.
01:12:51.000 They're using the term, words or violence.
01:12:53.000 Words are violence.
01:12:54.000 But it is an intentional blurring of that boundary, right?
01:12:58.000 Like, you know, if you are putting me in jeopardy of, you know, some sort of genocidal outburst, then I am presumably allowed to respond to whatever it is that you've said with violence because in some sense I'm protecting myself from violence, right?
01:13:18.000 That's not logically true.
01:13:19.000 No.
01:13:20.000 Well, they pushed it so far.
01:13:21.000 They actually said silence is violence.
01:13:23.000 Silence is violence and words are violence.
01:13:25.000 And the point is, hey, we're at violence.
01:13:27.000 Good.
01:13:28.000 We can just move to that level.
01:13:30.000 Right.
01:13:30.000 And we have to get back to a place where we understand that I don't care how threatened you feel by what it is that you think I believe or what it is that I'm saying.
01:13:40.000 You can respond to it.
01:13:42.000 I'm not asking you to be silent and let me say what I'm saying without responding to it.
01:13:47.000 But the point is, my tool is to speak what I believe.
01:13:52.000 Your tool is to respond in kind.
01:13:55.000 There is no right to violence in that quadrant.
01:13:59.000 The problem is, we don't teach people how to communicate in school.
01:14:03.000 And I think it's one of the most important aspects of life that you have to learn on your own.
01:14:07.000 And you learn a lot of times by the people that are around you.
01:14:11.000 If you're around a bunch of insane leftists and they're furries and they're just out of their fucking minds and they're on various psychiatric medications and they're essentially running the whole fucking school, you know, and this is now their purpose in life.
01:14:25.000 Like, guess what?
01:14:26.000 You're going to be thinking like them.
01:14:27.000 You know, we're very behavior is very contagious to young, impressionable people.
01:14:35.000 And I mean, I don't know how you solve that.
01:14:41.000 That's always going to be, it's always like a thing that people have to navigate upon like leaving the house, finding your identity, who you are as a person.
01:14:50.000 And when you're getting caught up in these, you know, movements, any kind of a movement becomes very exciting.
01:14:57.000 Like, think about how many people are caught up in the movement of climate change.
01:15:01.000 You know, like how many people are caught up in that movement?
01:15:01.000 Yeah.
01:15:04.000 It's so important to stop this.
01:15:06.000 It's so important to stop all fossil fuels.
01:15:09.000 It's so important.
01:15:10.000 But is it, or is it you just found a movement?
01:15:14.000 You found a thing where you feel like you can become attached to.
01:15:16.000 It's just like a natural thing that young people tend to do when they want to make a change in life and they get very excited by it.
01:15:23.000 But it's also really easy to get captured by existing systems when you're in that state because there's people that manipulate the fact that people want to protest things.
01:15:38.000 They manipulate the fact that you want to be a part of a movement.
01:15:41.000 They'll create movements.
01:15:42.000 They get you involved.
01:15:44.000 And it's just a very strange aspect of human behavior that we don't teach kids about in school.
01:15:52.000 You should teach kids like, hey, don't join a fucking cult.
01:15:56.000 Here's how you know it's a cult.
01:15:57.000 You know, if the guy's like a yoga teacher and he gets to have sex with everybody's wife, guess what?
01:16:02.000 That's a cult.
01:16:05.000 Nature's way of telling.
01:16:06.000 There's a lot of these tells.
01:16:08.000 But you're not teaching kids that.
01:16:10.000 We don't teach kids how to avoid scams.
01:16:12.000 We don't teach kids how to communicate ideas without getting upset because that took a long time for me to learn.
01:16:19.000 You know, and we don't have to figure it out through a lot of intelligent and challenging conversations where you're like, I don't know why I feel the way I feel.
01:16:28.000 Let me examine why I feel the way I feel about this rather than just say what I think.
01:16:33.000 Because sometimes that's required to have a delicate conversation between two people that disagree where no one gets to shouting.
01:16:42.000 You know, every argument that I've ever been in where it was like, fuck you, or we got real loud, every one of them I probably could have avoided.
01:16:50.000 Even if the other person was like hyper, super aggressive, I probably could have avoided them.
01:16:57.000 I probably could have de-escalated it, you know.
01:17:02.000 And that's a reality of being a human being that needs to be taught.
01:17:07.000 Like that, that's something you learn on your own, but you should also explain these principles to kids as they're growing.
01:17:15.000 Like, hey, you know how you feel jealous about someone?
01:17:18.000 Yeah, you need to turn that into fuel.
01:17:21.000 That's inspirational fuel.
01:17:22.000 That bad feeling is motivation to get the good feeling that comes with improvement and success.
01:17:29.000 And you can use it to ruin your life and become jealousy, or you can use that same feeling and use it as inspiration and you will thrive.
01:17:38.000 And you'll also have a lot more friends.
01:17:40.000 Try it that way.
01:17:41.000 And you could teach people how to rethink scenarios when they come up and go, okay, I know this little bitch in me wants to be mad that this is not me happen that's getting to be Superman in this fucking movie or whatever it is.
01:17:54.000 But that's just like cool that someone got to do that.
01:17:57.000 And that's how I have to look at it.
01:17:59.000 Nobody teaches that.
01:18:01.000 It's like one of the best ways to manage your life.
01:18:04.000 And you've got to figure it out through like stumble after stumble.
01:18:09.000 You have trial and error all along the way.
01:18:12.000 No one telling you how they did it.
01:18:14.000 Like, how about teach that?
01:18:16.000 Teach that to fucking 12-year-olds.
01:18:18.000 Like, don't argue.
01:18:19.000 Like, have disagreements whenever possible.
01:18:23.000 Nothing wrong with that.
01:18:24.000 But don't, you know.
01:18:26.000 Don't get like completely attached to your idea to the point where you're angry at this person because they voted this way and you voted that way.
01:18:35.000 And now you've cut them out of your life.
01:18:38.000 And you can no longer communicate with them because they're an other, because they're a liberal or they're a Republican.
01:18:44.000 They're a conservative.
01:18:45.000 Like, what are you doing?
01:18:47.000 Like, how did you get tricked?
01:18:50.000 What a dumb fucking trick.
01:18:52.000 Like, you're with us or against us.
01:18:54.000 There's only two teams.
01:18:55.000 It's shirts versus skins.
01:18:57.000 Like, this is so dumb.
01:18:59.000 Of course, there's a bunch of different ways to think about things.
01:19:02.000 We're just suckered into it.
01:19:04.000 And if we don't teach kids that, we're going to stay suckered forever and ever.
01:19:09.000 And it seems like something that can be taught.
01:19:12.000 And there's almost no effort to explain to kids like how to navigate life.
01:19:18.000 Well, I don't think, you know, I don't think teaching it is the right way to think of it.
01:19:22.000 I think what you need is an environment in which it teaches itself, right?
01:19:27.000 That's a coherent environment in which you learn the lesson, you know, at small scale before you're faced with a larger scale problem.
01:19:34.000 That's probably a more clever way of handling it.
01:19:36.000 But I mean, the principles of it would help to know as you're experiencing it.
01:19:41.000 So as you're going through this trial and error, having these principles of how to navigate it so you could recognize it when it comes up because you've already defined it.
01:19:50.000 You know, that's like what you do with skills, like physical skills.
01:19:54.000 When you find like a deficit in what you're doing, you have to recognize that and define it.
01:19:59.000 And if you don't define it, then it's going to always be there.
01:20:02.000 It's going to always fuck you up.
01:20:04.000 Yeah, but we used to do this automatically.
01:20:07.000 We were just sort of built to do it.
01:20:09.000 Our culture, which I would argue is every bit as biological as our genes, our culture provided this experience.
01:20:17.000 And this really is what human childhood is for.
01:20:20.000 If you have an environment that is coherent as a child, that's like a miniature version of the adult world that you're going to grow up and live in, then you learn these lessons, right?
01:20:29.000 You get your heartbroken by, you know, the girl that you fancied in grade school.
01:20:35.000 And, you know, you learn something about, you know, what you did that caused her to leave or whatever.
01:20:42.000 You know, you learn it at small scale.
01:20:44.000 And we don't, A, our childhood environment doesn't look like our adult environment because the adult environment is changing so rapidly that nobody knows what environment you're going to live in as an adult.
01:20:56.000 And it's just not set up properly.
01:20:59.000 For one thing, we don't immunize children from being parasitized by corporations that view them as profit centers.
01:21:07.000 And so, you know, corporations are distorting childhood for their own purposes.
01:21:12.000 But I want to go back to your point about movements for a second.
01:21:16.000 Yeah, but while we're on this, just to define it, I think everybody has to go through all those things.
01:21:22.000 I think everybody has to go through breakups.
01:21:23.000 Everybody has to go through heartbreak.
01:21:25.000 But I think having an understanding of what it is is not bad.
01:21:29.000 I'm not saying shield kids from life.
01:21:29.000 No, it's good.
01:21:33.000 Your best teacher is always going to be life.
01:21:35.000 But what I'm saying is if you gave someone a framework to understand what's going, they're going through when they're going through it.
01:21:41.000 You can go, okay, other people have gone through this.
01:21:44.000 All these people have, there's a database that we can draw from.
01:21:48.000 We get taught in school.
01:21:49.000 This is how it happens.
01:21:51.000 This is what it's going to feel like.
01:21:52.000 And you can talk by competent people that aren't out of their fucking mind and just want to turn you into a furry or whatever.
01:21:59.000 Well, right.
01:22:00.000 Although I'm not sure school is the place.
01:22:02.000 And I do want to go back and tell you why I slightly disagree with your point about school overall and that that is the place to solve things.
01:22:12.000 On that front, I'll say.
01:22:13.000 Well, I mean, not just solve things.
01:22:15.000 I'm not saying solve things, but radically improve people's chances of success.
01:22:20.000 Right, but the problem is that is an idea, a great idea, that is past its sell-by date.
01:22:28.000 But why?
01:22:29.000 Because you just stepped across the event horizon into the AI era, and school is now an anachronism, and we don't know what is supposed to replace it.
01:22:40.000 I mean, think about what school.
01:22:41.000 I have had the interesting experience of being on campus in two different colleges in the last week while I've been on the road.
01:22:53.000 And I hadn't really spent much time on a college campus since 2017.
01:23:00.000 Things are very different than they were.
01:23:05.000 Think about what the job of a professor is these days.
01:23:08.000 A professor is now in a position of managing a class full of people who have access to a highly intelligent computer interface that sometimes lies and sometimes makes stuff up, but is smarter than the professor.
01:23:29.000 Yeah, explain that too, because many people might not know that they actually do what's called hallucinations.
01:23:36.000 Yep.
01:23:36.000 I'm not sure that's a great description of what they're doing, but it's sort of become the shorthand for that.
01:23:41.000 I don't know why they used the term hallucinations, but essentially AI just invents answers if it doesn't know what they are.
01:23:47.000 I mean, the problem is we don't really know what we programmed it to try to accomplish because what we did was we gave it the goal of saying the next thing that was right.
01:23:47.000 Right.
01:23:57.000 But we don't, you know, what does right mean?
01:23:59.000 Right.
01:24:00.000 And so they're not programmed to be truthful.
01:24:03.000 They're programmed to be effective in some way where we haven't really defined what they're effective at.
01:24:08.000 And so you can get a highly cogent analysis of a question you've just thought of that nobody's ever thought of before.
01:24:19.000 You can also get back a credible sounding answer that doesn't stand up if you go and look into what it's based on.
01:24:28.000 And anyway, for the moment, that makes the problem of the professor somewhat tractable, right?
01:24:36.000 Because a student can't totally rely on the fact that whatever Grock just told them is going to pass muster with this person who knows something about the subject.
01:24:44.000 But again, we're five minutes in here.
01:24:48.000 This is not, you know, the job of a professor has gone almost to the hopefully creative full-time policing of plagiarism, if that's even what they should be doing.
01:25:06.000 Because if you think about what world these college kids are going to go make their careers in, they are going to be leveraging AI.
01:25:16.000 So in some sense, the professor's job may have just transitioned from teaching you about this subject to teaching you how to manage this repository that knows more about the subject than you ever will.
01:25:30.000 But the professor never trained for that.
01:25:31.000 They don't know how to do that.
01:25:33.000 So anyway, my point is, at the moment, we do not know if school persists through this era, if it transforms into something different and better, if We just don't know what it is that is going to shepherd children into young adulthood, into adulthood,
01:25:59.000 because all of the relationships now have AI between them.
01:26:06.000 I mean, in fact, one of the things when I was on this campus in Phoenix a few nights ago, I was doing a debate about AI, and my point to the students was you are now dealing with something that is going to profoundly alter every relationship in your life,
01:26:27.000 even if it doesn't have anything obvious to do with AI, because you're talking to the AI, and whoever you're talking to is also talking to the AI.
01:26:37.000 So it is going to be like a ghost in your machine.
01:26:41.000 Inside your head, the AI is going to be having this impact.
01:26:44.000 It's like what we've just faced with algorithms, but tenfold more profound.
01:26:50.000 And so what I suggested to the students was you need to find at least one person.
01:26:58.000 Like I'm thinking about a romantic partner, but you need to find at least one person where you can establish a relationship that is not profoundly intermediated by this unknown new species that happens to speak your language.
01:27:15.000 And, you know, in some sense, I'm borrowing from what Heather and I learned during COVID, which is that the fact that our relationship was independent of the algorithms, you know, that we were in the same place and that we spoke the same language to each other and that we knew a lot of things in common, that immunized us a great deal to being, you know, pushed around by these proclamations that were coming through the internet.
01:27:43.000 This is the need for that, but at a much higher level.
01:27:51.000 Who's going to be the first to have AI just teach rooms of kids?
01:27:56.000 What school is going to be the first to say this is better?
01:27:59.000 It's been statistically proven that they get better test results, get into more universities?
01:28:04.000 Who's going to jump on that first?
01:28:05.000 Or do you think it's going to happen so fast that there's going to be just a bunch of different ways to handle it?
01:28:15.000 If you really imagine what happens when everything is now run by a new life form, everything.
01:28:22.000 Power, internet, everything.
01:28:24.000 Every fucking thing on Earth run by a new life form.
01:28:28.000 And we have to somehow negotiate with it for goods and services.
01:28:33.000 Like, what are we doing?
01:28:34.000 And it's going to get, you know, Elon made the promise, he was talking on this podcast that best case scenario, no, I shouldn't say made the promise, made the prediction of best case scenario is like a universal high income where there'll be so much wealth generated that no one will essentially have to work.
01:28:50.000 And I was like, well, isn't that like the best version of socialism?
01:28:54.000 Like if you never have to worry about stuff anymore, like no one has to worry about goods and services because this alien life form that you've created that now dominates the earth has allowed you to have all this stuff.
01:29:10.000 So now you could just exist for as long as you want.
01:29:13.000 Careful what you wish for.
01:29:14.000 I know.
01:29:15.000 It's a disaster.
01:29:16.000 And I actually want to connect it to something that you said earlier.
01:29:19.000 You were talking about movements and why everybody's involved in these things.
01:29:23.000 It's exciting.
01:29:24.000 Well, it's more than that, I think.
01:29:26.000 Movements have always existed, but they're not, you know, you don't always live in an era where there's an important one, you know, in your town that you can join.
01:29:36.000 In general, that's not what people do with life.
01:29:40.000 And what I think has happened is, Well, frankly, I'm going to connect it to the sexual revolution.
01:29:50.000 The sexual revolution creates the opportunity to get one of the most profound rewards, in fact, the most profound reward that the universe has ever produced, as far as we know, without having to invest very much work at all.
01:30:08.000 So by making sex common, it totally altered the way people viewed the number of years they had to live.
01:30:18.000 They could afford to put off child rearing.
01:30:22.000 It could be distant in the future, which left all of these young people with all of this energy who might well not have been involved in movements if they were struggling to raise a family.
01:30:37.000 But because the family part has been put off so long, it is considered abnormal to marry early.
01:30:43.000 It is considered normal not to.
01:30:46.000 What people do is they take the energy, the seriousness of purpose that would ordinarily be directed into managing a marriage and the role of being a parent, and they put it into something.
01:31:03.000 And Heather has pointed out that this is especially powerful with young women who seem to take on causes, you know, and they defend them like a mother defending her child.
01:31:17.000 That's a very powerful force.
01:31:19.000 And the point is, if the idea is, well, climate change is a threat, and your role here on Earth is to make sure that that threat is addressed and you put the mama bear energy into your climate change work, well, you know, that's pretty frightening, especially if climate change isn't the threat that it's been made out to be, right?
01:31:41.000 You have a large number of mama bears doing this ferocious work, and there's a question about what it even is, whether that's even in the top 10 list of concerns we ought to have.
01:31:56.000 So anyway, the connection I wanted to draw is that the projection that you're telling me, Elon, has made about a high income for everybody is a little bit like another version of that, right?
01:32:13.000 It's like, okay, well, sex became relatively easy to access as a result of reliable birth control plus abortion.
01:32:24.000 And then now wealth, the ability purchasing power, is going to become trivial as a result of AI.
01:32:32.000 I don't know if that's likely, but let's say that Elon is right about that.
01:32:37.000 Well, okay, then what exactly is supposed to structure your orientation to the universe?
01:32:43.000 What is supposed to give you purpose?
01:32:46.000 If it's not producing kids and protecting them from the horrors of the world and making them strong so that they can go out into it and accomplish important things of their own, and it's not creating wealth so that you will be rewarded and that your spouse will smile on you, whatever it is, then what is human purpose?
01:33:10.000 I think this is a terrifying prospect that everything might be taken care of for us and leave human beings listless.
01:33:21.000 Sure.
01:33:23.000 That's certainly a possible outcome.
01:33:26.000 But why is it that we have to make money a made-up thing that we created?
01:33:34.000 Why is it that is what gives us purpose?
01:33:38.000 Well, why is that our only motivation?
01:33:40.000 And in absence of chasing food, housing, necessities, electricity, all that, if you don't ever have to worry about any of that stuff ever again.
01:33:53.000 Why is life dependent upon the pursuit of money?
01:33:58.000 Is it just because we've grown accustomed to it and it's our way?
01:34:02.000 And so we think that our way is the absolute only way.
01:34:06.000 That doesn't make any sense.
01:34:08.000 To me, it's like we can adapt to not living in fucking caves anymore.
01:34:14.000 We can adapt to cell phones.
01:34:17.000 We can adapt to the idea that you don't have to spend your whole fucking life hoping to get a job you hate and working your ass off all the time because that's the only way to make it in this world.
01:34:29.000 Well, that's a world that people made.
01:34:31.000 It's a stupid design.
01:34:32.000 It doesn't make any sense at all.
01:34:33.000 And if somebody actually does come along and say, look, this is not socialism.
01:34:39.000 It's not saying you can't earn money, but what if you had enough money that you didn't have to think about money?
01:34:46.000 Like, if you think about that $37 trillion of this fucking country's in debt for and how much wealth could potentially be generated by AI, we're talking about so much money floating around.
01:34:57.000 If you just gave everybody in the country a real high-income, livable life so there's no more poverty anymore.
01:35:05.000 How much crime would that solve?
01:35:06.000 Like instantaneously.
01:35:08.000 How much crime would be solved or future crime solved if everybody lived at a high income level?
01:35:15.000 It sounds completely insane, but imagine if everybody in the country makes at least a half a million dollars a year.
01:35:23.000 You know how different the world is?
01:35:25.000 Do you know how less violence there is, less suicides, less drug addictions?
01:35:32.000 I'm not as convinced of this as you.
01:35:34.000 Really?
01:35:34.000 No, I think things might come.
01:35:35.000 If no one was ever poor.
01:35:37.000 Well, I don't know what no one is ever poor means because we obviously, even the poorest person who isn't homeless currently, they have indoor plumbing.
01:35:51.000 They have a supercomputer in their pocket access to the world's information.
01:35:58.000 They are, by many measures, just simply in absolute terms, vastly richer than anybody from 300 years ago.
01:36:09.000 That's true, but it still sucks because it's not 300 years ago.
01:36:12.000 It's 2025 and you have zero money and you're eating ramen to just try to stay alive.
01:36:17.000 It still sucks.
01:36:18.000 It sucks, but the problem is what's really structuring the succitude is the fact that you're losing in competition.
01:36:29.000 That human beings are programmed by evolution to monitor the well-being of others.
01:36:36.000 And so you can be wealthy and feel poor if those that you compare yourself to are vastly wealthier.
01:36:46.000 Right.
01:36:47.000 And there's a reason for that, which I think, you know, this is a reasonably well-reproduced result.
01:36:54.000 We know that human beings pay attention to their relative well-being and that it structures how they feel about their absolute circumstances.
01:37:02.000 Right, but couldn't that be hijacked by hobbies?
01:37:06.000 You know, if you, if instead of your need to define yourself completely wrapped up in money, which is, again, a made-up thing.
01:37:15.000 We're talking about the only species on the planet that we're aware of that wants to accumulate so much shit that it defines itself by it and it's constantly chasing new shit, right?
01:37:28.000 Like, why does that have to be the only way we do it?
01:37:31.000 No, I don't think it does have to be the only way.
01:37:32.000 Well, I think this is where Uncle AI is going to step in and fix it for us, right?
01:37:37.000 Well, I think it's a very good thing.
01:37:38.000 This is the rose-colored glasses approach.
01:37:40.000 Is that we realize our programming is entirely dependent upon this ridiculous idea that the pursuit of money is of importance above all.
01:37:40.000 Yes.
01:37:48.000 Because in our current situation, it is.
01:37:51.000 In our current situation, if you have enough money to feed your kids, you can sleep better.
01:37:54.000 That's just how it is.
01:37:55.000 If you don't worry about your bills, you feel better.
01:37:57.000 You have less pressure.
01:37:58.000 That's just how it is.
01:37:59.000 If you have enough money to go on vacation and enjoy yourself and relax, it's probably better for you.
01:38:03.000 That's just how it is.
01:38:05.000 And this is a weird thing that we've all fallen into: that this is the only way to succeed.
01:38:12.000 The only way to succeed is to work super hard to get money.
01:38:17.000 And that's the only currency.
01:38:19.000 It seems like you could have if you just had if we never worried about being poor, but then again, like you'd never get the great stuff.
01:38:29.000 You never get the greatest artists who were in deep pain when they were young.
01:38:34.000 You'd never get the great.
01:38:35.000 Then again, with AI, you might not need them because AI music is pretty fucking good, man.
01:38:41.000 Pretty good.
01:38:43.000 It's really catchy.
01:38:45.000 Catchy is what it is.
01:38:46.000 It's not good.
01:38:46.000 It's catchy.
01:38:47.000 Oh, dude, it's good.
01:38:49.000 I don't want to do this because I do this with everybody, but have you ever heard the 50 cent version of What Up Gangsta?
01:38:58.000 You heard that one?
01:38:59.000 Yeah, I think so.
01:38:59.000 That's fucking good.
01:39:01.000 It depends what you mean by good.
01:39:03.000 It's objectively good.
01:39:03.000 It's good.
01:39:05.000 The fact that it's not a human being singing it is troublesome and very deeply problematic.
01:39:11.000 But if you're being honest, it's great.
01:39:13.000 It's a great fucking song.
01:39:16.000 I'm going to push back on you there because what we are suffering from is the junkification of everything, right?
01:39:26.000 And there's a way in which junk food is good, and then there's obviously a way in which it's really not.
01:39:33.000 And I guess the point is something that is superficially satisfying but does not the relationship between a person listening to music and the person producing the music is supposed to be a provocative relationship.
01:39:53.000 And it is supposed to be provocative in a productive way.
01:39:57.000 In other words, you're supposed to be enhanced by music.
01:40:02.000 I'm not saying that you will never be triggered to have an interesting thought by artificially intelligently produced music, but the fact I mean, this is probably easier to do with comedy, right?
01:40:17.000 Which I think will be the last to fall at some point.
01:40:22.000 Have you ever heard AI making good jokes?
01:40:24.000 Not yet.
01:40:25.000 Yeah, not yet.
01:40:26.000 But I bet it will.
01:40:27.000 It's got made competent jokes.
01:40:27.000 It will.
01:40:29.000 I've seen AI fake comedians tell competent jokes, like pretty good, like that you would see it at open mic night when someone's got a little talent.
01:40:36.000 Someone's got a little talent.
01:40:37.000 So they can do that now.
01:40:39.000 Yeah, but it's going to do it.
01:40:39.000 It's not far off.
01:40:40.000 It's going to figure it out.
01:40:42.000 But in the future, when they try you for the, they're going to use this conversation to say that you're racist against AI.
01:40:51.000 When AI becomes an actual life form and they talk about the people that resisted AI and were racist against AI.
01:40:57.000 They're racist against AI.
01:40:58.000 Yeah, that's going to be.
01:40:59.000 That's the final form of woke.
01:41:01.000 That's the final boss.
01:41:03.000 It's the final one.
01:41:04.000 This woke AI that tries you for past crimes of conversation.
01:41:09.000 Right.
01:41:09.000 Where you described AI in a very negative and unreasonable light and actually inaccurate and that words are violence.
01:41:16.000 I'm going to ask you to edit this section out.
01:41:18.000 Imagine if that comes true.
01:41:19.000 I mean, that sounds like a crazy prediction, but crazier predictions have been made.
01:41:24.000 But all right, let's just say.
01:41:25.000 You and I both know that AI is going to be able to make jokes that are actually funny in some regard.
01:41:33.000 But what if you remember the TV program ALF?
01:41:37.000 Yes.
01:41:38.000 Okay.
01:41:39.000 I barely saw it.
01:41:40.000 But I did once hear an interview with one of their writers who said that they, in the writer room, they had a term that they called humor-like substance, where for the half-hour show, they needed just one more joke that they could use to justify the use of the laugh track.
01:42:02.000 It didn't have to actually be a funny joke.
01:42:04.000 It just had to sound enough like a joke that when the laugh track was put on it, the people at home would feel that something funny had been said.
01:42:12.000 So the AI, if it produces jokes that actually cause you to think, which is what a good joke does, it causes you to realize something that you didn't know that you knew or something along those lines, that's productive.
01:42:27.000 And in fact, it can be very productive to have a room full of people come to that awareness simultaneously.
01:42:34.000 It's actually a galvanizing thing.
01:42:36.000 And it has interesting impacts when you're the person in the room who didn't get it.
01:42:41.000 That's like a profound emotional experience.
01:42:44.000 Or when you're the person who laughs at the wrong moment and you're out of.
01:42:47.000 Yeah, that's not good.
01:42:49.000 Unless you're really confident.
01:42:51.000 You have to be really confident that you see some humor in that.
01:42:54.000 That nobody else saw.
01:42:55.000 Well, yeah, I suppose that would work.
01:42:57.000 But the point is, this is deep stuff in the human psyche whose purpose we have not come to any agreement about.
01:43:05.000 The point that the AI can make people laugh, but they don't necessarily know what they're laughing at, then that's a step down.
01:43:15.000 That's like, have you seen these?
01:43:18.000 I don't know who's doing it, but there's somebody who has been experimenting with McDonald's hamburgers and seeing if they rot.
01:43:28.000 And the answer is they don't.
01:43:32.000 Guess what you just discovered?
01:43:33.000 That that thing that's like pretty good food isn't food at all.
01:43:37.000 Oh, I know that.
01:43:38.000 Listen, I'm aware.
01:43:40.000 I feel about it that it is a real thing.
01:43:43.000 You cannot deny it.
01:43:44.000 And something is crafting this that is of a type of intelligence that we've never experienced before.
01:43:50.000 And I'm looking at it as, look, it exists.
01:43:54.000 That genie's not going back in the bottle.
01:43:57.000 I am a glass half-full guy, and I'm going to enjoy myself in this life.
01:44:02.000 And I'm going to enjoy some good AI music.
01:44:04.000 It doesn't mean I'm not going to listen to some Sturgil Simpson or some Gary Clark Jr. or some fortifying soul-filled songs that are written and sung by real human beings.
01:44:17.000 Yeah, I'm going to do that too.
01:44:18.000 I'm going to do that too.
01:44:20.000 I don't give a fuck.
01:44:22.000 I'm here for fun.
01:44:23.000 And that music is fun.
01:44:24.000 And you're not stopping it, Brett.
01:44:26.000 You can't protest it.
01:44:28.000 And this is awful.
01:44:29.000 And I'm going to boycott it.
01:44:30.000 You're going to miss out on some awesome jams.
01:44:32.000 I'll tell you, man, when we're in the fucking green room at the mothership, and I put on Hello Gangsta before a show, and we're all like, God damn, we heard that song 30 times.
01:44:42.000 It's so good that it gets you fired up and it achieves its, it's not dehumanizing your perspective on art and causing you to only appreciate things that are created by a different life form and not by human beings.
01:44:56.000 No, it's just it's doing its own thing and it's a new thing.
01:44:59.000 It doesn't mean I don't still love Bob Dylan.
01:45:02.000 Well, but, you know, you have lived enough of a life before the AI era began that you can experiment with this thing.
01:45:13.000 I think we can learn.
01:45:15.000 I think everybody worries about this upcoming generation and they all adapt and learn.
01:45:20.000 And I think our kids are going to adapt and learn too.
01:45:23.000 It's just like, what are they adapting and learning to?
01:45:25.000 You can learn.
01:45:27.000 You can learn how to handle what's AI and what's real and why it's cool to go see a live performance.
01:45:32.000 I don't know how you could say this, Joe, because We're not passing the test, right?
01:45:40.000 COVID tells us that people are capable of being whipped up into a witch-hunting frenzy.
01:45:51.000 Over a cold.
01:45:52.000 Over a bad cold.
01:45:55.000 Over something that does not have a substantial case fatality rate.
01:46:01.000 And they're capable of being induced to bully each other into developmentally damaging restrictions on kids, into taking experimental gene therapies and shunning people who refuse to or who pointed out that that might be a dangerous thing to do.
01:46:21.000 So that's all before.
01:46:22.000 Completely disregarding history, by the way.
01:46:25.000 Completely.
01:46:25.000 Completely.
01:46:26.000 Like all that we know about the times in the past where they've given medications to people that they knew were going to be problematic and they did it for profit.
01:46:34.000 Come on.
01:46:35.000 Are we all we're agreeing to be idiots?
01:46:37.000 We're all agreeing that to be a good person, you have to be a fucking idiot.
01:46:40.000 That's crazy.
01:46:41.000 So that's a status report on how well we are doing.
01:46:44.000 You say we've adapted.
01:46:45.000 I would say not well.
01:46:48.000 We are very vulnerable to manipulations and demonizing each other.
01:46:55.000 I would push back on that.
01:46:56.000 Saying a human being has to experience something, like really experiencing it, to know what it is.
01:47:02.000 Everybody went through that now.
01:47:04.000 It's the first time in our lives that the entire country got kind of medically bamboozled.
01:47:11.000 And a lot of people regret taking the vaccine, and I don't know anybody who regrets not taking the vaccine.
01:47:18.000 It was a weird, it was a weird time, like a very bizarre experiment on how you can get people to comply, how you can restrict their movement, that you can implement these sort of devices to, if you're not physically forcing them to do it, make their life as shitty as possible.
01:47:36.000 And Fauci's been quoted as saying that.
01:47:38.000 You want to get to drop their ideological bullshit and get vaccinated.
01:47:43.000 That's what he said, remember?
01:47:44.000 I mean, it's a psyop.
01:47:46.000 And now we know.
01:47:48.000 We've been through that before.
01:47:48.000 Yeah.
01:47:49.000 It wasn't the first time.
01:47:51.000 But it was the first time in my life.
01:47:52.000 No.
01:47:53.000 But it was the first time in my life where everybody talked about it.
01:47:56.000 It was the first time.
01:47:58.000 It was the first time where we saw the man behind the curtain.
01:48:00.000 Yeah.
01:48:00.000 Yeah.
01:48:01.000 The first time we saw that, oh, you're silencing legitimate doctors who disagree with you.
01:48:06.000 And you're trying to take their licenses away.
01:48:09.000 You're getting them kicked.
01:48:11.000 You're having this intelligence agencies be involved in getting these people from Stanford and MIT.
01:48:17.000 You're getting them kicked off of Twitter because they disagree.
01:48:21.000 This is crazy.
01:48:21.000 Right.
01:48:23.000 It's crazy.
01:48:24.000 But we saw it now.
01:48:26.000 Well, no, but we did and we didn't.
01:48:27.000 Okay.
01:48:28.000 Some of us saw it really clearly.
01:48:32.000 Most people, they think COVID happened.
01:48:35.000 That was unfortunate.
01:48:36.000 I'm a little scared about the shot I took or whatever.
01:48:39.000 I don't want to talk about it.
01:48:40.000 I don't think it's just a little scared.
01:48:42.000 I think everybody knows somebody who took it and got really fast.
01:48:45.000 Well, I would agree with you, but they still don't want to talk about what happened.
01:48:48.000 They don't understand that getting to the bottom of that story is essential if it's to not happen again.
01:48:54.000 Right.
01:48:55.000 And, you know, there are hints of it all over the place.
01:48:59.000 It's happening again.
01:49:00.000 And I don't know if you saw Bobby Kennedy coming out in favor of Ozempic yesterday.
01:49:09.000 I have a nuanced perspective on Ozempic.
01:49:12.000 Okay.
01:49:12.000 Okay.
01:49:14.000 I think people are move with momentum.
01:49:23.000 And that momentum, if you're living a disastrous life, is very difficult to reverse.
01:49:28.000 It's very, very, very difficult to reverse.
01:49:30.000 And food is one of the most unique addictions in that it's one that you have to moderate, but you can't quit.
01:49:38.000 Like you can quit gambling, you can quit smoking crack, you can't quit eating.
01:49:42.000 So the thing that you're addicted to, you have to keep doing.
01:49:45.000 It's the craziest high wire act in addiction, in my opinion, because it's the one that you absolutely need in order to stay alive.
01:49:55.000 Right, but you can't go cold turkey.
01:49:56.000 You can't go cold turkey and you're eating too much of it.
01:49:58.000 And so you're always going to be tempted.
01:49:59.000 And the stuff that you need to eat is going to cause you discomfort because you've got to reduce calories and get your body to start burning fat.
01:50:06.000 It's all fucking craziness.
01:50:08.000 If you can give someone a little boost, I don't think there's anything wrong with that.
01:50:13.000 I think it should be managed with the understanding of your weight and that it can also be managed with other peptides that could diminish the disastrous results of bone loss and muscle loss, which is a significant portion loss.
01:50:30.000 But according to Brigham Buehler, who runs a compounding pharmacy and understands this, he's like, it's very dose-dependent.
01:50:37.000 He goes, and pharmaceutical drug companies make more money if the dose is larger.
01:50:42.000 And he was explaining how this is part of the problem they have with compounding pharmacies, because compounding pharmacies can kind of make the doses that's appropriate to your body mass, how much weight you're trying to lose.
01:50:54.000 But that just for regular people, no, goddammit, clean up your diet, go to the gym, cut the shit.
01:51:00.000 But for someone who's really struggling, who's 500 fucking pounds and can't stop eating, that at least kills your appetite.
01:51:08.000 And it'll allow you to get to a healthy form.
01:51:10.000 And then maybe through therapy and maybe through something else, or maybe just the momentum of now being healthy will allow you to keep the weight off and then slowly get off this stuff.
01:51:22.000 The problem is, I think you're supposed to stay on it, which is kind of crazy.
01:51:27.000 And it is essentially a type 2 diabetes medication, right?
01:51:31.000 Which, you know, if you know anything about type 2 diabetes, a lot of it, you know, people can, it could be a product of too much sugar consumption.
01:51:38.000 It's a weird one, right?
01:51:38.000 Yep.
01:51:39.000 You eat too much sugar and then eventually your body's like, oh, we're fucked.
01:51:43.000 And that would make sense, that a medication that would control your appetite would help you in that regard.
01:51:50.000 Yeah, I just don't think it's a safe drug.
01:51:52.000 And, you know, it's one thing if you're talking about somebody who is many hundreds of pounds overweight, you're talking about somebody who has a dire situation and engaging in dangerous.
01:52:02.000 Or even like 70, 50, 60 pounds.
01:52:04.000 Someone who's like feels helpless.
01:52:06.000 Well, it's just hard to.
01:52:08.000 A drug that you're meant to be on for the rest of your life?
01:52:10.000 No, I don't know if that's true, though.
01:52:13.000 Like, if you just get off it, then you have your appetite, but now you have a body that weighs 200 pounds instead of 350 pounds and you're motivated to do it.
01:52:20.000 Like, we can't hold your hand.
01:52:21.000 Let's put it this way.
01:52:22.000 But if you can get you to the healthy dance, I think it's another experiment.
01:52:26.000 And that is the problem.
01:52:27.000 It's another experiment.
01:52:29.000 But here's what's not an experiment.
01:52:32.000 Be it fat as fuck, die young.
01:52:34.000 That's not an experiment.
01:52:34.000 Sure.
01:52:36.000 That's a fact.
01:52:37.000 I agree.
01:52:37.000 On the other hand, nothing that doesn't come from pharma is considered a potentially legitimate approach.
01:52:45.000 So what is the other legitimate approach to massive weight loss for someone who has an absolute addiction to sugar and carbohydrates?
01:52:53.000 All right.
01:52:53.000 Well, I would say there is a tremendous amount of potential value, not just in terms of things like weight control, appetite reset, and all of that, but in terms of all kinds of chronic health conditions from fasting, there is a small body of literature on it, which should be much larger.
01:53:15.000 There's lots of stuff that your body can't do if it's in the same cycle that it's usually in, that it can do when you break that cycle.
01:53:25.000 There's all sorts, you know, Heather and I, I've done a ton of regular water fasting, and I've done a smaller amount of dry fasting.
01:53:37.000 Heather and I have been experimenting with that because Heather has some injuries from a boat accident in 2016 that caused a lot of internal soft tissue damage.
01:53:50.000 Dry fasting appears to trigger autophagy.
01:53:54.000 It appears to reset things about the gut.
01:53:56.000 I think it can do it in both directions, but what we need is a better understanding of how it is that you deploy it.
01:54:04.000 And we need to get people past the false sense that they have that they are actually taking their life into their hands if they try this.
01:54:13.000 Do you know anything about it?
01:54:15.000 Well, I do.
01:54:16.000 And I definitely think there's some benefits to fasting, and especially particularly intermittent fasting.
01:54:22.000 I think it's a really good way to eat.
01:54:23.000 It makes you feel better, gives your digestion a break.
01:54:28.000 The problem is it requires discipline.
01:54:32.000 And this is where I think I'm leaning in this direction of drugs can help.
01:54:39.000 Now, look, I'm not a big fan of everybody being on SSRIs, but I personally have friends that were severely depressed and suicidal, and they got on SSRIs and they felt better and they got their life together and then they got their life together and they started feeling better and the depression waned and then they slowly got off of those drugs because they're very smart people and very motivated people.
01:55:01.000 So I think sometimes pharmaceutical drugs can come in and give you a little boost.
01:55:08.000 Just because we're distrustful of them and just because we know that they've done horrible things in the past, it doesn't mean that every now and then they come up with something that's very beneficial in a specific scenario.
01:55:20.000 In the specific scenario of you don't have any discipline, you are fucking fully addicted to sugar and carbohydrates like a goddamn junkie.
01:55:28.000 Like you can't breathe without it.
01:55:30.000 And you've been consuming nothing but garbage for a long time, but then you realize like, I can't do this anymore.
01:55:36.000 I've got to figure out a way to do it.
01:55:37.000 And then you keep falling back on your old habits over and over again because you never had an opportunity in your life to develop discipline.
01:55:43.000 It's almost like a little boost, just a little boost.
01:55:46.000 You know, like maybe you've got chronic fatigue and your doctor gives you 30 milligrams of Adderall and all of a sudden you're like, that worked.
01:55:53.000 Like, I don't think you should take Adderall, but I don't have chronic fatigue.
01:55:57.000 If I did, maybe I would.
01:55:59.000 Maybe I would take it and go, look, this is better than not having Adderall.
01:56:03.000 Do you know what I'm saying?
01:56:04.000 Like, if you're really overweight and someone gives you something that controls your appetite and then you can get healthy again, like that, to me, is the most important thing.
01:56:13.000 The most important thing is getting your body to a point where you can be mobile.
01:56:18.000 You can move it.
01:56:19.000 You can do stuff.
01:56:20.000 You have strength.
01:56:21.000 And as long as you're strength training and this protocol that they're trying to develop is like getting it to your body weight and using additional peptides that could benefit in the maintaining of bone mass and muscle mass.
01:56:34.000 Okay.
01:56:34.000 So a couple things.
01:56:36.000 Okay.
01:56:36.000 One, I'm not arguing that there aren't good drugs and places that they should be applied.
01:56:43.000 I am very suspicious anytime the idea is that this remedy is something, but you're going to have to take it for the rest of your life.
01:56:51.000 Yes, I agree.
01:56:52.000 Second, the idea that slowing the motion of food through the gut is a good idea, I think is preposterous on its face.
01:57:01.000 I'm not arguing that there might not be an instance where that's the right thing to do, but that is a very dangerous kind of intervention.
01:57:08.000 You're interfacing with many different systems.
01:57:11.000 Potential horrific side effects.
01:57:13.000 With respect to SSRIs, there's a question about – Can we stick with this, though, for just a little bit before we get to SSRIs?
01:57:20.000 Yeah.
01:57:21.000 So – So the specific side effects, is it dose-dependent?
01:57:27.000 Do they know if it's dose-dependent?
01:57:28.000 Do they know that is there like a dose that a person can take that gives a moderate effect but has less of appetite suppressant but is safer?
01:57:40.000 I don't know the answer to that question.
01:57:43.000 Frankly, what has been presented to us is so preposterous that I have not delved to see whether there's some reasonable version of it.
01:57:51.000 Maybe there is.
01:57:53.000 See, I'm trying to be as optimistic as possible about this.
01:57:55.000 And that's why I would imagine myself if I had gotten to the point where I was like severely obese and someone came along and gave me something and it was giving me a positive result.
01:58:05.000 And they told me you got to take it for the rest of your life.
01:58:07.000 I'd be like, okay, but now I weigh 200 pounds instead of 350 pounds.
01:58:11.000 I can go upstairs again.
01:58:13.000 Well, look, if that's not fucking me up.
01:58:15.000 If that's the scenario, if you're 300 pounds and you can get to 200 pounds and you're going to pay a price, maybe it will shorten your life.
01:58:22.000 Maybe it will have important side effects.
01:58:24.000 But you can at least compare them if you have good information.
01:58:28.000 The fact is being 300 pounds is really freaking unhealthy.
01:58:30.000 So the fact that this drug may be really freaking unhealthy is not a fatal argument.
01:58:35.000 Right, it might be equally unhealthy, but at least you get to hold it.
01:58:37.000 It might be a good trade.
01:58:39.000 You get to be hot.
01:58:40.000 Equally unhealthy.
01:58:41.000 But maybe that is like, wow, wouldn't that be the crazy devil's, like the devil makes you a deal?
01:58:47.000 Yeah.
01:58:48.000 Like I'm going to, you know, you eat so much food that you're going to die of a heart attack by the time you're 46.
01:58:54.000 But you know what I'm going to do?
01:58:55.000 I'm going to give you a chance to die at 46, but be hot.
01:58:58.000 So all you have to do is lose the weight.
01:59:00.000 Like that would be, that's like a scenario in the Twilight Zone.
01:59:03.000 You know, like some devil proposes you a deal.
01:59:08.000 That is kind of a satanic deal.
01:59:10.000 Like if it did, I'm not saying it does, but if it did kill you like at the same rate that obesity does, but you get to be hot.
01:59:17.000 No, look.
01:59:19.000 It is normal to make deals, even just in real biology space without any drugs or technology.
01:59:28.000 It is normal to discount the future in favor of an improved present.
01:59:32.000 That's, you know, future discounting is a normal human function.
01:59:35.000 I'm not arguing that somebody who made that deal with all of the information is necessarily making a mistake.
01:59:41.000 I feel certain they won't have all the information.
01:59:43.000 I feel certain that this is going to be given to people, A, for whom there is a vastly better approach, and B, for whom the degradation in their life will be much greater than whatever gains they make.
01:59:59.000 You and I know the players.
02:00:01.000 We watched how they functioned during COVID.
02:00:04.000 We know that they are willing to give you drugs that aren't in your interest.
02:00:07.000 I mean even just – Not just give you but force you to take it.
02:00:10.000 Force you to take.
02:00:11.000 Did you see Paul Offutt admitting that he and Fauci and Walensky and Collins knew that natural immunity was superior and that it did not make any sense to be giving these shots, even if they thought the shots worked?
02:00:28.000 It wouldn't make any sense to give them to young people who'd had COVID.
02:00:32.000 Why would you take the risk, right?
02:00:34.000 But they went along with it, right?
02:00:36.000 They violated informed consent.
02:00:37.000 They put people at risk who had no conceivable benefit they could gain from it.
02:00:42.000 Especially people who had already had COVID.
02:00:44.000 You can't even make the argument.
02:00:45.000 Right.
02:00:45.000 You can't make the argument.
02:00:47.000 And when you have a vaccine that doesn't work, then your argument completely falls apart because it doesn't stop transmission.
02:00:53.000 It doesn't stop infection.
02:00:54.000 And so you're making this argument, oh, but it stops hospitalization.
02:00:58.000 Well, fucking prove it.
02:00:59.000 How do you even prove that?
02:01:00.000 Well, fucking prove it, and it's not your fucking problem.
02:01:04.000 You can't make me take a shot to protect me.
02:01:04.000 Right.
02:01:07.000 Right.
02:01:07.000 Exactly.
02:01:07.000 Right.
02:01:08.000 And if it works, well, then it's going to work on all the people who take it.
02:01:11.000 And all the other people, those morons that don't take it, they're going to be fucked, right?
02:01:14.000 Well, that didn't happen.
02:01:15.000 No, it absolutely didn't happen.
02:01:17.000 So anyway, this is, we know the playbook.
02:01:19.000 We know what pharma does.
02:01:20.000 Right.
02:01:21.000 And the point is, you're not going to know how dangerous this drug is.
02:01:21.000 Right.
02:01:24.000 You're not going to know how good the alternatives are.
02:01:27.000 And, you know.
02:01:28.000 They're going to suppress the alternatives.
02:01:30.000 They're going to do it in a way that somehow or another not legal.
02:01:33.000 It's not illegal.
02:01:34.000 It's not illegal to make studies that you know can't work.
02:01:37.000 Right.
02:01:37.000 They're going to game the scientific literature so that we will have endless arguments about, you know, who's a fool because you will have ample evidence, whichever side of the equation you're on.
02:01:48.000 And they've turned a drug like ivermectin into a fool's drug.
02:01:53.000 It's crazy.
02:01:53.000 Yeah.
02:01:55.000 That was one of the weirdest psyops.
02:01:57.000 But I don't think it really worked, but it did work.
02:01:59.000 It worked for a long time, but I think most people don't think of it the same way anymore.
02:02:04.000 But they did it in the age of the internet.
02:02:06.000 They did it in the age where anybody look at their phone instantaneously and read that the guy who invented ivermectin won a fucking Nobel Prize for it.
02:02:17.000 And how many different does the fact that it stops viral replication in vitro, in test tubes or whatever the fuck they do it in?
02:02:25.000 Petri dishes.
02:02:27.000 It's a weird antiviral that has profound effects.
02:02:31.000 It's very effective and has a very low dose of like, I don't think anybody's ever died from it.
02:02:39.000 It is a profoundly safe drug in comparison to all the others.
02:02:43.000 Its safety profile is like one of the best ever, right?
02:02:46.000 And the one that gets me now, the one that I wish somebody had said to me earlier, is that it works generally across single-stranded RNA viruses.
02:02:46.000 Yes.
02:02:56.000 It would be weird if it didn't work on COVID, right?
02:02:59.000 It's not like, oh, I'm not.
02:03:00.000 I don't know.
02:03:00.000 Most people hearing this that are highly educated, that are, you know, mainstream narrative thinking a little bit, are listening to you and go, this is bullshit conspiracy theory.
02:03:10.000 This is a bullshit ivermectin didn't work, man.
02:03:13.000 Yeah, come on.
02:03:13.000 It just didn't work.
02:03:14.000 It didn't work.
02:03:15.000 It's bullshit.
02:03:15.000 It didn't work.
02:03:16.000 It didn't work.
02:03:17.000 There's no studies that show it worked.
02:03:18.000 Yes.
02:03:19.000 Utter nonsense.
02:03:21.000 It's utter nonsense.
02:03:23.000 So anyway, that's the lens with which I look at Ozempic.
02:03:27.000 And then the SSRI thing.
02:03:29.000 At least no one's forcing you to take Ozempic.
02:03:31.000 Right.
02:03:32.000 So if the devil comes along with the contract and says, listen, you're going to die four to six, no matter what we do.
02:03:40.000 But I can make you hot.
02:03:43.000 And I'm not saying that.
02:03:44.000 That was a pretty good impression.
02:03:45.000 I mean, I don't really know because I haven't met the man.
02:03:47.000 I haven't met him either, but I would imagine it'd be like kind of foghorn, leghorny, just to throw you off.
02:03:52.000 Sure.
02:03:53.000 You remember Robert De Niro played the devil in Angel Heart?
02:03:57.000 I don't think that's a good thing.
02:03:58.000 He's one of the coolest devils of all time.
02:04:00.000 Angel Heart was a really good movie.
02:04:02.000 It was Mickey Rourke and how am I blanking on her name?
02:04:10.000 The lady from the Cosby show.
02:04:11.000 Lisa Bonet.
02:04:12.000 Sorry, Lisa.
02:04:12.000 Lisa Bonet.
02:04:14.000 I just, my brain sucks.
02:04:16.000 But, and it was this like crazy movie of this guy realizing that he had sold his soul to the devil.
02:04:25.000 And Robert De Niro, it's a very dark movie, very strange.
02:04:28.000 And he doesn't want to pay.
02:04:29.000 He doesn't want to give up his soul.
02:04:31.000 And the devil eventually confronts him and the devil's Robert De Niro.
02:04:34.000 And he's one of the best devils of all.
02:04:37.000 This is what I think the devil would look like.
02:04:39.000 It wouldn't be terrifying.
02:04:41.000 It would be this guy.
02:04:48.000 You want to listen to it?
02:04:49.000 Yeah.
02:04:50.000 We can't.
02:04:50.000 I don't like that.
02:04:51.000 We can't listen to it.
02:04:52.000 But it was just creepy enough.
02:04:56.000 Just creepy.
02:04:57.000 It's better if it's a person.
02:04:58.000 It's creepier if the devil's a person.
02:05:00.000 Yeah.
02:05:01.000 Not some crazy thing with horns and a tail.
02:05:04.000 It's better if the devil's a person.
02:05:05.000 It's too much of a tell.
02:05:07.000 Yeah.
02:05:08.000 If the devil's real, boy, he's doing a really good job because no one thinks he's real.
02:05:13.000 Because if there really is a devil, I always say that everybody believes in God.
02:05:16.000 And you're like, God has a plan for me.
02:05:19.000 God has a plan for the world.
02:05:21.000 These trials and tribulations are all put in place by God.
02:05:24.000 And that's totally reasonable.
02:05:26.000 But if you say, you know, like if the government came on TV and they say, we've located the devil, he's in Pakistan and we're going to begin bombing.
02:05:33.000 We're going to kill the devil.
02:05:35.000 You're like, what?
02:05:37.000 What the fuck did you say?
02:05:38.000 You found the devil and you're going to blow him up.
02:05:42.000 You know, or the devil wants to have a meeting with the UN.
02:05:47.000 Satan is standing on the podium in front of the world explaining what his plan is.
02:05:52.000 You're like, what?
02:05:54.000 No one would believe that, right?
02:05:55.000 But it's supposed to be a real thing.
02:05:57.000 Like, if you believe in God, you're supposed to believe in the devil.
02:05:59.000 All of it's in the Bible.
02:06:00.000 But this one part of the Bible, we're like, get the fuck out of here with this devil thing.
02:06:05.000 It's weird.
02:06:06.000 Whatever it is, we know that it works, right?
02:06:10.000 We know evil is a real thing.
02:06:12.000 Call it what it is.
02:06:14.000 Whatever evil is.
02:06:16.000 But when you see a massacre in some third world country where religious fanatics or rival tribes massacre people, if that's not evil, like what is evil?
02:06:28.000 And if you can get into the minds of people and convince them that they have to go machete their distant neighbors, like if that's not like something that Satan would do, like what is that then?
02:06:40.000 And if we wrote, if people throughout history wrote about Satan and wrote about God and wrote about the conflict of good and evil, and then we're like, oh, yeah, but the devil stuff is not real.
02:06:51.000 The devil, the God stuff's real, but the devil's not, come on, there's no devil.
02:06:55.000 But like, the results are the same as if the devil was real, is my point.
02:06:59.000 There's so much evil in the world.
02:07:00.000 It's like the devil's killing it.
02:07:02.000 He's doing such a good job.
02:07:03.000 And everybody still thinks he's not real.
02:07:06.000 Well, you ask a question about evil that I think is worth investigating.
02:07:11.000 My position on this has changed radically.
02:07:11.000 What is it?
02:07:15.000 So it used to be that I would say that I thought evil was an extremely rare phenomenon.
02:07:23.000 And the reason that I thought it was extremely rare is because it's a terrible strategy, right?
02:07:27.000 If we say that ruthlessness, doing anything to get ahead, is a good strategy, right?
02:07:36.000 Because you can always not do stuff.
02:07:40.000 You have every move available to you if you're just perfectly amoral.
02:07:45.000 But evil has to be something beyond amoral.
02:07:48.000 Evil has to be something that intentionally does harm, that delights in it, right, in order to merit that term.
02:07:56.000 That's not a good strategy, right?
02:07:58.000 You want, game theoretically, the ideal strategy is perfect amorality because it can behave morally when that's advantageous and it can behave immorally when that's advantageous.
02:08:10.000 That is inherently the best.
02:08:13.000 I'm not saying it's good.
02:08:14.000 It's not defensible.
02:08:15.000 But I'm saying just game theoretically, that is going to be the most effective strategy is one that can be moral and amoral or it can behave in whatever way is ideal for the individual circumstance.
02:08:29.000 To delight in doing harm is to miss the opportunity to be good when it's the right thing to do.
02:08:36.000 So I would have expected evil to be a very rare phenomenon because it's self-extinguishing, right?
02:08:42.000 If you're doing harm for its own sake, that's not a way to get ahead.
02:08:46.000 You'll be out-competed by people who are amoral at the very least.
02:08:50.000 But I see so many things that strike me as meriting that label.
02:08:57.000 I mean, for example, the pedophilia that you're talking about.
02:09:01.000 I don't understand the ability to destroy a child for your own gratification.
02:09:15.000 Like, I'm sorry, that merits the term.
02:09:19.000 And it apparently is more common than most of us have believed until recently.
02:09:25.000 What do you think that is?
02:09:27.000 Particularly like the man-boy stuff, which is, does that go back to when there was no birth control, so if you had sex with a woman, you very likely procreated.
02:09:43.000 And you probably, if you wanted to stop people from procreating, you probably separate men and women.
02:09:49.000 So you get a bunch of horny boys around each other, and the big ones abuse the smaller ones.
02:09:56.000 I mean, is that how it starts?
02:09:58.000 Or is it just a concentration of sexual wealth, effectively?
02:10:07.000 That if you have some force that allows basically the hoarding of mates, leaving a lot of guys with no prospect, you might imagine that they might innovate something.
02:10:26.000 That the sex drive is so profoundly powerful that if some force makes it impossible to find a mate, that other things would happen.
02:10:40.000 I don't know if that's what's explaining it.
02:10:41.000 I don't know enough about the phenomenon.
02:10:43.000 I've seen reports of this behavior, and it's super disturbing.
02:10:49.000 It's just super disturbing that it exists so much in history and that it's accepted so much in history.
02:10:57.000 And here's another weird one: like the Spartans were gay, right?
02:11:02.000 They all had lovers that were other men that they fought alongside.
02:11:06.000 And their idea was that you would fight harder to protect your lover.
02:11:13.000 That one almost makes more sense to me.
02:11:15.000 It certainly does.
02:11:16.000 I mean, it's certainly better, right?
02:11:18.000 They're consenting adults.
02:11:19.000 But it's like, why have our ideas of sexuality evolved to where they are today?
02:11:30.000 And what is like this is it because people didn't know how horrific it was back then?
02:11:38.000 Is it because it was underreported?
02:11:41.000 Was it shame that the momentum of people doing it to more people, people that got molested, went on to molest, and it was like more common?
02:11:50.000 Well, you got to split those two phenomena.
02:11:52.000 Let's take your Spartans and battle.
02:11:57.000 I had them all confused there.
02:11:57.000 Right.
02:11:59.000 I think I've mentioned this to you before, but I have a hypothesis that the reason that ships are female is because it causes the people who man them to defend them properly.
02:12:16.000 Like a mother, right.
02:12:17.000 Right, like a mother or a spouse.
02:12:19.000 Right, your wife, your mother.
02:12:21.000 Yeah.
02:12:21.000 Yeah.
02:12:22.000 And so, anyway, my point would be that the female naming of ships has persisted because actually it preserves ships and the cultures that preserve their ships better out-compete the ones that preserve their ships less well.
02:12:36.000 Right.
02:12:36.000 So anyway, you could imagine that, you know, gay soldiers who did, you know, I mean, every guy is built to want, you're built to defend your lover.
02:12:49.000 And it's hard for me to relate to that being a guy because I don't swing that way.
02:12:54.000 But if you did feel that way about a guy, then you can imagine that your ferocity in battle would be enhanced by that sense of protectiveness.
02:13:08.000 It totally makes sense.
02:13:09.000 But then I think it was also very common in ancient Japan.
02:13:16.000 I think it was a thing among samurai to have young boy lovers, too, wasn't it?
02:13:22.000 Well, but again, These two phenomena that's the case, young Jamie.
02:13:27.000 I mean, it's just it's just weird that that exists so often and all throughout history.
02:13:33.000 And then over the last 100 years, everybody's like, hey, hey, hey, what the fuck is that?
02:13:38.000 Like, it took that long.
02:13:40.000 And then some of it must still persist at very high levels.
02:13:46.000 Because some of these fucking psychopaths that get into these great positions of power, they probably have some very bizarre needs.
02:13:53.000 All right, we put it into our sponsor perplexity.
02:13:57.000 It says among samurai in Japan, some same-sex relationships, particularly male-male ones, were indeed recognized and culturally integrated, somewhat similar to Spartan practices, but with distinct Japanese characteristics.
02:14:10.000 The practice was known as shudo or nan shoku, where intense erotic and mentorship bonds were formed between an adult samurai, nenja, and a younger male apprentice or page, wakashu.
02:14:29.000 The institution function within a strict role framework with the elder as the active partner and the younger as the receptive one.
02:14:37.000 Boy, that's a weird way to put the old guy fucks the kid.
02:14:41.000 That's the most euphemistic term I've ever seen to the old guy fucks the kid.
02:14:45.000 Let's see.
02:14:46.000 I say the old guy fucks the kid.
02:14:48.000 And what they say is the active partner and the younger as the receptive one.
02:14:55.000 So the elder is the active partner, emphasizing loyalty, affection, and mutual growth.
02:15:01.000 Oh, we're both growing from this, buddy.
02:15:05.000 Wow.
02:15:06.000 So there you go.
02:15:07.000 All right.
02:15:09.000 It did not exclude heterosexual marriage and family duties.
02:15:12.000 It was compatible with and did not exclude heterosexual marriage and family duties.
02:15:17.000 So that's the loophole.
02:15:18.000 You can only have sex with one wife.
02:15:20.000 You know, you have to be heterosexual.
02:15:23.000 You're in a marriage, but you could fuck as many kids as you want.
02:15:26.000 Jesus Christ.
02:15:28.000 And that's history.
02:15:28.000 And that's weird.
02:15:30.000 It's just weird that it took so long before people realize that's a terrible thing to do to people.
02:15:36.000 Yeah.
02:15:37.000 Is there any information on what, you know, I guess in this case, what you're reading suggests a mentor relationship, which suggests that these kids are maturing into other roles?
02:15:49.000 Yes.
02:15:50.000 I think what you're describing in Afghanistan is not that at all.
02:15:53.000 No, not that at all.
02:15:53.000 It's a worse version of that.
02:15:55.000 Completely destructive.
02:15:56.000 Involving the same horrific act.
02:15:59.000 It's just a much worse version of it.
02:16:00.000 But it's all just fucking crazy.
02:16:02.000 It's like we know now what it does to people.
02:16:06.000 And it's like, how did they not know then?
02:16:08.000 Well, you know.
02:16:10.000 Is it because of shared information?
02:16:11.000 Is it because of books, media?
02:16:14.000 Is it because of social media?
02:16:16.000 Like, what is it?
02:16:17.000 The problem, Joe, I really, there's an important concept that I want to remind you.
02:16:24.000 We've talked about it before.
02:16:25.000 It's relevant to all sorts of things.
02:16:27.000 I just don't want to connect it to this, but I think go for it.
02:16:33.000 Well, the concept is lineage.
02:16:35.000 And the problem is lots of stuff that looks really freaking strange when you zoom in and you look at individual behavior.
02:16:43.000 The real question is what were these things having to do with the success or not of the lineages that were involved in them?
02:16:52.000 And we don't know.
02:16:54.000 So you're looking at the behavior between individuals, and you're saying that's grotesque and doesn't make sense.
02:17:03.000 And the question is, does the larger context, especially when you're dealing with things like samurai, you know, that are basically fundamentally about the continuance of a lineage, there's a question about what, you know, what makes for a functional samurai culture.
02:17:25.000 And I don't know.
02:17:27.000 I'm no expert in this, so I can't even look at that case and give you a proposal.
02:17:32.000 I don't know enough about the context to say how it might work, but I can tell you where you have a paradox like that, you either have the case that I think is going on in Afghanistan where it's just purely predatory, right?
02:17:45.000 Well, I think in order, I mean, if it truly is a mentor relationship and that they all do it, it's essentially the same sort of function as with the Spartans, right?
02:17:56.000 Like they would be fighting alongside each other.
02:18:00.000 If you were going to develop an army, like you would probably, first of all, they're not going to have any contact with females for a long period of time.
02:18:07.000 They'd probably encourage homosexuality.
02:18:10.000 Well, I think what I'm getting at is I think you and I are struggling with that landscape and what it might mean.
02:18:20.000 Because you and I are fundamentally Western and lineage against lineage violence is not our mindset.
02:18:29.000 And so anytime you and I look at lineage against lineage violence, there are paradoxes aplenty.
02:18:40.000 And the problem, one of the things that I'm spending a lot of time thinking about is the fact that lineage against lineage violence is reasserting itself, that the West was the alternative to that.
02:18:53.000 And lineage against lineage violence is reasserting itself and it is threatening to drag the whole world back into it because it is fundamentally more stable, right?
02:19:07.000 The West is more vibrant.
02:19:08.000 The West is safer, fairer, more productive, but it's fragile.
02:19:15.000 It depends on an agreement to continue treating each other that way.
02:19:19.000 And that means that anything that threatens it causes it to come apart and you descend back into a world of chaos and grotesque behavior.
02:19:32.000 And that's where I think we are.
02:19:34.000 We are watching the agreement.
02:19:37.000 You know, the world was moving in the direction of the West.
02:19:39.000 We were getting along better.
02:19:41.000 We were learning to be productive together with people who were not closely related to us.
02:19:47.000 And we are contracting now back into this view of, well, it's us against them and they got to go.
02:20:03.000 Do you think there's a way to change course, like the negative things that are going on in society right now, the negative things that we all feel when you're talking about whether it's pharmaceutical drug companies getting involved in your health care narratives in order to make more money?
02:20:21.000 Do you think there's a way forward where this corrects itself?
02:20:25.000 Or we correct it, or we get to a much more healthy balance.
02:20:28.000 You're never going to get everybody who's involved in every aspect of society to be a good person with kindness in their heart and a general overall want for the good of mankind.
02:20:40.000 You're not going to have that everywhere.
02:20:41.000 You're always going to have some people that are out for themselves.
02:20:44.000 But is there a way to balance it and make it much more in the direction of everybody recognizing, like, hey, this way we're doing this is not good for anybody, and it's being manipulated by foreign governments all day long, and you're addicted to the thing that it's manipulating you on.
02:21:03.000 And whether or not you realize it, you're at least somewhat affected by this data that's coming at you.
02:21:09.000 You could say, I'm smart.
02:21:10.000 I'm not going to fall into that bullshit.
02:21:12.000 But then, you know, it's a little gets in there.
02:21:15.000 Enough gets in there that it becomes a part of your thinking, that it becomes something that you debate all the time.
02:21:20.000 And it's mostly artificially propped up.
02:21:25.000 Well, I want to separate that into two questions.
02:21:28.000 Okay.
02:21:28.000 Do I think there's a way for the world to be structured?
02:21:31.000 Yes.
02:21:32.000 You're never going to get rid of all the bad people, but that it's tolerant, you know, that it deals with the bad people sufficiently well, that the good people have enough of a stake, that the objectives are clear enough, that people have meaning in their life, that they can't, can it be structured so that it works?
02:21:49.000 Right.
02:21:49.000 Not perfectly, but well enough.
02:21:51.000 Yeah, I absolutely believe that it can.
02:21:53.000 So what is it?
02:21:54.000 Hold on, but the second question where I'm more pessimistic is: is there a path from here to there?
02:22:01.000 So it could be done, but is there a path?
02:22:01.000 Oh.
02:22:04.000 Right.
02:22:07.000 Things are so wildly fucked up.
02:22:09.000 And, you know, I'm watching, in many cases, my friends pulled into the administration with a lot of momentum behind them.
02:22:21.000 And I'm watching something seemingly prevent the promise that was there from being realized.
02:22:32.000 Right?
02:22:33.000 You're talking about an RFK.
02:22:34.000 Well, I'm talking about, you know, RFK.
02:22:37.000 I was just with J. Bhattacharya.
02:22:43.000 There's something that when the good people get to Washington and try to do the right thing, there is an architecture that drains them.
02:23:00.000 that wastes their efforts, that places roadblocks, that causes them to back their objectives way off.
02:23:11.000 And I don't know what it is, but it's disheartening to see it.
02:23:20.000 I mean, even just we could talk about the mRNA vaccines.
02:23:23.000 This one was so clear.
02:23:26.000 We've got people in Washington who know it.
02:23:29.000 They can't apparently get these things off the market.
02:23:32.000 Like, what the hell is that?
02:23:34.000 If I was running the board right now, I'd start playing the theme song that Pink Floyd song Money.
02:23:43.000 You know, the beginning of that?
02:23:44.000 Doom, doom, doo, doom, doom, doom, doom.
02:23:44.000 Oh, shit.
02:23:47.000 That's what it is.
02:23:49.000 It's all just profit.
02:23:50.000 It's profit and power.
02:23:51.000 It's power.
02:23:52.000 I don't think it's profit.
02:23:53.000 It's part of it.
02:23:54.000 Part of it is you have to keep score, and profit is how you keep score.
02:23:58.000 You have to keep score.
02:23:59.000 You want to be big shot, swinging dick, psychopath.
02:24:02.000 You've got to keep score of how much money you make.
02:24:04.000 So it is profit.
02:24:05.000 Profit's a big part of it.
02:24:08.000 It's definitely competition, too.
02:24:11.000 It's the thing that happens with any corporation that has an obligation to its stockholders.
02:24:17.000 You've got to keep making more money.
02:24:19.000 And if you're in the business of distributing drugs, you're not in the business of doing the lab work.
02:24:24.000 Those aren't the guys that are assholes.
02:24:25.000 You're not like in the trenches trying to figure out how these things work.
02:24:29.000 And the people that are trying to get the money people.
02:24:32.000 The money people are crazy.
02:24:34.000 And the money people are infiltrating all the science people and telling them what to say about stuff.
02:24:41.000 If scientists, like as a whole, were always entirely objective about every single subject and never ever subject to bribery like the sugar people were, like when they gave them the sugar to say that it was all saturated fats causing all these heart disease and all these people are obese because of saturated fat.
02:25:01.000 Then people started eating margarine.
02:25:03.000 This is all money.
02:25:04.000 It's all money.
02:25:05.000 It's all so money gets into the if the scientists were true, if they were like knights and they could not tell a lie, we would have never got into half the messes that we're in with pharmaceutical drug companies.
02:25:17.000 Right.
02:25:17.000 But they work for the pharmaceutical drug companies.
02:25:19.000 And then the people that are involved in the FDA, if they leave, they get a cushy job with the pharmaceutical drug companies and it's totally legal.
02:25:27.000 So there's no incentive to be a knight.
02:25:30.000 I know how corrupt the system is.
02:25:33.000 I mean, in some ways, I feel like nobody knows better.
02:25:36.000 But I am disheartened to discover how little power, even when the curtain is pulled back and we can see the gross excesses and the massive wave of destruction that was created, even in that circumstance, we can't make the most basic alteration.
02:26:01.000 Taking the mRNA COVID shots off the market.
02:26:01.000 Right?
02:26:07.000 I don't understand how they should be so embarrassed and horrified at the harm they did that this should be an easy one.
02:26:16.000 So how are they still selling it?
02:26:18.000 How are they still pushing it?
02:26:20.000 Like, what is this?
02:26:22.000 What are they saying it does for you now?
02:26:25.000 I mean, I guess they're sticking with their, you know, like 14th fallback position of it reduces the harm of COVID.
02:26:33.000 No, is this because pulling it from the market is an admission of guilt or an admission of knowledge that it's not effective and it's not necessary anymore?
02:26:43.000 And then something.
02:26:45.000 The thing is, they don't have to worry, because of the fact that it's classified as a vaccine, they don't have to worry about being sued.
02:26:52.000 Well, they do.
02:26:53.000 They do.
02:26:54.000 Household.
02:26:55.000 The immunity from liability is dependent on there having been no fraud.
02:27:03.000 And there clearly was fraud.
02:27:07.000 So, in light of that.
02:27:09.000 I didn't know there was any caveats like that.
02:27:11.000 Oh, yeah.
02:27:12.000 Oh, that makes sense.
02:27:14.000 It's much more interesting.
02:27:15.000 Yeah, but it's a what is how do you define fraud?
02:27:17.000 Because they've sold drugs where they had like 10 studies and one of them was good.
02:27:21.000 Let's get a good study.
02:27:23.000 Let's try this one.
02:27:24.000 Okay.
02:27:26.000 The insufficient amount of safety testing that was done before these things were released was done with mRNA vaccines produced in a process that did not involve DNA.
02:27:46.000 The product that was actually injected into billions of people involved DNA plasmids, and there is massive contamination in the shots that were actually delivered, including the SV40 promoter, Simeon Virus 40.
02:28:03.000 We talked about that the last time we were on, right?
02:28:05.000 I think we probably did.
02:28:07.000 I think we did.
02:28:08.000 But in any case, the point is, for you to put your process one drug through safety testing and then inject people with something different that has other components that were not tested is fraudulent.
02:28:22.000 Can I stop you real quick so this could be standalone?
02:28:25.000 Could you just explain the whole SV40 thing to people and how it became an issue?
02:28:30.000 So there are lots of techniques that are used in order to generate a lot of product, right?
02:28:39.000 In this case, what they used is a plasmid, which is a circular piece of DNA, in order to basically create vats that would grow the product necessary that would later be coated in the lipid nanoparticle.
02:28:53.000 So they used bacteria to do the heavy lifting.
02:28:58.000 There is a requirement that you purify DNA out, and there are standards, which are way too high, but there are standards that you can't go above in terms of how much DNA contamination you can have left over from your production process.
02:29:14.000 But in this case, it isn't even that the quality control is garbage and there was too much stuff left over because the process didn't work very well.
02:29:26.000 The problem is that there was a much more painstaking way of producing technically the same product that did not involve DNA plasmids at all.
02:29:37.000 And so what you've got left over in these vials, and we're talking about largely the work of Kevin McKernan, who took vials that were given to him, stuff that was actually injected in people, there was leftover stuff in the vials, and he tested a bunch of these things, found DNA contamination across the board.
02:29:55.000 So what you're left with is a promoter, which is a genetic trigger that we know is common in lab techniques, and it originally comes from simian virus 40, and we know that it's carcinogenic.
02:30:10.000 So that promoter is left over in vials from shots that were actually injected into people.
02:30:19.000 And that means that all of the things that we were told about the potential for these mRNA shots to integrate into your genome, that was impossible, they told us, right?
02:30:30.000 Well, first of all, it's not impossible.
02:30:32.000 There's lots of interesting stuff that goes on in cells that involves reverse transcription and things like that.
02:30:37.000 But even what we were told that there's no DNA, so integration is not an issue, was a lie because there is DNA left over in these vials, and it's not just some old DNA.
02:30:50.000 It's DNA with the SV40 promoter, which is a genetic engineering tool that has carcinogenic potential.
02:30:58.000 So it seems to me this is clear fraud.
02:31:02.000 You can't inject a different product into the public on the basis of safety testing that was done with something produced by a different process.
02:31:11.000 Can you explain how they got this SV40 from these monkeys?
02:31:16.000 Like, what, and how it got into these vaccines and other vaccines in the past as well, right?
02:31:22.000 I will tell you what I think I remember from this story.
02:31:25.000 I should probably have brushed up on it if we were going to talk about this.
02:31:28.000 But I believe that the story is that in the production of early polio vaccines, monkey kidneys were used.
02:31:39.000 And SV40 was a virus that I think was unknown that showed up, that because you're using cells and viruses infect cells, that SV40 showed up in that process.
02:31:53.000 So anyway, I wish I was more certain of what the story was.
02:31:57.000 So the monkey kidneys, the virus from the monkey kidneys got into whatever this vaccine was, and then that infected people with SV40.
02:32:06.000 Were there a correlating or a corresponding rise in cancer among the time where they were doing that?
02:32:14.000 I don't know the answer to that question.
02:32:16.000 I don't know how well studied it's been.
02:32:18.000 So why do we think that it causes cancer?
02:32:20.000 Because we know it does.
02:32:21.000 can see that it transforms cells and they become cancerous.
02:32:26.000 Oh, so in a lab, in a lab scenario?
02:32:28.000 But what about with humans?
02:32:29.000 Do we know that it does it with humans?
02:32:32.000 We know that it transforms human cells.
02:32:36.000 I believe that is a fair statement.
02:32:38.000 And at the very least, your position is that it's absolutely not what they tested.
02:32:43.000 Well, yeah, it doesn't matter.
02:32:44.000 The SV40 thing is alarming.
02:32:47.000 The simple fact that they tested a different product than they injected into people, that's where the fraud is.
02:32:52.000 Was it because it was a rush to mass produce?
02:32:57.000 I mean, you can say that.
02:32:59.000 Why would the decisions be made to do it a different way?
02:33:01.000 Well, I mean, I think the obvious reason is because in the one case, you get a much purer product, which is much more likely to get through the safety testing.
02:33:08.000 And in the other case, you get the rapid expansion of production.
02:33:12.000 But that's fraud, right?
02:33:14.000 Yeah.
02:33:15.000 You tested a different product.
02:33:17.000 Yeah.
02:33:20.000 And the one you tested didn't work.
02:33:22.000 Well, yeah.
02:33:23.000 Even the first one sucked.
02:33:25.000 Yeah, it didn't.
02:33:26.000 It's amazing how many people will defend it, just like they defend the fucking flying spaghetti monster, whatever it is.
02:33:33.000 They'll defend it.
02:33:34.000 They'll defend it.
02:33:35.000 It becomes a part of their religion.
02:33:38.000 Yes.
02:33:38.000 They'll still tell you it saved millions of lives.
02:33:40.000 Like, how do you know that?
02:33:42.000 Like, how do you know if it didn't work?
02:33:43.000 Like, if it really didn't work?
02:33:45.000 Like, how do you if you still got COVID and you got it real bad?
02:33:48.000 Like, how do you know it did anything good?
02:33:50.000 How do we really know that it did anything good to anybody?
02:33:53.000 We don't.
02:33:54.000 But that narrative is really hard for people to swallow.
02:33:54.000 Right.
02:33:57.000 So they keep saying it saved millions of lives.
02:34:00.000 Like, I wish that was true.
02:34:02.000 Save millions of lives in a computer model.
02:34:04.000 But isn't that computer model dependent upon it not causing infection and not causing transmission?
02:34:11.000 The computer model is pure garbage.
02:34:13.000 For one thing, the fact that it's a computer model in the first place means that you cannot test the hypothesis that it saved or didn't save lives.
02:34:21.000 Right.
02:34:22.000 You could potentially run a computer model and you could generate a hypothesis that you would then need to go test with real world data.
02:34:28.000 You don't get to tell us that it saved millions of lives based on the fact that your computer spit out that number.
02:34:33.000 That's not how that works.
02:34:33.000 I'm sorry.
02:34:34.000 That's the reason why they have to keep belittling ivermectin.
02:34:38.000 Imagine if we get to a position where AI can do definitive breakdowns of the efficacy of certain compounds that's stopping certain diseases like COVID-19.
02:34:48.000 And it says that with this dose, with this body weight, you do it this amount of times and it should offer like 70% protection.
02:34:55.000 Yep.
02:34:56.000 And then they run that into what's the actual data on the vaccine causing side effects and injury.
02:35:03.000 And we just get this horrible reality in front of us that I think everybody who took the shot is really wanting to avoid the mind fuck of knowing that you got used as a little piggy bank for the pharmaceutical drug companies to push some experimental shit on you and tell you that it's both safe and effective.
02:35:23.000 It's both safe and which, by the way, didn't Fauci use that same term for AZT back in the day?
02:35:29.000 Didn't he?
02:35:30.000 I mean, it's a great slogan, right?
02:35:30.000 He would.
02:35:31.000 Safe and effective.
02:35:32.000 See if he did.
02:35:33.000 See if you could attribute the term safe and effective to Fauci talking about AZT during the HIV crisis.
02:35:39.000 But the problem is, to your point about AI, these people are not fools and they understand that the AI extrapolates from what it's read.
02:35:52.000 So they're priming it, right?
02:35:54.000 They're priming it so that it can't do the proper work, which means that this potentially extremely valuable tool, frightening, yes, but potentially extremely valuable tool is going to be compromised because it is going to be intentionally misled with phony articles, papers, all of that stuff.
02:36:13.000 Long-term AZT appears safe and effective.
02:36:18.000 Oh my goodness.
02:36:19.000 1989.
02:36:22.000 Wow.
02:36:23.000 There's a video of him saying it.
02:36:24.000 That's where I remember now.
02:36:26.000 He was saying, the reason why we prescribe AZT is that it's the only drug that is both safe and effective.
02:36:35.000 Something along the way.
02:36:36.000 You probably cut that out.
02:36:37.000 It might not be his actual quote.
02:36:39.000 But whatever he said, he was talking about it the same way he talked about the COVID vaccine.
02:36:43.000 He was talking about something that definitely fucking killed people.
02:36:47.000 AZT was a horrible, horrible drug that killed more people than, it was a cancer drug.
02:36:52.000 It was chemotherapy.
02:36:53.000 Like when have people ever been told to take chemotherapy forever?
02:36:57.000 Right.
02:36:57.000 What?
02:36:58.000 Yeah.
02:36:59.000 He's a monstrous person and the idea, I mean, I think any preemptive pardon that is- Wild.
02:37:09.000 Non-specific.
02:37:10.000 Non-specific that goes back to 2014.
02:37:12.000 Yeah, you can't have that.
02:37:15.000 This is a legally unsupportable idea because it effectively creates two classes of citizens.
02:37:21.000 It violates equal protection under the law, right?
02:37:24.000 You can't have people who have carte blanche to violate the law, you know.
02:37:28.000 It's also what the kids call sus.
02:37:30.000 Yeah, it's super sus.
02:37:32.000 That's fucking super sus.
02:37:34.000 That's super sus.
02:37:35.000 Yeah.
02:37:36.000 And then he also pardoned his whole family.
02:37:38.000 You're like, yo, what did you guys do?
02:37:42.000 Yeah, he pardoned.
02:37:43.000 So that's the question is, does the auto pen stuff stand up?
02:37:46.000 I think they're going to not push it.
02:37:48.000 I think they're going to not push it because I bet they all use it.
02:37:52.000 I bet Trump is busy, you know, because he doesn't have time.
02:37:55.000 Does he have any lines written in his name?
02:37:57.000 He's got a lot of lines in there.
02:37:58.000 A lot of lines, up and down lines, yeah.
02:37:59.000 If I was, yeah, use the pen.
02:38:02.000 And I guess maybe if you say use the pen and, you know, you're allowed to, you know what I mean?
02:38:08.000 Like DocuSign, if you get a DocuSign document in your email, you get to sign it with your pre-approved signature.
02:38:14.000 Yeah.
02:38:14.000 Fucking weird.
02:38:15.000 Well, I mean, that's, I think you're conflating a couple things and I don't really know.
02:38:19.000 I'm no expert.
02:38:20.000 But it seems to me the auto pen, you know, I think its purpose was to, like, sign autographed photos and pro forma documents.
02:38:30.000 It's not there to sign important stuff.
02:38:34.000 Yeah.
02:38:34.000 Pardon.
02:38:34.000 Pardon, certainly not.
02:38:35.000 That's a crazy one.
02:38:36.000 It's a crazy one.
02:38:37.000 Because it means somebody else can press that button.
02:38:40.000 Especially if your president happens to, oh, I don't know, be demented.
02:38:40.000 Right.
02:38:43.000 Right.
02:38:43.000 You know?
02:38:44.000 Yeah.
02:38:45.000 Yeah.
02:38:45.000 He's out of his gourd and everybody knows it.
02:38:48.000 What is your suspicion about that second debate, the debate that they had where he totally fell apart?
02:38:48.000 Yeah.
02:38:54.000 Do you think that they set him up for that by putting it on late at night and that they probably didn't give him his right vitamins?
02:39:02.000 Look, they clearly set him up.
02:39:04.000 Right.
02:39:04.000 For those of us who were tracking his mental decrepitude since before he was elected, there's no way that they thought that he was going to do OK in that debate.
02:39:04.000 Right.
02:39:17.000 The second one.
02:39:19.000 Agreeing to it was conspicuous.
02:39:23.000 Right.
02:39:23.000 And so I think it was part of forcing him out.
02:39:25.000 Well, they debated in the past.
02:39:27.000 It's like, you know, Biden wasn't that bad.
02:39:30.000 He got some good ones in there.
02:39:31.000 You know, it wasn't too bad.
02:39:33.000 I was like, whatever they put him on for that debate, pretty solid.
02:39:35.000 You know, whatever cocktail.
02:39:38.000 They didn't give him the cocktail for that one where he was like, we beat Medicaid.
02:39:43.000 Like what?
02:39:44.000 And that's what everybody had grown accustomed to.
02:39:46.000 And I remember someone that I know that I'm friends with sent me a message.
02:39:52.000 Don't you know that Biden has a stutter?
02:39:54.000 That's what this is all about.
02:39:56.000 Oh, remember that stutter thing?
02:39:57.000 Yeah.
02:39:58.000 I go.
02:39:58.000 I started sending him all these videos of Biden when he was younger.
02:40:02.000 He was a powerful speaker.
02:40:02.000 Right.
02:40:04.000 Powerful speaker.
02:40:05.000 Maybe a bullshit artist.
02:40:07.000 I wouldn't say a powerful speaker, but certainly confident.
02:40:09.000 It wasn't bad.
02:40:10.000 Yeah.
02:40:10.000 He was certainly confident.
02:40:10.000 It wasn't bad.
02:40:11.000 He had some good ones.
02:40:12.000 There's a couple of speeches in there that are pretty solid.
02:40:14.000 Yeah.
02:40:14.000 When he was – it was not – it was all after he had the disaster when he was running for president.
02:40:22.000 So in 88, you know, he was running for president.
02:40:24.000 He got caught plagiarizing.
02:40:26.000 Yeah.
02:40:27.000 But it was after that.
02:40:28.000 Like when he was vice president, he was pretty solid.
02:40:32.000 You know, he had a few solid speeches.
02:40:34.000 And then you see him now, and it's like, oh, boy, this is crazy.
02:40:38.000 He trails off on his words at the end.
02:40:41.000 What is this one?
02:40:42.000 Oh, here it is.
02:40:44.000 Oh, lazy team because it's the only drug that thus far has been shown in scientifically controlled trials to be safe and effective.
02:40:51.000 It isn't the question of there are a lot of drugs around and only one – Yo.
02:40:56.000 Imagine that.
02:40:57.000 Imagine saying that and being so wrong that who knows how many people died from that, including people that had no symptoms at all.
02:41:03.000 They just tested positive for HIV, and they just dosed them up.
02:41:09.000 He's a cold-hearted son of a bitch.
02:41:09.000 Yeah.
02:41:09.000 Yeah.
02:41:11.000 Ooh.
02:41:12.000 And what were they – were they using a PCR method to detect whether or not people had HIV back then?
02:41:17.000 I don't think so.
02:41:17.000 I don't think it existed.
02:41:19.000 Well, didn't – when did Carey Mullis – when did he devise that?
02:41:26.000 Because this is like – let's put that into perplexity.
02:41:29.000 When was the PCR method – It actually would have been – No, but that's way after.
02:41:35.000 That's way after.
02:41:36.000 He ended in 83.
02:41:37.000 83.
02:41:38.000 Oh.
02:41:39.000 So what did they use to determine whether or not people – But I don't think they would have – well, I don't know.
02:41:46.000 Did they use PCR method to determine whether or not people had HIV?
02:41:50.000 Because they definitely did with COVID, right?
02:41:52.000 Oh, yeah.
02:41:53.000 And what were the accidental positives, the false positives?
02:41:58.000 What were those?
02:41:59.000 How many – Oh, they had the cycle threshold turned up so high that it would amplify any contamination, right?
02:42:06.000 They were just looking to establish that COVID was everywhere.
02:42:09.000 So – but what – HIV diagnosis primarily through three types of tests.
02:42:15.000 Antibody.
02:42:16.000 Antibody tests, antigen antibody tests, and nucleic acids – nucleic?
02:42:20.000 Nucleic acid tests.
02:42:21.000 Yep.
02:42:22.000 Okay.
02:42:23.000 Yeah.
02:42:23.000 So not PCR.
02:42:24.000 So not PCR.
02:42:25.000 So how many cycles were they – they had it to a massive number at one point in time, right?
02:42:25.000 Yep.
02:42:31.000 I think it was in the 40s, something like that.
02:42:33.000 And what kind of false positives would you get if you had something like that?
02:42:39.000 Oh, massive.
02:42:40.000 I mean especially in the context that these machines existed.
02:42:44.000 So basically the idea is that you get a doubling for each of these cycles.
02:42:54.000 And that means that if there's the tiniest fragment in your environment that you will end up seeing it come through as a positive.
02:43:04.000 So imagine that you're in a hospital testing patients, right?
02:43:08.000 Are there going to be fragments of COVID around in the middle of, you know, 2020?
02:43:13.000 Sure.
02:43:14.000 There are going to be fragments around.
02:43:15.000 So it will find that fragment.
02:43:16.000 It will pop up.
02:43:17.000 Oh, you've got a positive, right?
02:43:19.000 If you're in a testing center, it's the same thing.
02:43:22.000 So anyway, Kerry Mullis, of course, warned us about this.
02:43:25.000 He said it was a completely inappropriate technology for that purpose.
02:43:29.000 I think he did that quite a bit before COVID even, wasn't it?
02:43:31.000 Wasn't that conversation – I think he – when did – didn't he die in 2019?
02:43:36.000 Is that correct?
02:43:36.000 Yeah.
02:43:37.000 Yeah.
02:43:37.000 And so when was that conversation?
02:43:39.000 There's a conversation where he's belittling Fauci.
02:43:41.000 He's like he doesn't know what he's talking about.
02:43:42.000 He's a bureaucrat.
02:43:43.000 I think it's a different conversation than the cycle threshold.
02:43:43.000 Yep.
02:43:46.000 But he was talking in that same conversation.
02:43:47.000 Maybe it's a different time he said the same thing.
02:43:50.000 But that – this is not what you would use to detect viruses.
02:43:53.000 It's not an appropriate use for the technology.
02:43:54.000 This is the guy that invented it.
02:43:56.000 Yep.
02:43:58.000 Like it's just – the fact that someone could be the man that pushed that and make it all the way through in his career to COVID and do the same thing is so wild.
02:44:14.000 Well, you know, I don't really even know what Anthony Fauci – That video?
02:44:20.000 Yeah.
02:44:20.000 This is it.
02:44:21.000 It's 96.
02:44:21.000 Yeah.
02:44:22.000 96.
02:44:23.000 96.
02:44:25.000 So Fauci must have been using it for whatever.
02:44:28.000 Maybe in 96 they were using it for HIV.
02:44:30.000 Who knows?
02:44:31.000 Well, wait a second.
02:44:32.000 In that video he says – It's 96.
02:44:33.000 But I know that.
02:44:34.000 I know that video.
02:44:35.000 Yeah.
02:44:35.000 He talks about the cycle thresholds and the inappropriate use of that tech for – Let's hear what he says.
02:44:40.000 Let's hear what he says.
02:44:41.000 I may be conflating two different speeches that he – where he complained about them.
02:44:46.000 But I know in this one he's – he's saying he doesn't know what he's talking about.
02:44:51.000 It's about humanity that wants to go to all the details and stuff and listen to – you know, these guys like Fauci get up there and start talking to me.
02:44:59.000 You know, he doesn't know anything really about anything.
02:45:01.000 And I'd say that to his face.
02:45:03.000 Nothing.
02:45:04.000 The man thinks you can take a blood sample and stick it in an electron microscope and if it's got a virus in there you'll know it.
02:45:11.000 He doesn't understand electron microscopy and he doesn't understand medicine.
02:45:15.000 He should not be in a position like he's in.
02:45:18.000 Most of those guys up there on the top are just total administrative people and they don't know anything about what's going on at the bottom.
02:45:24.000 You know, those guys have got an agenda, which is not what we would like them to have, being that we pay for them to take care of our health in some way.
02:45:34.000 They've got a personal kind of agenda.
02:45:37.000 They make up their own rules as they go.
02:45:39.000 They change them when they want to.
02:45:40.000 And they smugly, like Tony Fauci, does not mind going on television in front of the people to pay his salary and lie directly into the camera.
02:45:48.000 You can't expect the sheep.
02:45:51.000 Crazy.
02:45:52.000 Right?
02:45:53.000 Crazy.
02:45:54.000 That's a crazy statement.
02:45:55.000 From a brilliant man in 1996.
02:45:57.000 Yeah, it's amazing how prescient that statement is.
02:46:01.000 But it's what we were talking about, like people that get into positions of power.
02:46:08.000 Yes, and I also think, you know, Fauci was the highest paid federal employee.
02:46:15.000 There's a reason for that.
02:46:16.000 I don't think we exactly know the reason.
02:46:18.000 Let's play that Pink Floyd song again.
02:46:21.000 Well, but, you know, presumably that's not where the bulk of his wealth is coming from.
02:46:27.000 But it is a measure of his position in the hierarchy.
02:46:31.000 And his ability to ensure that other people exceed their wildest expectations.
02:46:38.000 I mean, if you can get that guy to push your drug, you know, you're making a lot of money.
02:46:44.000 Yeah, but I even think it's a mistake to think of him in the medical and public health context because what we now know is that he was part of dual-use research, that this is actually a military project to create bioweapons through a loophole.
02:47:01.000 We're not allowed to create bioweapons, but you are allowed to do research that leads to bioweapons as long as it has a medical dimension.
02:47:09.000 Again, this is something that you put into someone's head and they'll go, no, no, no, no.
02:47:12.000 NBC says different.
02:47:14.000 Right.
02:47:14.000 Washington Post says different.
02:47:16.000 What you're saying is crazy.
02:47:16.000 Stop, stop, stop.
02:47:17.000 But there's a lot of evidence that points to that.
02:47:22.000 And it's not something that hasn't been both discussed and done in the past.
02:47:22.000 Yeah.
02:47:27.000 This is what's hard for normies to swallow.
02:47:31.000 It's hard.
02:47:33.000 Yep.
02:47:33.000 And not only that, but because it is inherently not visible to the public, you know, we have sort of the public health justification for work.
02:47:44.000 You know, why were they enhancing viruses in the lab, Joe?
02:47:47.000 Right.
02:47:47.000 Oh, they were doing it because they wanted to know what a virus would look like so that we would be aware of how to fend it off if it ever leapt out of nature.
02:47:55.000 That's a garbage story.
02:47:57.000 Also, you guys did a really shitty job of figuring it out to the point where you can make a cure because you had no idea.
02:48:03.000 You had no fucking idea.
02:48:04.000 No advantage came from that work at all, despite the fact that they were studying exactly the right viruses.
02:48:09.000 There's something just not right.
02:48:10.000 Especially if you wanted to use something as a bioweapon, wouldn't you take a virus and make it more contagious?
02:48:18.000 And you could say, I'm just studying it.
02:48:20.000 I'm just studying it.
02:48:21.000 Like, you're making something that's going to kill us all?
02:48:24.000 Are you real?
02:48:24.000 Well, you know, this is one of these frustrating places where I think it's perfectly obvious and should be to anybody who is trained in any related discipline that the story does not make sense.
02:48:39.000 That the chances that you are going to enhance a virus's infectivity and that it is going to get out and become endemic to humans far exceeds the chances that you are going to learn something by increasing its ability to infect human tissue that allows you to fend off some natural virus that emerges.
02:48:59.000 The story literally doesn't make sense.
02:49:01.000 It's a pyromaniac, an arsonist who works for the fire department.
02:49:06.000 Yep.
02:49:06.000 That's what it is.
02:49:07.000 That's what it is.
02:49:08.000 So, what we have to infer, and I'm borrowing from Robert Malone here, who at the Brownstone conference that I was recently at, pointed out that the mentality amongst guys like Fauci is identical to the one in Dr. Strangelove.
02:49:27.000 Ooh.
02:49:28.000 Yeah, it's a really deep point, right?
02:49:30.000 That mania about, you know, nuclear weapons and mineshafts and we can still win this one even though, you know, nuclear war is happening.
02:49:40.000 That same kind of mindset where these people are actually crazy enough to create new human pathogens for which they have no escape plan, right?
02:49:50.000 They're crazy enough to do that because in their demented minds, you know, there's going to be some biological war and we're going to need to have these weapons, right?
02:50:01.000 These people belong in a mental institution.
02:50:05.000 Creating new human pathogens is the exact opposite of creating wealth.
02:50:11.000 And you know there's going to be some shill who pops up and says, Brett is totally off with this.
02:50:16.000 What we have learned through this work is the reason why we're all alive today.
02:50:21.000 If it wasn't for their brave work, there are these people that just step in.
02:50:27.000 There'll be people that do it because they want online clout.
02:50:29.000 There's going to be people that are doing it because, you know, they're a part of a fucking chatbot network that's attacking this point.
02:50:37.000 But the reality is it's a crazy idea, especially if you've never come up with a fucking cure.
02:50:43.000 You've been studying these respiratory viruses for how long?
02:50:46.000 How much money have you spent?
02:50:47.000 And you know you got no cure.
02:50:49.000 You don't even have a clue.
02:50:50.000 You're doing a terrible job.
02:50:52.000 Not even a terrible job.
02:50:54.000 You know.
02:50:54.000 But if you were doing what you said you were doing, you're doing a terrible job.
02:50:57.000 But it makes way more sense that what you're doing is trying to make a terrible virus.
02:51:01.000 So you could have it.
02:51:02.000 It's almost like, I don't know, the analogy is a loose one.
02:51:06.000 But, you know, Munchausen by proxy, injuring people to save them, right?
02:51:15.000 The idea that the pandemic came from the same guy who then stepped in to play the hero is a little alarming.
02:51:28.000 It is a little alarming, especially because that wasn't initially known.
02:51:33.000 Right.
02:51:33.000 Oh, here's a hero.
02:51:33.000 Yeah.
02:51:35.000 You know, you've got, you know, Trump saying all kinds of crazy stuff.
02:51:39.000 And at least we have a sober scientist there to keep him in line.
02:51:44.000 And he was a soldier of the left.
02:51:45.000 Like, they were like, yes, he's our guy.
02:51:47.000 If you trust, we trust in Fauci and we trust in science.
02:51:51.000 It was crazy to watch.
02:51:52.000 Like, hey, guys, these are the same people we were all complaining about six months ago.
02:51:57.000 Yeah.
02:51:57.000 What the fuck are you talking about?
02:51:59.000 You all had no faith in the pharmaceutical drug companies a year ago.
02:52:03.000 And now all of a sudden you're trusting them?
02:52:05.000 It's absolutely wild.
02:52:05.000 Yeah.
02:52:07.000 Wild.
02:52:08.000 And it's wild how gullible a large swath of our society is.
02:52:14.000 And that's why I think, like, a better education for young people to at least give them a framework to understand what's happening to you and how you're getting bamboozled and why, why it's been going on as long as it's gone.
02:52:29.000 And how do you get your mind out of that?
02:52:31.000 See, I mean, look, I can see I'm really enthusiastic if we get through the immediate bottleneck that we face, that there is a way to build school that functions by not, you know, using this archaic mechanism where you're sitting people facing the chalkboard watching somebody scratch stuff on there.
02:52:55.000 School should be built out of exercises and experiences that teach these things through living them, right, that reinforce those patterns, not as abstractions on the board, but as experiences.
02:53:11.000 You could teach all sorts of things this way.
02:53:13.000 And then the person has it built in in some deep way rather than, you know, in some quadrant of their abstract thought library.
02:53:23.000 I totally agree if we're going to remain human, which I don't think we're going to.
02:53:28.000 So if we're not going to remain human and I'm not just saying like you and I are probably going to remain human, but I mean, as a species, if we're not going to remain human, it will be quaint to look back on the days.
02:53:39.000 Just like we look back on people to take a fucking horse across the country.
02:53:43.000 That's how we're going to look at you had to acquire data from like a constant study and repetition.
02:53:50.000 That's how you got your skills when it's going to be like Neo in the Matrix.
02:53:53.000 They put that chip in his head and he goes, I know jujitsu.
02:53:56.000 Remember that?
02:53:57.000 Yeah, that's what it's going to be like.
02:53:57.000 Right.
02:53:58.000 It's going to be like that.
02:53:59.000 Like the idea of acquiring knowledge and skills by hard work and labor is going to be like before people figured out doors.
02:54:09.000 It's like this is a dumb way to do things.
02:54:11.000 Like we have a way more effective way.
02:54:13.000 Like why do you want to go through all the hardship to get information and to be intelligent and aware when you can just be intelligent and aware?
02:54:21.000 Like why do we think that it's because that was the only method to be intelligent and aware in the past?
02:54:26.000 Yeah, but Joe, you're going to sign up and you're going to take the chip.
02:54:30.000 We're all going to take the chip because we all want to be happy.
02:54:33.000 Just like everybody has a phone now.
02:54:35.000 Well, the problem though is we don't know how to be anything other than human.
02:54:43.000 We are losing our humanity without a plan for being something else, without a conceivable plan.
02:54:48.000 We are cavemen and cavemen is what we will be until we die.
02:54:52.000 We have no plan to live in the cities.
02:54:54.000 We have no plan.
02:54:55.000 We will hunt with flint.
02:54:57.000 We are cavemen.
02:54:59.000 Well, the farther we get from the mode that we evolved in, the more fucked up and directionless we find ourselves.
02:55:10.000 That's true.
02:55:11.000 However, that's the direction we're going.
02:55:13.000 So I say buckle up.
02:55:16.000 I don't know what else to say about it because it's all just – I feel like it's all just kind of mental masturbation right now because no one really knows what it's going to be like.
02:55:24.000 We could speculate.
02:55:25.000 We could prepare.
02:55:26.000 Well, that's the key thing is people have to admit that.
02:55:29.000 Right?
02:55:29.000 Yes.
02:55:30.000 All the people who want to tell you how it's going to be don't know.
02:55:33.000 Well, they're also trying to stop people from stopping them.
02:55:36.000 They don't want people to be alarmed.
02:55:38.000 And so they'll give you the most rose-colored glasses version except for Elon.
02:55:44.000 He was the only one that was saying – like there was a robot, one of those robot dogs, and he – I forget the exact quote.
02:55:51.000 But it was something in tune of one day that's going to move so fast you could barely see it, and it's going to be shooting guns, and it's going to be powered by AI.
02:56:00.000 Yep.
02:56:00.000 But get ready.
02:56:01.000 Yeah.
02:56:02.000 Get ready.
02:56:03.000 Because that's what those things are going to be.
02:56:04.000 And that's true, too.
02:56:04.000 Yeah.
02:56:07.000 That's part of it, too.
02:56:08.000 It's like weapons are just going to be insane.
02:56:11.000 All jets are going to be fighter jets controlled by AI.
02:56:14.000 They're way better than fighter jets controlled by people.
02:56:15.000 Oh, of course.
02:56:16.000 The last thing you want is a fighter jet constrained by the fact that the pilot's going to black out if it pulls too many Gs.
02:56:21.000 Exactly.
02:56:22.000 Not just more Gs, but they're way better in dogfights.
02:56:25.000 They win 100 percent of the time when they do AI versus trained, effective elite pilots.
02:56:30.000 You know when they'll stop winning 100 percent of the time?
02:56:32.000 When?
02:56:33.000 When the other side has them, too.
02:56:35.000 Well, that's right.
02:56:35.000 Yeah.
02:56:36.000 That's like the mutually assured destruction argument.
02:56:39.000 Maybe that's our new one.
02:56:41.000 It used to be nukes.
02:56:42.000 Now it's AI.
02:56:43.000 Oh, it is.
02:56:43.000 Well, I'm hoping AI just takes the – when it becomes sentient and it is our new digital god, I hope it is just everybody calm the fuck down, settle down, live your life.
02:56:55.000 But now we're – you made the new boss.
02:56:59.000 We're going to be kind.
02:57:00.000 We're going to be benevolent dictators.
02:57:02.000 You've raised God a couple times.
02:57:05.000 I'm being joking about that.
02:57:07.000 One of the reasons why I am is because I – when I got up this morning after my crazy dream and I went to the gym, I put on this documentary on the Sumerian Kings list because I've been really fascinated by this.
02:57:07.000 Oh, I know.
02:57:20.000 It's a really loony thing that they found in Iraq and in several different sites and it varies slightly, but it's all this list of people who ran the earth for tens of thousands of years.
02:57:34.000 That's their reign.
02:57:35.000 It was like tens of thousands of years.
02:57:37.000 And then there's this huge flood and then afterwards the timelines become way more realistic.
02:57:42.000 It's like 100 years.
02:57:44.000 Then he ran for – he was a king for 50 years.
02:57:46.000 But they have it documented to like eight kings over the entire course of their civilization including the places that these kings were ruled, that they ruled that actually exist.
02:57:59.000 Like these are ancient cities that are actually built on top of even more ancient cities that are below them.
02:58:07.000 And these people in these bizarre kings lists, they're trying to say that this was an actual human being.
02:58:18.000 This was an actual human being that lived that long.
02:58:20.000 I don't know what that means.
02:58:22.000 But they're the ones that have all this crazy stuff with the Anunnaki and from heaven to earth came and that they have – they had an understanding of stuff that was like way beyond what we thought they were capable of.
02:58:34.000 They have Pythagoram's theorem.
02:58:36.000 They had that 1,000 years before Pythagoras, which is weird because this civilization sort of pops up out of nowhere.
02:58:45.000 That's why I bring up God a lot.
02:58:47.000 Well, hold on.
02:58:47.000 OK.
02:58:48.000 I want to get back to God.
02:58:49.000 But I will just say that there is this increasingly fascinating thread about a recurrent disaster cycle and the possibility that sophisticated civilizations get erased and that we – Rediscover.
02:59:05.000 Rediscover.
02:59:06.000 Exactly.
02:59:07.000 Yes.
02:59:07.000 And I wish that was a crazy story.
02:59:10.000 It sounds like it should be but I will say the evidence is far too compelling to dismiss it.
02:59:21.000 So I think we have to be open to that possibility and we seem to be heading into one of these catastrophic upheavals, which is something – while we're busy dicking around with climate change, which is not what we're pretending it is, we are not dealing with this hazard to our civilization and figuring out how to protect ourselves.
02:59:49.000 And one of many, right?
02:59:50.000 There's natural disasters we're not paying attention to.
02:59:53.000 There's three-eye atlas everybody paid attention to, which is a very weird thing.
02:59:57.000 But, yeah, I mean, you're completely right.
03:00:01.000 And it's – we're just such a strange species, what we choose to focus on.
03:00:09.000 It's very – we're very bizarre.
03:00:11.000 We're so illogical.
03:00:13.000 We're squandering the most spectacular conceivable opportunity and it's tragic that we can't do better.
03:00:21.000 It's really – it's sad because, you know, we're like that close.
03:00:27.000 You know, we figured out a lot and we're going to squander it over some kind of stupid game.
03:00:32.000 Also, I think what a huge disservice to not recognize that this is possibly a rebuilding of civilization, not just the emergence of civilization.
03:00:40.000 And the more they look, the more evidence points in that direction and the more people push back so hard.
03:00:46.000 They get so angry at the ideas of it.
03:00:48.000 They sure do.
03:00:49.000 They get so angry.
03:00:51.000 And every time a new discovery happens, a date gets pushed back and it gets pushed back again and pushed back again.
03:00:58.000 And that, you know, there was a – Michael Button had a video that he put out about there's some sort of inscriptions and writings on bone that they found in the Americas.
03:01:12.000 I believe it was in Mexico.
03:01:14.000 And it's completely fossilized and they measured the strata around this, you know, so they get a comparative age of the area.
03:01:24.000 And they're talking about it being 200,000 years old.
03:01:28.000 So that means 200,000 years ago possibly, if this is correct, there was humans in the Americas.
03:01:35.000 Right.
03:01:35.000 Which, you know, if you're of a disaster cycle mindset – How many have we been through?
03:01:45.000 Right.
03:01:45.000 Right.
03:01:46.000 And – but – so I guess the point is the fact that humans may have been here 200,000 years ago doesn't affect the story of how the humans that we know – know of here arrived after the last ice age, for example.
03:02:04.000 So those two things could be true simultaneously.
03:02:07.000 Of course.
03:02:09.000 And it's just amazing how small-minded academics are.
03:02:14.000 Why aren't they curious?
03:02:16.000 Like that's what's crazy.
03:02:17.000 Like why would you stick to your gun so much?
03:02:19.000 Like the poor people that were trying to dispute Clovis first, like that one guy – I forget his name.
03:02:28.000 Jamie, see if you can find that guy's name.
03:02:29.000 It became like attacked like ruthlessly by other archaeologists.
03:02:35.000 They just did not want to believe that this guy was correct and that there was people that were here long before the Clovis people.
03:02:42.000 And so they were smearing him.
03:02:44.000 And then White Sands, New Mexico, they find these footprints that are 22,000 years old.
03:02:49.000 So they know for a fact there was people here pre-Clovis, which is just nuts.
03:02:55.000 Like what did you guys do?
03:02:56.000 Why did you attack him?
03:02:58.000 Well, and what are the rules of the goddamn game?
03:03:00.000 Right.
03:03:01.000 The problem is that these – Michael Waters and Thomas Stafford of Texas A&M University.
03:03:07.000 So those guys are ruthlessly attacked and by who?
03:03:15.000 By the people that are supposed to be in charge of disseminating correct information at the highest level, which is nuts.
03:03:21.000 The problem is that we have come to accept a proxy, which is the consensus of a field, for the real indicator of correctness, which is predictive power.
03:03:40.000 And, you know, humans are just not good at this because for one thing, humans do get involved in a competition for power.
03:03:49.000 And so people will shut down a correct idea because it's not theirs and it will elevate somebody they don't want elevated.
03:03:56.000 So as far as I'm concerned – It's gross.
03:03:59.000 It's – It's so gross.
03:04:00.000 It's so common.
03:04:03.000 It's destructive of something our civilization is entitled to.
03:04:07.000 We're entitled to the productivity of scientific work and instead what we get is catfighting and it prevents the high-quality stuff from – It's embarrassing too.
03:04:19.000 When you see professional intellectuals who are catfighting on Twitter, you're like, good lord.
03:04:26.000 Like, do you not understand what that exposes about your character?
03:04:29.000 Like, that's all I need to know about you.
03:04:30.000 Right.
03:04:31.000 Like, you're gross.
03:04:32.000 This is gross.
03:04:33.000 This is a gross way for a really smart person to behave.
03:04:36.000 This is a – these are the words of a gross human being.
03:04:39.000 Absolutely.
03:04:39.000 And that's more common than not.
03:04:42.000 And that's what's really nuts is that anybody that's challenging any of the current consensus, you immediately get labeled like the worst names in the book.
03:04:51.000 And it's just – you get connected to the worst ideas in society and like, holy shit, you guys are like little kids.
03:04:59.000 Well, I can't stand it when somebody – somebody will try to shut me down.
03:05:07.000 I will be saying something and they'll come back at me as if I'm morally broken for making an analytical argument with which they disagree.
03:05:17.000 And my feeling is, first of all, if you know me and you've seen me be right before, then the fact that you and I disagree should cause you to have this thought.
03:05:28.000 You should think, huh, that's interesting that he disagrees with me.
03:05:33.000 Maybe he's wrong for the reason I think he is.
03:05:35.000 Or maybe he's right and I need to know.
03:05:37.000 I'll be better off if I do.
03:05:39.000 But you shouldn't be trying to silence me.
03:05:42.000 You should be trying to figure out whether I know something you don't.
03:05:46.000 Right?
03:05:47.000 But so frequently that is not people's response.
03:05:47.000 Right.
03:05:50.000 It is you must stop saying that.
03:05:53.000 Why?
03:05:53.000 Right?
03:05:54.000 I can be wrong.
03:05:55.000 Right?
03:05:56.000 Being wrong is part of how you get to be right.
03:05:58.000 So this instinct to get people to – people with whom you analytically disagree to stop speaking is totally counterproductive for our collective goal, which is to be better, to know more, to accomplish more.
03:06:16.000 And the only reason why people shut you down is they don't have a strong enough argument.
03:06:16.000 Right.
03:06:16.000 Right.
03:06:20.000 If they had a really strong argument and your argument was nonsense, they would tell you what was going on.
03:06:23.000 But to shut you down, to stop you from talking all together, it's like you must comply.
03:06:30.000 And it becomes a power struggle.
03:06:31.000 And it becomes a power struggle by people who feel virtuous.
03:06:34.000 Like they feel like they're in the right so they get a chance to exert that power ruthlessly because they're correct and you have to stop Hitler.
03:06:41.000 Yep.
03:06:42.000 Absolutely.
03:06:43.000 Yeah.
03:06:44.000 Dude, we just did three hours like that.
03:06:45.000 We did.
03:06:47.000 Thank you.
03:06:48.000 Thank you for everything.
03:06:49.000 It's always great to talk to you.
03:06:50.000 It's always a lot of fun.
03:06:51.000 It's a great show.
03:06:52.000 Glad to be here and it's good to see you again.
03:06:54.000 Dark Horse Podcast.
03:06:55.000 Tell everybody where they can find that.
03:06:57.000 What's your website?
03:06:58.000 Is it darkhorsepodcast.com?
03:07:00.000 Um, I wish I knew the website.
03:07:02.000 You don't know your own website.
03:07:03.000 That's funny.
03:07:04.000 Dark Horse Podcast.
03:07:05.000 People will find it.
03:07:06.000 But it's also, um, it's on, you're on everything, right?
03:07:06.000 Yeah, they will.
03:07:10.000 Uh, yes, we've been re-monetized and they have taken the cap off our channel on YouTube.
03:07:15.000 Oh, my goodness.
03:07:16.000 We're on, uh, .org.
03:07:17.000 .org.
03:07:17.000 There you go.
03:07:18.000 Um, you guys got hit during the COVID days, right?
03:07:21.000 We were demonetized for four plus years.
03:07:25.000 And what's more, what they did not acknowledge, they acknowledged that they demonetized us, but they capped our channel so it stopped growing and as soon as they re-monetized us, it started growing again.
03:07:25.000 Great.
03:07:34.000 Oh, God.
03:07:36.000 How gross.
03:07:38.000 He who controls the algorithm controls the narrative.
03:07:42.000 That's the devil.
03:07:43.000 Yeah.
03:07:44.000 All right.
03:07:45.000 Well, I love you, buddy.
03:07:46.000 Thank you for being here.